This paper presents development of artificial neural network (ANN) models to compute received signal strength (RSS) for four VHF (very high frequency) broadcast stations using measured atmospheric parameters. The network was trained using Levenberg-Marquardt back-propagation (LMBP) algorithm. Evaluation of different effects of activation functions at the hidden and output layers, variation of number of neurons in the hidden layer and the use of different types of data normalisation were systematically applied in the training process. The mean and variance of calculated MSE (mean square error) for ten different iterations were compared for each network. From the results, the ANN model performed reasonably well as computed signal strength values had a good fit with the measured values. The computed MSE were very low with values ranging between 0.0027 and 0.0043. The accuracy of the trained model was tested on different datasets and it yielded good results with MSE of 0.0069 for one dataset and 0.0040 for another dataset. The measured field strength was also compared with ANN and ITU-R P. 526 diffraction models and a strong correlation was found to exist between the measured field strength and ANN computed signals, but no correlation existed between the measured field strength and the predicted field strength from diffraction model. ANN has thus proved to be a useful tool in computing signal strength based on atmospheric parameters.
In today’s technology-driven world, most millennials are tech-savvy. They have neither the time nor the interest in reading textbooks, newspapers or journals. They would like to immediately get instant answers and clarifications for all their doubts and questions. On many occasions, we are unable to find the exact word or meaning which we are searching for. So, if we have a clear, concise summary of a piece of literature, and we could understand what it contains with just a glimpse, we would be able to save a lot of time. This paper dwells about utilizing Natural Language Processing (NLP) to summarize a given text/textbook/paper. The state-of-the-art technology in this field has been demonstrated by Google’s Bidirectional Encoder Representations from Transformers (BERT), one of the latest developments in NLP. BERT is believed to understand English better than other models because of its underlying bidirectional architecture. The present proposal is to use BERT as a sentence similarity extractor. By applying the TextRank algorithm, the sentences holding the most important information are extracted. This comes under the domain of extractive summarization. Abstractive summarization is much talked about, but since Google BERT is not built for generating text, we are utilizing it in a different way to achieve the requirement. This paper intends to discuss the use of BERT for the gen-next kids which will save time and initiate further interest for researchers in developing new programs continuously in the future.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.