Currently, there is a growing demand to determine the protective status of vaccinated fish in order to prevent diseases outbreaks. A set of different parameters that include the infectious and immunological status of vaccinated salmonids from 622 Chilean farms were analyzed during 2011–2014. The aim of this study was to optimize the vaccination program of these centers through the determination of the protective state of vaccinated fish using oral immunizations. This state was determined from the association of the concentration of the immunoglobulin M (IgM) in the serum and the mortality rate of vaccinated fish. Salmonids were vaccinated with different commercial mono- or polyvalent vaccines against salmonid rickettsial septicemia (SRS) and infectious salmon anemia (ISA), first by the intraperitoneal injection of oil-adjuvanted antigens and then by the stimulation of mucosal immunity using oral vaccines as a booster vaccination. The results showed that high levels of specific IgM antibodies were observed after injectable vaccination, reaching a maximum concentration at 600–800 degree-days. Similar levels of antibodies were observed when oral immunizations were administrated. The high concentration of antibodies [above 2750 ng/mL for ISA virus (ISAv) and 3500 ng/mL for SRS] was maintained for a period of 800 degree-days after each vaccination procedure. In this regard, oral immunizations maintained a long-term high concentration of anti-SRS and anti-ISAv specific IgM antibodies. When the concentration of antibodies decreased below 2000 pg/mL, a window of susceptibility to SRS infection was observed in the farm, suggesting a close association between antibody levels and fish protective status. These results demonstrated that, in the field, several oral immunizations are essential to uphold a high level of specific anti-pathogens antibodies and, therefore, the protective status during the whole productive cycle.
Dynamic Spectrum Access allows using the spectrum opportunistically by identifying wireless technologies sharing the same medium. However, detecting a given technology is, most of the time, not enough to increase spectrum efficiency and mitigate coexistence problems due to radio interference. As a solution, recognizing traffic patterns may lead to select the best time to access the shared spectrum optimally. To this extent, we present a traffic recognition approach that, to the best of our knowledge, is the first non-intrusive method to detect traffic patterns directly from the radio spectrum, contrary to traditional packet-based analysis methods. In particular, we designed a Deep Learning (DL) architecture that differentiates between Transmission Control Protocol (TCP) and User Datagram Protocol (UDP) traffic, burst traffic with different duty cycles, and traffic with varying rates of transmission. As input to these models, we explore the use of images representing the spectrum in time and time-frequency. Furthermore, we present a novel data randomization approach to generate realistic synthetic data that combines two state-of-the-art simulators. Finally, we show that after training and testing our models in the generated dataset, we achieve an accuracy of ≥ 96 % and outperform state-of-the-art methods based on IP-packets with DL.
IEEE 802.11 (Wi-Fi) is one of the technologies that provides high performance with a high density of connected devices to support emerging demanding services, such as virtual and augmented reality. However, in highly dense deployments, Wi-Fi performance is severely affected by interference. This problem is even worse in new standards, such as 802.11n/ac, where new features such as Channel Bonding (CB) are introduced to increase network capacity but at the cost of using wider spectrum channels. Finding the best channel assignment in dense deployments under dynamic environments with CB is challenging, given its combinatorial nature. Therefore, the use of analytical or system models to predict Wi-Fi performance after potential changes (e.g., dynamic channel selection with CB, and the deployment of new devices) are not suitable, due to either low accuracy or high computational cost. This paper presents a novel, data-driven approach to speed up this process, using a Graph Neural Network (GNN) model that exploits the information carried in the deployment’s topology and the intricate wireless interactions to predict Wi-Fi performance with high accuracy. The evaluation results show that preserving the graph structure in the learning process obtains a 64% increase versus a naive approach, and around 55% compared to other Machine Learning (ML) approaches when using all training features.
Recently, the operation of LTE in unlicensed bands has been proposed to cope with the ever-increasing mobile traffic demand. However, the deployment of LTE in such bands implies sharing spectrum with mature technologies such as Wi-Fi. Several studies have discussed this coexistence problem by suggesting that LTE implements different adaptation mechanisms that allow transmission possibilities to Wi-Fi. While such adaptation mechanisms exist, they still negatively impact Wi-Fi performance, mainly due to the lack of collaboration/coordination mechanisms that inform about the co-located networks' activities. In this paper, we propose a distributed spectrum management framework that enhances the performance of Wi-Fi, as a particular case, by detecting harmful co-located wireless networks and changes the Wi-Fi's operating central frequency to avoid them. The framework is based on a Convolutional Neural Network (CNN) that can identify different wireless technologies and provides spectrum usage statistics. Experiments were carried out in a real-life testbed, and the results show that Wi-Fi maintains its performance when using our framework. This translates in an increase of at least 40% on the overall throughput compared to a non-managed operation of Wi-Fi.
With the advent of Artificial Intelligence (AI)-empowered communications, industry, academia, and standardization organizations are progressing on the definition of mechanisms and procedures to address the increasing complexity of future 5G and beyond communications. In this context, the International Telecommunication Union (ITU) organized the first AI for 5G Challenge to bring industry and academia together to introduce and solve representative problems related to the application of Machine Learning (ML) to networks. In this paper, we present the results gathered from Problem Statement 13 (PS-013), organized by Universitat Pompeu Fabra (UPF), which primary goal was predicting the performance of next-generation Wireless Local Area Networks (WLANs) applying Channel Bonding (CB) techniques. In particular, we overview the ML models proposed by participants (including Artificial Neural Networks, Graph Neural Networks, Random Forest regression, and gradient boosting) and analyze their performance on an open dataset generated using the IEEE 802.11ax-oriented Komondor network simulator. The accuracy achieved by the proposed methods demonstrates the suitability of ML for predicting the performance of WLANs. Moreover, we discuss the importance of abstracting WLAN interactions to achieve better results, and we argue that there is certainly room for improvement in throughput prediction through ML.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.