Abstract-In this paper, we present a novel approach for improving the performance of a large class of CPU and memory intensive passive network monitoring applications, such as intrusion detection systems, traffic characterization applications, and NetFlow export probes. Our approach, called locality buffering, reorders the captured packets by clustering packets with the same destination port, before they are delivered to the monitoring application, resulting to improved code and data locality, and consequently to an overall increase in the packet processing throughput and to a decrease in the packet loss rate. We have implemented locality buffering within the widely used libpcap packet capturing library, which allows existing monitoring applications to transparently benefit from the reordered packet stream without the need to change application code. Our experimental evaluation shows that locality buffering improves significantly the performance of popular applications, such as the Snort IDS, which exhibits a 40% increase in the packet processing throughput and a 60% improvement in packet loss rate.
Bandwidth usage monitoring is important for network troubleshooting and planning. Traditionally, used bandwidth is computed from router interface byte counters read by SNMP. This method only allows to check long-term averages of the total used bandwidth without information about short-term dynamics and without knowledge of what applications are consuming most bandwidth.We describe the architecture of a novel passive bandwidth usage monitoring application. This application uses packet capture and advanced processing to continuously provide real-time information about bandwidth usage. The produced characteristics include information about short-term peaks and about the percentage of bandwidth used by different protocols in different layers of the OSI model hierarchy, including detection of application protocols that use dynamic ports.
Abstract-With the advent of dynamic and elusive distributed applications such as peer-to-peer file sharing systems, network administrators find it increasingly difficult to understand the types of applications running in their networks and the amount of traffic each application produces.In this paper, we present measurement results from the deployment of an accurate traffic characterization application in three National Research and Education Networks for a period of two months. Our observations go beyond traffic distribution; we explore the application usage in terms of active IP addresses, the existence of IP addresses generating massive amounts of traffic, the asymmetry of incoming and outgoing traffic, and the existence of SPAM-sending mail servers.
Monitoring applications provide an important service in network related activities, such as network monitoring, network management and network software engineering. They facilitate the need of understanding exactly what occurs inside our networks and how each network interacts with the rest of the Internet. From private and local networks, to large-scale corporate networks and intranets, there is an ever-growing need to characterize and analyze network traffic. Unfortunately, network monitoring applications have the side effect of generating huge amounts of real-time data, that need to be processed, stored and presented, in an effective fashion. If this is done correctly and efficiently, network administrators, researchers, as well as users, can extract useful information from them, such as, traffic patterns, newly deployed network protocols, etc.In this paper we present our experiences on the combination of two tools, AppMon and Stager, and the study of the resulting system. AppMon is a network monitoring toolkit which performs per-application traffic classification. Stager is a tool which stores, aggregates and presents long-term network statistics, coming from multiple monitoring sides. We modified, combined, and extended these tools so that real-time data produced by AppMon are transferred, converted and stored through Stager. The resulting system gives access to valuable aggregated long-term network data which were not available through existing tools and methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.