Enterprises own a significant fraction of the hosts connected to the Internet and possess valuable assets, such as financial data and intellectual property, which may be targeted by attackers. They suffer attacks that exploit unpatched hosts and install malware, resulting in breaches that may cost millions in damages. Despite the scale of this phenomenon, the threat and vulnerability landscape of enterprises remains under-studied. The security posture of enterprises remains unclear, and it's unknown whether enterprises are indeed more secure than consumer hosts. To address these questions, we perform the largest and longest enterprise security study so far. Our data covers nearly 3 years and is collected from 28K enterprises, belonging to 67 industries, which own 82M client hosts and 73M public-facing servers. Our measurements comprise of two parts: an analysis of the threat landscape and an analysis of the enterprise vulnerability patching behavior. The threat landscape analysis studies the encounter rate of malware and PUP in enterprise client hosts. It measures, among others, that 91%-97% of the enterprises, 13%-41% of the hosts, encountered at least one malware or PUP file over the length of our study; that enterprises encounter malware much more often than PUP; and that some industries like banks and consumer finances achieve significantly lower malware and PUP encounter rates than the most-affected industries. The vulnerability analysis examines the patching of 12 client-side and 112 server-side applications in enterprise client hosts and servers. It measures, among others, that it takes over 6 months on average to patch 90% of the population across all vulnerabilities in the 12 client-side applications; that enterprise hosts are faster to patch vulnerabilities compared to consumer hosts; and that the patching of server applications is much worse than the patching of clientside applications.
The Transport Layer Security (TLS) protocol is the de-facto standard for encrypted communication on the Internet. However, it has been plagued by a number of different attacks and security issues over the last years. Addressing these attacks requires changes to the protocol, to server-or client-software, or to all of them. In this paper we conduct the first large-scale longitudinal study examining the evolution of the TLS ecosystem over the last six years. We place a special focus on the ecosystem's evolution in response to high-profile attacks. For our analysis, we use a passive measurement dataset with more than 319.3B connections since February 2012, and an active dataset that contains TLS and SSL scans of the entire IPv4 address space since August 2015. To identify the evolution of specific clients we also create the-to our knowledge-largest TLS client fingerprint database to date, consisting of 1,684 fingerprints. We observe that the ecosystem has shifted significantly since 2012, with major changes in which cipher suites and TLS extensions are offered by clients and accepted by servers having taken place. Where possible, we correlate these with the timing of specific attacks on TLS. At the same time, our results show that while clients, especially browsers, are quick to adopt new algorithms, they are also slow to drop support for older ones. We also encounter significant amounts of client software that probably unwittingly offer unsafe ciphers. We discuss these findings in the context of long tail effects in the TLS ecosystem.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.