2018
DOI: 10.1007/978-3-030-05171-6_6
|View full text |Cite
|
Sign up to set email alerts
|

ProPatrol: Attack Investigation via Extracted High-Level Tasks

Abstract: Kernel audit logs are an invaluable source of information in the forensic investigation of a cyber-attack. However, the coarse granularity of dependency information in audit logs leads to the construction of huge attack graphs which contain false or inaccurate dependencies. To overcome this problem, we propose a system, called ProPatrol, which leverages the open compartmentalized design in families of enterprise applications used in security-sensitive contexts (e.g., browser, chat client, email client). To ach… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(3 citation statements)
references
References 19 publications
0
3
0
Order By: Relevance
“…While a number of experimental audit frameworks have incorporated notions of data provenance [27], [31], [76], [98] and taint tracking [45], [29], the bulk of this work is also based on commodity audit frameworks such as Linux Audit. Techniques have also been proposed to efficiently extract threat intelligence from voluminous log data [99], [100], [46], [23], [30], [24], [101], [102], [32], [33], [34], [25], [85], [28], [103], [35], [104], [105]; in this work, we make the use of such techniques applicable to RTS through the design of a system audit framework that is compatible with temporally constrained applications. Our approach to template generation in Ellipsis shares similarities with the notion of execution partitioning of log activity [84], [32], [85], [23], [24], which decomposes long-lived applications into autonomous units of work to reduce false dependencies in forensic investigations.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…While a number of experimental audit frameworks have incorporated notions of data provenance [27], [31], [76], [98] and taint tracking [45], [29], the bulk of this work is also based on commodity audit frameworks such as Linux Audit. Techniques have also been proposed to efficiently extract threat intelligence from voluminous log data [99], [100], [46], [23], [30], [24], [101], [102], [32], [33], [34], [25], [85], [28], [103], [35], [104], [105]; in this work, we make the use of such techniques applicable to RTS through the design of a system audit framework that is compatible with temporally constrained applications. Our approach to template generation in Ellipsis shares similarities with the notion of execution partitioning of log activity [84], [32], [85], [23], [24], which decomposes long-lived applications into autonomous units of work to reduce false dependencies in forensic investigations.…”
Section: Related Workmentioning
confidence: 99%
“…Not only does this approach take the responsibility of event logging out of the hands of the application developer, it also provides a unified view of system activity in a way that application-specific logging simply cannot. In particular, systems logs can be iteratively parsed into a connected graph based on the shared dependencies of individual events, facilitating causal analysis over the history of events within a system [26], [27], [28], [29], [30], [31], [32], [32], [33], [34], [35]. This capability is invaluable to defenders when tracing suspicious activities [23], [25], [24], to the point that the vast majority of cyber analysts consider audit logs to be the most important resource when investigating threats [22].…”
Section: Introductionmentioning
confidence: 99%
“…The need to aggregate and analyze various data sources has been well researched, including network traffic [18], [19], processes/system events [20], and audit logs [21]. However, the collection of large quantities of data introduces storage and searching challenges, which requires techniques to reduce data quantity and dimensionality, as explored by [22].…”
Section: Related Workmentioning
confidence: 99%