2017
DOI: 10.1007/s11859-017-1224-7
|View full text |Cite
|
Sign up to set email alerts
|

A forensic method for efficient file extraction in HDFS based on three-level mapping

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
2
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 13 publications
0
2
0
Order By: Relevance
“…The experiment that is undertaken in the paper is done with the 3L mapping previously established, the authors write: "Without 3L mapping, it is difficult to overcome the problems caused by the features of cloud and HDFS to implement file extraction." [7]. This is of specific relevance to the work in this project, any analysis of RAM or processes must either mitigate or solve this issue of mapping.…”
Section: Literature Reviewmentioning
confidence: 96%
See 2 more Smart Citations
“…The experiment that is undertaken in the paper is done with the 3L mapping previously established, the authors write: "Without 3L mapping, it is difficult to overcome the problems caused by the features of cloud and HDFS to implement file extraction." [7]. This is of specific relevance to the work in this project, any analysis of RAM or processes must either mitigate or solve this issue of mapping.…”
Section: Literature Reviewmentioning
confidence: 96%
“…Gao and Li [7] put forward a three level mapping for efficient file extraction in their paper A Forensic Method for Efficient File Extraction in HDFS Based on Three-Level Mapping. They state the importance of discovering the mapping between files and nodes which can aid in the extraction of files.…”
Section: Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation
“…HDFS is a large distributed file system, which can build an efficient and stable storage cluster on personal computer. The storage mode of HDFS is partitioned storage, that is, files are partitioned into different files [18]. In addition to the above two core technologies, Hadoop architecture also includes Hive and HBase and other structures.…”
Section: Key Technologies Of the Realization Of Information Management Systemmentioning
confidence: 99%
“…There have been several studies on the collection and analysis of cloud evidence 4‐7 and all monitor cloud networks, processes, data access, and other behaviors through an API (application programming interface), and provide log information needed for evidence collection, 8 respectively discusses log monitoring models for IaaS (infrastructure as a service) and PaaS (platform as a service). Taking hadoop distributed file system (HDFS) as an example, Reference 9 proposes a forensics method based on three‐layer mapping to achieve effective file extraction. Reference 10 proposes a time correlation method for heterogeneous virtual machine event logs.…”
Section: Related Workmentioning
confidence: 99%