Huge demand on business intelligence applications over large volume of enterprise data has resulted in rapid adoption of High performance data analytics. Hadoop based high performance computing environment are optimized for large files. Data centric execution with localization of computing proximal to data provided higher performance for large files. But for small files, the performance reduced and overhead increased due to the way Hadoop handles files. Resource allocation and scheduling policies of Hadoop has to be improvised to handle small files. This work proposes an integrated data, task, and resource management technique to speed up processing of small files in Hadoop based high performance computing environment. As part of data management, the data placement is made dynamic to access frequency and inherent data semantics. As part of task management, tasks were grouped based on fine grained data semantics correlation they process. As part of resource management, data blocks or storage nodes are replicated to improve the access latency with minimal cost overheads. With this integrated management, the proposed solution is able to increase the speedup of handling small files by at least 8.7% compared to most recent works on individual management of data, task and resources.