“…Security is another challenge for stream data ingestion process which comes out from quick growth of the internet, web-based systems who are facing malicious and suspicions files threatening in their security, so the ingestion process should provide security, auditing, and provenance. The analytical value from the stream data depends on accuracy and completeness of data so achieving good and accurate stream data ingestion is complicated and challenging task that require good planning and expertise (Yadranjiaghdam,B.,Yasrobi,S.,& Tabrizi,N.,217) (Pal, G., Li, G., & Atkinson, K., 2018) (Gurcan, F., & Berigel, M., 2018) 3.4.1 Flume Apache It's a distributed reliable, available and efficient service for importing, collecting, aggregating and bringing in huge amount of data with its streaming feature and ingest it in a way that makes it easy for processing tool, hardly supports fault tolerance with accurate consistency ways, the data model used by flume is particularly used for online analytic application It has the most important role in data ingestion for real time data analytics, which is responsible for data refining and data visualization (Yadranjiaghdam, B., Pool, N., & Tabrizi, N., 2016 The data flow in flume same as pipeline that ingest data from the source to destination. Regarding to figure 5 below that discussed Flume architecture, data is transformed from source to destination based on flume agent which is JVM process that host the components during the data flow from the source to next end and it contains of channel, sink and the source.…”