2019
DOI: 10.1016/j.procs.2019.09.316
|View full text |Cite
|
Sign up to set email alerts
|

Streaming Social Media Data Analysis for Events Extraction and Warehousing using Hadoop and Storm: Drug Abuse Case Study

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(3 citation statements)
references
References 7 publications
0
3
0
Order By: Relevance
“…They used NLP, machine learning and unigram Naïve techniques for classification, and extracted tweets are loaded into Hadoop platform. The work in [42] developed a large scale architecture by combining Storm and Hadoop to process social media data and facilitate their integration into the traditional data warehouse. Recent researches started working on semantic driven approaches for TSA.…”
Section: Other Approachesmentioning
confidence: 99%
“…They used NLP, machine learning and unigram Naïve techniques for classification, and extracted tweets are loaded into Hadoop platform. The work in [42] developed a large scale architecture by combining Storm and Hadoop to process social media data and facilitate their integration into the traditional data warehouse. Recent researches started working on semantic driven approaches for TSA.…”
Section: Other Approachesmentioning
confidence: 99%
“…GraphLab, Giraph, and GraphX use the Pregel model [6]. Storm can deal with flowing data [7]. Each system uses threads and basic ways to establish a parallel and distributed computing environment.…”
Section: Introductionmentioning
confidence: 99%
“…In GraphLab, Giraph, and GraphX, the Pregel model is implemented [5]. Storm [6] handles data streaming. Each system uses threads and simple methods to create a parallel and distributed computing environment.…”
Section: Introductionmentioning
confidence: 99%