2017
DOI: 10.1016/j.eswa.2017.04.028
|View full text |Cite
|
Sign up to set email alerts
|

Detecting anomalies in time series data via a deep learning algorithm combining wavelets, neural networks and Hilbert transform

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
47
0
1

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 92 publications
(48 citation statements)
references
References 41 publications
0
47
0
1
Order By: Relevance
“…In time series data anomalies can be categorized into: outliers, unusual data points significantly dissimilar to the remaining points in the data set, and anomaly patterns, which group fractions of data together, which are different from the majority of normal data. To deal with these categories, various anomaly detection algorithms have been developed that are classified into five major groups: probabilistic, distancebased, reconstruction-based, domain-based, and informationtheoretic-based [15]. Different types of algorithms are more applicable in various scenarios of data analysis, which brings to front the idea of a generic framework capable of combining several analytic tools and approaches.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…In time series data anomalies can be categorized into: outliers, unusual data points significantly dissimilar to the remaining points in the data set, and anomaly patterns, which group fractions of data together, which are different from the majority of normal data. To deal with these categories, various anomaly detection algorithms have been developed that are classified into five major groups: probabilistic, distancebased, reconstruction-based, domain-based, and informationtheoretic-based [15]. Different types of algorithms are more applicable in various scenarios of data analysis, which brings to front the idea of a generic framework capable of combining several analytic tools and approaches.…”
Section: Methodsmentioning
confidence: 99%
“…These complex systems share at least two characteristics: a lack of mathematical model that accurately describes the system behaviour and a large amount of monitoring data that can be both historical and real-time. Therefore, for performing effective anomaly detection, it is necessary to rely on databased approaches and on smart methods of condition monitoring that often use computational intelligence and machine learning techniques [15]. The latter is especially important when some constraints are present that cannot be satisfied by human intervention with regard to decision making speed in life threatening situations (e.g.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…A new training dataset is obtained by combining the classified dataset with original data and from that maximum shortage details is also identified [18]. Based on future queries a new classifier will be developed and can be widely used in medical field by the data collected from the patient who consumes large time and occupies vast area [19]. The artificial intelligence based neural network technique provides a step to make a final decision and the information received from neurons which is an interconnected nerve cells in surrounding [20].In an epoch of big data, the anomaly detection should be well-organized to process the large volume of data at real time without any loss of dynamic packet flow.…”
Section: Introductionmentioning
confidence: 99%
“…Wavelets were used to detect spikes in the wavelet details of responses of a framed structure subjected to strong earthquake excitation, indicating damage occurrence [29]. A combined wavelets, neural networks, and Hilbert transform, inspired by the deep learning paradigm, was presented to form a new signal-processing algorithm [30]. This was further studied to identify the fault condition of a roller bearing using three types of deep neural network models such as deep Boltzmann machines, deep belief networks and stacked auto-encoders [31].…”
Section: Introductionmentioning
confidence: 99%