2008
DOI: 10.1016/j.engstruct.2008.03.013
|View full text |Cite
|
Sign up to set email alerts
|

Structural damage identification by using wavelet entropy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
102
0

Year Published

2011
2011
2016
2016

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 126 publications
(102 citation statements)
references
References 18 publications
0
102
0
Order By: Relevance
“…For example, the trend of information entropy of the vibration signals is studied by Elforjani [5] for fault diagnosis of ball bearings. Ren and Sun [6] also proposed a structural damage identification method based on the entropy of the wavelet transformed vibration signals. This paper aims at studying the behavior of the information entropy of the AE signals during the fatigue damage evolution of titanium specimens tested.…”
Section: Introductionmentioning
confidence: 99%
“…For example, the trend of information entropy of the vibration signals is studied by Elforjani [5] for fault diagnosis of ball bearings. Ren and Sun [6] also proposed a structural damage identification method based on the entropy of the wavelet transformed vibration signals. This paper aims at studying the behavior of the information entropy of the AE signals during the fatigue damage evolution of titanium specimens tested.…”
Section: Introductionmentioning
confidence: 99%
“…Other specifications of the generalized information entropy such as wavelet information entropy and multiscale permutation information entropy have also been formulated and applied for applications of mechanical and structural damage diagnosis [34][35][36].…”
Section: Generalized Information Entropymentioning
confidence: 99%
“…Information entropy only depends on the probability mass function of the system symbol sequence and it increases with the increasing uniformity of the probability mass function. The above concept of information entropy has been generalized in a number of different ways by different researchers [10,[33][34][35][36] since the pioneering work of Shannon. In all, generalized information entropy H can be formulated as:…”
Section: Generalized Information Entropymentioning
confidence: 99%
“…Entropy, for instance, has been used to quantify nonlinearities and complexity in electroencephalogram (EEG) signals [40], event-related potentials in neuroelectrical signals [41,42], structural damage identification [43] and characterization of complexity in random processes [44][45][46]. In order to obtain an entropy estimate from measured data, a probability mass function (pmf) is required.…”
Section: Wavelet-tsallis Q-entropymentioning
confidence: 99%