2018
DOI: 10.20944/preprints201710.0166.v2
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Using the Quantization Error from Self-Organizing Map (SOM) Output for Fast Detection of Critical Variations in Image Time Series

Abstract: The quantization error (QE) from SOM applied on time series of spatial contrast images with variable relative amount of white and dark pixel contents, as in monochromatic medical images or satellite images, is proven a reliable indicator of potentially critical changes in images across time and image homogeneity. The QE is shown to increase linearly with the variability in spatial contrast contents of images across time when contrast intensity is kept constant across images.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
21
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
2

Relationship

4
3

Authors

Journals

citations
Cited by 12 publications
(22 citation statements)
references
References 21 publications
1
21
0
Order By: Relevance
“…There is a right balance between structural and functional complexity, and this balance conveys functional system plasticity; adaptive learning allows the system to stabilize but, at the same time, remain functionally dynamic and able to learn new data. Activity-dependent functional systemic growth in minimalistic sized network structures is probably the strongest advantage of self-organization; it reduces structural complexity to a minimum and promotes dimensionality reduction [111,112], which is a fundamental quality in the design of "strong" Artificial Intelligence [5]. Bigger neural networks akin to those currently used for deep learning [113] do not necessarily learn better or perform better [114].…”
Section: Discussionmentioning
confidence: 99%
“…There is a right balance between structural and functional complexity, and this balance conveys functional system plasticity; adaptive learning allows the system to stabilize but, at the same time, remain functionally dynamic and able to learn new data. Activity-dependent functional systemic growth in minimalistic sized network structures is probably the strongest advantage of self-organization; it reduces structural complexity to a minimum and promotes dimensionality reduction [111,112], which is a fundamental quality in the design of "strong" Artificial Intelligence [5]. Bigger neural networks akin to those currently used for deep learning [113] do not necessarily learn better or perform better [114].…”
Section: Discussionmentioning
confidence: 99%
“…The results from this study show that the unsupervised, self-organizing map-based automatic classification by SOM-QE [8][9][10][11][12][13][14][15] To provide a time course model for viral proliferation in terms of hours/days, the authors of the reference study [7] further reduced the complexity of information in their experimentally derived images, which had a considerably poorer resolution than our model image simulations here, by partitioning them into blocks of 20x20 pixels, which is not very precise, then averaging the pixels contained in each block. This reduction in image complexity was necessary to reduce the total number of pixels while retaining prominent features of the infection spread for further analysis by their own mathematical model.…”
Section: Discussionmentioning
confidence: 99%
“…Figure 2 shows the image state (ground truth) before any changes simulating further viral particle growth/recession were implemented. 159 further images were drawn directly from this high resolution computer model image to simulate the finest possible here, [13,13,13] : Computer generated image model reconstruction based on an original image from experimentally obtained cell imaging data [7]. The model possesses the same clinically relevant image data variations as the original (Figure 1) at the same scale.…”
Section: Image Simulationsmentioning
confidence: 99%
See 1 more Smart Citation
“…A simple functional architecture of the self-organizing map may be applied to the unsupervised classification of massive amounts of patient data from different disease entities ranging from inflammation to cancer, as shown recently [35]. Other recent work [102][103][104][105] has shown the quantization error (QE) in the output of a basic self-organized neural network map (SOM); in short, the SOM-QE is a parsimonious and highly reliable measure of the smallest local change in contrast or color data in random-dot, medical, satellite, and other potentially large image data. The SOM is easily implemented, learns the pixel structure of any target image in about two seconds by unsupervised "winner-take-all" learning, and detects local changes in contrast or color in a series of subsequent input images with a to-the-single-pixel precision, in less than two seconds for a set of 20 images [106,107].…”
Section: Som For Single-pixel Change Detection In Large Sets Of Imagementioning
confidence: 99%