“…Later other researchers began to explore the regularizing properties of random projection, for example, for classification problems and machine learning [20], and, more recently, for solving inverse problems [21]. Since the approach of random projection, along with improving the accuracy of the solution by regularization, reduces the computational complexity of the solution, we have managed to develop algorithms that provide an accurate and fast solution for discrete inverse problems [22], [23], [24], [25], [26], [27], [28].…”
Section: Distributed Representations Based On Random Projections For mentioning
confidence: 99%
“…One of the approaches to ensuring the stability of solving ill-posed problems is the use of an integer regularization parameter, which is the number of summands in the model (linear with respect to parameters) approximating the original data. To obtain a stable solution (estimation x*), such methods as truncated singular value decomposition [32], truncated QR decomposition, and the method based on random projection [25], [26], [33] can be used.…”
Section: Distributed Representations Based On Random Projections For mentioning
confidence: 99%
“…The number of columns N of the matrix Q k is determined by the size of the original matrix A. The number of rows k is not fixed a priori and can vary from 1 to N. The dependence of the error components (e x, e y ) on the number of rows k of the matrix Q k was analytically studied in [25]. Such a study is based on the representation of the matrix…”
Section: Distributed Representations Based On Random Projections For mentioning
confidence: 99%
“…In [25], recursive expressions for the stochastic and deterministic error components were obtained. For the stochastic component of the true signal recovery error, the recursive expression has the form:…”
Section: Distributed Representations Based On Random Projections For mentioning
Introduction. Distributed representation (DR) of data is a form of a vector representation,where each object is represented by a set of vector components, and each vector component can belong to representations of many objects. In ordinary vector representations, the meaning of each component is defined, which cannot be said about DR. However, the similarity of RP vectors reflects the similarity of the objects they represent.DR is a neural network approach based on modeling the representation of information in the brain, resulted from ideas about a "distributed" or "holographic" representations. DRs have a large information capacity, allow the use of a rich arsenal of methods developed for vector data, scale well for processing large amounts of data, and have a number of other advantages. Methods for data transformation to DRs have been developed for data of various types -from scalar and vector to graphs.The purpose of the article is to provide an overview of a part of the work of the Department of Neural Information Processing Technologies (International Center) in the field of neural network distributed representations. The approach is a development of the ideas of Nikolai Mikhailovich Amosov and his scientific school of modeling the structure and functions of the brain.Scope. The formation of distributed representations from the original vector representations of objects using random projection is considered. With the help of the DR, it is possible to efficiently estimate the similarity of the original objects represented by numerical vectors. Gritsenko V.I., Rachkovskij D.A., Revunova E.G. ISSN 2519-2205 (Online), ISSN 0454-9910 (Print). Киб. и выч. техн. 2018. № 4 (194)
8The use of DR allows developing regularization methods for obtaining a stable solution of discrete ill-posed inverse problems, increasing the computational efficiency and accuracy of their solution, analyzing analytically the accuracy of the solution. Thus DRs allow for increasing the efficiency of information technologies applying them.Conclusions. DRs of various data types can be used to improve the efficiency and intelligence level of information technologies. DRs have been developed for both weakly structured data, such as vectors, and for complex structured representations of objects, such as sequences, graphs of knowledge-base situations (episodes), etc. Transformation of different types of data into the DR vector format allows unifying the basic information technologies of their processing and achieving good scalability with an increase in the amount of data processed.In future, distributed representations will naturally combine information on structure and semantics to create computationally efficient and qualitatively new information technologies in which the processing of relational structures from knowledge bases is performed by the similarity of their DRs. The neurobiological relevance of distributed representations opens up the possibility of creating intelligent information technologies based on them that function similarly to...
“…Later other researchers began to explore the regularizing properties of random projection, for example, for classification problems and machine learning [20], and, more recently, for solving inverse problems [21]. Since the approach of random projection, along with improving the accuracy of the solution by regularization, reduces the computational complexity of the solution, we have managed to develop algorithms that provide an accurate and fast solution for discrete inverse problems [22], [23], [24], [25], [26], [27], [28].…”
Section: Distributed Representations Based On Random Projections For mentioning
confidence: 99%
“…One of the approaches to ensuring the stability of solving ill-posed problems is the use of an integer regularization parameter, which is the number of summands in the model (linear with respect to parameters) approximating the original data. To obtain a stable solution (estimation x*), such methods as truncated singular value decomposition [32], truncated QR decomposition, and the method based on random projection [25], [26], [33] can be used.…”
Section: Distributed Representations Based On Random Projections For mentioning
confidence: 99%
“…The number of columns N of the matrix Q k is determined by the size of the original matrix A. The number of rows k is not fixed a priori and can vary from 1 to N. The dependence of the error components (e x, e y ) on the number of rows k of the matrix Q k was analytically studied in [25]. Such a study is based on the representation of the matrix…”
Section: Distributed Representations Based On Random Projections For mentioning
confidence: 99%
“…In [25], recursive expressions for the stochastic and deterministic error components were obtained. For the stochastic component of the true signal recovery error, the recursive expression has the form:…”
Section: Distributed Representations Based On Random Projections For mentioning
Introduction. Distributed representation (DR) of data is a form of a vector representation,where each object is represented by a set of vector components, and each vector component can belong to representations of many objects. In ordinary vector representations, the meaning of each component is defined, which cannot be said about DR. However, the similarity of RP vectors reflects the similarity of the objects they represent.DR is a neural network approach based on modeling the representation of information in the brain, resulted from ideas about a "distributed" or "holographic" representations. DRs have a large information capacity, allow the use of a rich arsenal of methods developed for vector data, scale well for processing large amounts of data, and have a number of other advantages. Methods for data transformation to DRs have been developed for data of various types -from scalar and vector to graphs.The purpose of the article is to provide an overview of a part of the work of the Department of Neural Information Processing Technologies (International Center) in the field of neural network distributed representations. The approach is a development of the ideas of Nikolai Mikhailovich Amosov and his scientific school of modeling the structure and functions of the brain.Scope. The formation of distributed representations from the original vector representations of objects using random projection is considered. With the help of the DR, it is possible to efficiently estimate the similarity of the original objects represented by numerical vectors. Gritsenko V.I., Rachkovskij D.A., Revunova E.G. ISSN 2519-2205 (Online), ISSN 0454-9910 (Print). Киб. и выч. техн. 2018. № 4 (194)
8The use of DR allows developing regularization methods for obtaining a stable solution of discrete ill-posed inverse problems, increasing the computational efficiency and accuracy of their solution, analyzing analytically the accuracy of the solution. Thus DRs allow for increasing the efficiency of information technologies applying them.Conclusions. DRs of various data types can be used to improve the efficiency and intelligence level of information technologies. DRs have been developed for both weakly structured data, such as vectors, and for complex structured representations of objects, such as sequences, graphs of knowledge-base situations (episodes), etc. Transformation of different types of data into the DR vector format allows unifying the basic information technologies of their processing and achieving good scalability with an increase in the amount of data processed.In future, distributed representations will naturally combine information on structure and semantics to create computationally efficient and qualitatively new information technologies in which the processing of relational structures from knowledge bases is performed by the similarity of their DRs. The neurobiological relevance of distributed representations opens up the possibility of creating intelligent information technologies based on them that function similarly to...
“…The shortcomings of supervised dimension reduction methods have caused the development of the approach to transformation of vector data without adaptation, so-called random projection [19][20][21][22][23][24][25][26][27]. In this method, to transform input vectors into output ones, multiplication by a random matrix is carried out; elements of the matrix are randomly generated and then fixed numbers from some distribution.…”
We analyze the estimation of the angle, scalar product, and the Euclidean distance of real-valued vectors using binary vectors with controlled sparsity. Transformation is carried out by projection using a binary random matrix with elements { } 0 1 , and the output threshold transformation. We also provide a comparative analysis of the error obtained while estimating the similarity measures of input vectors by some similarity measures of output binary vectors based on their scalar product.Keywords: binary random projections, sparse binary representations, estimate of vector similarity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.