A three-dimensional (3D) matrix multiplication algorithm for massively parallel processing systems is presented. The P processors are con gured as a \virtual" processing cube with dimensions p 1 , p 2 , and p 3 proportional to the matrices' dimensions|M, N, and K. Each processor performs a single local matrix multiplication of size M=p 1 N=p 2 K=p 3. Before the local computation can be carried out, each subcube must receive a single submatrix of A and B. After the single matrix multiplication has completed, K=p 3 submatrices of this product must be sent to their respective destination processors and then summed together with the resulting matrix C. The 3D parallel matrix multiplication approach has a factor P 1=6 less communication than the 2D parallel algorithms. This algorithm has been implemented on IBM POWERparallel T M SP2 T M systems (up to 216 nodes) and has yielded close to the peak performance of the machine. The algorithm has been combined with Winograd's variant of Strassen's algorithm to achieve performance which exceeds the theoretical peak of the system. (We assume the MFLOPS rate of matrix multiplication to be 2MNK.
In this papel; we present ScalParC (Scalable Parallel Classifier), a new parallel formulation of a decision tree based classification process. Like other state-of-the-art decision tree classifiers such as SPRINT, ScalParC is suited for handling large datasets. We show that existing parallel formulation of SPRINT is unscalable, whereas ScalParC is shown to be scalable in both runtime and memory requirements. We present the experimental results of classifying up to 6.4 million records on up to 128 processors of Cray T3D, in order to demonstrate the scalable behavior of ScalParC. A key component of ScalParC is the parallel hash table. The proposed parallel hashing paradigm can be used to parallelize other algorithms that require many concurrent updates to a large hash table.
ABSTRACT:We propose a single-frame, learning-based super-resolution restoration technique by using the wavelet domain to define a constraint on the solution. Wavelet coefficients at finer scales of the unknown high-resolution image are learned from a set of high-resolution training images and the learned image in the wavelet domain is used for further regularization while super-resolving the picture. We use an appropriate smoothness prior with discontinuity preservation in addition to the wavelet-based constraint to estimate the superresolved image. The smoothness term ensures the spatial correlation among the pixels, whereas the learning term chooses the best edges from the training set. Because this amounts to extrapolating the high-frequency components, the proposed method does not suffer from oversmoothing effects. The results demonstrate the effectiveness of the proposed approach.
Abstract-We propose a technique for super-resolution imaging of a scene from observations at different camera zooms. Given a sequence of images with different zoom factors of a static scene, we obtain a picture of the entire scene at a resolution corresponding to the most zoomed observation. The high-resolution image is modeled through appropriate parameterization, and the parameters are learned from the most zoomed observation. Assuming a homogeneity of the high-resolution field, the learned model is used as a prior while super-resolving the scene. We suggest the use of either a Markov random field (MRF) or an simultaneous autoregressive (SAR) model to parameterize the field based on the computation one can afford. We substantiate the suitability of the proposed method through a large number of experimentations on both simulated and real data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.