Abstract-We present two graph-based algorithms for multiclass segmentation of high-dimensional data on graphs. The algorithms use a diffuse interface model based on the Ginzburg-Landau functional, related to total variation and graph cuts. A multiclass extension is introduced using the Gibbs simplex, with the functional's double-well potential modified to handle the multiclass case. The first algorithm minimizes the functional using a convex splitting numerical scheme. The second algorithm uses a graph adaptation of the classical numerical Merriman-Bence-Osher (MBO) scheme, which alternates between diffusion and thresholding. We demonstrate the performance of both algorithms experimentally on synthetic data, image labeling, and several benchmark data sets such as MNIST, COIL and WebKB. We also make use of fast numerical solvers for finding the eigenvectors and eigenvalues of the graph Laplacian, and take advantage of the sparsity of the matrix. Experiments indicate that the results are competitive with or better than the current state-of-the-art in multiclass graph-based segmentation algorithms for high-dimensional data.
A comprehensive three‐dimensional fully coupled thermo‐electro‐mechanical finite element framework is developed for modeling spark plasma sintering (SPS). The finite element model is applied to the simulation of spark plasma processing with four different tooling sizes and various temperature regimes. The comparison of modeling and experimental results shows that the model is reliable for qualitative predictions of the densification behavior and of the grain growth in powder specimens subjected to SPS with a given temperature regime. The conducted modeling indicates the possibility of changing the heating pattern of the specimen (warmer central areas of the specimen's volume and cooler outside areas or vice versa) depending on the size of the tooling. High heating rates and large specimen sizes elevate the temperature and, in turn, material structure gradients during SPS processing. The obtained results suggest that the industrial implementation of SPS techniques should be based on the predictive capability of reliable modeling approaches.
Convolutional sparse representations are a form of sparse representation with a dictionary that has a structure that is equivalent to convolution with a set of linear filters. While effective algorithms have recently been developed for the convolutional sparse coding problem, the corresponding dictionary learning problem is substantially more challenging. Furthermore, although a number of different approaches have been proposed, the absence of thorough comparisons between them makes it difficult to determine which of them represents the current state of the art. The present work both addresses this deficiency and proposes some new approaches that outperform existing ones in certain contexts. A thorough set of performance comparisons indicates a very wide range of performance differences among the existing and proposed methods, and clearly identifies those that are the most effective. We do not consider the analysis form [2] of sparse representations in this work, focusing instead on the more common synthesis form. 2 We do not consider the very recent online CDL algorithms [18], [19], [20], [21] in this work.
BackgroundThe National Cancer Institute drug pair screening effort against 60 well-characterized human tumor cell lines (NCI-60) presents an unprecedented resource for modeling combinational drug activity.ResultsWe present a computational model for predicting cell line response to a subset of drug pairs in the NCI-ALMANAC database. Based on residual neural networks for encoding features as well as predicting tumor growth, our model explains 94% of the response variance. While our best result is achieved with a combination of molecular feature types (gene expression, microRNA and proteome), we show that most of the predictive power comes from drug descriptors. To further demonstrate value in detecting anticancer therapy, we rank the drug pairs for each cell line based on model predicted combination effect and recover 80% of the top pairs with enhanced activity.ConclusionsWe present promising results in applying deep learning to predicting combinational drug response. Our feature analysis indicates screening data involving more cell lines are needed for the models to make better use of molecular features.
We present two graph-based algorithms for multiclass segmentation of high-dimensional data, motivated by the binary diffuse interface model. One algorithm generalizes Ginzburg-Landau (GL) functional minimization on graphs to the Gibbs simplex. The other algorithm uses a reduction of GL minimization, based on the Merriman-Bence-Osher scheme for motion by mean curvature. These yield accurate and efficient algorithms for semi-supervised learning. Our algorithms outperform existing methods, including supervised learning approaches, on the benchmark data sets that we used. We refer to [1] for a more detailed illustration of the methods, as well as different experimental examples.
BackgroundCurrent multi-petaflop supercomputers are powerful systems, but present challenges when faced with problems requiring large machine learning workflows. Complex algorithms running at system scale, often with different patterns that require disparate software packages and complex data flows cause difficulties in assembling and managing large experiments on these machines.ResultsThis paper presents a workflow system that makes progress on scaling machine learning ensembles, specifically in this first release, ensembles of deep neural networks that address problems in cancer research across the atomistic, molecular and population scales. The initial release of the application framework that we call CANDLE/Supervisor addresses the problem of hyper-parameter exploration of deep neural networks.ConclusionsInitial results demonstrating CANDLE on DOE systems at ORNL, ANL and NERSC (Titan, Theta and Cori, respectively) demonstrate both scaling and multi-platform execution.
The present paper shows the application of a three-dimensional coupled electrical, thermal, mechanical finite element macro-scale modeling framework of Spark Plasma Sintering (SPS) to an actual problem of SPS tooling overheating, encountered during SPS experimentation. The overheating phenomenon is analyzed by varying the geometry of the tooling that exhibits the problem, namely by modeling various tooling configurations involving sequences of disk-shape spacers with step-wise increasing radii. The analysis is conducted by means of finite element simulations, intended to obtain temperature spatial distributions in the graphite press-forms, including punches, dies, and spacers; to identify the temperature peaks and their respective timing, and to propose a more suitable SPS tooling configuration with the avoidance of the overheating as a final aim. Electric currents-based Joule heating, heat transfer, mechanical conditions, and densification are imbedded in the model, utilizing the finite-element software COMSOL™, which possesses a distinguishing ability of coupling multiple physics. Thereby the implementation of a finite element method applicable to a broad range of SPS procedures is carried out, together with the more specific optimization of the SPS tooling design when dealing with excessive heating phenomena.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.