We propose simple protocols for performing quantum noise spectroscopy based on the method of transfer tensor maps (TTM), [Phys. Rev. Lett. 112, 110401 (2014)]. The TTM approach is a systematic way to deduce the memory kernel of a time-nonlocal quantum master equation via quantum process tomography. With access to the memory kernel it is possible to (1) assess the non-Markovianity of a quantum process, (2) reconstruct the noise spectral density beyond pure dephasing models, and (3) investigate collective decoherence in multiqubit devices. We illustrate the usefulness of TTM spectroscopy on the IBM Quantum Experience platform, and demonstrate that the qubits in the IBM device are subject to mild non-Markovian dissipation with spatial correlations.
The graph Laplacian regularization term is usually used in semi-supervised representation learning to provide graph structure information for a model f(X). However, with the recent popularity of graph neural networks (GNNs), directly encoding graph structure A into a model, i.e., f(A, X), has become the more common approach. While we show that graph Laplacian regularization brings little-to-no benefit to existing GNNs, and propose a simple but non-trivial variant of graph Laplacian regularization, called Propagation-regularization (P-reg), to boost the performance of existing GNN models. We provide formal analyses to show that P-reg not only infuses extra information (that is not captured by the traditional graph Laplacian regularization) into GNNs, but also has the capacity equivalent to an infinite-depth graph convolutional network. We demonstrate that P-reg can effectively boost the performance of existing GNN models on both node-level and graph-level tasks across many different datasets.
The graph Laplacian regularization term is usually used in semi-supervised node classification to provide graph structure information for a model f (X). However, with the recent popularity of graph neural networks (GNNs), directly encoding graph structure A into a model, i.e., f (A, X), has become the more common approach. While we show that graph Laplacian regularization f (X) ∆f (X) brings little-to-no benefit to existing GNNs, we propose a simple but non-trivial variant of graph Laplacian regularization, called Propagation-regularization (P-reg), to boost the performance of existing GNN models. We provide formal analyses to show that P-reg not only infuses extra information (that is not captured by the traditional graph Laplacian regularization) into GNNs, but also has the capacity equivalent to an infinite-depth graph convolutional network. The code is available at https://github.com/yang-han/P-reg.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.