HCDs–DOX was prepared with low cytotoxicity, and sustained, pH-targeted release properties. In vitro release conformed to the Weibull model and Fick diffusion.
In distributed training of deep neural networks or Federated Learning (FL), people usually run Stochastic Gradient Descent (SGD) or its variants on each machine and communicate with other machines periodically. However, SGD might converge slowly in training some deep neural networks (e.g., RNN, LSTM) because of the exploding gradient issue. Gradient clipping is usually employed to address this issue in the single machine setting, but exploring this technique in the FL setting is still in its infancy: it remains mysterious whether the gradient clipping scheme can take advantage of multiple machines to enjoy parallel speedup. The main technical difficulty lies in dealing with nonconvex loss function, non-Lipschitz continuous gradient, and skipping communication rounds simultaneously. In this paper, we explore a relaxed-smoothness assumption of the loss landscape which LSTM was shown to satisfy in previous works and design a communication-efficient gradient clipping algorithm. This algorithm can be run on multiple machines, where each machine employs a gradient clipping scheme and communicate with other machines after multiple steps of gradient-based updates. Our algorithm is proved to have O 1 N 4 iteration complexity for finding an -stationary point, where N is the number of machines. This indicates that our algorithm enjoys linear speedup. We prove this result by introducing novel analysis techniques of estimating truncated random variables, which we believe are of independent interest. Our experiments on several benchmark datasets and various scenarios demonstrate that our algorithm indeed exhibits fast convergence speed in practice and thus validates our theory.
Dilated cardiomyopathy (DCM) is the major cause of heart failure and has a poor prognosis. The accumulating evidence points to an essential role of the inflammatory component in the process of DCM. Inhibitors of sodium-glucose cotransporter 2 (SGLT2) are widely used to treat heart failure patients due to their cardiac benefits. However, their role in DCM remains unclear. We used the doxorubicin (Dox)-induced DCM model for our study. The SGLT2 inhibitor dapagliflozin (Dapa) improved cardiac function in mice treated with doxorubicin and attenuated the activation of the nucleotide-binding oligomerization domain-like receptor family protein 3 (NLRP3) inflammasome pathway and the expression of inflammatory factors. In addition, dapagliflozin suppresses NLRP3 activation by decreasing p38-dependent toll-like receptor 4 (TLR4) expression. In our study, dagliflozin improves cardiac function in DCM by inhibiting the activity of the NLRP3 inflammasome.
Graphical Abstract
The daily iron absorption and loss are small and iron metabolism in human is characterized by a limited external exchange and by an efficient reutilization of iron from internal sources. The mononuclear phagocyte system (MPS) plays a key role in recycling iron from hemoglobin of senescent or damaged erythrocytes, which is important in maintaining iron homeostasis. Many iron-related proteins are expressed in the MPS, including heme oxygenase (HO) for heme degradation, the iron importer transferrin receptor 1 (TfR1) and divalent metal transport 1 (DMT1), the iron exporter ferroportin 1 (FPN1) and the iron regulatory hormone hepcidin. Insights into the regulatory mechanisms that control the regulation of iron metabolism proteins in the MPS will deepen our understanding about the molecular mechanism of iron homeostasis and iron-related diseases.
Over the last decade, research on automated parameter tuning, often referred to as automatic algorithm configuration (AAC), has made significant progress. Although the usefulness of such tools has been widely recognized in real world applications, the theoretical foundations of AAC are still very weak. This paper addresses this gap by studying the performance estimation problem in AAC. More specifically, this paper first proves the universal best performance estimator in a practical setting, and then establishes theoretical bounds on the estimation error, i.e., the difference between the training performance and the true performance for a parameter configuration, considering finite and infinite configuration spaces respectively. These findings were verified in extensive experiments conducted on four algorithm configuration scenarios involving different problem domains. Moreover, insights for enhancing existing AAC methods are also identified.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.