Recent advances in domain adaptation show that deep self-training presents a powerful means for unsupervised domain adaptation. These methods often involve an iterative process of predicting on target domain and then taking the confident predictions as pseudo-labels for retraining. However, since pseudo-labels can be noisy, self-training can put overconfident label belief on wrong classes, leading to deviated solutions with propagated errors. To address the problem, we propose a confidence regularized self-training (CRST) framework, formulated as regularized self-training. Our method treats pseudo-labels as continuous latent variables jointly optimized via alternating optimization. We propose two types of confidence regularization: label regularization (LR) and model regularization (MR). CRST-LR generates soft pseudo-labels while CRST-MR encourages the smoothness on network output. Extensive experiments on image classification and semantic segmentation show that CRSTs outperform their non-regularized counterpart with state-of-the-art performance. The code and models of this work are available at https://github.com/yzou2/CRST. Network retraining (b) Label regularized self-training (c) Model regularized self-training (a) Self-training without confidence regularization Backbone Network Car Person Bus Network retraining Network retraining Figure 1: Illustration of proposed confidence regularization. (a) Self-training without confidence regularization generates and retrains with hard pseudo-labels, resulting in sharp network output. (b) Label regularized self-training introduces soft pseudo-labels, therefore enabling outputs to be smooth. (c) Model regularized self-training also retrains with hard pseudo-labels, but incorporates a regularizer to directly promote output smoothness.More recently, self-training with networks emerged as a promising alternative towards domain adaptation [4,5,25,29,49,54,69]. Self-training iteratively generates a set of one-hot (or hard) pseudo-labels corresponding to large selection scores (i.e., prediction confidence) in target domain, and then retrains network based on these pseudo-labels with target data. Recently, [69] proposes class-balanced selftraining (CBST) and formulates self-training as a unified loss minimization with pseudo-labels that can be solved in an end-to-end manner. Instead of reducing domain gap by minimizing both the task loss and domain adversarial loss, the self-training loss implicitly encourages cross-domain feature alignment for each class by learning from both labeled source data and pseudo-labeled target data.Early work [29] shows that the essence of deep selftraining is entropy minimization -pushing network output to be as sharp as hard pseudo-label. However, 100% accuracy cannot always be guaranteed for pseudo-labels. Trusting all selected pseudo-labels as "ground truth" by encoding
The Atg4 cysteine proteases are required for processing Atg8 for the latter to be conjugated to phosphatidylethanolamine on autophagosomal membranes, a key step in autophagosome biogenesis. Notably, whereas there are only one atg4 and one atg8 gene in the yeast, the mammals have four Atg4 homologues and six Atg8 homologues. The Atg8 homologues seem to play different roles in autophagosome biogenesis, and previous studies had indicated that they could be differentially processed by Atg4 homologues. The present study provided the first detailed kinetics analysis of all four Atg4 homologues against four representative Atg8 homologues. The data indicated that Atg4B possessed the broadest spectrum against all substrates, followed by Atg4A, whereas Atg4C and Atg4D had minimal activities as did the catalytic mutant of Atg4B (C74S). On the other hand, GATE-16 seemed to be the overall best substrate for Atg4 proteases. The kinetics parameters of Atg4B were also affected by its structure and that of the substrates, indicating a process of induced fit. The determination of the kinetics parameters of the various Atg4-Atg8 pairs provides a base for the understanding of the potential selective impact of the reaction on autophagosome biogenesis.
This study analyzed the temporal precipitation variations in the arid Central Asia (ACA) and their regional differences during 1930-2009 using monthly gridded precipitation from the Climatic Research Unit (CRU). Our results showed that the annual precipitation in this westerly circulation dominated arid region is generally increasing during the past 80 years, with an apparent increasing trend (0.7 mm/10 a) in winter. The precipitation variations in ACA also differ regionally, which can be divided into five distinct subregions (I West Kazakhstan region, II East Kazakhstan region, III Central Asia Plains region, IV Kyrgyzstan region, and V Iran Plateau region). The annual precipitation falls fairly even on all seasons in the two northern subregions (regions I and II, approximately north of 45°N), whereas the annual precipitation is falling mainly on winter and spring (accounting for up to 80% of the annual total precipitation) in the three southern subregions. The annual precipitation is increasing on all subregions except the southwestern ACA (subregion V) during the past 80 years. A significant increase in precipitation appeared in subregions I and III. The long-term trends in annual precipitation in all subregions are determined mainly by trends in winter precipitation. Additionally, the precipitation in ACA has significant interannual variations. The 2-3-year cycle is identified in all subregions, while the 5-6-year cycle is also found in the three southern subregions. Besides the inter-annual variations, there were 3-4 episodic precipitation variations in all subregions, with the latest episodic change that started in the mid-to late 1970s. The precipitations in most of the study regions are fast increasing since the late 1970s. Overall, the responses of ACA precipitation to global warming are complicated. The variations of westerly circulation are likely the major factors that influence the precipitation variations in the study region.arid Central Asia, annual and seasonal precipitation, changing tendency, regional difference Citation:
Alterations in MUC expression occur during colorectal tumorigenesis. The transformation process in MC and SRCC may be different from that in the traditional adenoma-carcinoma sequence.
Whether and how warming alters functional traits of absorptive plant roots remains to be answered across the globe. Tackling this question is crucial to better understanding terrestrial responses to climate change as fine-root traits drive many ecosystem processes.We carried out a detailed synthesis of fine-root trait responses to experimental warming by performing a meta-analysis of 964 paired observations from 177 publications.Warming increased fine-root biomass, production, respiration and nitrogen concentration as well as decreased root carbon : nitrogen ratio and nonstructural carbohydrates. Warming effects on fine-root biomass decreased with greater warming magnitude, especially in shortterm experiments. Furthermore, the positive effect of warming on fine-root biomass was strongest in deeper soil horizons and in colder and drier regions. Total fine-root length, morphology, mortality, life span and turnover were unresponsive to warming.Our results highlight the significant changes in fine-root traits in response to warming as well as the importance of warming magnitude and duration in understanding fine-root responses. These changes have strong implications for global soil carbon stocks in a warmer world associated with increased root-derived carbon inputs into deeper soil horizons and increases in fine-root respiration.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.