Transcription factors (TF) bind DNA-target sites within promoters to activate gene expression. TFs target their DNA-recognition sequences with high specificity by binding with resident times of up to hours in vitro. However, in vivo TFs can exchange on the order of seconds. The factors that regulate TF dynamics in vivo and increase dissociation rates by orders of magnitude are not known. We investigated TF binding and dissociation dynamics at their recognition sequence within duplex DNA, single nucleosomes and short nucleosome arrays with single molecule total internal reflection fluorescence (smTIRF) microscopy. We find that the rate of TF dissociation from its site within either nucleosomes or nucleosome arrays is increased by 1000-fold relative to duplex DNA. Our results suggest that TF binding within chromatin could be responsible for the dramatic increase in TF exchange in vivo. Furthermore, these studies demonstrate that nucleosomes regulate DNA–protein interactions not only by preventing DNA–protein binding but by dramatically increasing the dissociation rate of protein complexes from their DNA-binding sites.
Eukaryotic genomes are repetitively wrapped into nucleosomes that then regulate access of transcription and DNA repair complexes to DNA. The mechanisms that regulate extrinsic protein interactions within nucleosomes are unresolved. We demonstrate that modulation of the nucleosome unwrapping rate regulates protein binding within nucleosomes. Histone H3 acetyl-lysine 56 [H3(K56ac)] and DNA sequence within the nucleosome entry-exit region additively influence nucleosomal DNA accessibility by increasing the unwrapping rate without impacting rewrapping. These combined epigenetic and genetic factors influence transcription factor (TF) occupancy within the nucleosome by at least one order of magnitude and enhance nucleosome disassembly by the DNA mismatch repair complex, hMSH2–hMSH6. Our results combined with the observation that ∼30% of Saccharomyces cerevisiae TF-binding sites reside in the nucleosome entry–exit region suggest that modulation of nucleosome unwrapping is a mechanism for regulating transcription and DNA repair.
BackgroundThere are many benefits to open datasets. However, privacy concerns have hampered the widespread creation of open health data. There is a dearth of documented methods and case studies for the creation of public-use health data. We describe a new methodology for creating a longitudinal public health dataset in the context of the Heritage Health Prize (HHP). The HHP is a global data mining competition to predict, by using claims data, the number of days patients will be hospitalized in a subsequent year. The winner will be the team or individual with the most accurate model past a threshold accuracy, and will receive a US $3 million cash prize. HHP began on April 4, 2011, and ends on April 3, 2013.ObjectiveTo de-identify the claims data used in the HHP competition and ensure that it meets the requirements in the US Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule.MethodsWe defined a threshold risk consistent with the HIPAA Privacy Rule Safe Harbor standard for disclosing the competition dataset. Three plausible re-identification attacks that can be executed on these data were identified. For each attack the re-identification probability was evaluated. If it was deemed too high then a new de-identification algorithm was applied to reduce the risk to an acceptable level. We performed an actual evaluation of re-identification risk using simulated attacks and matching experiments to confirm the results of the de-identification and to test sensitivity to assumptions. The main metric used to evaluate re-identification risk was the probability that a record in the HHP data can be re-identified given an attempted attack.ResultsAn evaluation of the de-identified dataset estimated that the probability of re-identifying an individual was .0084, below the .05 probability threshold specified for the competition. The risk was robust to violations of our initial assumptions.ConclusionsIt was possible to ensure that the probability of re-identification for a large longitudinal dataset was acceptably low when it was released for a global user community in support of an analytics competition. This is an example of, and methodology for, achieving open data principles for longitudinal health data.
Interest remains in reconstruction-algorithm research and development for possible improvement of image quality in current PET imaging and for enabling innovative PET systems to enhance existing, and facilitate new, preclinical and clinical applications. Optimization-based image reconstruction has been demonstrated in recent years to have potential utility for CT imaging applications. In this work, we investigate tailoring the optimization-based techniques to image reconstruction for PET systems with standard and non-standard scan configurations. Specifically, given an image-total-variation (TV) constraint, we investigated how the selection of different data divergences and associated parameters impacts the optimization-based reconstruction of PET images. The reconstruction robustness was explored also with respect to different data conditions and activity up-takes of practical relevance. A study was conducted particularly for image reconstruction from data collected by use of a PET configuration with sparsely populated detectors. Overall, the study demonstrates the robustness of the TV-constrained, optimization-based reconstruction for considerably different data conditions in PET imaging, as well as its potential to enable PET configurations with reduced numbers of detectors. Insights gained in the study may be exploited for developing algorithms for PET-image reconstruction and for enabling PET-configuration design of practical usefulness in preclinical and clinical applications.
The proposed incremental algorithms prove effective and efficient for iterative image reconstruction in low-dose CT applications particularly with sparse-view projection data.
Purpose Simulation-based image quality metrics are adapted and investigated for characterizing the parameter dependences of linear iterative image reconstruction for DBT. Methods Three metrics based on a 2D DBT simulation are investigated: (1) a root-mean-square-error (RMSE) between the test phantom and reconstructed image, (2) a gradient RMSE where the comparison is made after taking a spatial gradient of both image and phantom, and (3) a region-of-interest (ROI) Hotelling observer (HO) for signal-known-exactly/background-known-exactly (SKE/BKE) and signal-known-exactly/background-known-statistically (SKE/BKS) detection tasks. Two simulation studies are performed using the aforementioned metrics, varying voxel aspect ratio and regularization strength for two types of Tikhonov regularized least-squares optimization. The RMSE metrics are applied to a 2D test phantom with resolution bar patterns at varying angles, and the ROI-HO metric is applied to two tasks relevant to DBT: lesion detection, modeled by use of a large, low-contrast signal, and microcalcification detection, modeled by use of a small, high-contrast signal. The RMSE metric trends are compared with visual assessment of the reconstructed bar-pattern phantom. The ROI-HO metric trends are compared with 3D reconstructed images from ACR phantom data acquired with a Hologic Selenia Dimensions DBT system. Results Sensitivity of the image RMSE to mean pixel value is found to limit its applicability to the assessment of DBT image reconstruction. The image gradient RMSE is insensitive to mean pixel value and appears to track better with subjective visualization of the reconstructed bar-pattern phantom. The ROI-HO metric shows an increasing trend with regularization strength for both forms of Tikhonov-regularized least-squares; however, this metric saturates at intermediate regularization strength indicating a point of diminishing returns for signal detection. Visualization with the reconstructed ACR phantom images appear to show a similar dependence with regularization strength. Conclusions From the limited studies presented it appears that image gradient RMSE trends correspond with visual assessment better than image RMSE for DBT image reconstruction. The ROI-HO metric for both detection tasks also appears to reflect visual trends in the ACR phantom reconstructions as a function of regularization strength. We point out, however, that the true utility of these metrics can only be assessed after amassing more data.
This paper investigates the randomized version of the Kaczmarz method to solve linear systems in the case where the adjoint of the system matrix is not exact-a situation we refer to as "mismatched adjoint". We show that the method may still converge both in the over-and underdetermined consistent case under appropriate conditions, and we calculate the expected asymptotic rate of linear convergence. Moreover, we analyze the inconsistent case and obtain results for the method with mismatched adjoint as for the standard method. Finally, we derive a method to compute optimized probabilities for the choice of the rows and illustrate our findings with numerical example.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.