Illumina-based next generation sequencing (NGS) has accelerated biomedical discovery through its ability to generate thousands of gigabases of sequencing output per run at a fraction of the time and cost of conventional technologies. The process typically involves four basic steps: library preparation, cluster generation, sequencing, and data analysis. In 2015, a new chemistry of cluster generation was introduced in the newer Illumina machines (HiSeq 3000/4000/X Ten) called exclusion amplification (ExAmp), which was a fundamental shift from the earlier method of random cluster generation by bridge amplification on a non-patterned flow cell. The ExAmp chemistry, in conjunction with patterned flow cells containing nanowells at fixed locations, increases cluster density on the flow cell, thereby reducing the cost per run. It also increases sequence read quality, especially for longer read lengths (up to 150 base pairs). This advance has been widely adopted for genome sequencing because greater sequencing depth can be achieved for lower cost without compromising the quality of longer reads. We show that this promising chemistry is problematic, however, when multiplexing samples. We discovered that up to 5-10% of sequencing reads (or signals) are incorrectly assigned from a given sample to other samples in a multiplexed pool. We provide evidence that this "spreading-of-signals" arises from low levels of free index primers present in the pool. These index primers can prime pooled library fragments at random via complementary 3' ends, and get extended by DNA polymerase, creating a new library molecule with a new index before binding to the patterned flow cell to generate a cluster for sequencing. This causes the resulting read from that cluster to be assigned to a different sample, causing the spread of signals within multiplexed samples. We show that low levels of free index primers persist after the most common library purification procedure recommended by Illumina, and that the amount of signal spreading among samples is proportional to the level of free index primer present in the library pool. This artifact causes homogenization and misclassification of cells in single cell RNA-seq experiments. Therefore, all data generated in this way must now be carefully re-examined to ensure that "spreading-ofsignals" has not compromised data analysis and conclusions. Re-sequencing samples using an older technology that uses conventional bridge amplification for cluster generation, or improved library cleanup strategies to remove free index primers, can minimize or eliminate this signal spreading artifact.
Turbulent drag reduction induced by lambda-DNA is studied. The double-stranded DNA is found to be a good drag reducer when compared with the other normal linear polymers. However, this drag reducing power disappears when the DNA denatures to form two single-strand molecules. Mechanical degradation of DNA is also different from that of the normal linear-chain polymers: DNA is always cut in half by the turbulence. Our results suggest that the mechanism for turbulent degradation of DNA is different from that of the normal flexible long-chain polymers.
Effects of interstitial air on the motions of a large intruder in a shaken granular bed are studied experimentally as a function of ambient air pressure, particle size of the bed, and the density of the intruder. It is found that the intruder always rises from the granular bed in the absence of air. However, the intruder can acquire both positive and negative buoyancy in the presence of air. Negative buoyancy can be observed only when both the density of the intruder and the particle size of the bed are small enough. This negative buoyancy can be explained by the unusual air pressure distribution found in the bed.
Network connectivities ((-)k) of cortical neural cultures are studied by synchronized firing and determined from measured correlations between fluorescence intensities of firing neurons. The bursting frequency (f) during synchronized firing of the networks is found to be an increasing function of (-)k. With f taken to be proportional to (-)k, a simple random model with a (-)k dependent connection probability p((-)k).has been constructed to explain our experimental findings successfully.
The technique of photon-correlation spectroscopy was exploited to study turbulent pipe Aob ehind a grid. %e measure the correlation function g (t) of the light intensity scattered by small particles suspended in the turbulent Siid. The results imply that the probability distribution function for the small relative velocity fluctuations in the turbulent grid flou is Lorentzian-like. The statistical properties of the small velocity fluctuations over varying length scales possess a self-similar feature. This self-similarity vvas seen only when the Reynolds number becomes larger than a specific value %,. All the measurements suggest that the Sow changes its character at this point. (t) is an incoherent sum of these ensemble averaged (or time averaged) phase factors over all the particle pairs in the scattering volume. The ensemble average of the phase factor cos[q V(R)t] involves the velocity distribution function P(V(R)). When the distribution function P(V(R)) has the scaling form P(~V(R)~/u(R)) discussed below, the ensemble average of the phase factor cos[q V(R}t] becomes the Fourier cosine transform of P(V(R)). Therefore the measurement of g(t) yields a weighted integral of the Fourier cosine transform of P(V(R) ). (This weighting is required, because the detector is sensitive to all particle pairs in the scattering volume, and for small R, more pairs will be found in the scattering volume than that for larger R.) When the direction of the scattering vector q is fixed, the onedimensional distribution function P{Vq(R }) can be measured, where Vq(R) is the component of V(R, t) along the scattering vector q (we drop the subscript q hereafter when no confusion arises). The PCS technique yields information about velocity Auctuations without introducing an invasive probe, such as a hot wire anemometer. Nor is it necessary to invoke Taylor's "frozen turbulence" assumption to interpret the measurements.In theories of fully developed turbulence, ' "' ' the dynamic process of turbulence is considered as a cascade of turbulent kinetic energy from large scales to small scales. Energy fed into the turbulence goes primarily into the large eddies. The size of these eddies is determined by the boundary of the system and establishes the outer scale Lo of turbulence. From these large eddies, smaller 3'7 2125
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.