Phase‐separated biomolecular droplets are formed in cells to regulate various biological processes. This phenomenon can be applied to constructing self‐assembled dynamic molecular systems such as artificial cells and molecular robots. Recently, programmable phase‐separated droplets called DNA droplets have been reported as a possible method to construct such dynamic molecular systems. This study reports a computational DNA droplet that can recognize a specific combination of tumor biomarker microRNAs (miRNAs) as molecular inputs and output a DNA logic computing result by physical DNA droplet phase separation. A mixed DNA droplet consisting of three DNA nanostructures with orthogonal sticky‐end sequences and two linker DNAs to cross‐bridge the orthogonal DNA nanostructures is proposed. By the hybridization of miRNAs with the linkers, the cross‐bridging ability is lost, causing the phase‐separation of the mixed DNA droplet into three DNA droplets, resulting in executing a miRNA pattern recognition described by a logical expression ((miRNA‐1 ∧ miRNA‐2) ∧ (miRNA‐3 ∧ ¬miRNA‐4)). This experimentally demonstrates that the computational DNA droplets recognize the above specific pattern of chemically synthesized miRNA sequences as a model experiment. In the future, this method will provide potential applications such as diagnosis and therapy with integration to biomolecular robots and artificial cells.
Computational DNA Droplets
In article number 2202322, Masahiro Takinoue and co‐workers develop a computational DNA droplet using programmable phase‐separated droplets consisting of DNA nanostructures. They demonstrate that the computational DNA droplet can recognize a specific combination of tumor marker microRNAs as molecular inputs and output results of DNA logic operations through DNA droplet division, achieving the fusion of biosensing and molecular computation in DNA droplets.
There has been a growing interest in augmented reality (AR) methods for gait guidance and rehabilitation. These methods have been effective in guiding spatial gait parameters, such as stride length, and temporal cycle times when paired with rhythmic stimuli. However, few studies have explored the simultaneous spatial and temporal guidance of gaits and have primarily focused on one of several aspects. Gait parameters are related, and changing one might have unintended effects on others; thus, simultaneous guidance is required. In this study, we designed and evaluated a system that provides simultaneous spatial and temporal guidance using a synchronized walking avatar in AR. The system requires a head-mounted display (HMD) and presents a walking avatar involving auditory cues synchronized with the foot-contact timing of the participant via a mutual entrainment model. Spatial feedback is provided by distance changes between the participant and avatar, and the effects on stride length are observed. Phase difference changes between the avatar and participant serve as temporal cues that are used to guide the cycle time. The stride length and cycle-time of eight participants walking along a straight corridor wearing the HMD were recorded and analyzed. Four experimental conditions were applied comprising combinations of increases and decreases in the spatial and temporal cues. Furthermore, the results demonstrated that spatial and temporal feedback had almost independent effects on respective gait parameters. These findings reveal that this combination of spatiotemporal cueing can be extremely effective in gait guidance without loss of effectiveness from single modality cues.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.