After a large-scale radiological accident, early-response biomarkers to assess radiation exposure over a broad dose range are not only the basis of rapid radiation triage, but are also the key to the rational use of limited medical resources and to the improvement of treatment efficiency. Because of its high throughput, rapid assays and minimally invasive sample collection, metabolomics has been applied to research into radiation exposure biomarkers in recent years. Due to the complexity of radiobiological effects, most of the potential biomarkers are both dose-dependent and time-dependent. In reality, it is very difficult to find a single biomarker that is both sensitive and specific in a given radiation exposure scenario. Therefore, a multi-parameters approach for radiation exposure assessment is more realistic in real nuclear accidents. In this study, untargeted metabolomic profiling based on gas chromatography-mass spectrometry (GC-MS) and targeted amino acid profiling based on LC-MS/MS were combined to investigate early urinary metabolite responses within 48 h post-exposure in a rat model. A few of the key early-response metabolites for radiation exposure were identified, which revealed the most relevant metabolic pathways. Furthermore, a panel of potential urinary biomarkers was selected through a multi-criteria approach and applied to early triage following irradiation. Our study suggests that it is feasible to use a multi-parameters approach to triage radiation damage, and the urinary excretion levels of the relevant metabolites provide insights into radiation damage and repair.
Textiles are a vital and indispensable part of our clothing that we use daily. They are very flexible, often lightweight, and have a variety of application uses. Today, with the rapid developments in small and flexible sensing materials, textiles can be enhanced and used as input devices for interactive systems. Clothing-based wearable interfaces are suitable for in-vehicle controls. They can combine various modalities to enable users to perform simple, natural, and efficient interactions while minimizing any negative effect on their driving. Research on clothing-based wearable in-vehicle interfaces is still underexplored. As such, there is a lack of understanding of how to use textile-based input for in-vehicle controls. As a first step towards filling this gap, we have conducted a user-elicitation study to involve users in the process of designing in-vehicle interactions via a fabric-based wearable device. We have been able to distill a taxonomy of wrist and touch gestures for in-vehicle interactions using a fabric-based wrist interface in a simulated driving setup. Our results help drive forward the investigation of the design space of clothing-based wearable interfaces for in-vehicle secondary interactions.
Advanced developments in handheld devices' interactive 3D graphics capabilities, processing power, and cloud computing have provided great potential for handheld augmented reality (HAR) applications, which allow users to access digital information anytime, anywhere. Nevertheless, existing interaction methods are still confined to the touch display, device camera, and built-in sensors of these handheld devices, which suffer from obtrusive interactions with AR content. Wearable fabric-based interfaces promote subtle, natural, and eyes-free interactions which are needed when performing interactions in dynamic environments. Prior studies explored the possibilities of using fabric-based wearable interfaces for head-mounted AR display (HMD) devices. The interface metaphors of HMD AR devices are inadequate for handheld AR devices as a typical HAR application require users to use only one hand to perform interactions. In this paper, we aim to investigate the use of a fabric-based wearable device as an alternative interface option for performing interactions with HAR applications. We elicited user-preferred gestures which are socially acceptable and comfortable to use for HAR devices. We also derived an interaction vocabulary of the wrist and thumb-to-index touch gestures, and present broader design guidelines for fabric-based wearable interfaces for handheld augmented reality applications. Appl. Sci. 2019, 9, 3177 2 of 21 and portable enough to be carried wherever users go. With this ubiquitous availability, HAR allows us to develop and design innovative applications in navigation, education, gaming, tourism, interactive shopping, production, marketing, and others [3]. Thus, smartphones have been identified as an ideal platform for HAR experiences in various outdoor and indoor environments [4][5][6].In order to interact with the virtual world using HAR displays, a user needs to position and orientate the device using one hand and manipulate the virtual 3D objects with the other hand. In general, the touchscreen is used as a primary interface to interact with AR content [7,8]. In addition, the various built-in sensors in the handheld devices-such as cameras, GPS, compass, accelerometers, and gyroscope-enable to precisely determine the position and orientation of the device in the real world (e.g., [8][9][10]). Furthermore, the device's camera is used to naturally capture the user's mid-air hand movements while holding the device [11,12].Like in HMD AR, manipulations such as selecting and moving virtual 3D information are primary interactions in HAR devices [13]. The existing HAR interaction methods, such as touch input, offer promising solutions to manipulate virtual content (e.g., [14]). However, they still have substantial limitations. For instance, touch input is limited by the device's physical boundary and usability suffers as on-screen content becomes occluded by finger (i.e., finger occlusions [15,16]). Also, 2D inputs on the touch surface do not directly support manipulating the six degrees of freedom of a virt...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.