Stability and performance during rhythmic motor behaviors such as locomotion are critical for survival across taxa: falling down would bode well for neither cheetah nor gazelle. Little is known about how haptic feedback, particularly during discrete events such as the heel-strike event during walking, enhances rhythmic behavior. To determine the effect of haptic cues on rhythmic motor performance, we investigated a virtual paddle juggling behavior, analogous to bouncing a table tennis ball on a paddle. Here, we show that a force impulse to the hand at the moment of ball-paddle collision categorically improves performance over visual feedback alone, not by regulating the rate of convergence to steady state (e.g., via higher gain feedback or modifying the steady-state hand motion), but rather by reducing cycle-to-cycle variability. This suggests that the timing and state cues afforded by haptic feedback decrease the nervous system's uncertainty of the state of the ball to enable more accurate control but that the feedback gain itself is unaltered. This decrease in variability leads to a substantial increase in the mean first passage time, a measure of the long-term metastability of a stochastic dynamical system. Rhythmic tasks such as locomotion and juggling involve intermittent contact with the environment (i.e., hybrid transitions), and the timing of such transitions is generally easy to sense via haptic feedback. This timing information may improve metastability, equating to less frequent falls or other failures depending on the task.
Purpose Stereotactic body radiation therapy (SBRT) allows for high radiation doses to be delivered to the pancreatic tumors with limited toxicity. Nevertheless, the respiratory motion of the pancreas introduces major uncertainty during SBRT. Ultrasound imaging is a non-ionizing, non-invasive and real-time technique for intrafraction monitoring. A configuration is not available to place ultrasound probe during pancreas SBRT for monitoring. Methods and Materials An arm-bridge system was designed and built. CT scan of the bridge-held ultrasound probe was acquired and fused to ten previously treated pancreatic SBRT patient CT’s as virtual simulation CT’s. Both step-and-shoot intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc-therapy (VMAT) planning were performed on virtual simulation CT. The accuracy of tracking algorithm was evaluated by programmed motion phantom with simulated breath-hold 3D movement. IRB approved volunteer study was also performed to evaluate feasibility of system setup. Three healthy subjects underwent the same patient setup required for pancreas SBRT with active breath control (ABC). 4D ultrasound images were acquired for monitoring. Ten breath-hold cycles were monitored for both phantom and volunteers. For phantom study, the target motion tracked by ultrasound was compared with motion tracked by the infrared camera. For volunteer study, the reproducibility of ABC breath-hold was assessed. Results Volunteer study results showed that the arm-bridge system allows placement of ultrasound probe. The ultrasound monitoring showed less than 2 mm reproducibility of ABC breath-hold in healthy volunteers. The phantom monitoring accuracy is 0.14 ± 0.08 mm, 0.04 ± 0.1 mm, 0.25 ± 0.09 mm in three directions. On dosimetry part, 100% of virtual simulation plans passed protocol criteria. Conclusions Our ultrasound system can be potentially used for real-time monitoring during pancreas SBRT without compromising planning quality. The phantom study showed high monitoring accuracy of the system and volunteer study showed feasibility of the clinical workflow.
Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position. To handle cases where some or all of the markers are occluded, this paper proposes an inertial and optical sensor fusion approach in which the bias of the inertial sensors is estimated when the optical tracker provides full six degree-of-freedom (6-DOF) pose information. As long as the position of at least one marker can be tracked by the optical system, the 3-DOF position can be combined with the orientation estimated from the inertial measurements to recover the full 6-DOF pose information. When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system. Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU). Experimental results show that under partial occlusion conditions, the root mean square errors (RMSE) of orientation and position are 0.04° and 0.134 mm, and under total occlusion conditions for 1 s, the orientation and position RMSE are 0.022° and 0.22 mm, respectively. Thus, the proposed sensor fusion approach can provide reliable 6-DOF pose under long-term partial occlusion and short-term total occlusion conditions.
Abstract. Ultrasound can provide real-time image guidance of radiation therapy, but the probe-induced tissue deformations cause local deviations from the treatment plan. If placed during treatment planning, the probe causes streak artifacts in required computed tomography (CT) images. To overcome these challenges, we propose robot-assisted placement of an ultrasound probe, followed by replacement with a geometrically identical, CT-compatible model probe. In vivo reproducibility was investigated by implanting a canine prostate, liver, and pancreas with three 2.38-mm spherical markers in each organ. The real probe was placed to visualize the markers and subsequently replaced with the model probe. Each probe was automatically removed and returned to the same position or force. Under position control, the median three-dimensional reproducibility of marker positions was 0.6 to 0.7 mm, 0.3 to 0.6 mm, and 1.1 to 1.6 mm in the prostate, liver, and pancreas, respectively. Reproducibility was worse under force control. Probe substitution errors were smallest for the prostate (0.2 to 0.6 mm) and larger for the liver and pancreas (4.1 to 6.3 mm), where force control generally produced larger errors than position control. Results indicate that position control is better than force control for this application, and the robotic approach has potential, particularly for relatively constrained organs and reproducibility errors that are smaller than established treatment margins.
Image-guided radiation therapy (IGRT) involves two main procedures, performed in different rooms on different days: (1) treatment planning in the simulator room on the first day, and (2) radiotherapy in the linear accelerator room over multiple subsequent days. Both the simulator and the linear accelerator include CT imaging capabilities, which enables both treatment planning and reproducible patient setup, but does not provide good soft tissue contrast or allow monitoring of the target during treatment. We propose a cooperatively-controlled robot to reproducibly position an ultrasound (US) probe on the patient during simulation and treatment, thereby improving soft tissue visualization and allowing real-time monitoring of the target. A key goal of the robotic system is to produce consistent tissue deformations for both CT and US imaging, which simplifies registration of these two modalities. This paper presents the robotic system design and describes a novel control algorithm that employs virtual springs to implement guidance virtual fixtures during “hands on” cooperative control.
We are developing a cooperatively controlled robot system for image-guided radiation therapy (IGRT) in which a clinician and robot share control of a 3-D ultrasound (US) probe. IGRT involves two main steps: 1) planning/simulation and 2) treatment delivery. The goals of the system are to provide guidance for patient setup and real-time target monitoring during fractionated radiotherapy of soft tissue targets, especially in the upper abdomen. To compensate for soft tissue deformations created by the probe, we present a novel workflow where the robot holds the US probe on the patient during acquisition of the planning computerized tomography image, thereby ensuring that planning is performed on the deformed tissue. The robot system introduces constraints (virtual fixtures) to help to produce consistent soft tissue deformation between simulation and treatment days, based on the robot position, contact force, and reference US image recorded during simulation. This paper presents the system integration and the proposed clinical workflow, validated by an in vivo canine study. The results show that the virtual fixtures enable the clinician to deviate from the recorded position to better reproduce the reference US image, which correlates with more consistent soft tissue deformation and the possibility for more accurate patient setup and radiation delivery.
Objective Acoustic radiation force (ARF)-based approaches to measure tissue elasticity require transmission of a focused high-energy acoustic pulse from a stationary ultrasound probe and ultrasound-based tracking of the resulting tissue displacements to obtain stiffness images or shear wave speed estimates. The method has established benefits in biomedical applications such as tumor detection and tissue fibrosis staging. One limitation, however, is the dependence on applied probe pressure, which is difficult to control manually and prohibits standardization of quantitative measurements. To overcome this limitation, we built a robot prototype that controls probe contact forces for shear wave speed quantification. Methods The robot was evaluated with controlled force increments applied to a tissue-mimicking phantom and in vivo abdominal tissue from three human volunteers. Results The root-mean-square error between the desired and measured forces was 0.07 N in the phantom and higher for the fatty layer of in vivo abdominal tissue. The mean shear wave speeds increased from 3.7 to 4.5 m/s in the phantom and 1.0 to 3.0 m/s in the in vivo fat for compressive forces ranging from 2.5 to 30 N. The standard deviation of shear wave speeds obtained with the robotic approach were low in most cases (< 0.2 m/s) and comparable to that obtained with a semiquantitative landmark-based method. Conclusion Results are promising for the introduction of robotic systems to control the applied probe pressure for ARF-based measurements of tissue elasticity. Significance This approach has potential benefits in longitudinal studies of disease progression, comparative studies between patients, and large-scale multidimensional elasticity imaging.
Radiation therapy typically begins with the acquisition of a CT scan of the patient for planning, followed by multiple days where radiation is delivered according to the plan. This requires that the patient be reproducibly positioned (set up) on the radiation therapy device (linear accelerator) such that the radiation beams pass through the target. Modern linear accelerators provide cone-beam computed tomography (CBCT) imaging, but this does not provide sufficient contrast to discriminate many abdominal soft-tissue targets, and therefore patient setup is often done by aligning bony anatomy or implanted fiducials. Ultrasound (US) can be used to both assist with patient setup and to provide real-time monitoring of soft-tissue targets. However, one challenge is that the ultrasound probe contact pressure can deform the target area and cause discrepancies with the treatment plan. Another challenge is that radiation therapists typically do not have ultrasound experience and therefore cannot easily find the target in the US image. We propose cooperative control strategies to address both the challenges. First, we use cooperative control with virtual fixtures (VFs) to enable acquisition of a planning CT that includes the soft-tissue deformation. Then, for the patient setup during the treatment sessions, we propose to use real-time US image feedback to dynamically update the VFs; this co-manipulation strategy provides haptic cues that guide the therapist to correctly place the US probe. A phantom study is performed to demonstrate that the co-manipulation strategy enables inexperienced operators to quickly and accurately place the probe on a phantom to reproduce a desired reference image. This is a necessary step for patient setup and, by reproducing the reference image, creates soft-tissue deformations that are consistent with the treatment plan, thereby enabling real-time monitoring during treatment delivery.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.