We investigated the effect of acarbose, an alpha-glucosidase and pancreatic alpha-amylase inhibitor, on gastric emptying of solid meals of varying nutrient composition and plasma responses of gut hormones. Gastric emptying was determined with scintigraphy in healthy subjects, and all studies were performed with and without 100 mg of acarbose, in random order, at least 1 wk apart. Acarbose did not alter the emptying of a carbohydrate-free meal, but it delayed emptying of a mixed meal and a carbohydrate-free meal given 2 h after sucrose ingestion. In meal groups with carbohydrates, acarbose attenuated responses of plasma insulin and glucose-dependent insulinotropic polypeptide (GIP) while augmenting responses of CCK, glucagon-like peptide-1 (GLP-1), and peptide YY (PYY). With mixed meal + acarbose, area under the curve (AUC) of gastric emptying was positively correlated with integrated plasma response of GLP-1 (r = 0.68, P < 0.02). With the carbohydrate-free meal after sucrose and acarbose ingestion, AUC of gastric emptying was negatively correlated with integrated plasma response of GIP, implying that prior alteration of carbohydrate absorption modifies gastric emptying of a meal. The results demonstrate that acarbose delays gastric emptying of solid meals and augments release of CCK, GLP-1, and PYY mainly by retarding/inhibiting carbohydrate absorption. Augmented GLP-1 release by acarbose appears to play a major role in the inhibition of gastric emptying of a mixed meal, whereas CCK and PYY may have contributory roles.
In a realistic mobile push-manipulation scenario, it becomes non-trivial and infeasible to build analytical models that will capture the complexity of the interactions between the environment, each of the objects, and the robot as the variety of objects to be manipulated increases. We present an experience-based push-manipulation approach that enables the robot to acquire experimental models regarding how pushable real world objects with complex 3D structures move in response to various pushing actions. These experimentally acquired models can then be used either (1) for trying to track a collision-free guideline path generated for the object by reiterating pushing actions that result in the best locally-matching object trajectories until the goal is reached, or (2) as building blocks for constructing achievable push plans via a Rapidly-exploring Random Trees variant planning algorithm we contribute and executing them by reiterating the corresponding trajectories. We extensively experiment with these two methods in a 3D simulation environment and demonstrate the superiority of the achievable planning and execution concept through safe and successful pushmanipulation of a variety of passively mobile pushable objects. Additionally, our preliminary tests in a real world scenario, where the robot is asked to arrange a set of chairs around a table through achievable pushmanipulation, also show promising results despite the increased perception and action uncertainty, and verify the validity of our contributed method.
Brain Computer Interfaces (BCIs) are systems that allow human subjects to interact with the environment by interpreting brain signals into machine commands. This work provides a design for a BCI to control a humanoid robot by using signals obtained from the Emotiv EPOC, a portable electroencephalogram (EEG) device with 14 electrodes and sampling rate of 128 Hz. The main objective is to process the neuroelectric responses to an externally driven stimulus and generate control signals for the humanoid robot Nao accordingly. We analyze steady-state visually evoked potential (SSVEP) induced by one of four groups of light emitting diodes (LED) by using two distinct signals obtained from the two channels of the EEG device which reside on top of the occipital lobe. An embedded system is designed for generating pulse width modulated square wave signals in order to flicker each group of LEDs with different frequencies. The subject chooses the direction by looking at one of these groups of LEDs that represent four directions. Fast Fourier Transform and a Gaussian model are used to detect the dominant frequency component by utilizing harmonics and neighbor frequencies. Then, a control signal is sent to the robot in order to draw a fixed sized line in that selected direction by BCI. Experimental results display satisfactory performance where the correct target is detected 75% of the time on the average across all test subjects without any training.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.