This paper presents the software framework established to facilitate cloud-hosted robot simulation. The framework addresses the challenges associated with conducting a task-oriented and real-time robot competition, the Defense Advanced Research Projects Agency (DARPA) Virtual Robotics Challenge (VRC), designed to mimic reality. The core of the framework is the Gazebo simulator, a platform to simulate robots, objects, and environments, as well as the enhancements made for the VRC to maintain a high fidelity simulation using a high degree of freedom and multisensor robot. The other major component used is the CloudSim tool, designed to enhance the automation of robotics simulation using existing cloud technologies. The results from the VRC and a discussion are also detailed in this work. Note to Practitioners-Advances in robot simulation, cloud hosted infrastructure, and web technology have made it possible to accurately and efficiently simulate complex robots and environments on remote servers while providing realistic data streams for human-in-the-loop robot control. This paper presents the software and hardware frameworks established to facilitate cloud-hosted robot simulation, and addresses the challenges associated with conducting a task-oriented robot competition designed to mimic reality. The competition that spurred this innovation was the VRC, a precursor to the DARPA Robotics Challenge, in which teams from around the world utilized custom human-robot interfaces and control code to solve disaster response-related tasks in simulation. Winners of the VRC received both funding and access to Atlas, a humanoid robot developed by Boston Dynamics. The Gazebo simulator, an open source and high fidelity robot simulator, was improved upon to met the needs of the VRC competition. Additionally, CloudSim was created to act as an interface between users and the cloud-hosted simulations.
Humanoids have increasingly become the focus of attention in robotics research in recent years, especially in service and personal assistance robotics. This paper presents the application developed for humanoid robots in the therapy of dementia patients as a cognitive stimulation tool. The behaviour of the robot during the therapy sessions is visually programmed in a session script that allows music to play, physical movements (dancing, exercises, etc.), speech synthesis and interaction with the human monitor. The application includes the control software on board the robot and some tools like the visual script generator or a monitor to supervise the robot behaviour during the sessions. The robot applicationʹs impact on the patientʹs health has been studied. Experiments with real patients have been performed in collaboration with a centre of research in neurodegenerative diseases. Initial results show a slight or mild improvement in neuropsychiatric symptoms over other traditional therapy methods.
No abstract
)UDQFLVFR 0DUWtQ &DUORV $JHUR -RVp 0DUtD &DxDV DQG (GXDUGR 3HUGLFHV 5H\ -XDQ &DUORV 8QLYHUVLW\ 6SDLQ +PVTQFWEVKQP 7KH IRFXV RI URERWLF UHVHDUFK FRQWLQXHV WR VKLIW IURP LQGXVWULDO HQYLURQPHQWV LQ ZKLFK URERWV PXVW SHUIRUP D UHSHWLWLYH WDVN LQ D YHU\ FRQWUROOHG HQYLURQPHQW WR PRELOH VHUYLFH URERWV RSHUDWLQJ LQ D ZLGH YDULHW\ RI HQYLURQPHQWV RIWHQ LQ KXPDQKDELWHG RQHV 7KHUH DUH URERWV LQ PXVHXPV 7KUXQ HW DO GRPHVWLF URERWV WKDW FOHDQ RXU KRXVHV URERWV WKDW SUHVHQW QHZV SOD\ PXVLF RU HYHQ DUH RXU SHWV 7KHVH QHZ DSSOLFDWLRQV IRU URERWV PDNH DULVH D ORW RI SUREOHPV ZKLFK PXVW EH VROYHG LQ RUGHU WR LQFUHDVH WKHLU DXWRQRP\ 7KHVH SUREOHPV DUH EXW DUH QRW OLPLWHG WR QDYLJDWLRQ ORFDOLVDWLRQ EHKDYLRU JHQHUDWLRQ DQG KXPDQPDFKLQH LQWHUDFWLRQ 7KHVH SUREOHPV DUH IRFXVHV RQ WKH DXWRQRPRXV URERWV UHVHDUFK ,Q PDQ\ FDVHV UHVHDUFK LV PRWLYDWHG E\ DFFRPSOLVKPHQW RI D GLIILFXOW WDVN ,Q $UWLILFLDO ,QWHOOLJHQFH UHVHDUFK IRU H[DPSOH D PLOHVWRQH ZDV WR ZLQ WR WKH FKHVV ZRUOG FKDPSLRQ 7KLV PLOHVWRQH ZDV DFKLHYHG ZKHQ GHHS EOXH ZRQ WR .DVSDURY LQ ,Q URERWLFV WKHUH DUH VHYHUDO FRPSHWLWLRQV ZKLFK SUHVHQW D SUREOHP DQG PXVW EH VROYHG E\ URERWV )RU H[DPSOH *UDQG &KDOOHQJH SURSRVH D URERWLF YHKLFOH WR FURVV KXQGUHG RI NLORPHWHUV DXWRQRPRXVO\ 7KLV FRPSHWLWLRQ KDV DOVR D XUEDQ YHUVLRQ QDPHG 8UEDQ &KDOOHQJH)LJ 6WDQGDUG 3ODWIRUP /HDJXH DW 5RER&XS LER URERW %XW VLQFH WKHUH LV D QHZ SODWIRUP FDOOHG 1DR ILJXUH 1DR LV D ELSHG KXPDQRLG URERW WKLV LV WKH PDLQ GLIIHUHQFH ZLWK UHVSHFW $LER WKDW LV D TXDGUXSHG URERW 7KLV IDFW KDV KDG D ELJ LPSDFW LQ WKH ZD\ WKH URERW PRYHV DQG LWV VWDELOLW\ ZKLOH PRYLQJ $OVR WKH VL]HV RI ERWK URERWV LV QRW WKH VDPH $LER LV FP WDOO ZKLOH 1DR LV DERXW FP WDOO 7KDW FDXVHV WKH ELJ GLIIHUHQFH RQ WKH ZD\ RI SHUFHSWLRQ ,Q DGGLWLRQ WR LW ERWK URERWV XVH D VLQJOH FDPHUD WR SHUFHLYH ,Q $LER WKH SHUFHSWLRQ ZDV ' EHFDXVH WKH FDPHUD ZDV YHU\ QHDU WKH IORRU 5RERW 1DR SHUFHLYHV LQ ' EHFDXVH WKH FDPHUD LV DW D KLJKHU SRVLWLRQ DQG WKDW HQDEOHV WKH URERW WR FDOFXODWH WKH SRVLWLRQ RI WKH HOHPHQWV WKDW DUH ORFDWHG RQ WKH IORRU ZLWK RQH VLQJOH FDPHUD 0DQ\ SUREOHPV KDYH WR EH VROYHG EHIRUH KDYLQJ D IXOO\ IHDWXUHG VRFFHU SOD\HU )LUVW RI DOO WKH URERW KDV WR JHW LQIRUPDWLRQ IURP WKH HQYLURQPHQW PDLQO\ XVLQJ WKH FDPHUD ,W PXVW GHWHFW WKH EDOO JRDOV OLQHV DQG WKH RWKHU URERWV +DYLQJ WKLV LQIRUPDWLRQ WKH URERW KDV WR VHOIORFDOLVH DQG GHFLGH WKH QH[W DFWLRQ PRYH NLFN VHDUFK DQRWKHU REMHFW HWF 7KH URERW PXVW SHUIRUP DOO WKHVH WDVNV YHU\ IDVW LQ RUGHU WR EH UHDFWLYH HQRXJK WR EH FRPSHWLWLYH LQ D VRFFHU PDWFK ,W PDNHV QR VHQVH ZLWKLQ WKLV HQYLURQPHQW WR KDYH D JRRG ORFDOLVDWLRQ PHWKRG LI WKDW WDNHV VHYHUDO VHFRQGV WR FRPSXWH WKH URERW SRVLWLRQ RU WR GHFLGH WKH QH[W PRYHPHQW LQ IHZ VHFRQGV EDVHG RQ WKH ROG SHUFHUSHWLRQ 7KH HVWLPDWHG VHQVHWKLQNDFW SURFHVV PXVW WDNH OHVV WKDQ PLOOLVHFRQG WR EH WUXO\ HILFLHQW 7KLV LV D WRXJK UHTXLUHPHQW IRU DQ\ EHKDYLRU DUFKLWHFWXUH WKDW ZLVKHV WR EH DSSOLHG WR VROYH WKH SUREOHP :LWK WKLV ZRUN ZH DUH SURSRVLQJ D EHKDYLRU EDVHG DUFKLWHFWXUH WKDW PHHWV ZLWK WKH UHTXLUHPHQWV QHHGHG WR GHYHORS D VRFFHU SOD\HU (YHU\ EHKDYLRU L...
Cameras are one of the most relevant sensors in autonomous robots. One challenge with them is to manage the small field of view of regular cameras. A method of coping with this, similar to the attention systems in humans, is to use mobile cameras to cover all the robot surroundings and to perceive all the objects of interest to the robot tasks even if they do not lie in the same snapshot. A gaze control algorithm is then required that continuously selects where the camera should look. This paper presents three different covert attention mechanisms that have been designed and compared: one based on round-Robin sharing, another based on dynamic salience and one with fixed pattern camera movements. Several experiments have been performed with a humanoid robot in order to validate them and to give an objective comparison in the context of RoboCup, where the robots have several perceptive needs like localization and object tracking that must be satisfied and may not be fully compatible.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.