This paper presents an open-access platform for practical learning of intelligent robotics in engineering degrees: Robotics-Academy. It comprises a collection of exercises including recent service robot applications in real life, with different robots such as autonomous cars, drones or vacuum cleaners. It uses Robot Operating System (ROS) middleware, the de facto standard in robot programming, the 3D Gazebo simulator and the Python programming language. For each exercise, a software template has been developed, performing all the auxiliary tasks such as the graphical interface, connection to the sensors and actuators, timing of the code, etc. This also hosts the student’s code. Using this template, the student just focuses on the robot intelligence (for instance, perception and control algorithms) without wasting time on auxiliary details which have little educational value. The templates are coded as ROS nodes or as Jupyter Notebooks ready to use in the web browser. Reference solutions for illustrative purposes and automatic assessment tools for gamification have also been developed. An introductory course to intelligent robotics has been elaborated and its contents are available and ready to use at Robotics-Academy, including reactive behaviors, path planning, local/global navigation, and self-localization algorithms. Robotics-Academy provides a valuable complement to master classes in blended learning, massive online open courses (MOOCs) and online video courses, devoted to addressing theoretical content. This open educational tool connects that theory with practical robot applications and is suitable to be used in distance education. Robotics-Academy has been successfully used in several subjects on undergraduate and master’s degree engineering courses, in addition to a pre-university pilot course.
)UDQFLVFR 0DUWtQ &DUORV $JHUR -RVp 0DUtD &DxDV DQG (GXDUGR 3HUGLFHV 5H\ -XDQ &DUORV 8QLYHUVLW\ 6SDLQ +PVTQFWEVKQP 7KH IRFXV RI URERWLF UHVHDUFK FRQWLQXHV WR VKLIW IURP LQGXVWULDO HQYLURQPHQWV LQ ZKLFK URERWV PXVW SHUIRUP D UHSHWLWLYH WDVN LQ D YHU\ FRQWUROOHG HQYLURQPHQW WR PRELOH VHUYLFH URERWV RSHUDWLQJ LQ D ZLGH YDULHW\ RI HQYLURQPHQWV RIWHQ LQ KXPDQKDELWHG RQHV 7KHUH DUH URERWV LQ PXVHXPV 7KUXQ HW DO GRPHVWLF URERWV WKDW FOHDQ RXU KRXVHV URERWV WKDW SUHVHQW QHZV SOD\ PXVLF RU HYHQ DUH RXU SHWV 7KHVH QHZ DSSOLFDWLRQV IRU URERWV PDNH DULVH D ORW RI SUREOHPV ZKLFK PXVW EH VROYHG LQ RUGHU WR LQFUHDVH WKHLU DXWRQRP\ 7KHVH SUREOHPV DUH EXW DUH QRW OLPLWHG WR QDYLJDWLRQ ORFDOLVDWLRQ EHKDYLRU JHQHUDWLRQ DQG KXPDQPDFKLQH LQWHUDFWLRQ 7KHVH SUREOHPV DUH IRFXVHV RQ WKH DXWRQRPRXV URERWV UHVHDUFK ,Q PDQ\ FDVHV UHVHDUFK LV PRWLYDWHG E\ DFFRPSOLVKPHQW RI D GLIILFXOW WDVN ,Q $UWLILFLDO ,QWHOOLJHQFH UHVHDUFK IRU H[DPSOH D PLOHVWRQH ZDV WR ZLQ WR WKH FKHVV ZRUOG FKDPSLRQ 7KLV PLOHVWRQH ZDV DFKLHYHG ZKHQ GHHS EOXH ZRQ WR .DVSDURY LQ ,Q URERWLFV WKHUH DUH VHYHUDO FRPSHWLWLRQV ZKLFK SUHVHQW D SUREOHP DQG PXVW EH VROYHG E\ URERWV )RU H[DPSOH *UDQG &KDOOHQJH SURSRVH D URERWLF YHKLFOH WR FURVV KXQGUHG RI NLORPHWHUV DXWRQRPRXVO\ 7KLV FRPSHWLWLRQ KDV DOVR D XUEDQ YHUVLRQ QDPHG 8UEDQ &KDOOHQJH)LJ 6WDQGDUG 3ODWIRUP /HDJXH DW 5RER&XS LER URERW %XW VLQFH WKHUH LV D QHZ SODWIRUP FDOOHG 1DR ILJXUH 1DR LV D ELSHG KXPDQRLG URERW WKLV LV WKH PDLQ GLIIHUHQFH ZLWK UHVSHFW $LER WKDW LV D TXDGUXSHG URERW 7KLV IDFW KDV KDG D ELJ LPSDFW LQ WKH ZD\ WKH URERW PRYHV DQG LWV VWDELOLW\ ZKLOH PRYLQJ $OVR WKH VL]HV RI ERWK URERWV LV QRW WKH VDPH $LER LV FP WDOO ZKLOH 1DR LV DERXW FP WDOO 7KDW FDXVHV WKH ELJ GLIIHUHQFH RQ WKH ZD\ RI SHUFHSWLRQ ,Q DGGLWLRQ WR LW ERWK URERWV XVH D VLQJOH FDPHUD WR SHUFHLYH ,Q $LER WKH SHUFHSWLRQ ZDV ' EHFDXVH WKH FDPHUD ZDV YHU\ QHDU WKH IORRU 5RERW 1DR SHUFHLYHV LQ ' EHFDXVH WKH FDPHUD LV DW D KLJKHU SRVLWLRQ DQG WKDW HQDEOHV WKH URERW WR FDOFXODWH WKH SRVLWLRQ RI WKH HOHPHQWV WKDW DUH ORFDWHG RQ WKH IORRU ZLWK RQH VLQJOH FDPHUD 0DQ\ SUREOHPV KDYH WR EH VROYHG EHIRUH KDYLQJ D IXOO\ IHDWXUHG VRFFHU SOD\HU )LUVW RI DOO WKH URERW KDV WR JHW LQIRUPDWLRQ IURP WKH HQYLURQPHQW PDLQO\ XVLQJ WKH FDPHUD ,W PXVW GHWHFW WKH EDOO JRDOV OLQHV DQG WKH RWKHU URERWV +DYLQJ WKLV LQIRUPDWLRQ WKH URERW KDV WR VHOIORFDOLVH DQG GHFLGH WKH QH[W DFWLRQ PRYH NLFN VHDUFK DQRWKHU REMHFW HWF 7KH URERW PXVW SHUIRUP DOO WKHVH WDVNV YHU\ IDVW LQ RUGHU WR EH UHDFWLYH HQRXJK WR EH FRPSHWLWLYH LQ D VRFFHU PDWFK ,W PDNHV QR VHQVH ZLWKLQ WKLV HQYLURQPHQW WR KDYH D JRRG ORFDOLVDWLRQ PHWKRG LI WKDW WDNHV VHYHUDO VHFRQGV WR FRPSXWH WKH URERW SRVLWLRQ RU WR GHFLGH WKH QH[W PRYHPHQW LQ IHZ VHFRQGV EDVHG RQ WKH ROG SHUFHUSHWLRQ 7KH HVWLPDWHG VHQVHWKLQNDFW SURFHVV PXVW WDNH OHVV WKDQ PLOOLVHFRQG WR EH WUXO\ HILFLHQW 7KLV LV D WRXJK UHTXLUHPHQW IRU DQ\ EHKDYLRU DUFKLWHFWXUH WKDW ZLVKHV WR EH DSSOLHG WR VROYH WKH SUREOHP :LWK WKLV ZRUN ZH DUH SURSRVLQJ D EHKDYLRU EDVHG DUFKLWHFWXUH WKDW PHHWV ZLWK WKH UHTXLUHPHQWV QHHGHG WR GHYHORS D VRFFHU SOD\HU (YHU\ EHKDYLRU L...
Cameras are one of the most relevant sensors in autonomous robots. However, two of their challenges are to extract useful information from captured images, and to manage the small field of view of regular cameras. This paper proposes implementing a dynamic visual memory to store the information gathered from a moving camera on board a robot, followed by an attention system to choose where to look with this mobile camera, and a visual localization algorithm that incorporates this visual memory. The visual memory is a collection of relevant task-oriented objects and 3D segments, and its scope is wider than the current camera field of view. The attention module takes into account the need to reobserve objects in the visual memory and the need to explore new areas. The visual memory is useful also in localization tasks, as it provides more information about robot surroundings than the current instantaneous image. This visual system is intended as underlying technology for service robot applications in real people's homes. Several experiments have been carried out, both with simulated and real Pioneer and Nao robots, to validate the system and each of its components in office scenarios.
Visual Simultaneous Localization and Mapping (SLAM) approaches have achieved a major breakthrough in recent years. This paper presents a new monocular visual odometry algorithm able to localize in 3D a robot or a camera inside an unknown environment in real time, even on slow processors such as those used in unmanned aerial vehicles (UAVs) or cell phones. The so-called semi-direct visual localization (SDVL) approach is focused on localization accuracy and uses semi-direct methods to increase feature-matching efficiency. It uses inverse-depth 3D point parameterization. The tracking thread includes a motion model, direct image alignment, and optimized feature matching. Additionally, an outlier rejection mechanism (ORM) has been implemented to rule out misplaced features, improving accuracy especially in partially dynamic environments. A relocalization module is also included but keeping the real-time operation. The mapping thread performs an automatic map initialization with homography, a sampled integration of new points and a selective map optimization. The proposed algorithm was experimentally tested with international datasets and compared to state-of-the-art algorithms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.