2020 IEEE International Conference on Robotics and Automation (ICRA) 2020
DOI: 10.1109/icra40945.2020.9197373
|View full text |Cite
|
Sign up to set email alerts
|

Where and When: Event-Based Spatiotemporal Trajectory Prediction from the iCub’s Point-Of-View

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 15 publications
0
3
0
Order By: Relevance
“…That seed could grow into a mature field thanks to the availability of hardware that could be more easily tuned by nonexperts with standard software tools [428] and of quality DVS [429] and neuromorphic computing chips and systems [412,430] featuring many instances of neurons and (learning) synapses that could be used as computational primitives for perception and decision making. Since then, neuromorphic robotics followed three main paths, with the development of visual perception for robots using event-driven (dynamic) vision sensors [375,431], proof-of-concept systems linking sensing to control [432] and SNN for the control of motors [433,434]. At the same time, the neurorobotics community started developing models of perception, cognition and behaviour based on SNN, with recent attempts to implement those on neuromorphic platforms [435][436][437].…”
Section: Statusmentioning
confidence: 99%
“…That seed could grow into a mature field thanks to the availability of hardware that could be more easily tuned by nonexperts with standard software tools [428] and of quality DVS [429] and neuromorphic computing chips and systems [412,430] featuring many instances of neurons and (learning) synapses that could be used as computational primitives for perception and decision making. Since then, neuromorphic robotics followed three main paths, with the development of visual perception for robots using event-driven (dynamic) vision sensors [375,431], proof-of-concept systems linking sensing to control [432] and SNN for the control of motors [433,434]. At the same time, the neurorobotics community started developing models of perception, cognition and behaviour based on SNN, with recent attempts to implement those on neuromorphic platforms [435][436][437].…”
Section: Statusmentioning
confidence: 99%
“…After an offline evaluation of the accuracy, online experiments show the quadrotor avoiding the ball under different scenarios. In [15] an event camera with an Encoder-Decoder [16] LSTM network predicts the giver's motion during a handover task from the iCub robot perspective. The pipeline predicts both spatial and temporal future points, allowing the robot to know in advance where to move, compensating for internal delays in the perception-action loop.…”
Section: Introductionmentioning
confidence: 99%
“…This suits the event-driven paradigm, for which the number of data points cannot be defined before the action unfolds. This "continuous memory" also allows feeding as input only the latest point to update the prediction at each time step, avoiding redundant computation in the network due to possible buffers [15]. To boost the learning process, and cope with the data-hungry nature of the chosen model-free approach, similarly to [10], we train on simulated trajectories, parametrized to capture the same statistics of the real ones, before fine-tuning the models in real-world experiments.…”
Section: Introductionmentioning
confidence: 99%