EV-Catcher: High-Speed Object Catching Using Low-Latency Event-Based Neural Networks

Ziyun Wang, Fernando Cladera Ojeda, Anthony Bisulco, Daewon Lee, Camillo J. Taylor, Kostas Daniilidis, M. Ani Hsieh, Daniel D. Lee, Volkan Isler

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Event-based sensors have recently drawn increasing interest in robotic perception due to their lower latency, higher dynamic range, and lower bandwidth requirements compared to standard CMOS-based imagers. These properties make them ideal tools for real-time perception tasks in highly dynamic environments. In this work, we demonstrate an application where event cameras excel: accurately estimating the impact location of fast-moving objects. We introduce a lightweight event representation called Binary Event History Image (BEHI) to encode event data at low latency, as well as a learning-based approach that allows real-time inference of a confidence-enabled control signal to the robot. To validate our approach, we present an experimental catching system in which we catch fast-flying ping-pong balls. We show that the system is capable of achieving a success rate of 81% in catching balls targeted at different locations, with a velocity of up to 13 m/s even on compute-constrained embedded platforms such as the Nvidia Jetson NX.

Original languageEnglish (US)
Pages (from-to)8737-8744
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume7
Issue number4
DOIs
StatePublished - Oct 1 2022
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2016 IEEE.

Keywords

  • Sensor-based control
  • visual tracking

Fingerprint

Dive into the research topics of 'EV-Catcher: High-Speed Object Catching Using Low-Latency Event-Based Neural Networks'. Together they form a unique fingerprint.

Cite this