Human Gesture Robot Control Using a Camera/Accelerometer-in-Palm Sensor

Jared Floersch, Perry Y. Li

Research output: Contribution to journalConference articlepeer-review

Abstract

This paper considers the use of an in-palm sensor to enable intuitive control with human gesture. With the sensor, the human operator is able to control a robot by merely pointing the palm at the robot and directing the robot motion by the palm’s movement. The sensor consists of a Raspberry Pi Zero W single-board computer, an accelerometer, and an infrared camera module. Infrared LED targets are placed on the robot and observed by the camera. Using LED image locations and gravity information from the accelerometer, the gesture control device is able to calculate the global position desired by the user in real-time, and broadcast that information to a target computer via WiFi or Bluetooth. The target computer then controls the position of the robot via a low-level controller implemented on a real-time operating system. A key contribution is an accelerometer assisted vision based localization that allows the algorithm to be executed on low power single board computers. The point-and-follow control was implemented on a desktop-scale robot with a fast response and reasonable accuracy.

Original languageEnglish (US)
Pages (from-to)284-289
Number of pages6
JournalIFAC-PapersOnLine
Volume54
Issue number20
DOIs
StatePublished - Nov 1 2021
Event2021 Modeling, Estimation and Control Conference, MECC 2021 - Austin, United States
Duration: Oct 24 2021Oct 27 2021

Bibliographical note

Publisher Copyright:
Copyright © 2021 The Authors. This is an open access article under the CC BY-NC-ND license

Keywords

  • Computer vision
  • Human machine interaction
  • Robot motion control

Fingerprint

Dive into the research topics of 'Human Gesture Robot Control Using a Camera/Accelerometer-in-Palm Sensor'. Together they form a unique fingerprint.

Cite this