Human-Robot Interface by Pointing with Uncalibrated Stereo Vision
Roberto Cipolla and Nicholas J. Hollinghurst

Department of Engineering,
University of Cambridge,
Cambridge CB2 1PZ, UK.


Here we present the results of an investigation into the use of a pointing-based interface for robot guidance. The system requires no physical contact with the operator, but uses uncalibrated stereo vision with active contours to track the position and pointing direction of a hand in real time. With a ground plane constraint, it is possible to find the indicated position in the robot's workspace, by considering only two-dimensional collineations.

Experimental and simulation data show that a resolution of within 1cm can be achieved in a 40cm workspace, allowing simple pick-and-place operations to be specified by finger pointing.


  • 1 Introduction
  • 2 Geometrical framework
  • 3 Tracking a pointing hand
  • 4 Implementation
  • 5 Conclusion
  • References


    We are indebted to Mr.Masaaki Fukumoto and Dr.Yasuhito Suenaga of the NTT Human Interface Laboratories, Yokosuka, Japan, who collaborated with Roberto Cipolla on an earlier implementation of this stereo pointing algorithm; and to Paul Hadfield who helped implement an active contour based pointing system.