The Visual Panel Project

In many intelligent environments, people are looking for an intuitive, immersive and cost-efficient interaction device, instead of using conventional mice and keyboards. This paper presents a vision-based gesture interface system, VisualPanel, which employs an arbitrary quadrangle-shape panel and a tip pointer like fingertip as an intuitive input device. Taking advantage of the panel, the system can fulfill many UI tasks such as controlling a remote and large display, and simulating a physical keyboard. Users can naturally use their fingers or other tip pointers to issue commands and type texts. The system is facilitated by accurately and reliably tracking the panel and the tip pointer and detecting the clicking and dragging actions. The system, which runs at around 22Hz on PIII 800MHz PC, is scalable and extendable.



Demo Sequences:

  The Visual Panel System (short version)
(33M, MPG)

Note [1] : Joint work with Dr. Zhengyou Zhang, Dr. Ying Shan and Dr. Steve Shafer

Note [2] : A longer but larger version (85M) can be found at Dr. Zhengyou Zhang's web site at MSR.


Publication:

[1] Zhengyou Zhang, Ying Wu, Ying Shan and Steven Shafer, "Visual Panel: Virtual Mouse, Keyborad, and 3D Controller with an Ordinary Piece of Paper", in Proc. ACM Perceptive User Interface Workshop (PUI'01), Florida, Nov., 2001.  [PDF]

Return to HCI Research


Updated 09/2000. Copyright © 2001-2003 Ying Wu