Multi-Touch Interaction Research
© 2006, Jeff Han
Bi-manual, multi-point, and multi-user interactions on a graphical interaction surface.
While touch sensing is commonplace for single points of contact, multi-touch sensing enables a user to interact with a system with more than one finger at a time, as in chording and bi-manual operations. Such sensing devices are inherently also able to accommodate multiple users simultaneously, which is especially useful for larger interaction scenarios such as interactive walls and tabletops.
Since refining the FTIR (frustrated total internal reflection) sensing technique, we've been experimenting with a wide variety of application scenarios and interaction modalities that utilize multi-touch input information. These go far beyond the "poking" actions you get with a typical touchscreen, or the gross gesturing found in video-based interactive interfaces. It is a rich area for research, and we are extremely excited by its potential for advances in efficiency, usability, and intuitiveness. It's also just so much fun!
Our technique is force-sensitive, and provides unprecedented resolution and scalability, allowing us to create sophisticated multi-point widgets for applications large enough to accommodate both hands and multiple users.
The drafting table style implementation shown here measures 36"x27", is rear-projected, and has a sensing resolution of better than 0.1" at 50Hz. Stroke event information is sent to applications using the lightweight OSC protocol over UDP.
We'll be working with other, more interesting form factors (both larger and smaller). Wouldn't it also be nice to identify which finger (e.g. thumb, index, etc.) is associated with each contact?