While touch-sensitive frames have been around for years, their size and responsiveness has been limited. A new prototype called ZeroTouch has been developed at Texas A&M University’s Interface Ecology Lab, and creates more possibilities for interaction beyond typical interfaces like glass touch-screens on smartphones and laptops.
The 28-inch ZeroTouch frame with scalloped edges can detect whatever moves around inside it. Fingertips, hands, arms, and inanimate objects pass through an invisible two-dimensional optical web that tracks them. When placed on a computer screen, ZeroTouch turns into an interactive surface that can be manipulated with a stylus. Around the frame’s four edges are an array of infrared LED lights, the invisible beams of which shine into and across the inside open area. Mixed in with those lights are 256 modulated infrared sensors, which register the beams of the lights located across from them. ZeroTouch simply requires the user to break the light beams; there’s no force required to activate the sensor.
When a user places one or more fingers or other objects within the frame, the system’s software is able to calculate the size, shape and location of those objects within the frame, and apply that to equivalents on a Windows 7 computer screen. The research prototype was made using commercially available sensors usually found in TV remote controls, so the frame prototype only cost about $450 to construct. Aside from improving device interaction, the ZeroTouch could potentially be used as a training guide for surgeons, as the device could track fine hand movements. Additionally, it could provide interactive instructions on how to construct complicated machinery. ZeroTouch was presented at last week’s 2011 Conference on Human Factors in Computing Systems in Vancouver.