October 21 2008 / by John Heylin / In association with Future Blogger.net
Category: Technology Year: 2010 Rating: 9 Hot
Tired of mucking about with your touchscreen? Constantly having to worry about scratching the screen in your pocket? Wiping the face of it with your t-shirt to get your greasy finger marks off it? Microsoft may have an answer.
SideSight, a prototype by Microsoft, uses Infrared proximity sensors to determine which way you want to spin or expand the screen of your smartphone. “The sensors can read inputs up to 10 centimeters away, just through reflected infrared light.” This way you can browse through your phone without having to worry about mucking up your screen.
While this technology is limited (for instance, you need a flat surface for the sensors to work), it shows some amazing potential for future phone interactions. By placing sensors all around the phone, you will be able to use your hands directly in front of the screen in order to shuffle through images or browse sites. Being able to tell exactly where your hands are gives you the added bonus of being able to control the interface with individual fingers or your hand position itself, something the touchscreen can only do through physical contact.