*A simple hardware innovation, but why not? If your mobile is seeing something, it might as well see everything all the time.
(...)
"Xing-Dong Yang at the University of Alberta in Edmonton and colleagues reasoned that, if placed on a phone's user-facing camera instead of the main, rear one – which mostly points at the ground – such a set-up could monitor activity and track objects in the user's surroundings. So the team fitted a Dot to the front camera on an HTC Butterfly phone and trained an Android app to recognise locations and gestures.
"Users can use the app to control laptops, wireless speakers and other objects with pinch and swipe gestures in the air. The app can also ask the user if they have forgotten their phone when they walk away from it. If you don't want your phone, a sweeping hand gesture sets it to voicemail mode. Recognising multiple faces present in a gathering, the phone automatically mutes itself, and if it identifies the interior of your car and hears engine sounds, it blocks calls as a safety measure, Yang told New Scientist.
"Called Surround-See, Yang's combination of lens and app gives just a glimpse of what smartphones could offer if fitted with depth-sensing cameras. That became a distinct possibility last week after it emerged that Apple is in talks to purchase PrimeSense, (((oh really))) the Israel-based firm that pioneered the depth camera at the heart of Microsoft's Kinect sensor. The news is fuelling speculation that future Apple devices will be fitted with gesture-sensing technology...."