I’ve been briefing organizations for years about the future of mobility and how interacting with our mobile devices is going to evolve. Samsung’s Galaxy line is starting to bring some of those to reality in a limited way. Our phones are packed with sensors from the camera & GPS to the gyroscope and microphone. Our phones are going to get smarter and smarter with what they present to us based on those sensors.
Samsung has a batch of features called ‘Smart Stay’ in the Galaxy line. One of these is the ‘Smart Pause’ feature that suspends video playback when you look away from the screen. Yes, that means the phone is constantly watching you to see where you are looking — we can start calling this “Little Brother” as our phones are watching us. More broadly, Smart Stay controls the backlight on the phone and keeps it on if you’re looking at the device, rather than the screen going dark after a delay.
Another interaction method is being able to use gestures in mid air with the phone rather than having to touch the screen. Samsung’s Air Gesture lets you scroll and navigate the phone without touching the screen. Air View allows a finger hovering over the screen like a stylus to trigger other functions. I’m looking forward to trying this in person but from the videos I’ve digested it doesn’t seem a very intuitive set of gestures and controls yet. It’s going to take some refinement before the average user is really comfortable with gestures in mid-air.
There have been a lot of internet rumors on Samsung employing eye tracking technology to add even more intelligence to the phone (the Smart Stay features look to pay attention to more of the full face). While not apparent in the Galaxy 4s, we can certainly expect that sort of capability soon. Of course, what we do with that input will require serious thought on the user experience. The idea of scrolling a page with the eyes sounds like a bad idea — using the eyes that are supposed to be reading the page to then move things around sounds like double duty and confusion. Maybe that’ll mean a twitch of the mouth to skip forward a page.
A big leap we’re going to see in the coming years is our phones touching us back. It’s already been proven out in research environments that a high voltage electric field behind the screen can create a touch sensation out in front of the screen. That means we’ll actually get tactile feedback and be able to touch the objects on our screens in mid-air.