One of the limitations of smart phones is that they don’t know what’s around them — the camera can only see front to back. So if you forget to set the ringer on vibrate and go into a meeting it can go off and irritate everyone. Phones can be used while driving, and can’t tell if you’re using a stylus or a finger on the touch screen, nor can they pick up gestures from across the room.
Anything You Can Do Robots Can Do Better: Photos
That’s now changed. Xing-Dong Yang, a graduate student in the Department of Computing Science at the University of Alberta has made his phone (an HTC as it happens) able to see its surroundings and learn what it sees. He calls the device Surround-See.
“We can train the phone,” Yang told. “It takes a number of pictures of the environment, as samples.” At that point its machine-learning algorithm can help the phone compare the pictures in the library to what it sees around it. The algorithm itself is off-the-shelf, but nobody combined it with a smart phone and a panoramic lens before.
Now Yang’s phone can see when he makes gestures across the room, so he can turn it on or off or answer it. The phone even warns him when he is in a car, and asks him if he wants to take it with him when he leaves the room. He has a video showing some of the phone’s capabilities here.
No comments:
Post a Comment