System recognizes hand gestures to extend computer input to a keyboard

Researchers are developing a new technology that uses hand gestures to perform commands on computers.

The prototype, called “Typealike”, works via an ordinary laptop webcam with a simple mirror affixed. The program recognizes the user’s hands next to or near the keyboard and offers operations according to the different hand positions.

A user could, for example, place their right hand with the thumb pointing up next to the keyboard, and the program would recognize this as a signal to turn up the volume. Different gestures and combinations of gestures can be programmed to perform a wide range of operations.

Innovation in human-machine interaction aims to make the user experience faster and smoother, with fewer keyboard shortcuts or working with a mouse and trackpad.

“It all started with a simple idea for new ways to use a webcam,” said Nalin Chhibber, a recent master’s graduate from the Cheriton School of Computer Science at the University of Waterloo. “The webcam is pointed at your face, but most of the interactions that happen on a computer are around your hands. So we thought: what could we do if the webcam could pick up hand gestures? “

The initial idea led to the development of a small mechanical accessory that redirects the webcam down towards the hands. The team then created software that could understand distinct hand gestures under varying conditions and for different users. The team used machine learning techniques to train the Typealike program.

“It’s a neural network, so you have to show the algorithm examples of what you’re trying to detect,” said Fabrice Matulic, senior researcher at Preferred Networks Inc. and former postdoctoral researcher at Waterloo. “Some people will gesture a little differently, and the hands vary in size, so you have to collect a lot of data from different people with different lighting conditions.”

The team recorded a database of hand gestures with dozens of research volunteers. They also asked volunteers to do some tests and surveys to help the team understand how to make the program as functional and versatile as possible.

“We’re always looking to create things that people can easily use,” said Daniel Vogel, associate professor of computer science at Waterloo. “People look at something like Typealike, or other new technology in the human-machine interaction arena, and they say that makes sense. That’s what we want. We want to create technology that is intuitive and simple, but sometimes it takes a lot of complex research and sophisticated software. “

Researchers say there are other applications for the Typealike VR program where it could eliminate the need for portable controllers.

Source of the story:

Material provided by University of Waterloo. Note: Content can be changed for style and length.


Source link

Comments are closed.