Share
New Technology Lets You Control TV or Tablet with a Mug, Spoon or Your Foot

New Technology Lets You Control TV or Tablet with a Mug, Spoon or Your Foot

Cooking with the support of a YouTube recipe video, need to rewind but have fingers covered in goo? Sitting on the sofa with a tray holding a bowl of food and glass of wine on your lap but the remote control is out of reach? These first world problems may soon be a thing of the past thanks to new technology being worked on at the University of Lancaster, as recently reported by online technology media Wired.

The research team, part of the University of Lancaster’s computing and communications department, hopes that their work will, as well as meaning the navigation of scenarios such as those described above is no longer a mild inconvenience, potentially be of huge benefit to people with disabilities. The system, named Matchpoint, is still in the academic prototype phase, but is showing plenty of promise.

Combining a webcam and motion sensor with a custom digital interface allows users to temporarily select any object or body part as a temporary ‘pointer’ that can then control an on-screen cursor. It’s a little like Microsoft’s Kinect technology, which failed to gain any real traction in the end, but makes use of technological advances of recent years to improve execution of the general concept.

Rather than looking for a specific body part or object, the Matchpoint system just looks for any generic motion. Moving targets move around a small circle displayed on the screen of the device being interacted with. When the user wants to change the ‘remote’ for another body part or object they simply sync their movement of it to the circle.

The big technology giants such as Amazon and Google are also working on motion-based control systems but it looks like the academics of Matchpoint team might currently have an edge on their big budget commercial rivals. Perhaps the most significant potential value in the system is how it could make the life of those with disabilities easier. As the University’s Chris Clarke, part of the Matchpoint team explains:

“They might not be able to use their hands to interact, so they could use their hand. It sounds silly but most gesture device systems rely on you having hands in the first place. The Matchpoint system doesn’t make any assumptions of the user.”

Leave a Comment

six + eleven =