After the hands-free phone – the hands-free computer. Microsoft has secretly developed a prototype handheld computer that lets users navigate around large documents or images on screens by moving their hands or heads – but without touching the computer.
The system, developed at its British research centre in Cambridge, was last week taken by its British-born inventor to the US to show off at the company’s headquarters, to assess its commercial potential.
Rather than the usual mouse or keyboard, the new system – dubbed “Sonarscreen” – senses the movements by using ultrasonic emitters and sensors attached to the screen. While the sound is inaudible to humans, the computer can sense it and assess changes in the echoes as movement, much as a bat senses how near walls and objects are.
Its developer, Lyndsay Williams, hit on the idea when she was trying to navigate her way through a map on Cambridge on the Web to find a specific location.
The result – familiar to many Web users – was a frustrating episode in which she kept clicking to zoom in and then out and then left and then right, but kept missing the image she wanted.
“I said to myself, there has to be an easier way to handle large images and documents without depending on a mouse and without so many clicks,” she noted in an internal Microsoft briefing document that has been seen by The Independent.
Although Microsoft is best known as a software company, it has recently moved into making hardware with its Xbox games console, to be released in Europe in March, and says “major hardware manufacturers” will offer pen-operated tablet-shaped PCs, made to its specification, “in the second half of 2002″. The Tablets would be the ideal format for the new ultrasonic system – although some observers have cast doubts on whether the shape will prove popular with users.
Ms Williams, who has a long pedigree in computing – including having designed the hardware for the first sound card in an IBM-compatible PC in 1987 – rigged up a working model of her system in just four weeks.
Presently it is just a demonstration model, believed to be based on the Tablet PC, which is essentially a large screen with a computer processor attached. In one software setup, for map viewing, moving the face nearer to the screen enlarges the onscreen image, and gesturing left or right with the hands scrolls the image in the same direction.
Another setup will enlarge onscreen text as the face gets further away – so that one can view the screen, say for reading an electronic documents, from a comfortable distance.
“We proved the technology works and now we really need to test users to find out if this method of manipulating information is easy and comfortable for customers to use on a handheld device,” Ms Williams notes in the internal document.
But Andy Brown, research manager of mobile computing at IDC, a computing market research and analysis company, said: “It sounds interesting, but over the years there have been a lot of things that are invented, but very few actually materialise. The trouble is that their invention is driven by engineers rather than end-users. And though tests might find that people say they like it, if it’s at all expensive to implement then it won’t get taken up.”
However he said that it could have interesting applications once “pervasive” computing – where computers are built into homes and react with people constantly – becomes commonplace. “But that’s about five to ten years away,” he added.
Microsoft Cambridge estimates that the ultrasonic sensors cost no more than a few pounds each, and are just 3 millimetres across – which ought to mean that they pose no significant cost to PC manufacturers.
The Microsoft Cambridge laboratories were established in June 1997.