58.1 F
Los Angeles
Wednesday, April 24, 2024

Trump Lawyer Resigns One Day Before Trial To Begin

Joseph Tacopina has filed with the courts that he will not represent Donald J. Trump. The E. Jean Carroll civil case is schedule to begin Tuesday January 16,...

Judge Lewis A. Kaplan Issues Order RE Postponement

On May 9, 2023, a jury found Donald J. Trump liable for sexual assault and defamation. The jury awarded Ms. Carroll $5 million in damages. Seven months ago,...

ASUS Announces 2023 Vivobook Classic Series

On April 7, 2023, ASUS introduced five new models in the 2023 Vivobook Classic series of laptops. The top laptops in the series use the 13th Gen Intel® Core™...
TechnologyWhat Is NewImproving ways to interact with your computer -- Natural User Interface

Improving ways to interact with your computer — Natural User Interface

What major advancements do you want to see with PC operating systems? "Ease of use," may be a vague catch-all phrase, but it is a continual quest. "How can PCs be easier?" "How can more people benefit?" Consider how most people primarily interact with PCs today: keyboard and touchpad. Speech, touch, and pen have been wonderful emerging methods, even offered in some products, but we’re really beginning to delve into natural interaction.

For example, an adult male talking in his native language to his PC will get impressive speech recognition results. Whereas, an 8 year old girl, who has high pitched voice and speaking in a foreign language, may not get great results — let alone conditions that impact everyone at one time or another, like someone who has a cold and cough. As useable as today’s speech technology is today, there is still a tremendous way to go to build out the possibilities. Wouldn’t it be great for a PC to be able to identify and recognize multiple voices singing in a choir or those in a conversation? Or shifting to touch, what about multiple people interacting with objects on a surface at the same time and the PC is smart enough to know the angles they’re reaching from or perhaps who is touching it? What about vision sensing in robots for not only location but also facial recognition?

So, when I read transcripts of Bill Gates or others challenging people to think beyond the traditional interaction methods that we’ve accepted in PCs, I think about how to rally the community to express ideas around it. These paradigm shifts won’t happen overnight. I’m sure we’ll continue to see incremental business models and technology development around these ideas, picking up some things that worked from the past and adding a new feature, until we get closer to these ideals. There will be some convergence, as well as plenty of expansion too. Then, it’s a matter of dealing with that incremental progress and making that better in stages too.

Today at TechEd, Bill Gates stated:

There’s a number of technologies that our research group and others have been working on for these decades that are now moving into the mainstream. It’s a combination of software advances, and hardware power that allow us to bring new interaction techniques to a mainstream world. We collectively refer to these as natural user interface, but it’s several different things. It’s the idea of touch panel, and we gave a glimpse just last week of some of Windows 7, and the thing we chose to highlight there was this touch support, and how we built that in and made that easy for developers, and how end users will like that.

We’ve also got the pen capability that we’re taking to a whole new level in terms of easy recognition, and how that is implemented in the hardware. I think of every student having a device that avoids the need for paper textbooks. The tablet device will let them take notes, record audio, connect to the Internet. It will be superior in every way, and yet it can’t be purely keyboard based. It has to have this touch and pen as well.

We also have the speech recognition. On the phone today, if you call up information, that is a piece of software from a Microsoft group called TellMe that’s taking those information calls, and we’re building up the database, the speech model, of people in general, and people specifically to allow that speech interaction to be very rich. And as so we look out over the next decade, the way you interact with that cell phone, speech will be a major part of it.

The final natural interface piece, one that I think is perhaps the most important of all, is vision. A camera is very inexpensive, and putting software behind it that can tell what it’s seeing allows you to have gestures, and movements, things that will be used in a variety of settings. We put out our Microsoft Surface product that actually uses a camera to project up onto a table surface, and there you can point with your hands, or put objects on the table, and the software sees them. It’s being used in retail stores, and as that price comes down, that would be in every office, it will be in every home. Your desk won’t just have a computer on it, it will have a computer in it. And your whiteboard will be intelligent. You can walk up, take information, expand it, point to somebody’s name, start a teleconference with them, sit there and exchange information. And so natural interface really has a pretty dramatic impact on making these tools of empowerment, the personal computer, making them pervasive, and looking at them in new ways.

Although you could interpret these comments as directed at specific products, they really are inspirational to the industry and indicative of this next level of advancement required to push PCs to "what’s next." What do you want next?

Lora
Lora
Lora is passionate about student access to technology and information, particularly 1:1 computing environments. Also, has strong interest in natural user input, user experience and interaction behavior patterns.

Latest news

Related news