The coming revolution in computer usability


One of the biggest complaints among doctors and other healthcare providers is how much time they must spend entering data into devices connected to their electronic health records (EHRs) systems — time that could be better spent focusing on their patients.

Two health experts writing in Harvard Business Review call for a “revolution in usability,” arguing that “voice and gesture-based interfaces” should replace keyboards and the mouse as the method of building and interacting with the record.

But you don’t have to be logging endless hours inputting patient data into an EHR to long for a revolution in usability. Enterprise workers and consumers everywhere suffer from the inefficiencies of traditional computing interfaces and tools such as mice, keyboards and drop-down menus. These technologies are decades old, people! Yet here the vast majority of users are today, typing and pointing and clicking and scrolling and swiping, like lower-form primates.

To be sure, voice commands are gaining a foothold thanks to improved technology and steady consumer adoption of voice assistants for the home. Still, we’re probably a long way off from the day when most computer users intuitively will choose voice interactions over physical manipulation of screens and interfaces.

The thing is, it probably won’t be a binary choice because even more ways to interact with computing devices are in the works. In addition to gesture-based computing (as mentioned above), virtual and augmented reality (VR and AR) technologies already are being embraced in the enterprise in a number of roles. Applying them directly to device interaction will open up a world of 3D computing that might look quite familiar to Tony Stark.

Over at Digital Trends, writer Luke Dormehl flags some other emerging technologies with which humans someday may routinely interact with computers. The two that really grabbed my attention are 1) emotion sensing and 2) brain interface.

“While it’s more of a way of improving interfaces, rather than an interface in its own right, emotion sensing can assist users by pulling up relevant suggestions based on how you’re feeling at that precise moment,” Dormehl writes. Emotion sensing could enable a computer to know “the optimal time for you to do work based on your productivity levels” or determine a user’s moods by analyzing their typing, he adds.

Brain interfaces are the ultimate frictionless computer communications vehicle. Think about a task, and the computer gets to work! This one’s going to require some effort, though, not only on the technology side, but on the human side. That’s because our thoughts aren’t nearly as linear as our words. We may start out thinking, “I’d like to see the sales spreadsheet for Q2,” only to suddenly be musing about how much we’d like a venti caramel macchiato right about now! Also, I wonder why my foot is so itchy? If the computer can’t sort all that out, you may have a dermatologist delivering a large Starbucks drink to your office. Not the worst outcome, but not exactly on point. And you still won’t have that spreadsheet.

While we may not see brain interfaces or emotion sensing until after whatever generation follows Generation Z is in the workforce, VR and AR, gesture-based commands (including pre-touch sensing) and other computer interface technologies are being actively explored by technology researchers today. It’s only a matter of time before they’re in use.

Speak Your Mind


This site uses Akismet to reduce spam. Learn how your comment data is processed.