Tuesday, April 10, 2018

Road to Zanzibar

No, I'm not going to talk about the Bob Hope-Bing Crosby movie (ask your grandparents).  Rather, I want to cover some new kinds of interfaces, the new platforms upon which our computing needs are going to be based -- including but not limited to Microsoft's Project Zanzibar.
Voice is the new interface, right?  Wired said so a couple years ago, and Farhood Manjoo recently agreed, both picking Alexa as the likely "winner" (although Siri and others aren't conceding anything).  In Forbes, Ilker Koksal predicted that voice-first devices are the "next big thing, " since:  "Finally, there will be a single interface for interacting with a diverse variety of devices at home or on-the-go, making it easier than ever for users to accomplish any task hands-free."

The Harvard Business Review recently speculated on what healthcare would look like in a world of smart speakers, and Healthcare IT News wrote a "Special Report" explaining that "AI voice assistants have officially arrived in healthcare."   The case for voice, it might seem, is officially closed.

Not so fast.  There certainly will be voice interfaces, but that's not all there's going to be, and even voice interfaces won't necessarily be what we're now getting used to.

For one thing, "voice" interfaces may not even require you to actually speak.  MIT researchers have developed a wearable device and associated computing system that "can transcribe words that the user verbalizes internally but does not actually speak aloud."  It can also "talk" back to you silently, transmitting vibrations directly to your inner ear.  You could, in effect, have a complete conversation with your AI without either of you uttering a sound. 
Image: Lorrie Lejeune/MIT

The system had an average transcription accuracy of 92% within 2 hours of use, which is remarkable.  Alexa, did you hear that?

All right, the device makes Google Glasses look cool, but it is still a prototype.  One can imagine that, at some point, it will be much smaller -- perhaps even simply implanted -- and this kind of technology could become ubiquitous.  We all have experienced loud smartphone users whom we wish had this technology.  

All right, very impressive, but still a voice interface, just very quiet.  Example number two, then, are "virtual wearables," such as Project North Star from Leap Motion.  Keiichi Matsuda, their VP of design, told Fast Company, "Our hands are our original interface with the world, and foundational to any immersive experience."  

They've designed two tools to give our hands more control in our cyber-world, whether AR, VR, or screens.  Power Tools gives our hands "superpowers," an array of apps that have discrete functions -- "a virtual palette attached to your wrist."  They couple that with "Virtual Wearables," which  Fast Company describes as:
Think of it as a smartwatch that exists in virtual space, and therefore can morph based on context. Instead of memorizing an innumerable amount of gestures to call up different menus or buttons, Virtual Wearables look and act like familiar interfaces. Users can click, open, twist, turn, or swipe them, just like they would in real life.
Mr. Matsuda sees these as integral to our future:
We are on the verge of a new era of human-computer interaction, that will ultimately supersede not only mobile and desktop, but also many of the physical interfaces that we rely on.  The more the technology progresses, the more absurd it will be to have to rely on controllers, keyboards, and touch screens
Pretty cool.

Then there is Project Zanzibar, which Microsoft will formally announce later this month but which has already started leaking out.  They describe their goal as wanting to blur the lines between the physical and digital world.

As currently constituted, there is a flexible, portable mat that "has the ability to locate, sense and communicate with objects as well as sense a user’s touch," using Near Field Communication (NFC).  It uses Bluetooth to connect with whatever screen you happen to have handy.  

For example, toy pieces could be on the mat and then be represented on the screen, interacting with their virtual counterparts.  Toys could "come alive."  The use cases now include toys, games, and learning, but one suspects we're barely scratching the surface of what we might be able to use it for.
Microsoft's description notes:
One reads all this and senses that the moment for tangible user interfaces indeed is upon us. And if the passion of this research can be as inspiring in the outside world, then we may indeed be at a turning point in the realization of natural interactions with computers that lead to the next era of how we think of – or don’t think of! – computers when we are using them to create, learn, play and build.

Again, pretty cool.

Healthcare is full of bad interfaces.  It can't quite figure out exactly who the "user" is in its UI, or exactly what they do, how they do it, or how the interface should make things better.

If anything, interfaces get in the way of better care -- too many error messages, confusing visual display, and too much keying.  Patients and physicians alike complain that physicians spend too much time inputting into the EHR and not enough listening, looking, or touching.

Voice inputs -- spoken or sub-vocal -- could clearly help with the input problem, but it may take a little longer to figure out how to use Power Hands, Virtual Wearables, or the Zanzibar mat in healthcare.  But figure it out we will.

E.g., I keep wondering how NFC could be used in, say, a physical exam or even a surgery, tying the physical of our body with the digital of more information about our body.

So, you can keep using your keyboard, or perhaps your game controller, if you want.  You can even start using Alexa or Siri or Google Now to free your hands for some tasks.  But don't get too used to any of these, because there are some new interfaces on the road ahead.

It may even be, indeed, the Road to Zanzibar, but it almost certainly won't stop there.

No comments:

Post a Comment