Computers are everywhere. In toasters, washing machines, refrigerators, cars, and thermostats (to name an arbitrary few). At last year's Consumer Electronics Show (CES) - Mecca for tech geeks - Intel CEO Brian Krzanich showed off ‘Curie’; a computer so small he wore it as a suit button on stage.
We live in a world of ubiquitous computers. But right now, the way we interact with them is so awkward that people question their very utility.
As UX/UI design guru Golden Krishna describes the modern connected oven: “We slap a seven inch, Android-powered touchscreen on it so you can watch YouTube while you bake cookies.”
The problem here is a failure of metaphor. We think about the ‘computer’ inside the oven like a desktop computer. We know that, to talk to it, it needs a few basic ingredients: a screen, some kind of desktop and - of course - a QWERTY keyboard. To change how we interact with these new forms of computers, we have to change how we think about computing itself.
What if, when you thought about using your computer at home, the thing that jumped to mind wasn’t the familiar setup: screen, keyboard, pointer, and processor. What if it was just your home itself? Many interconnected pieces that you could control simply and easily from wherever you are.
Your screen could exist anywhere, at your whim, and vanish when you didn’t need it. The way you interact with that screen - to start a movie, launch a gaming application, or browse news and weather - could be the same as how you interact with your oven, microwave, and refrigerator. Walking around, your house changes what part of the computer you talk to, and you control everything effortlessly with natural, intuitive inputs. The same could be true for shared screens in public places.
What will it take to get there? A few ingredients:
All of these ingredients exist today, save the last one.
We have wearable computers now, and they’re getting both more fashionable, more popular,
and smaller. We have contextually-aware systems, like Google Now and Siri, designed to give us information we need without us having to ask. We have a growing ecosystem of natural controllers; gesture control inputs like the Myo armband that read our muscles, voice control inputs that get more robust each year, and truly exotic inputs like the Muse headband designed to translate brainwaves into technology inputs.
Right now, we’re reaching a point where today’s technology has not yet realized its potential. Researchers have created a suite of astonishing technologies that break computing out of the old paradigm and release it into the air. Computers are all around us and connected, enabling interactions that would have been unimaginable in the heyday of the GUI.
What hasn’t caught up is our metaphor. We’re stuck in the office, thinking about computers in a way that limits their capability.
Stephen Lake is Co-Founder and CEO of Thalmic Labs
Read the July issue of Business Review USA & Canada here