Technology & Innovation

How to think about computers in the 21st century

August 02, 2016

Global

August 02, 2016

Global
Stephen Lake

Co-founder and CEO

Stephen Lake is a biomedical engineer, inventor, and entrepreneur in emerging technologies. He is the Co-founder and CEO of Thalmic Labs, the company behind the groundbreaking Myo armband, which measures the electrical activity in your muscles to wirelessly control computers, phones, and other digital technologies with gestures and motion. He envisions a future of seamless interaction between humans and machines, where wearable computing interfaces seamlessly blend the real and digital worlds. Stephen has taken his revolutionary concept to an actual product with more than $15 million in funding, close to 100 employees, many patent filings, and over 50,000 users worldwide.

Has our perception of computers really changed?

In 2014, two huge investments into wearable technology made headlines in the tech world: Google led a US$542 million investment round into startup Magic Leap, and Facebook bought Oculus Rift for the hefty sum of US$2 billion.

In 2015, investment into wearable technology continued. Fossil Group agreed to acquire Misfit, a maker of wearable activity trackers, for US$260 million, and Intel acquired the wearable heads-up display manufacturer, Recon Instruments, for US$175 million. Fitbit, another wearable activity tracker maker, raised US$732 million in an IPO which valued the company at over US$4 billion.

Some of the smartest minds are making serious bets that the way people interact with technology is about to radically transform.

And why shouldn’t it? Our interactions with technology are long due for a makeover.

Stagnant innovation

Thanks to , computers have been getting exponentially faster and smaller. Their form, however, hasn’t changed nearly as quickly.

The smartphone takeover of the early 2000s is the most recent revolution in computing but despite a powerful new pocket size, the interaction has barely changed. The input of text on a miniaturized version of a classic “QWERTY” keyboard is old news. There is an array of applications on a miniature desktop which open and close with taps rather than clicks. It’s exactly what we’re used to, just smaller.

The mother of all demos

1968 was a remarkable year. Doug Engelbart and his team of 17 at the Augmentation Research Centre at Stanford Research Institute in Menlo Park had just created their own radical vision for the future.

On December 9th, Engelbart showed off their work in a seminal 90 minute live demonstration, now lovingly referred to as “”. In it, he demonstrated virtually all of the major aspects of a modern Human-Computer Interaction (HCI) system. From visual, structured menus for navigating information, to the mouse and keyboard as inputs, Engelbart revealed it all.

Which is more remarkable: how right Engelbart and his team got it the first time, or how little has changed in nearly six decades since then?

A metaphor is a powerful thing

The answer to that question lies in another: if we had all the basic ingredients for a modern computer in 1968, why did we have to wait until the 1980s to get one?

The IBM 5150 didn’t arrive in homes and offices until 1981, despite the fact that Engelbart gave us the ingredients in 1968 and computers were used to plan the Apollo 11 moon landing and predict the 1969 presidential election. If the hardware existed and the value was obvious, why was adoption so slow?

The main answer lies in the fact that people did not get it.  The only way to interact with a computer back then was through a command-line interface (CLI). You had to tell the computer, action-by-action and step-by-step, exactly what you wanted it to do. We had no idea what was happening inside the computer, no mental model to make its inner workings understandable. Before that, human-computer interaction involved putting punch cards in the right order.

What Engelbart and his team were missing was a metaphor: a way for humans and machines to understand each other. They got one in the world’s most famous Graphical User Interface (GUI): the desktop.

This was the single most important revelation for making computers mainstream, creating enough demand to drive down the cost and make personal computers accessible to the average consumer. The Apple Lisa introduced a menu bar and window controls in 1983, and by 1985 the Atari ST and Commodore Amiga were created. Since then, apart from performance and size, little has changed.  Keyboards for text input, pointers for targeting, and flat desktops for storing things are all similar. Making computers smaller has drastically transformed where people do their computing, but not how. The biggest input change has been learning to type on a QWERTY keyboard with our thumbs.

A new metaphor for a new computer

It’s time for a new metaphor. We have changed what counts as a computer, but not how people think about them.

Maybe this time, the technology world can start with the metaphor, making technological marvels accessible to all from day one.

The views and opinions expressed in this article are those of the authors and do not necessarily reflect the views of The Economist Intelligence Unit Limited (EIU) or any other member of The Economist Group. The Economist Group (including the EIU) cannot accept any responsibility or liability for reliance by any person on this article or any of the information, opinions or conclusions set out in the article.

Enjoy in-depth insights and expert analysis - subscribe to our Perspectives newsletter, delivered every week