Technology and innovation perspectives from The Economist Intelligence Unit

Color

#006BA2

Hero Carousel

Spotlight

Culture clash - the challenge of innovation through acquisition
IoT Business Index 2017: Transformation in Motion

YouTube

https://www.youtube.com/channel/UCQDMOSbJtqrtNzif5rS22OQ

Slideshare

http://www.slideshare.net/economistintelligenceunit

Pinterest

https://www.pinterest.com/theeiu/

Money with no middleman

The defining innovation of cryptocurrency Bitcoin is not that it is digital – only a tiny proportion of the world’s money takes the form of physical cash today – but that it is decentralised. No central authority governs, monitors or controls its use, which is one reason it has proved so popular in the criminal underground.

Whatever happens to Bitcoin itself – whether it gains mainstream adoption or fizzles out as a fad – the point has been made that a decentralised currency is possible, and the technological mechanism that underpins has been proven.  

Power to the people

Letting employees have their voice is the future of business, says Cheryl Burgess, chief executive and chief marketing officer at Blue Focus Marketing. But do managers dare give it to them?

UK report

Alongside the global report we have written a UK country report.

Big decisions agenda

Gut & gigabytes

To brain or not to brain

To brain or not to brain

Artificial intelligence researchers haven’t always looked to the brain for inspiration. That is changing, although many experts still focus on purely mechanical approaches

It may be surprising in retrospect, but the pioneers of artificial intelligence did not look to the brain for inspiration.

The third era of IT

There have so far been only two eras of computing, says John Gordon, vice president of IBM Watson Solutions: the tabulation era and the programmable computer era. Now, he says, we’re beginning to embark on a third: the era of cognitive computing.

AI comes of age

AI finally comes of age (just don’t call it artificial intelligence)

Two new concepts in IT - cognitive and neuromorphic computing – may finally bring the AI fantasies of the past 50 years to life. 

The history of attempts to reproduce human intelligence in machines is riddled with pitfalls. What used to be called artificial intelligence (AI) fell into disrepute in the 1970s, but it’s making a comeback. It’s been rebranded as “cognitive computing”, and it’s advancing very quickly.

Learning from the brain

When Alan Turing first devised his theoretical ‘universal computer’, the idea that spawned the information technology revolution, he was trying to devise a machine that could solve mathematical problems like a human being. IT, therefore, has always mimicked the human mind to some degree.

In the intervening years, our understanding of workings of the brain has become much more sophisticated. Now researchers are transferring insight from neuroscience into computing, in the hope of developing systems that can learn and spot patterns as well as we can.

The century of automation

There are evident parallels between the early days of the automobile and current state of industrial automation

At the turn of the 20th century, cars were predominantly driven by the super-rich with little regard for safety. As a result, they were far from popular . It took another thirty years and Henry Ford’s affordable automobiles for the public to come around to the idea.

Enjoy in-depth insights and expert analysis - subscribe to our Perspectives newsletter, delivered every week