Color

#B4BA39

Hero Carousel

Spotlight

Bringing free education to township girls and women

YouTube

https://www.youtube.com/channel/UCQDMOSbJtqrtNzif5rS22OQ

Slideshare

http://www.slideshare.net/economistintelligenceunit

Pinterest

https://www.pinterest.com/theeiu/

Infographic

Finding a niche

Devising an Internet-connected object with a viable business model has proved elusive for many organisations 

Every technology trend carries with it a degree of hype and at present the “Internet of things” (IoT) is overburdened. In August 2014, IT analyst company Gartner declared that IoT is at the very peak of inflated expectations in its annual Hype Cycle of emerging technologies.

Designing for the Internet of things

The IT industry is alight with buzz about the Internet of things, the idea that objects embedded with sensors and communications components can interact with each other and their owners via the Internet. So far, the use cases have focused primarily on established product categories, such as cars, fridges or energy meters, and with good reason. But what new products could be devised for the Internet connected-era? 

Power to the people

Letting employees have their voice is the future of business, says Cheryl Burgess, chief executive and chief marketing officer at Blue Focus Marketing. But do managers dare give it to them?

To brain or not to brain

To brain or not to brain

Artificial intelligence researchers haven’t always looked to the brain for inspiration. That is changing, although many experts still focus on purely mechanical approaches

It may be surprising in retrospect, but the pioneers of artificial intelligence did not look to the brain for inspiration.

The third era of IT

There have so far been only two eras of computing, says John Gordon, vice president of IBM Watson Solutions: the tabulation era and the programmable computer era. Now, he says, we’re beginning to embark on a third: the era of cognitive computing.

AI comes of age

AI finally comes of age (just don’t call it artificial intelligence)

Two new concepts in IT - cognitive and neuromorphic computing – may finally bring the AI fantasies of the past 50 years to life. 

The history of attempts to reproduce human intelligence in machines is riddled with pitfalls. What used to be called artificial intelligence (AI) fell into disrepute in the 1970s, but it’s making a comeback. It’s been rebranded as “cognitive computing”, and it’s advancing very quickly.

Learning from the brain

When Alan Turing first devised his theoretical ‘universal computer’, the idea that spawned the information technology revolution, he was trying to devise a machine that could solve mathematical problems like a human being. IT, therefore, has always mimicked the human mind to some degree.

In the intervening years, our understanding of workings of the brain has become much more sophisticated. Now researchers are transferring insight from neuroscience into computing, in the hope of developing systems that can learn and spot patterns as well as we can.

What is social business?

It's much more than tweeting, says Brian Solis, Principal analyst at Altimeter Group.

Should undervalued employees be more Scottish and less Welsh?

Threatening to leave should not be the only way to get rewarded. Businesses in post-crisis mode should do more to encourage loyalty

Enjoy in-depth insights and expert analysis - subscribe to our Perspectives newsletter, delivered every week