The amount of information flowing across networks has mushroomed in recent years, and its varieties multiplied, thanks to the growth of social media, peer-to-peer websites, mobile Internet use and other modes of digital communication. Data is now termed “big” not only due to its enormous quantities and multiplicity of types (photos and video, for example, in addition to conventional spreadsheet data). “Big” also refers to the potential opportunities for organisations that can mine the data mountains and extract the insights they contain.
The advent of the Internet of Things (IoT) means that big data is likely to get a lot bigger. The IoT links wireless networks of tagged objects as diverse as automotive components, clothing, appliances, medical products and packaging. Data volumes handled by networks and servers are certain to expand enormously as the IoT grows. The potential value to public and commercial organisations able to analyse and act upon the data is considerable. Unlocking such value, however, requires that a range of technology and non-technology issues are addressed.
The mobile connection
The Internet of Things is already with us due to the technologies we now carry, maintains David Carrera, a professor at the Barcelona Supercomputing Centre. "Every smartphone is a collection of sensors, continually connected to the Internet, reporting interesting information," he notes. Such devices, connected to sensors in all manner of objects, are creating a flood of data which begs to be interpreted.
Sensors and monitoring systems, to be sure, have existed for decades. The difference now is that data can be transported from a broader range of device types and locations than before, due to the increasing prevalence of mobile and wireless technologies.
At the back-end, crunching and analysing the volumes of data generated by such devices has only recently become possible—and affordable. "Historically information has been discarded rather than analysed, principally because there were not the tools available to analyse it in a cost-effective manner," says Philip Howard, a data expert at Bloor Research, a UK-based analyst firm.
As the prices of data storage and transport, sensors, devices and analytics technologies fall, the threshold for IoT adoption by organisations is also falling, and the range of possible applications is broadening. Not only can sensors be attached to increasing numbers of physical objects, but historical data also can be analysed in new ways. In Boston, for example, city authorities are using sensors, video cameras and GPS (global positioning system) in taxis to report potholes in roads.
While innovative, such examples may quickly become old hat as both the public and private sectors become smarter in how they interpret and then use such information. The IoT, for example, underpins the capacity of “smart grids” which distribute electricity according to demand, or smartcard-controlled taps to manage water distribution in drought areas. In a “smart” home, “learning” thermostats can upload data about how specific rooms heat and cool; the results can then be fed into and controlled by smartphone apps.
A question of reliability
For the IoT to deliver on its promise, all the links in the technology chain need to deliver and collate data in a sufficiently timely manner, to ensure sufficient data “latency”. This puts particular pressure on the transmission network, which needs not only to ensure that sensor data can get through, but also that a response can be delivered to be acted on within the necessary time frame.
Meanwhile, analytics tools need must be able to “ingest” the data and deliver actionable insights—the timeliness of which depends on the storage, software and data architecture. Philip Howard explains: "If you have to index the data as it is loaded, this will significantly slow down the loading process and it will add to the size of the database, not to mention adding to administrative costs."
These factors require architectural decisions that take all elements of the chain into account. Some data processing could take place on a local device or server before key data elements are uploaded to a central server. For example, a vehicle number plate recognition sensor could process the number plate at the sensor. Such decisions require a trade-off in terms of processing and power requirements at the device level, within the network or during server processing and storage.
Understanding such trade-offs holds the key to linking big data with the IoT to deliver maximum benefit, believes Niall Murphy, founder and CEO of Internet of Things software company EVRYTHNG: "The challenge is to develop the competencies and systems to use data in real time, to make engagements and applications smarter."
Questions of data ownership are also yet be ironed out. Who owns the data generated by an electrical smart meter installed in a house, for instance? Issues regarding the security, privacy and ownership of data “are major challenges that need to be addressed before the IoT becomes widely adopted," believes Mr Carrera.
The success of the IoT is predicated on reducing the friction between the data created and our ability to make sense of it, both in technological and non-technological terms. Both the IoT and big data are likely to be waypoints on a larger journey towards sensory, reactive, “smart” environments. The journey will take time, and there will be numerous stoppages along the way. But as the issues outlined in this article are untangled, “big” may not be sufficient to describe the data-intensive world that emerges.
Click here to download the full article.