Data to dollars: A data interoperability challenge

weeve
weeve's World
Published in
6 min readMar 24, 2021

--

Industry 4.0 promises to bring huge value to many industries and to address challenges with wide-reaching business impact. Many companies may look to IoT to improve processes, reduce waste, and even create new business models. While the potential benefits are frequently spoken of, realising this potential is often not as simple as you would hope.

Whether you are still struggling to improve your OEE or trying to implement cutting-edge business models, there are still many barriers to implementing successful IoT projects. From data privacy and security concerns, cost of implementation and scalability, lack of subject matter expertise, inadequate infrastructure, the list goes on.

According to a Penton study published in IoT World Today: 28% of survey participants said “Data Interoperability” was a major challenge for IoT adoption within their organisations. The challenge of data interoperability is still clearly an unsolved familiarity for industrial players hoping to capitalize and turn their data into dollars.

In this article, we will focus on exploring the data interoperability challenge. This is the challenge companies face when trying to transform and understand data from various systems with different standards and formats and turn this into insightful and actionable information that can in turn generate value for a business.

Head in the clouds

For those of us that have been working in IoT for most of our careers, the lines seem to blur between 1999 when the term “Internet of Things” was first coined, and the point when technology prices dropped such that sensor technology became more and more ubiquitous. While IoT has been a buzz word ever since it first entered the Gartner “Hype Cycle” in 2011, many consider 2016 to be the beginning of the IoT movement.

In 2016, when “IoT Platform” was at the peak of the Gartner hype cycle, data interoperability came with a special set of challenges. From 2016 till about 2019, the idea was to merge data sets to a central server (cloud) to process data. The concept was simple; propose a few clever business ideas, connect your machines to the cloud, and analyze the data.

As we have learned, this approach does not work. Even in small-scale operations, the sheer number of data points is too large to process and internet throughput becomes an issue. Even if you were able to process the data, you would of course end up with the challenge of data interoperability. If data streams can’t be united and streamlined, then you can’t extract business value.

We have also seen conflicts (ongoing) for data ownership that resulted in platform wars. Businesses were, and still are, too scared to give their data to platform vendors for fear of vendor lock-in and loss of IP and data rights.

Now more than ever it is important to first address data interoperability on-premise before making decisions on where the data insights go, and with whom you are choosing to share this data.

White is the new black

Today’s exploding number of IoT vendors has turned the IoT ecosystem into a highly complex landscape. True interoperability is still an unchecked box in the IoT ecosystem, which leads to many other issues:

  1. Vendor Lock-In — A large number of existing IoT solutions are proprietary and designed to operate only within a predefined hardware or infrastructure environment.
  2. Security — The nature of the IoT with its heterogeneous architecture and devices involve the sharing of information and collaboration between things across many networks. This poses serious challenges to end-to-end security.
  3. Scalability — A key challenge in IoT deployments is operating at scale. Monitoring a single piece of equipment could entail communicating with tens or hundreds of nodes.
  4. Technical Reliability — Enabling the various datasets that you need to communicate seamlessly can oftentimes include additional technical components, which increases the number of failure points, and overall system reliability.

It was initially thought that many of these problems would be solved by creating new standards and implementing IoT Cloud Platforms that aggregated protocols and made data sets communicate seamlessly-we now know better. We will not create a “single standard” for IoT protocols because such protocols have specific applications that are useful and meaningful in different contexts.

As an example, LoRa has long-range benefits over wM-bus, but wM-bus has computational efficiency — these tradeoffs mean that there are scenarios where the application of one makes more sense than the other. In some cases, it makes sense to have both. Inherently, this means we will have more hardware & software options, not less.

As a business, the move is clear. Try as often as possible to use less custom hardware and more “off-the-shelf” hardware. Push your suppliers/vendors to conform strictly to hardware requirements that are purposefully as generic as possible. Failing to do this, may mean forever being tied to making hardware, and ultimately failing in your digital transformation.

Oftentimes the idea is to make/buy custom hardware in hopes to save pennies on the dollar to “meet the company margin goals”. Keep in mind that these installations are going to be around for a while. You may want to spend those extra pennies on processing power on the edge. This will ensure that your data is “speaking the same language” at the point where data is processed- or as we like to call it, at the “Data custody edge”.

If you avoid spending the money to correctly implement digital initiatives now, you may end up missing out on business opportunities in the future. Over the long term, the return the right data can provide greatly outweighs any additional initial costs. This is the concept of “Data to Dollars’’.

Picture perfect

Consider this, a machine supplier or OEM wants access to data from the machines their customers are using in order to understand performance and deliver value-added services. As a customer, you may not want to share complete raw data, but only that data from a subset of field machines, for a specific time-period. So, you grant access to only that data — just like when you grant your food delivery app permission to use your location only when you place a food delivery order.

With access to parameters such as temperature, oil pressure, vibration, and RPM data, the supplier can optimize asset performance, and deliver value added-services such as usage-based pricing. Combining data from multiple users of the same type of machinery builds a more complete picture of how products and services are used and allows OEM’s to improve ML algorithms that, for example, enable predictive maintenance. All the while, raw data never leaves the “data custody edge”.

Imagine a world where this is all a possibility — sounds amazing right? Data interoperability is a challenge that needs to be overcome for this picturesque scenario to be realised, but it will not be solved in the way we imagined in the past. There will always be different players, vendors and equipment in the market; when we are thinking about a future of interoperability, we should envision one that focuses on enabling services and business functions. Instead of focusing on the short-term hardware specific goals of what you “think” you need to achieve to stay relevant in the market, choose infrastructure that also takes into account data-interoperability and future proofing your business then use a platform that orchestrates the right data and allows for seamless integration.

Originally published at https://www.weeve.network.

--

--

weeve
weeve's World

weeve’s mission is to enable pioneering companies to securely extract new value from an increasingly connected machine economy.