IoT Data Interoperability POC: A Pragmatic Feasibility Proof

IoT Data interoperability has become increasingly important and widely acknowledged over the last few years, and has been touted by standards and vendors, and often discussed by industry analysts. McKinsey’s Unlocking the Potential of the Internet of Things report estimates that an extra 40% of the market value could be unlocked by achieving the interoperability in IoT. So, how can semantic IoT data interoperability be accomplished in practice?

Milan Milankovic, IoT Technology and Strategy Advisor at HTEC

It does not seem likely to happen through standards – there are too many competing ones in IoT data space, fragmented and conflicting in some cases, with no discernible dominant player. They are mostly focused on intra-specification interoperability among compliant devices, but not across specifications and domains where the estimated 40% of the additional value is.

An alternative top-down approach would be to try to structure a meta-ontology from the existing standards definition. This tends to be rather cumbersome in practice, and problems appear with the semantic web efforts. It is too slow to cope with the evolving IoT standards and the explosive growth of the types of IoT devices.

We explored an alternative pragmatic approach to cross-standard interoperability by selecting some representative IoT standards and prototyping a translation into a common interoperable data format. This proved to be feasible and practical, as well as instructional, and we are in the process of expanding and promoting it as an added activity to the leading IoT standards bodies.

Implementation: data interop POC

To explore the feasibility and complexity of the IoT data interoperability in practice, we created an interoperability proof of concept (POC). The basic idea was to mimic the real-world IoT diversity by having a collection of sensors whose data are encoded in different standards and converted in real time to a common format to be consumed by IoT applications and services in the cloud.

The setup consists of 3 gateways acting as sensor data aggregators (Intel Galileo boards), each with a set of 5 sensors that measure the same physical entities – air temperature, (simulated) chilled-water temperature, humidity, CO2, and electrical current flow. The sensor selection was driven by an actual smart-building engagement and use case. Specific sensor types were chosen for their wide use in different domains, such as industrial, weather, commercial buildings, and homes.

The code running on each gateway samples sensor data periodically, encodes it in one of three popular IoT standards – IPSO, OCF, and Haystack respectively – and sends data to the fourth gateway that is acting as the format-conversion unit. The format-translation gateway receives data encoded in specific standards, and converts/translates them to a common interoperable data format which is the same regardless of the differences in the input data, thus providing a consistent interface to cloud services, such as analytics or ML.

Note that the format-conversion function could be placed anywhere in the data path before the data-retrieval APIs, e.g., as a container service running in a gateway, fog node or cloud. In our implementation, we placed this function in a separate larger gateway to make the demo self-contained. We also used that gateway to host and drive the POC UI (live link) function.

The reading highlighted in the figure shows the IPSO encoding for a temperature sensor whose ID is gipso/3303/1. Other encodings of the equivalent sensor, OCF, Haystack, and BMP are greyed out. While there are obvious differences in the encodings between standards, the interop format is the same for all three, with only sensor IDs being unique as they refer to different physical sensors attached to different gateways.

IOT POC Data Interoperability

Conclusions and next steps

The POC demonstrated that our ground-up approach to translating individual standard definitions is feasible and practical. In retrospect, this stands to reason because all standards are modeling the same physical entities, real world “objects”, thus, resulting in (mostly) comparable abstract object models. Coding of the translation function was further simplified by the fact that all three standards provide variants of the popular JSON [key, value] form of data serialization.

On the negative side, we experienced the drawbacks of non-isomorphic and rigid modeling of smart objects in several standards. In some cases, we were forced to provide superfluous data fields to satisfy arbitrarily clustered resource properties, in others we were unable to express the available numerical sensor readings and had to use Boolean values instead. We are planning to communicate these findings and other insights gained from the POC to the related standards bodies.

We also implemented an expanded version of the POC with the additional proprietary sensor data-format to test the effectiveness of the proposed approach beyond standards. And the outcome was successful.

Being positively and strongly encouraged by this experience, we intend to work with several standards bodies to promote adding of semantic data interoperability to their charter and deliverables.

Architecture of the solution by: Milan Milankovic, HTEC IoT Strategy Advisor

POC implementation by: Dusan Nastic, Ivan Petrovic, Srdjan Jovanovic, Dragomir Krstic

If you liked this article

Read more related posts:

|Tech Blog

Machine Learning – Halide Programming Insights

In the last decade, Moore’s Law continued to provide more and more transistors per area unit. In the beginning, computer architects used those transistors to increase single-thread performance, designing power-hungry central processing units (CPUs).

Read More
|Tech Blog

IoT – Interoperability for Big Data

The importance of IoT interoperability is widely acknowledged and promoted.  The McKinsey report estimates that achieving interoperability would unlock an additional 40% of IoT market value.  However, there is usually little specificity as to what type of interoperability is being discussed and what it is useful for when achieved.

Read More
|Tech Blog

Building Intelligent Routing Models

The significant expansion of crowdsourcing applications in the last decade, along with the vast availability of mobile phones and similar personal sensory devices, has inevitably generated a considerable amount of location data. The ability to transform locations into business value has triggered the interest of our data-driven economy in spatial data analysis. Geo data science has tapped its way into the emerging field of data-related solutions, expanding the reasoning behind such solutions to an additional dimension – a spatial dimension.

Read More