Contact Us

Contact Us

IoT Data Handling with Microsoft Azure Platform

IoT Data Handling with Microsoft Azure Platform – Background

Low-cost sensors are producing a massive amount of data that are used in many health settings. Many in the healthcare industries and Technosoft client are integrating IoT sensors to monitor the patient’s condition and/or the environment in real time. Processing this massive data through custom made software is not a viable option. There are many different companies that provide IoT platform to build the apps that can collect, manage and process IoT data from different sensors. Giant players like Amazon and Microsoft were early to get in the market as compared to others. Sensor vendors also provide utilities that can be used along with the IoT platforms to consume or process those sensors data.

In this article, we will discuss the Microsoft Azure platform IoT and Monnit sensors options that Technosoft Engineers used to develop a web application that processes huge data from various types of sensors. Purpose of this web application was to consume the highly heterogeneous data from sensors; filter this data based on sensor type and finally convert the data to a standard format that can be used by external systems for analysis and reporting purpose.

Types of Sensors

For this project, we used different sensors from Monnit, listed below:

  1. Temperature Sensor
  2. Vibration Sensor
  3. Air Quality Sensor
  4. Door Data Sensor
  5. Activity Sensor
  6. Pressure Sensor
  7. Light Sensor
  8. Humidity Sensor
  9. Motion Sensor
  10. Voltage Detect Sensor
  11. Water Detect Sensor
  12. Push Button Sensor

Getting Data from Sensors

There are multiple ways we can collect the sensors data.

1. Using Vendor Gateway

Monnit sensors can be configured with Monnit Corps’ own gateway called IMonnit Gateway or a third party gateway like Option (https://www.option.com/ecosystem/ ) called Option Gateway. Using gateways we can configure sensors Settings like heartbeat interval etc.

Once sensor data is received in a gateway it can be sent to an IoT Data processing Hub like Azure IoT Hub or IMonnit Mine Server. Both approaches have pros and cons.

  • Azure IoT Hub

    Azure IoT Hub is a managed IoT cloud-hosted service that allows bidirectional communication between various IoT devices. Azure IoT Hub provides device libraries for the most commonly used platforms and languages for easy device connectivity that receives sensors data from gateway and passes that data to Azure Stream analytics which then passes that data to customized Azure function. The customized Azure function can process data as programmed or send that data to an external app API for further reporting and processing.

  • Role of Azure Stream Analytics

Azure Stream Analytics is a serverless scalable event processing engine that helps developers run real-time analytics on multiple simultaneous IoT sensor data streams.

IoT hub itself is event-based i.e. it triggers events like sensor connected, data received, etc. and it’s the developers’ responsibility to catch those events and parse required data. Further, if an event is missed we can’t get the data back to know what happened. Also, we cannot directly send the sensor data from Azure IoT Hub to any external application. Technosoft, we have written many custom functions to process heterogeneous data coming from different sensors and created Jobs in IBM Maximo Assets Management System.

Azure Stream Analytics receives data from Azure IoT Hub as an input and can send it to some Azure Function as an output. That Azure function in return can pass the data to an external app via APIs for example.

Moreover, it also benefits to maintain history as it stores data in the cloud (built in feature) and a user can recover data history at any time. Finally, Stream Analytics can be integrated with Microsoft Analytical tools for useful insight information.

Handling IoT data with Azure Stream Analytics

  • IMonnit Mine Server

    Alternatively, vendor specific hub/mining can be used as IMonnit Mine Server. It listens and it processes messages as they arrive but as previously mentioned vendor specific hubs usually don’t have data caching facility. In the event of a server crash, data can be lost forever.

The benefit of using the sensor vendor specific hub is that it saves the cost of Azure IoT Hub and Stream Analytics.

2.  MQTT Server

  1. MQTT is an IoT publish-subscribe based connectivity protocol for an environment with small code footprints or with limited network bandwidth. Some of the sensors are availabe in the market MQTT enabled. Those sensors can be processed by an MQTT server without a need of a gateway.
  2. With these sensors, we have the same issue that data caching is usually not available if the server gets crashed then we can lose the sensors data for that time.

Common Problems in Azure IoT Environment Setup

Following is a list of some challenges we faced while configuring the IoT platform to meet business needs:

  •  Running Monnit Mine Server and MQTT Server on Azure App Service

For MQTT enabled sensors we have set up a MQTT server. However, Microsoft Azure app service allows you to open only 80 (HTTP) and 443(https) ports. While MQTT Server runs on different port usually. This is a limitation with Azure app service that we can’t open any other port. We have to move our web application to a dedicated Virtual Machine that gives you the liberty to open any port as needed.

  •  Receiving Different Data Formats

When you are getting data from two different sources e.g. Azure IoT hub and Monnit Mine Server, this will be in two different JSON formats. In this case, you have to normalize the data so that application can process it correctly. In order to resolve this issue, a mapping table was created that can hold the sensor specific fields and corresponding standard value to map on.

  •  How to test the IoT application

While in the development phase you may and may not have access to actual sensors all of the times. For this, you can collect the JSON formatted data for each sensor and then sent the sample data via Postman tool to this application.