Azure IoT : From RaspberryPi with Sensor to Azure Storage Table by using a serverless architecture

Introduction

A few days ago my connectors arrived for my latest PoC on Azure. So today I’m writing about my experience in using a RaspberryPi with a temperature & humidity sensor and to save the telemetry data in Azure. For this we’ll be using Azure Event Hub as an ingress mechanism, and Azure Functions to storage the events towards an Azure Storage Account. My next venture will be to use this data to create reports and maybe (on the long run) do some machine learning. For the latter, I’m pondering about linking this system to my ebus system of my heating system. That way I could correlate the data from the various censors (RPi, Thermostat & outside sensor) in combination with the heater information & heating schedules. Basically… creating my own  Google Nest. 🙂

 

Sensor : Physical Connection (I2C)

The guys from ThingTank had a spare sensor lying around, which they lend to me for my PoC… This was a “Grove – Temperature&Humidity Sensor (High-Accuracy & Mini)“. As you can see in the picture, underneath, this one has an I2C connector. We see four connections ; GND, VCC, SDA & SCL.

grove-tem-hum-accuracy-mini_01

I’ll be using a Raspberry PI. The GPIO pins on my board are as follows ;

53bc258dc6c0425cb44870b50ab30621

So the SDA, SLC & GND are pretty clear. For the VCC, you’ll be mapping this to #2 of #4. Which results in the following look ;

kvaes-raspberry-pi-i2c-grove

 

Sensor : Software Configuration

I started out with Raspbian as my operating system on the RaspberryPi. When checking out the software repository for the Grove sensors, I concluded that python was my best bet to get it working. So I installed the following packages…

apt-get install python-smbus i2c-tools libi2c-dev git python-pip

And I added the “i2c_dev” module to the list of modules to auto load at startup ;

root@raspberrypi:~# cat /etc/modules
# /etc/modules: kernel modules to load at boot time.
#
# This file contains the names of kernel modules that should be loaded
# at boot time, one per line. Lines beginning with “#” are ignored.

i2c_dev

Next up is to ensure that the “/boot/config.txt” file ;

dtparam=i2c_arm=on
device_tree_param=i2c1=on

Next up, let’s get the needed code & libraries ;

git clone https://github.com/DexterInd/GrovePi.git
pip install azure-servicebus

 

Azure Event Hub

We’ll be using the Azure Event Hub as queueing mechanism to which we’ll send all our telemetry ;

2016-12-17-20_50_10-event-hubs-microsoft-azure

For this PoC, I’m using one basic throughput unit. Be sure to note the secret information of the “Shared Access Policies”, as you’ll be needing the key name & value for the next bit.

 

Sensor Code

So now to read the sensor… and report back to our event hub. We’ll do this by the following piece of simple code ;

root@raspberrypi:~# cat /sensing.py
#!/usr/bin/env python
import grove_i2c_temp_hum_mini
import time
import json
import socket
import time
from azure.servicebus import ServiceBusService

key_name = 'mysecretkeyname'
key_value = 'mysecretkeyvalue'
service_namespace = 'eventhub namespace'
eventhub_name = 'eventhub name'
interval = 60

host = socket.gethostname()
sbs = ServiceBusService(service_namespace,
shared_access_key_name=key_name,
shared_access_key_value=key_value)
sbs.create_event_hub(eventhub_name)

t= grove_i2c_temp_hum_mini.th02()
while True:
temp = t.getTemperature()
humi = t.getHumidity()
unix = int(time.time())
hour = time.strftime("%H")
min = time.strftime("%M")
sec = time.strftime("%S")
day = time.strftime("%d")
month = time.strftime("%m")
data = {'Hostname': host, 'Timestamp': unix, 'Temperature': temp, 'Humidity': humi, 'Month': month, 'Day': day, 'Hour': hour, 'Minute': min, 'Second': sec}
msg = json.dumps(data)
print(msg)
sbs.send_event(eventhub_name, msg)
time.sleep(interval)

So what does that look like when we run it? Or maybe better said, how should it look…

2016-12-17-20_54_43-piraspberrypi_

And yes… the telemetry data is being sent to our event hub!

2016-12-17-20_55_46-nvidia-geforce-overlay

 

Azure Function

Now we’ve been receiving our telemetry data on our Event Hub. Next up, we’ll be using an Azure Function to pick up the messages from the queue and push it towards a storage account.

For those unfamiliar with Azure Functions… Imagine you have a piece of code you want to run. As a context you can have a trigger, zero to many inputs and zero to many outputs. Where in the end, you’ll just pay for your compute time. In my honest opinion, this is a VERY nice example of why cloud rocks! 😀

Anyhow, what will we be doing here? The trigger will be a new message on the queue, which will also serve as input. We’ll be using three outputs ;

  • Blob : Write the message (as json) to a blob container (for archival purposes)
  • Blob : Write the message (as csv) to a blob container (for archival purposes)
  • Table : Insert the message into an Azure Storage Table (which we’ll be using as data source on PowerBi later on)

And that kinda looks like this…

 

Integration part

In the gui ;

2016-12-17-21_00_44-nvidia-geforce-overlay

or in code…

{
  "bindings": [
    {
      "type": "eventHubTrigger",
      "name": "myEventHubTrigger",
      "direction": "in",
      "path": "kvaeshomesensor",
      "connection": "kvaesiot"
    },
    {
      "type": "blob",
      "name": "outputBlobCsv",
      "path": "sensorinfocsv/{rand-guid}",
      "connection": "kvaesiot_STORAGE",
      "direction": "out"
    },
    {
      "type": "blob",
      "name": "outputBlobJson",
      "path": "sensorinfojson/{rand-guid}",
      "connection": "kvaesiot_STORAGE",
      "direction": "out"
    },
    {
      "type": "table",
      "name": "outputTable",
      "tableName": "sensorInformation",
      "connection": "kvaesiot_STORAGE",
      "direction": "out"
    }
  ],
  "disabled": false
}

 

Function Code

And the code for the function, that writes to these three outputs is also very simple!

module.exports = function (context, myEventHubTrigger) {
    
    context.bindings.outputBlobJson = myEventHubTrigger;

    var json2csv = require('json2csv');
    var fields = ['Second', 'Temperature', 'Hour', 'Month', 'Timestamp', 'Hostname', 'Day', 'Minute', 'Humidity'];
    context.bindings.outputBlobCsv = json2csv({ data: myEventHubTrigger, fields: fields });

    context.bindings.outputTable = {
        "partitionKey": myEventHubTrigger.Hostname,
        "rowKey": myEventHubTrigger.Timestamp,
        "Values": myEventHubTrigger}

    context.done();
};

As you will probably have already noticed, there is a library (“json2csv”) in there, that isn’t present by default on the system.

You are able to add your own libraries via “kudu” console ;

2016-12-17-21_04_41-function-app-microsoft-azure

Here you can do the install of your library…

2016-12-17-21_05_05-diagnostic-console

Which will be propagated to all systems where your function runs.

 

Action Pics

So how does this look in action? We can see in the logs of our function that it gets triggered every time a message is sent to our event hub ;

2016-12-17-21_06_38-function-app-microsoft-azure

As also shown in the monitor logging ;

2016-12-17-21_09_23-function-app-microsoft-azure

When we take a look at our storage account, then we see that everything goes great. Our table is filling up…

2016-12-17-21_10_38-nvidia-geforce-overlay

And the same for both our storage containers ;

2016-12-17-21_10_57-nvidia-geforce-overlay 2016-12-17-21_11_06-nvidia-geforce-overlay

 

TL;DR

  • Doing a PoC like this doesn’t need to be expensive. The hardware I used (despite being recycle material mostly) is about 30€. In terms of Azure, the cost to maintain this is also very low cost.
  • Try to decouple all parts of your architecture into small parts. Use the language most useful for that part. Python was the most easy one for my sensor. Though NodeJS was the best in terms of supportability for my Function.
  • Serverless architectures are the way to go in the future. They are PaaS v2 (so to speak), and will really enable you to increase your velocity in terms of time to market.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.