Brosteins

Developers, Technology Evangelists, Bros.

Integrating an Azure IoT Hub into the Particle Cloud

This is the second post in a series of posts on the IoT weather monitoring solution I built for Locust Hill Farm. If you haven’t read the first post, check it out.

If you’ve read the first post, you’ll recall that I built the IoT weather station using a Particle Electron. I walked you through selecting the hardware and assembling everything, but I didn’t cover the software and integration with the Particle and Azure clouds.

In this post, I’m taking a deep dive into the technical side of the solution. I’ll walk through how to connect a Particle Electron to the Particle Cloud, submit data from the Electron to the cloud, integrate the Particle Cloud with an Azure IoT Hub. I’ll follow up this post with another that covers how I process messages from the IoT Hub with an Azure Function.

Connect a Particle Electron to the Particle Cloud

Getting started with a Particle Electron is really easy, and Particle makes it really easy with their getting started documentation at https://docs.particle.io/guide/getting-started/intro/?start.

Right away, you’re greeted with a graphic to select your device. I had an Electron, so I clicked on the Electron image to get started. After selecting the Electron, I was brought to the particle documentation and Getting Started guide. There’s a lot to read in the documentation, but it’s broken down into consumable pieces. I recommend taking 10-15 minutes to read through the following:

When you’ve read a little bit, it’s time to get your Electron connected to the Particle cloud. I returned to the starting page, and clicked on the Setup my Electron button.

On the setup page, I chose the Setup an Electron w/SIM Card option and clicked Next.

The next step has you assemble the pieces and parts of your Electron that you’ll need to get it registered: the Electron, the SIM card that came with it, the 2000mAh LiPo battery, cellular antenna, and USB cable.

I won’t walk through the remaining steps in detail because it’s straight-forward: you’ll enter your SIM card number, complete some billing information for your Electron, and connect it to your Particle account. After entering in the information, assemble your Electron by inserting the SIM into the Electron, and connecting the battery, antenna, and USB cable.

When everything is connected, it’s a waiting game while the SIM is registered with a cellular carrier and the Particle Cloud. I had to wait ~10 minutes for this to happen.

TIP #1: It’s important you don’t power cycle your unit unless particle instructs you to do so. You may also need to wait more than 10 minutes.

TIP #2: You should have the battery connected to the Electron at all times, even when the USB cable is connected. This is because most USB ports on computers don’t output enough current for the Electron to operate at it’s peak energy consumption. The battery provides that extra boost, when needed.

Exploring the Particle Cloud Console

After the Electron is registered, you can manage it from the Particle console.

 

The Devices page is the landing page, where you can manage your connected devices by naming them, seeing the last time they checked in with the Particle cloud to start a secure session. Clicking into my Electron, I can see various details and data events triggered.

As data is sent from a device to the Particle cloud, you’ll see it appear in the console in real-time. Above, you can see that my Electron sent weather data around 11:12:25 AM on 9/8/2017. Note the event name “w” and the data sent for that event. In the event shown above, the event data is JSON formatted – this isn’t by default though. Event data can be any string, and as a developer you control the events published by setting an event name and data via code.

Now that you’ve seen what events look like in the console, let’s take a closer look at how you can send events.

Sending Data to the Particle Cloud

Sending data to the particle cloud is easy from a connected device. You use Particle’s built-in API to send an event. Let’s start with a simple Particle app.

All Particle apps follow the same form of a setup() function and the loop() function. The setup() function runs once, and is used for configuring any global settings, initializing variables, etc. The loop() function runs continuously, over, and over, and over… In the code snippet, you’ll also notice a delay(1000) call. This pauses execution for 1000 milliseconds. This construct is typical for Particle development because it allows you to slow down the loop execution, limiting it to run every second.

Now that you know the basics of a Particle app, let’s add in code to publish an event to the Particle cloud.

The Particle.publish() method is a built-in function that sends an event to the Particle cloud. The first parameter to the function is the event name, and the second is the event data. As I’ve said previously, you control the data format. It can be anything, as long as it’s a string. Using a JSON-formatted string is a concise way of transmitting data, so it’s my preferred method.

Putting everything together, this simple particle app sends an event to the Particle cloud every 1000 milliseconds. The event is named weather, and the data is a JSON-formatted string with a temperature reading of 83.58 degrees.

And, that’s it. Pretty easy. But, there are some things you should consider before writing code to publish to the Particle cloud.

Planning for Data Usage on Electrons

If you have a Particle Photon (the WiFi model), you don’t really need to be concerned with the size of your event data. But, if you’re writing code for a Particle Electron that runs off of the cellular network, every byte of data counts.

Particle Electrons data plans are $2.99 for the 1st MB of data each month, and $0.99 for each additional MB of data. Event data you send to the Particle cloud may not seem very large, but each time you publish, your incur a small amount of data overhead. And, it adds up. Fast.

Particle offers a lot of guidance around data usage and planning how your device accesses the Particle cloud here. Particle has done a lot to optimize the data usage of Electrons, such as using UDP instead of TCP, reduced the frequency of device check-ins, reduced the number of handshakes, and extended the session length. I recommend you read it thoroughly before you start building your first project. I’m glad I did. Generally, I recommend you do several things:

  1. Reduce the size of your data as much as possible (by using abbreviated event names and concise JSON-formatted data)
  2. Reduce the frequency at which you transmit data to the Particle cloud (my solution posts data every 15 minutes)
  3. Reduce function and variable calls from the API to Electrons
  4. Never flash the firmware update to your Electron over cellular
  5. Don’t use 3rd party libraries that make TCP connections from an Electron to 3rd party platforms (like Azure, AWS, etc.) – they use a lot of data. Instead, publish an event to the Particle cloud and use Particle cloud integrations to propagate your event data to other platforms
  6. Bundle multiple data transmissions into a single transmission (if it makes sense)
  7. Place your Electron in deep sleep mode to reduce power consumption and prevent device pings from consuming data

I’ve used a combination of these recommendations in my project, and have gotten my data usage down to just over ~ 1 MB each month. The biggest changes I made to affect my data usage were to reduce the publish interval to every 15 minutes and shortening the JSON as much as possible. Below you’ll see how I used abbreviations in my event data.

 

Sweet. Now, I’ve got lots of data in the Particle cloud, but what can I do with it? Uhh…not much (at least in the Particle cloud). The Particle cloud is intended to be your gateway to the cloud, providing reliable event delivery. Once the data is in the Particle cloud, you need to ship it somewhere else to perform additional processing. Let’s take a look at how to do that in the Integrations area.

Integrating with the Particle Cloud

We’ve already started looking at the Particle console on the Devices page. There are other pages accessible from the navigation bar on the left, like the Products page (where you can bundle devices together and manage them as groups) and the Events page that shows an aggregated view of all device events. Let’s take a deeper dive into the fourth page: Integrations.

The Integrations page allows you to connect the Particle cloud to a variety of other public clouds and endpoints.

My farm monitoring solution is connected to an Azure IoT Hub, which is a Microsoft Platform-as-a-Service (PaaS) offering to connect, manage, and collect data from IoT devices (I’ll cover this more later).

Although I already have an integration setup, let’s explore the various integration options. To setup a new integration, click the large New Integration button.

There are 4 types of integrations to choose from: Google Maps, Azure IoT Hub, Google Cloud Platform, and a webhook. The first three integrations are platform-specific and you need to know a bit about the platform to use it properly, but the presence of a webhook integration is great. With the webhook integration, the Particle cloud will post event data from the Particle cloud to an HTTP endpoint in real-time. So, if you’re unfamiliar with Azure or Google’s cloud, you can always integrate with a web hook.

Planning your integration

Before I go much further down integration, I want to set the stage for how to work with Particle event data. I’ve said previously that the Particle cloud platform is really just an ingestion point for your event data. Before you start building your project, you should plan where your data is going to go. A common pattern is to collect data from your IoT device, ingest it via the Particle cloud, then push it to another cloud (via a direct integration or a webhook).

On the receiving end, you’ll process the incoming event data from the Particle cloud. Your processing will typically follow two paths: immediate streaming analysis and long-term storage. For the immediate data path, you’ll be analyzing the stream of events, comparing it to other recent events, producing aggregates, and reacting to the data, raising new events and/or alerts. You can author you own engine to process streaming event data, but there are a variety of technologies that can do this for you. I’m not going to go into the details of thee systems, but I can point you to one I’m familiar with: Azure Stream Analytics. For the long-term storage path, your receiving program should process the data and store it for later analysis.

Now, back to the integration.

Configuring an Azure IoT Hub

Before I took a detour, I described the various integrations with the Particle cloud. In my weather monitoring solution, I integrated with an Azure IoT Hub. Let’s step through the process by starting with creating an Azure IoT Hub.

Log into the Azure portal at https://portal.azure.com. Then, create a new IoT Hub.

 

Give your IoT Hub a name, select a pricing tier, and accept the remaining defaults, assign it to a resource group, and select a location.

 

In my weather monitoring solution, I named the IoT Hub locust-hill-iot-hub, and selected the Free pricing tier. I won’t get into the details of pricing, but you should know that there is a free version of the IoT Hub that allows you to process up to 8000 device events per day. My solution only sends data every 15 minutes, and generates 96 events per day, so I’m well below the 8,000 event threshold.

After you click the create button, an IoT Hub will be queued for provisioning.

What is an IoT Hub?

I’ve talked a lot about Azure IoT Hubs, but haven’t really described one. IoT Hubs are an Azure PaaS service that allow you to connect, monitor, and manage IoT devices at scale. The IoT Hub is built on top of Azure Event Hubs, which are built to accept and process millions of data points and events per second. IoT Hubs inherit the event processing capabilities of Event Hubs, but also add device management and bi-directional (device-to-cloud and cloud-to-device) communication capabilities. There are a variety of other useful management and security tools, but I won’t go into those details.

As a developer, you can think of an IoT Hub as a cloud-based queue. But, a cloud-based queue that can scale massively.

When you integrate the Particle cloud with an IoT Hub, messages pushed from the Particle cloud are enququed inside of the IoT Hub. After it’s enqueued, it’s up to you to dequeue it and process accordingly.

Continuing the Azure IoT Hub Integration

Now that you know a little bit about the IoT Hub and created one, let’s return to the Particle cloud and finish the integration.

After selecting the IoT Hub integration, the Particle cloud portal gives you three tasks to complete:

  1. Sign up for an Azure account (done)
  2. Create an Azure IoT Hub (done)
  3. Add a shared access policy

We’ve already created an IoT Hub, so that takes care of steps 1 and 2. Step 3 asks you to create a shared access policy, which grants the Particle cloud access to submit event data to the underlying IoT Hub Event Hub.

 

Expanding each of the steps gives detailed instructions. Let’s walk through adding the shared access policy. Jump back to the Azure portal and open your IoT Hub.

Select the Shared access policies setting, then click the Add button. Create a policy named particle-iot-hub and give it full permissions.

 

A shared access policy is like a username/password with permissions attached. After the policy is created, select the policy and retrieve it’s Primary key (the password). Save this.

 

Return to the Particle cloud portal, and click the I have done all these things button, then fill in the required information about your IoT Hub.

The Event Name field allows you to target an integration with an event. You’ll recall that the Particle.publish() method took 2 parameters: an event name and event data. Enter the value of the first parameter into this field. For my solution, I named my weather events w.

Enter the IoT Hub name, the shared access policy name, and the primary key.

Finally, select the device(s) which this event will be raised from (I selected only my Particle Electron at the farm, but you can allow this event to be raised from multiple devices).

Click the enable integration button to finish. The integration should now display int he Particle cloud portal.

Verifying the IoT Hub Integration

After you’ve integrated with an IoT Hub, you likely want to test that it’s working. To test the integration, head back to the Integrations page and click on the integration.

On the integration detail page, click the Test button.

If everything is configured properly, you’ll see a success message.

That it. I really like the Particle cloud portal and the direction it gives you in integrating with other platforms. Their step-by-step approach makes it easy to get started and makes the experience feel very polished.

The integration with the IoT Hub doesn’t stop here. In my next post, I’ll walk you through how I process the weather data that’s pushed to the IoT Hub.

 

 

 

 

 

 

 

 

 

 

 

Share

1 comment for “Integrating an Azure IoT Hub into the Particle Cloud

  1. Avatar
    Mike
    October 31, 2017 at 2:06 pm

    Hi,
    This is a great guide.
    Up to this point I have been trying put together a setup just like this but with the google services. Unfortunately I am stuck where this guide leaves off (setup with intra-cloud services). Ideally I would like to be able to stream data from my electron in real time then store it and visualize it with.

    I’m looking forward to the next segment of the post where these types of things are handled.
    And ff using Azure ends up being more conducive to this type of thing I may have to make the switch.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.