--- layout: post title: "Microsoft teamed up with 17 Minds to create an Azure IoT solution for their sleep management system" author: "James Earle" author-link: "https://twitter.com/ItsJamesIRL" #author-image: "{{ site.baseurl }}/images/17minds/james.jpg" date: 2017-05-01 categories: [IoT] color: "blue" image: "images/17minds/feat_architecture.jpg" excerpt: Microsoft worked together with the startup 17 Minds to create an Azure IoT solution complete with a gateway for their child sleep management system. verticals: [Health] language: [English] geolocation: [North America] #permalink: /.html --- Microsoft worked together with the startup *17 Minds* to create an Azure IoT solution complete with a gateway for their child sleep management system, which collects clinical data on sleeping children. Data is used to track sleep apnea events for subsequent analysis to predict certain behavioral events such as a behavioral outburst in an autistic child. Understanding such associations is important to understanding how to help. **Key technologies** used include Raspberry Pi as the hardware for the field gateway, and the Microsoft Azure IoT Gateway SDK for the software. Other technologies used include Azure IoT Hub, Azure Stream Analytics, Microsoft Power BI, Azure DocumentDB, Azure Blob storage, and Bluetooth Low Energy (LE). ### Core team - Julie Ann Lockner – CEO, 17 Minds ![Julie Ann Lockner]({{ site.baseurl }}/images/17minds/julie.jpg)
- [Jeremy Foster](https://twitter.com/codefoster) – Project Lead, Microsoft ![Jeremy Foster]({{ site.baseurl }}/images/17minds/codefoster.png)
- [James Earle](https://twitter.com/ItsJamesIRL) – Developer, Microsoft ![James Earle]({{ site.baseurl }}/images/17minds/james.jpg)
- [Kevin Leung](https://twitter.com/KSLHacks) – Developer, Microsoft ![Kevin Leung]({{ site.baseurl }}/images/17minds/kevin.png)
## Customer profile [17 Minds](https://17minds.life/), based in Cambridge, Massachusetts, is an innovation driven startup in the MIT ecosystem, whose mission is to help children with special needs reach their full potential by empowering their support teams. The 17 Minds platform allows families, educators, therapists, psychiatrists, and care providers to share, collaborate, and analyze behavioral and physiological trends over time to ensure the best possible outcomes with a child’s educational and prescriptive care plans. The team is comprised of experts in technology, consumer products, and social work. 17 Minds is a collection of applications built on a common collaboration platform to address the communication challenges associated with supporting individuals with special needs throughout life. They provide the ability to create and tailor the individual’s experience by collecting, analyzing, and sharing information relevant to their personal challenges, goals, and progress. The platform works with third party sensors and wearables to augment data collection and reporting for a comprehensive view of the individual’s well-being. ## Problem statement The project's current solution provides users with a means to manually enter information about their sleep habits into a database in order to analyze these habits and advise users on how to be healthier. This single manual channel for entering sleep data leaves room for human error and inconsistency and reduces the flexibility of the application. The program would benefit from a software solution that allows them to augment manual data with sensors that record actual bio sensor data while a user sleeps. This creates an opportunity to collect data at significantly higher intervals with increased reliability and accuracy. They also need a hub for data reporting that can be easily installed in a clinic and configured to accept data from a wide variety of sensor devices. With increased granularity in the data, we could also implement data intelligence services that would not only analyze the incoming data after collection, but also provide clinicians with live insights and alerts when necessary. ## Solution and steps 17 Minds data collection can be improved by making it easy to collect sensor data in a clinic setting. What we need is a complete data pipeline from the actual devices to the cloud. This is facilitated by a field gateway device that has broad support for sensor devices in the proximity of the gateway and the ability to aggregate, filter, and send data to Azure. The following diagram depicts an overview of the project's architecture. ![architecture diagram]({{ site.baseurl }}/images/17minds/architecture.jpg) ### The gateway To keep things simple, we've selected the Raspberry Pi as the hardware for our field gateway, and for the software we're using the Azure IoT Gateway SDK. This combination of hardware and software gives us a lot of flexibility and extensibility for future sensor device support and the addition of business logic. It should be noted that the IoT Gateway SDK is compatible with a huge array of edge devices. The choice to use a Raspberry Pi was mostly one of convenience because codebases for both the Raspberry Pi 2 and Raspberry Pi 3 were already created. Besides acting as a connection point, the gateway gives us a chance to intelligently filter, batch, or otherwise manipulate or analyze sensor data before sending it to Azure. #### Node.js gateway modules Because it's written in C, the IoT Gateway SDK can be installed on almost any gateway hardware. Its compatibility is not limited to the types of hardware it can be run on or the programming languages developers can use to implement a data workflow. In this project, we've decided to implement gateway modules that use Node.js. This language choice is mostly preference, but one of the strong advantages is that Node.js is quite simple in comparison to C, and is broadly familiar to developers. Chances are good that when this project is expanded, developers will either know or have no trouble picking up the language. Modules for the gateway are easy to write: just drop a new `.js` file in the project's `modules` folder that contains the following interface. ``` js module.exports = { broker: null, configuration: null, create: function (broker, configuration) {}, start: function() {} receive: function (message) {}, destroy: function () {} }; ```
The functions perform the following tasks: - The `create()` function determines what happens when the gateway is started and this module is created. - The `start()` function determines what happens after this module has been created and then started by the gateway. - The `receive()` function determines what happens when an incoming message arrives addressed to this module. - The `destroy()` function provides the opportunity to clean up memory when this module is taken down. After all the modules have been created, they need to be _configured_ and _linked_ in the project's `config.json` file. A module configuration looks something like the following. ``` json { "name": "filter", "loader": { "name": "node", "entrypoint": { "main.path": "./modules/filter.js" } }, "args": { "eda_min": "20", "eda_max": "40" } } ```
Notice that this gives an operations manager the opportunity to define configuration variables (in this case `eda_min` and `eda_max`) that will affect the behavior of the module in operation. The module linking looks something like the following. ``` json "links": [ { "source": "*", "sink": "logger" }, { "source": "simulated-device", "sink": "filter" }, { "source": "filter", "sink": "batch" }, { "source": "batch", "sink": "printer" }, { "source": "batch", "sink": "native-iothub" } ] ```
This controls the flow of the data through the gateway. In this case, messages generated by the `filter` module are received by the `batch` module. #### Device support One unknown in this project so far is specifically which devices will be used as data collection sensors. The reality is that in a final solution there will likely be a variety of device types sending data, so we should maintain the ability to easily add support for more at any time. So far, the gateway simply facilitates communication by serving an HTTP API. In reality, most sensor devices will not have the ability to submit to an HTTP endpoint. In the future, adding support for new device types and their hardware and communication protocols (such as Bluetooth) would simply involve adding additional gateway modules. Using the API, data can be submitted to the gateway with a simple HTTP post to the gateway's `/api/messages` route with the JSON data as the payload. Because we know so little yet about devices to be used in a final solution, we created a module for our gateway called the `simulated-device` that generates random messages with the general format of content that we expect. In reality, a clinic might have 1-10 devices, and we anticipate messages being about 1100 bytes and arriving at the gateway from each device at a frequency of 4 Hertz (Hz). The information payload of each message is the measurement of a variety of biometric sensors related to sleep, such as Electro-Dermal Activity (EDA), heart rate, and motion. #### Data aggregation One of the advantages of setting up a field gateway is the opportunity to "sweeten" the data before sending it to the cloud. In this project, although we're not yet sure exactly how data needs to be sweetened, we wanted to create some facility to do so and to expand on this easily in the future. We chose to add simple and configurable _filter_ and _time batch_ functionality. These are implemented in [`filter.js`](https://github.com/17minds/gateway/blob/master/modules/filter.js) and [`batch.js`](https://github.com/17minds/gateway/blob/master/modules/batch.js) respectively. These modules use a technology called Reactive Extensions for JavaScript (Rx.js). Rx is very good at handling streams of data, so it's ideal in this scenario. In the case of the basic filtering that we do in the filter module, Rx is a bit overkill and its advantages won't likely show up until something more complicated is requested, but in the batch module it's fairly easy to see how advantageous this library is. Take a look at this significant statement from the `batch.js` file in the solution. ``` js this.subscription = this.messages .map(m => JSON.parse(utf8.decode(m.content))) .bufferTime(this.configuration.batch_time) .subscribe(b => { this.broker.publish({ properties: { 'source': 'batch' }, content: utf8.encode(b) }) }) ```
In this case, the `messages` object is the stream that all incoming messages are pushed into. Messages going through the data flow have byte-encoded content, so the first thing we do is decode so that we can pull out just the content. ``` js .map(m => JSON.parse(utf8.decode(m.content))) ```
The functions in Rx are chainable, so the result of this mapping is still a _stream_ and the next function can be appended. ``` js .bufferTime(this.configuration.batch_time) ```
This buffers the messages for the configured amount of time. All of the data for every five-second window of time (for example) is then combined together into a single array of data points. Finally, we _subscribe_ to this buffered data to determine what to do with the results. ``` js .subscribe(b => { this.broker.publish({ properties: { 'source': 'batch' }, content: utf8.encode(b) }) }) ```
In this case, we build a new JavaScript object to submit to the gateway pipeline, encoding the content in our buffered batch as the content. This means that whatever module(s) are configured as sink links in the `config.json` will pick this message up and process it. ### The hub After the gateway transforms and batches data, it then sends it to [Azure IoT Hub](https://azure.microsoft.com/en-us/services/iot-hub/). IoT Hub is an ideal cloud endpoint for IoT data. With security as well as manageability in mind, IoT Hub requires that all devices be registered. However, the devices in our solution are not connecting directly to Azure IoT Hub, but are going through our gateway. For this reason, we need to _map_ device identifiers to actual IoT Hub device IDs. This is best done in a separate module in the gateway data flow. In this solution, however, we're not yet aware of what kind of device identifiers we would actually have, so we've left this module out for now. Consequently, IoT Hub will currently see all incoming data as being from a device ID that corresponds to the gateway itself. Most of the modules in this solution use Node.js. However, there's an issue in the IoT Gateway SDK Node.js IoT Hub module that causes intermittent failures. So for now, we're opting to use the native IoT Hub module for this. When this issue is resolved, it might be a good idea to switch out the native module for the Node.js for consistency. ### Stream processing IoT solutions such as this one tend to generate a lot of data. When, as in our case, a field gateway is employed, there is a prime opportunity to make sure that data generated is interesting and valuable, but often it's not yet known what interesting and valuable data is (at the point in time that the data arrives at the gateway). Let's have a look at the sample data for this project. ``` json [{"eda":42,"bvp":27},{"eda":54,"bvp":29},{"eda":44,"bvp":28},{"eda":41,"bvp":27},{"eda":47,"bvp":23}] ```
Notice that the messages are batched. The length of the batch is variable due to the possibility of timing boundary issues as well as any edge filtering we may configure to occur. Real measurements would likely include a few more data points than just these two. When data starts arriving in the cloud, it can be visualized as a constant stream of data. Often, we'll want to somehow manipulate that stream to: - Further filter or aggregate data beyond how it's been filtered or aggregated in the field gateway. - Shred data in the cloud that's been batched in the field gateway. - Divert data to various other cloud endpoints for further analysis or storage. In Azure, streams are handled by the [Azure Stream Analytics](https://azure.microsoft.com/en-us/services/stream-analytics/) service. In this solution, we currently don't have a need for further data analysis in stream, so we're simply using "pass through" queries to divert data to Azure Blob storage, to an Azure DocumentDB database, and to Microsoft Power BI. We're using separate jobs due to the likelihood of these stream queries evolving separately from each other. We could later decide, for example, that Blob storage would continue to retain every message, whereas the DocumentDB database would only receive aggregate data. Following are all three queries. ``` sql SELECT eventprocessedutctime, eda, bvp INTO blob FROM iothub SELECT eventprocessedutctime, eda, bvp INTO docdb FROM iothub SELECT eventprocessedutctime, eda, bvp INTO pbi FROM iothub ```
#### Data visualization It's always helpful to have a live view of your data. Power BI enables this without a lot of work. Streaming data in Azure can be plugged in to Power BI and then used to feed live charts or KPI indicators. Power BI allows us to build our own custom visuals; however, for this project so far, we're using a simple line chart and some value cards to show current levels.
![Power BI charts]({{ site.baseurl }}/images/17minds/pbi.png)
#### Data storage A major area of consideration for this project is precisely how the resulting data will be combined or overlaid with existing data to add value to sleep studies. It wouldn't be a good idea to spend too much time implementing something, but we also don't want to limit data collection for future analysis, so the current solution is to put all collected data into DocumentDB for later analysis. Furthermore, a separate Azure Stream Analytics query dumps all data to Blob storage with the idea being that the project may eventually require different sets or shapes of data to go to Blob storage versus DocumentDB. ### Security Security and privacy are massive concerns in almost any IoT solution and certainly in solutions that involve medical data for underage patients. Our solution involves a hardware gateway that aims to facilitate the connection of multiple devices. Fortunately, the devices will be under the control of the clinic. That doesn't alleviate our concerns, though. We need to think about our data's entire travel path and make sure it's secure. Azure IoT Hub includes a device registry that generates either a synchronous key or an X.509 certificate for each device. Normally, this key is configured on each device that will communicate with the hub. In our case, however, the likely devices are purpose-made, wrist-mounted sensor arrays with proprietary computational units and usually Bluetooth Low Energy (LE) communication protocols. We cannot execute an Azure IoT Hub SDK and use keys to encrypt communication. Instead, this becomes a job for the gateway. The gateway device is configured and programmed to receive device communication using each device's proprietary means, and the security of that channel must remain outside the scope of this project. After the data is transferred from device to the gateway, however, we have two options: we can treat the gateway as a single device that just happens to be getting its data from disparate devices, or we can consider each of the connected devices to be an IoT device. The latter is preferable. To implement this, we need to implement _identity mapping_ where each of the devices' unique identifiers (likely their MAC address as in the case of Bluetooth LE) is mapped to the security key for a device that's been registered in our IoT Hub. Because we're focusing on the gateway in the project and know relatively little about the devices that will eventually relay data, we have left the mapping module out for now. Any device with a key will then encrypt its data (and in fact _must_ encrypt its data) before sending it to IoT Hub. This ensures security in transit. After data arrives in the Azure IoT Hub, it's much easier to secure and becomes largely a matter of trust in Azure as a secure cloud provider. For more information, see [Microsoft Azure - The Trusted Cloud](http://azure.microsoft.com/support/trust-center/). ## Conclusion This Microsoft solution at 17 Minds highlights the flexibility of Azure's IoT offerings both in the cloud and on the edge — in this case in the sleep clinic. The use of multiple Azure services that integrate together easily makes for an excellent use case for the customer. By establishing nothing more than a choice of physical devices (Raspberry Pi 2 or 3), the customer can enable a gateway to collect large amounts of input data from multiple devices. This can then be processed and sent to the cloud for more advanced analysis and storage. The chosen services create a data pipeline exactly in line with the customer's outlined needs. It also shows the excellent integrations capable within the Azure IoT Suite for data aggregation, manipulation, and visualization. ### Learnings from the IoT project with 17 Minds - The Azure IoT Gateway SDK is tremendously flexible. It offers support for a huge array of devices and can be developed in many popular languages. Being able to use a Raspberry Pi as our gateway makes the solution inexpensive and easy to replicate. - Data ingested by Azure IoT Hub can easily be configured to connect to other Azure services, making data manipulation in the cloud very easy. - According to Julie from 17 Minds, their group is optimistic that the gateway solution we've created here will add value to their project and will end up in production. Specifically, the ability to batch and filter messages before they're sent to the cloud gives them the ability to control the trade-off between message size and message frequency, and this gives them additional control over their messaging costs. ### Shareable assets - **Node.js gateway**: The Azure IoT Gateway SDK appears difficult due to the low level C source code and the need to set up a dev box to build it. We were able to get the essential binaries for both a Raspberry Pi 2 and 3 so that this customer and any future customers need only pull a small (70 MB) repository, compose modules by using Node.js, and then configure the data pipeline by using the `config.json` file. - **Filter.js module**: A useful feature of the Azure IoT Gateway SDK is the ability to add custom modules. These can be set up and included in the pipeline by using `config.json`. You can establish module arguments or parameters to perform a wide array of tasks. The first significant case of these that we've included for the customer and future customers is the _filter_ module. This is located in `filter.js`, where a simple data filter is implemented. This means that only data within a certain range will be passed to the next stage in the pipeline. With this, the customer has control over parametric data. It is also customizable, so that they can control the filtered range of data with user supplied arguments, again configured by using `config.json`. - **Batch.js module**: A second module that we created for this implementation of the IoT Gateway SDK allows a configuration variable for batch time. It then collects sleep data for that configured amount of time before sending it to IoT Hub as a single batch. The requirements that may necessitate this feature might be the ability to control ingestion costs, reduce network traffic, or group related data. ## Additional resources - [Project repository (GitHub)](http://github.com/17minds/gateway) - [Azure IoT Gateway SDK (GitHub)](https://github.com/Azure/azure-iot-gateway-sdk) - [Azure IoT Gateway SDK](https://azure.microsoft.com/en-us/services/iot-hub/iot-gateway-sdk/) - [Raspberry Pi Home](https://www.raspberrypi.org/)