This commit is contained in:
kimwolk 2021-07-08 17:13:22 -07:00 коммит произвёл GitHub
Родитель 0a8abad8f4
Коммит 47e23ba50d
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
1 изменённых файлов: 2 добавлений и 2 удалений

Просмотреть файл

@ -1,6 +1,6 @@
# iot-central-compute
A simple way to do data transformation, compute, and data enhancement on data sent to Azure IoT Central
A simple way to do data transformation, compute, and data enhancement on data sent to Azure IoT Central ([demo video](https://youtu.be/aV3fNHhyZN4))
# Introduction
@ -145,4 +145,4 @@ send status: MessageEnqueued [{"data":"40.5, 36.41, 14.6043, 14.079"}]
Because the data comes through the IoT hub twice when using the Azure Function as we are it will consume two messages per message the device sends. This may consume more messages than your monthly allotment and increase the cost of using Azure IoT Central. One way around this would be to use the Azure IoT Central device bridge in it's original form and from your device call the Azure Function via HTTPS and send the telemetry and device identity in where it can be transformed, computed, augmented and finally sent to the Azure IoT Central application. The limitation here will be that your device will be one way only with telemetry being sent from the device to the cloud but no device twin or commands being enabled for that device.
The processing time in the Azure function must not exceed 5 minutesso ensure that the compute performed does not exceed this time. Ideally for performance and scaling the processing and augmentation of data should be in the order of seconds. If you really need long running jobs then you could look at converting this code to use [Azure Durable Functions](https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp) especially if you are looking to data agregation over time windows for example.
The processing time in the Azure function must not exceed 5 minutesso ensure that the compute performed does not exceed this time. Ideally for performance and scaling the processing and augmentation of data should be in the order of seconds. If you really need long running jobs then you could look at converting this code to use [Azure Durable Functions](https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp) especially if you are looking to data agregation over time windows for example.