When visualizing a new IoT application; carefully consider not what data flows you want, but what data flows your application needs to be successful.
Data flows are one of the key constraints in the design of any IoT application. Data flows drive not just communications cost, but also indirectly control communication technology selection, power needs, and the actual paradigm of an application’s functionality. As you begin visualizing your new IoT application, think carefully about the data and communication patterns that support your planned features.
IoT is a confluence of smart and connected in a remote device. I assume that if you’re reading this you have a “device” side and a “user” application side that you are thinking about connecting. Your devices could be in near proximity of your user, their home/office, or anywhere. The user application where the device data lands initially could be a smartphone, or in many cases a cloud platform. Throughout the discussion, chances are the connection will be over a wireless link. This is by far the predominant pattern for typical IoT (non-IIOT/non-manufacturing use cases).
First, let’s differentiate between “data” and “data flows.
Data: what is measured
Data Flows: what is communicated.
Sure, increased data is likely to ramp up the amount of data sent to/from your device, but more data is not a linear predictor of how much data an application needs to communicate with a user. As the power of IoT device MCU chips increases, there is a steady ability to do more processing on the device and only communicate a summary of relevant events and periodic data points.
Communications are power-hungry compared with computation and memory on an IoT device. The more you can keep your radio turned off, the more battery life that remains. There are power light wireless technologies like Bluetooth Low Energy (BLE) for near distance communications, but what if your device is far away? Radios vary in their performance profile and there are numerous articles out there about WiFi vs. LoRa vs. LTE. Know your communications stack. Next, I lay out some concepts that should be considered regardless of which type of radio is in your device.
Most IoT projects fall into two broad categories. These two patterns dictate many aspects of the data flows your application will need to perform and when your communications hardware needs to be turned on.
Interactive applications place the user and device in virtual proximity, with physical distance ranging from a few feet to miles to wherever. The communication flows, bridge that physical distance. This application pattern is the most demanding from a communications perspective.
Communications that are interactive require that a device radio stays on to listen for user input. This could be constrained to a specific interval of interest, rather than 24 hours a day. Maybe the communications channel can be predictably enabled during “business hours” or only during predicted device “usage” times. The key point, radio on all the time increases power consumption considerably. This in turn increases challenges for off-grid or solar applications to have enough power harvesting and storage.
Being that the interactive application pattern is so demanding that you may find variations necessary to make things work. Consider maybe delaying user input by minutes or even hours, opening times for user control of the device.
From a communication and power budget perspective, this application pattern is much easier to implement. Devices can wake up occasionally, gather and locally store data, assess the situation, and then decide if communication is required. The radio stays off until it is needed to send data to the user before it’s back to sleep for the communications.
The remote monitoring pattern can be integrated with an interactive application by making use of when the radio is already turned on. When you periodically send data, check for user directives. This approach is standardized in LoRaWAN Class A devices which listens for user input 1 and 2 seconds after transmitting its data.
IoT applications typically use protocols such as MQTT or HTTP to package their data while in transit. MQTT, HTTP, AMQP and other IoT communication protocols add protocol data to the total amount of IoT device payload data being transmitted between the device and the user. The amount of data communicated typically increases in two ways: framing overhead and keepalives.
Framing overhead is the extra data that is sent along with an application’s data to make communications more robust and reliable. Think of protocol framing as the envelope you put your physical correspondence in. In the case of MQTT, to send data, the overhead is 6 characters + the MQTT topic name your device is publishing to. This can add up and, in some cases, exceed the size of the payload data you are sending to the user side. It is important to note that while MQTT transmits these extra characters with your message payload, MQTT is more efficient than AMQP and HTTP; which is why MQTT is so often used in IoT systems.
The other protocol tax is keepalive messaging (sometimes referred to as heartbeats). MQTT implementations typically perform a keepalive action every 1-4 minutes, this time period is referred to as the keepalive interval. Keepalives are not required if data transmission has been performed recently. To keep communications active, MQTT sends a 2-character long PING when the keepalive interval ends. The keepalive interval is reset with each transmission, for either a PING or payload data.
Most implementations afford the ability to lengthen the keepalive interval (reduce the number of keepalives sent), each system will typically impart some upper limit for the keepalive interval. Azure IoT Hub uses MQTT extensively and limits the keepalive interval to a maximum of 1177 seconds or once every 19 minutes, 37 seconds (Understand Azure IoT Hub MQTT Support).
When reviewing data and deciding what to send back and forth, think about ways to eliminate or reduce application data flows. When reviewing data flows, take note of how big each one is, how often data is sent, and what is going on with your communications channel when nothing is happening.
There are tools online (IoT Bandwidth Estimation Tool) to help visualize your data budget and be proactive in planning your data communications.
Time adds up… fast! Sending 500 characters of data every 20 seconds:
180 times / hour 90KB / hour
4,320 times / day 2,160MB / day
30,240 times / week 17,120MB / week
129,600 times / month 64,800MB / month
Remember every character you send can increase costs and draws down your device’s battery.
- After sampling remote data, look for ways to summarize prior to sending. For example: consider sending maximum, minimum, average, and number of data points over a specific period.
- Similarly, once a maximum and minimum are established, consider sending data only when a new outlying maximum or minimum has been observed.
- For remote sensing consider only sending data once a day or even once a week. But send data events when something significant has been observed at the device.
- Consider building normal limits in your device software. When the data being sensed leaves these limits, then communicate and report the event to the user.
- Log your data locally on the device and send a block of data (a day or weeks’ worth) at one time. Once the radio is on using it, then shut it off. Every time the radio is turned on/off, power is wasted before/after when data is sent.
Only you can determine when a piece of data being sent is valuable. Is that piece of data something you want… or is it something you need?