In this age, where billions of devices are connected to the Internet, we see increasing potential of tapping big data acquired from these devices and processing them efficiently through various applications. IoT devices are devices with multiple sensors connected to the cloud, typically via gateways. IoT cloud platforms can also be extended to services that use advanced machine learning algorithms for predictive analysis, especially in disaster prevention and recovery planning using data from the edge devices.
Typical features include connectivity and network management, device management, data acquisition, processing analysis and visualization, application enablement, integration and storage.
Inspired by business needs, these software blocks cover four functional areas:
- Registration, administration and updating of connected equipment.
- Acquisition and contextualization of events generated by the sensors.
- Recording through data processing, converting them into commercial transactions and transmitting them in the form of commands to controllers, or any other combination of these above functions.
- Hosting of IoT application components.
All major public cloud providers offer tools to address each of these areas. In some cases, these tools can be used without modifying their applications.
In others, development teams need to integrate tools with applications. This is often the case in the context of IoT deployment in industrial environments. It is therefore necessary to understand each of the four functional areas and the level of nesting that it requires.
Register and update equipment
It is possible to register, manage and update connected objects, as long as they communicate with Web protocols (HTTPS, MQTT ). They must also implement secure interactions between devices, enable the decommissioning of those that are taken out of service, and perform other routine tasks. In this way, IoT cloud platform applications do not have to manage endpoints.
Contextualization consists in ensuring that the event is correlated with the state of the system from which it originated. Developers, administrators and Ops alike should consider IoT as a source of events. Devices send signals in response to real conditions, and those signals then activate application processes. In some cases, an event is a signal or a stand-alone request. In others, the context of the event is important.
Simple event flows don’t necessarily require contextualization. These pipelines can connect to analytical tools if the main objective is to count the number of events (for example, the number of finalized parts passed in front of a sensor on a production line).
These pipelines can also feed complex event processing applications. Some of these applications can be done with little or no customization, but it is obvious that the results are better by tailoring the software to the specific signals coming from the flows. This ingestion method has the advantage of facilitating the processing of massive amounts of data.
IoT platforms in the cloud support event processing in different ways. Some are specific to IoT, others are originally designed for more traditional applications. The difference lies in the variability of the flows. Organizations can better manage regular events with containerized solutions embedded in vendor platforms or using managed services powered by Kubernetes.
However, the use of containers or traditional IaaS from cloud providers is not suitable for processing events subject to strong variations. Such architectures run the risk of running out of resources during peak periods and causing waste during off-peak periods. Under these conditions, hosting microservices is the best approach.
The associated economic model permits companies to pay only for the resources they consume.
Edge computing and on-board machine learning
In principle, the services of cloud providers are not available locally. However, it is possible to use some of its Edge computing features. By offering several capabilities in deploying algorithms from terminals to have cached data collected, it allows localized management of events in the event of a network cut or connection irregularity. These products will however require a personalized development phase.
OpenStack for Edge Computing
Among infrastructure software, OpenStack is the most highly distributed already. It provides the building blocks for a robust infrastructure to deploy in a location of your choice. Consequently, making it ideal for the edge network. Therefore, OpenStack and edge network have a long journey ahead of them.
With the work that OSF Edge Computing Group puts in, the use case of OpenStack on the edge network is on the rise in some sectors. Do not miss out on the benefits of edge computing along with the adaptability on an OpenStack based cloud. Reach out to us to know more about how OpenStack can take you to the edge.
You’ve got Big Data?
We’ve got your Big Data Infrastructure Solution in a free ebook!