Containers help enterprises decentralize services and consolidate hardware
Cloud computing’s role in enterprise network infrastructure is immense and evolving. Network architectures built on the traditional cloud computing model alone are not sufficient for applications that require real-time response and low latency at the edge. Additionally, the widespread adoption of IoT is forcing enterprises to consider complimentary network architecture models to reduce network bandwidth and the cost associated with transferring huge amounts of data to centralized cloud applications.
As a result, edge–oriented IoT architectures, where intelligence is moving from cloud to edge, are gaining prominence as more and more organizations move to edge computing.
What is Edge Computing?
Edge computing is a computing paradigm where compute resources are used to process data at or closer to the source where it is generated. Until a few years ago, the role of edge computing in the IoT context was confined to gathering, filtering, and sending data to applications in the cloud. Thanks to plummeting silicon costs and the miniaturization of silicon devices, more computing power is executed in smaller-footprint edge devices such as IoT gateways and routers. Due to this, modern infrastructure at the edge is capable of processing analytics, running machine learning models, and taking action at the edge in real time.
As a result, edge computing plays a dominant role in every IoT deployment across all industries. Gartner predicts that by 2022, 75% of enterprise-generated data will be created and processed outside of the traditional, centralized cloud or data center.
Edge containers enable enterprises to decentralize services by moving key components of their application workloads and services to the edge environment, while they can still be managed centrally from the cloud and integrated with a core set of services running in the cloud or data center. By moving intelligence to the edge of the network, enterprises can reduce network costs drastically and deliver low latency for applications that require real-time response.
Why Edge Containers?
Historically organizations have deployed applications on “bare metal servers” with an operating system that has full control of the underlying hardware. But the server is confined to a single tenant and provided poor utilization of the hardware resources. Additionally, applications built on bare metal servers are hard to move around and upgrade as they are built on an operating system that is tied to specific hardware.
As a result, “virtualization” became popular with hypervisors running on a physical server where it creates multiple virtual machines (VMs), each with its own operating system and sharing the underlying hardware resources. Over the last many years, VM–based applications have become dominant in cloud and data center infrastructure, as they are portable and resource–efficient compared to those built on bare metal. But as each VM is built with full operating system (OS), application, and associated binaries and libraries, they have lot of overhead, and they run slowly.
This resulted in the evolution of containers. Containers virtualize at the operating system level, not at the hardware level, which is the case with VMs. Docker is the most popular container technology, and it became the industry standard with its open source Docker Engine. So, what is a container? It is a software model that packages all the code and its dependencies so that an application can be run independently and moved from one computing environment to another consistently and easily. As a result, containers continue to see widespread adoption as enterprises are deploying container-based applications virtually everywhere — data center, cloud, and edge infrastructure.
Both containers and VMs are popular choices for developers to build their applications, as they offer clear isolation and allocation of the resources for applications running on the same platform.
Containers are particularly well suited for edge computing because they are:
Lightweight — Containers virtualize at the OS level, sharing the same kernel, so they are far more lightweight and ideal for execution on edge environments.
More Portable — As containers are based on the widespread Docker image format, they are highly portable and run anywhere — data center, cloud or edge. As a result, it is easy to “lift and shift” container applications from cloud to edge. For example, customers can develop and train machine learning models in the cloud and deploy them at the edge.
Faster Boot Time — Containers do not include an OS image, so they start in seconds, whereas applications based on VM take minutes to boot up.
More Efficient — Because they are lightweight, containers can be deployed, migrated, and upgraded faster on distributed edge infrastructure compared to VM applications.
NetCloud Container Orchestrator for Deploying Edge Containers
Cradlepoint’s NetCloud Container Orchestrator (NCCO) is a new capability that allows customers to run lightweight containers on Cradlepoint routers deployed at the edge. NCCO has two key components:
- NetCloud Edge Container – Runtime on Cradlepoint router to run Open Containers Initiative (OCI) compatible docker container workloads at the edge.
- NetCloud Container Orchestration – The ability to deploy, monitor, and manage these container workloads in the cloud with NetCloud Manager.
Partners and customers who want to leverage NetCloud Container Orchestrator can host and pull container images from any docker container registry like Docker Hub.
NCCO supports Docker Compose, a tool for defining and running multi-container docker application. With Docker Compose, customers can use a YAML to configure and start application services.
With NCCO, Cradlepoint turns its Wireless WAN routers into an edge compute platform and enables customers to deploy lightweight edge containers onto it. NetCloud Manager provides a single pane of glass for managing WAN connectivity and container workloads at the edge at scale.
Use Cases for Containers on Cradlepoint Devices
There are several edge compute use cases for containers running on Cradlepoint’s Wireless WAN edge infrastructure across branch, mobile, and IoT segments:
- Preprocess and Filter IoT Data — Containers running on a Cradlepoint router at the edge can process data from sensors and devices connected to the router. These containers can preprocess and/or filter much of the raw data from all sensors and send only relevant data to the cloud. Customers deploying IoT solutions in manufacturing floors, retail environments, smart buildings, and smart cities can save bandwidth and costs associated with transferring raw IoT data from sensors to the cloud.
- Real-Time Response at the Edge — In mobile environments such as school or public buses, customers can deploy edge containers for public safety solutions. For example, containers running on Cradlepoint routers can trigger video recording from connected cameras in the buses when a driver or passenger presses a panic button and send the footage to the cloud for a public safety agency to respond to the incident in real time.
- Offline Device Operation — Cradlepoint supports AWS IoT Greengrass containers for IoT deployments based on AWS Cloud. Greengrass is designed to operate when connectivity to the AWS Cloud is intermittent or lost. As a result, Greengrass containers are ideal for remote monitoring applications to gather, process, and store IoT data locally when the connection to the cloud is lost. When the device is back online, Greengrass containers can synchronize the data stored on the local device with cloud services providing seamless functionality regardless of connectivity.
- Low-Latency Applications — Applications that require time-sensitive action suffer from latency if the data is processed and analyzed in the cloud to take action at the edge. For example, industrial customers can use NCCO to deploy containers that help to make decisions rapidly based on events generated by SCADA systems at the edge and control (turn on or off) other systems in real time, without sending data to the cloud.
- Privacy and Regulatory Compliance — In industries such as healthcare, there are privacy and regulatory requirements for enterprises sending sensitive personal data of patients to applications in the cloud. Containers running at the edge help these enterprises filter Personally Identifiable Information (PII) and send only preprocessed data to the cloud.
In addition to edge computing, there are many other use cases for edge containers on Cradlepoint routers. One of the use cases to highlight is the box consolidation at the edge. For example, customers use legacy protocols such as Modbus to connect and manage SCADA systems in industrial IoT, or CAN bus or BACnet for building management systems in branch deployments. These customers use multiple boxes — one to run these protocol converter applications and another for WAN connectivity. With NCCO, these customers can deploy Cradlepoint routers that provide both connectivity and compute for running containers to translate these legacy protocols to standard IP format and manage these systems from the cloud.
Likewise, IoT deployments in branch or mobile environment use separate appliances to run applications such as IoT security, Node RED, etc., which now can be run on Cradlepoint routers as a container for box consolidation at the edge.
We look forward to seeing how Cradlepoint customers and partners will leverage NetCloud Container Orchestrator to solve customer problems we have not thought of yet.