By Michel Chabroux
At Wind River, we’re moving on several fronts to achieve one key goal, which is to help embedded systems developers make better software, faster. For example:
• Our recent debut of Wind River Studio introduced a new cloud-native platform for the development, deployment, operations, and servicing of mission-critical intelligent systems.
• VxWorks® recently became the first and only real-time operating system (RTOS) to support C++17, Boost, Python, and Rust.
• With Wind River Labs, we share innovative new technologies. For example, for VxWorks we have the Robot Operating System 2 (ROS 2) framework and Internet of Things (IoT) software development kits (SDKs). We have IoT SDKs for Amazon Web Services (AWS), Microsoft® Azure, and Google Cloud.
Now I want to tell you about another initiative we’ve been working on that will change the way embedded real-time systems are developed and deployed.
Real-time systems are being incorporated into larger and more complex environments every day. A fighter jet, for example, is built with many different compute systems that could potentially be running different operating systems. (I discuss this fighter jet example further in this video, which you should really check out. But the same principles apply to self-driving cars, automated factories, and any number of other scenarios.)
The question is, how do we accelerate the deployment of software on such large systems? How do we make the deployment process uniform so that there’s no workflow change from one subsystem to another? The answer that we envision is a containerized local infrastructure — or edge cloud — in the plane, car, or factory, ready to serve the software for the various subsystems. That edge cloud connects in turn to another cloud, which allows you to push information and software updates in order to manage and orchestrate your heterogeneous software subsystems.
To pursue this vision, we’re delivering support for Open Container Initiative (OCI)–compliant containers in VxWorks. These containers will give you the ability to use the same type of cloud infrastructure, the same type of tooling, and the same type of workflow as you would use for any other application in a more traditional IT environment. With container support, the RTOS world becomes more intelligible for modern application development, IT methods, and DevOps sensibilities.
VxWorks containers work with the Wind River Studio cloud, enabling deployment and management of applications at scale across heterogeneous systems using a unified technology. For the first time, an RTOS becomes a “full citizen” in a cloud-native infrastructure.
Given the large size of its footprint, we couldn't just add Docker to VxWorks. The VxWorks container runtime has a small footprint of less than 100 KB, compared to the Docker footprint of something over 2 GB. That’s four orders of magnitude of difference! When you add container management into the mix, VxWorks container engine is still under 400 KB. The challenge was not only to work within a small footprint but also to preserve the deterministic and certifiable environment at the heart of the VxWorks RTOS.
From a tooling point of view, it was imperative to not have to use anything different. So we locked on Buildah, using it as it is. Nothing extra!
This means the time is rapidly approaching when you won’t need a VxWorks expert to deploy VxWorks applications, but rather you’ll be able to deploy at scale with standard IT tools and methods.
Want to learn more about our vision for the intelligent edge? Check out this paper.