What would you do with
15,000 more hours?      

Calculate the benefits of digital transformation with Wind River Studio developer capabilities. With this calculator, you can see how your annual productivity and return of investment can improve.

 

Answer the questions below to see your results.

What is your company’s total revenue?
Company Revenue
$0
 
$0 $10,000,000,000
How many developers work on your projects?
Number of Developers
0
 
0 1,000
How far along are you in your digital transformation journey?
Digital Transformation Progress
Digital transformation is the adoption of digital technology to transform businesses and processes through replacing non-digital or manual processes with digital processes.
 
 
1 IN PROCESS
2 SEEING RESULTS
3 MAINSTREAMING
4 OPTIMIZING
5 LEADER

Estimated First Year ROI

$18,741,120
HOURS SAVED
$18,741,120
Returns
$5,640,000
Investments
x32%
Return on Investment

BENEFIT METRICS

Annual productivity improvement
$3,714,120
Risk mitigation savings
$27,000
Time-to-market improvement
$15,000,000

Investment

Implementation fees (one-time)
$1,080,000
Subscription fees (ongoing)
$4,560,000
Hosting fees (annual)
$3300

Your ROI was calculated using average annual development costs.

To get a customized report using your specific development cost, get in touch with a Wind River expert.

Talk to an expert
 

On the Cutting Edge: Part 4
Robot MD

Robots in the operating room19

Surgeons use million-dollar robotic devices in operations that require more precision, an expanded range of motion, and greater control than using their hands. The robot assists with suturing, dissecting, and retracting tissue, but surgeons remain in control. Dr. Benjamin Tee, at the National University of Singapore, has long been captivated by a scene in Star Wars: The Empire Strikes Back in which a surgical droid replaces Luke Skywalker’s hand after Darth Vader slices it off with a lightsaber.

“A fully autonomous robot surgeon is the Holy Grail,” although still many years off, says Dr. Tee. But he and other researchers are developing devices that can perform surgical tasks with minimal human oversight. Dr. Tee’s latest project is an artificial skin that would give robots a sense of touch, allowing them to differentiate between healthy tissue and tumors and make surgical incisions. Other researchers are working on robotic surgical assistants that use machine learning and AI to avoid soft tissue, for less invasive surgeries.

To date, fully autonomous surgical robots remain in the sphere of science fiction. Still, early signs show that robots could eventually perform certain surgical procedures quickly and more accurately than humans. “That’s something where robots really shine — precision, repeatability. And they don’t get tired,” says robotics research Professor Axel Krieger at Johns Hopkins University.

robot

From science fiction to reality: The scene in Star Wars: The Empire Strikes Back in which a surgical droid replaces Luke Skywalker's hand with a prosthetic may not be so far off.

On the Cutting Edge: Part 5
Key Questions to Ask
About Your Robotics System

The business implications of the new intelligent systems world mean that the dynamics for decision-making in robotic systems are evolving rapidly. What we might once have seen as incremental steps now become opportunities for transformation.

Ask yourself these five key questions as a team with business, strategy, development, and operational members. Using a simple scale from 1 (somewhat irrelevant) to 5 (a significant opportunity), you can calibrate how fully you are maximizing your opportunities by thinking about these areas.

If you score between 1 and 2 for any question, think through the ideas in this article to consider whether your robotic systems are bypassing significant opportunities.

 

Contact Wind River to discuss your plans for your robotics system with an intelligent systems expert.

On the Cutting Edge: Part 3
3 Ways to Simplify

The McKinsey Simplification Model: 3 ways to incorporate robots in the automation continuum

Advanced sensor technologies, more potent computing power, and edge processing provide robots with AI capabilities. To facilitate more adoption, value, and further growth, McKinsey has developed a model15 that synthesizes industry recommendations into a single concept — simplification in three essential areas:

1. Simpler to Apply

Robot developers and integrators need to make it easier for potential end users to envision compelling scenarios. Simplification in this realm could mean something as basic as providing software that closes the gap between conceivability and installation, helping end users prove their design concepts before committing to a final investment. A prime example comes from ABB Robotics:

Visitors to the company’s website are given access to a build-your-own cobot application. Working with intuitive menus, users can browse for functions they need, with options including part handling, screwdriving, visual inspection, and “tell us more.” Users go on to select how the cobot picks up parts and puts them down; where its vision sensors are placed; what communications protocols are used; and whether the cobot will be mounted on a wall, table, or ceiling. Illustrations clarify the choices throughout. Once completed, the program evaluates the selections, then delivers a customized video simulation of how the cobot, fully installed, would perform.

2. Simpler to Connect

McKinsey advises that robot manufacturers need to deliver secure, flexible connectivity. A key goal is to achieve interoperability. The robots should be able to readily connect not only with other robots but also with the full range of intelligent systems, edge, cloud, analytics, and similar tools and devices.

Cobots rely on multiple sensors and tools such as AI to make sense of and operate safely in the world around them. Simultaneously, the environment it is installed in or traveling through will feature multiple sensor-intensive intelligent devices. The challenge is that IoT and robotics technology are often considered separate fields.16 Thus the synergies across the two disciplines go unexplored. But reimagined together, IoT and industrial robotics become the Internet of Robotic Things, or IoRT.

To date, robotics and IoT have been driven by varying yet highly related objectives. IoT focuses on supporting services for pervasive sensing, monitoring, and tracking, while the robotics community focuses on production, action, interaction, and autonomous behavior. By fusing the two fields, the resulting wider-scale digital presence means intelligent sensor and data analytics are feeding better situational awareness information to robots, which means they can better execute their tasks. In short, the robots have access to more data for analysis and decision-making. Then edge computing opens the door for even more intimate collaborations between machines and between man and machine.17

“Potential end users will be increasingly likely to envision use cases, as robots and the talent needed to deploy them become more available.”
 
McKinsey & Company
The estimated share of the industrial robot market taken by cobots by 2027 (Robotic Industries Association) 14

3. Simpler to Run

Paradoxically, as robots become ever more sophisticated, capable, and flexible, the effort required by end users to train them often declines. Leading manufacturers understand that shortening the learning cycle is an important means of elevating the appeal of industrial robots:

“Fanuc wants to make robots easier to train, therefore making automation more accessible to more industries,” observes TechCrunch reporter Catherine Shu.18 Accordingly, the company is harnessing AI and related technologies to accelerate teaching and learning processes. Similarly, Locus Robotics advertises warehouse robots that are so easy to train they can be deployed in just four weeks. Interfaces and tools that drive robotic learning are becoming simpler, clearer, and more efficient for end users. Such improvements are a key focus across the industry.

robotic industry
Improvements in interfaces and tools that drive robotic learning are a key focus across the industry.

On the Cutting Edge: Part 2
Top 4 Challenges

Solving the top 4 challenges of sensor-driven collaboration

The potential payoffs of sensor-driven collaboration are huge — from protecting workers during the pandemic to increased profitability and productivity and new revenue streams via innovative breakthroughs. But the automation continuum, which involves many different players and reams of data, comes with challenges. Fortunately, the four principal challenges can be solved using the same technologies that made the automation continuum possible in the first place.

1. Proximity to Humans

The context of vulnerable humans working amid powerful machines is inherently risky. The traditional approach has been to effectively bar humans from working around active robots. Shields, guardrails, and even completely separate rooms or buildings are employed. But in an era of cobots, this will no longer be feasible, as humans increasingly will inhabit close quarters with machines.

The good news is that edge technologies can be harnessed to protect humans. Ryan Braman, director of commercial products at TÜV Rheinland, a global robotics certification consultancy, says, “New types of sensing technologies, such as laser scanners, radars, and other types of electro-sensitive protective equipment, allow robots to sense where safety-related objects such as humans or stairs are and adjust their path or stop movement until the object is removed, enabling robots to work safely alongside humans. As the technology improves, the range of tasks that robots will be able to take on will also greatly improve.”9

Rigorous testing is required before a system can be considered safe. “Manufacturers will tell you they have a safe system right out of the box. But all they’re really giving you is the ability to create a safe cell. You have to consider the application and interact with it and test it rigorously,” says Braman.

Reality dictates that safety needs to be an integral part of design and development from the start. YuMi is a basic but highly programmable service robot from ABB Robotics. According to Nick O’Donnell, ABB global external affairs manager, “YuMi was designed to safely work in the immediate vicinity of human coworkers, even in the event of unintended contact. It has physical and software safety features including lightweight soft-padded arms; motion control software; speed-limited hardware; and no pinch points on each of its dual, seven-axis arms,” says O’Donnell.

85%
Idle-time reduction when people work collaboratively with a human-aware robot compared to working in all-human teams (Professor Julie Shah, director, Interactive Robotics Group, MIT) 6
50%
The cycle time for simple assembly tasks can be roughly halved using human-robot collaboration as opposed to humans alone (Veo Robotics) 7
30–40%
The rise in efficiency of certain industrial operations due to human-robot collaboration (Robotics Market Trends) 8

2. Data Overload

A vastly higher level of machine awareness makes for an industrial environment rich in sensor-derived data but potentially stretched too thin in the processing and analytics departments. Traditional computing strategies and frameworks can be overwhelmed, sabotaging the benefits of a robot-intensive workforce.

The solution is at the edge. “Pushing all this data out somewhere else for processing — into the cloud — is no longer practical, nor does it make sense,” says Wind River Senior Director of Product Management Michel Chabroux. At the edge in robotics, “we increase productivity, because with artificial intelligence and access to so much data, the robot can make decisions much faster than humans — and statistically speaking, the robot will always make the best possible decision.”

A robot driven by data gathered and processed at the edge is able to detect the likelihood of its breakdown or, at the very least, the failure to maintain quality standards. Communicating with the other robots on the assembly line, the at-risk machine shuts down while others adapt their workflow in real time to make up for the missing worker. The production line slows but doesn’t stop. A human technician intervenes, making the needed adjustment or repair, and then the system returns to full speed. The only way this and related capabilities can be realized is through the edge.

“With so many machines, sensors, and data in the mix, computing will have to take place increasingly on the edge. The robots themselves will be better equipped to perform more activities and make more decisions autonomously.”
 
—Michel Chabroux
Senior Director, Product Management
Wind River
Michel Chabroux

3. Cybersecurity

As robots become mobile, collaborative, edge resident, and connected to internal and external sensors and IoT devices, the data-rich ecosystem opens itself to multiple access points for would-be hackers. Companies may find themselves vulnerable to malware, cyber ransom, production delays, and business disruption. What’s more, cyberattacks targeting highly nimble, powerful robotic systems also come with some serious physical safety concerns.

The solution? A comprehensive, unified, end-to-end approach to cybersecurity. It starts with device manufacturers: “When designing a product, developers need to make sure they are implementing security measures throughout the design process and writing firmware that is as secure as possible,” says Nigel Stanley, CTO at TÜV Rheinland.10 From there, systems integrators need to understand the machines they’re installing and the overall environment, with an eye toward identifying potential access points and hardening vulnerable targets. Finally, the operator’s IT team needs to be actively engaged, monitoring threats and updating security measures.

Security issues can arise even when a device no longer needs to be in service, points out Arlen Baker, principal security architect at Wind River. “It is vital to have a decommissioning process that ensures that sensitive software or data on the device is cryptographically sanitized, so there is nothing remaining on the device to reverse-engineer,” he says.

4. Price

The cost of robots is becoming less of an issue, as advanced technologies and new business models drive economies of scale. According to McKinsey, 16% of prospects for industrial robotics see cost as their number-one challenge — with 53% viewing the issue as one of their top five concerns.11 But with the rise of Robots-as-a-Service (RaaS), more manufacturers are becoming service providers, allowing customers to scale the number of running units depending on demand.

Computing, data communication, and storage advances continue to deliver more capabilities at a lower cost. “AI and machine learning algorithms have become more efficient, making it easier to program robots, devise new use cases, and reduce the energy required to run them,” notes Richardson at Wind River. “While some believe that Moore’s law is no longer true in terms of the quantity of transistors, it continues to be valid in terms of the cost of computing, as more and more capabilities become available at lower processing prices.”

$45K
The top price currently for most collaborative robots, making them a viable solution in numerous applications inside and outside of factory settings (Robotics Online) 12
2–3 years
Time needed to repay investments in service robots, assuming 24-hour operation (International Federation of Robotics) 13
On the Cutting Edge
Robots Join the Sensor-Driven Revolution

Transforming robots from repetitive machines into cognitive collaborators

Today’s industrial robots are transforming radically, metamorphizing from rote machines into cognitive collaborators. They’ve become an integral link in a dynamic continuum that encompasses humans, other machines, and the digital environments in which they operate: In a factory in Turin, Italy, a three-ton robotic arm senses through human-inspired “skin,” deciding whether to reduce speed or stop operations altogether. Its 3-D camera eyes materials that need to be retrieved, and it relies on its sense of touch to grip them properly. An advanced laser scanner monitors the workspace while the arm is in operation and alerts the robot to be careful when humans are nearby.1

A nexus of sensors, including computer vision, sensing ‘skin,’ positional feedback, and work cell monitoring, will insure safe cobotic interaction.

The relationship between humans and robots will soon deepen, thanks to affective computing. “Today we have rudimentary capabilities in software and hardware to perform sensory responses within given environments,” says Wind River® CPO Cyra Richardson. “As robots are taught how to respond to human emotions, and as engineers abstract lessons from human evolution about how people see, move, balance, hear, and feel, we will be able to better understand how cognition works. The sensing abilities of robots will grow and automation ecosystems will expand.”

The scope of environments that can be monitored will also grow as tiny sensors, aka smart dust, collect and analyze information from vast terrains. Implanted in humans, smart dust could help control pain or cure deadly diseases. Sprinkled around a city, it could monitor vibrations, temperatures, and magnetic or electrical fields. Therein lies the foundation of a smart city.

These advancements mean that automation ecosystems currently operating in controlled environments, such as a factory floor or a space satellite, will be adaptable to more chaotic and unpredictable environments — think autonomous vehicles driving on busy city streets, or humanoid robots helping the disabled navigate crowded sidewalks.

The cognitive capabilities of robots are already becoming indispensable as the COVID-19 pandemic reveals an urgent need to create more resilient supply chains and protect human workers. Work-related restrictions are driving the creation of an automated continuum involving humans and robots in industrial manufacturing. In warehouses, collaborative robots — cobots — work alongside humans to accelerate tasks such as reading orders, picking and packing products, and shipping them. At a DHL facility in Wilkes-Barre, Pennsylvania,2 workers averaged from 70 to 80 picks per hour prior to the arrival of a fleet of cobots. Now cobot-assisted picks range from 150 to 180 per hour.

“Robots and humans are collaborating in a state of continuous communication, orchestration, and optimization. A closed loop uses sensors and AI running on the edge at ultra-high speed to interpret sensor data in real time.”
 
—Cyra Richardson
Chief Product Officer
Wind River
Cyra Richardson
Human picks/hr: 80
Robotic picks/hr: 180

Advanced robotics are expected to add up to $4.5 trillion to the global economy by 2025, according to McKinsey.3 By 2026, cobots designed to share workspaces with humans will see an estimated compounded annual growth rate of 45%, according to ResearchMarkets. And the International Federation of Robotics recently reported that service robot sales were up 32% worldwide, largely driven by COVID-19.4

One subset experiencing particularly high growth is logistics robots — up 110% year on year. Such autonomous mobile robots were initially used in warehouses, but with the ongoing digitalization of production, they’re quickly becoming part of today’s smart factories. Another category surging in the COVID-19 era is robotic surgery systems. By 2022, medical robot sales have the potential to more than double.

“People are talking about cobots more and more. Every customer and every industry — everybody’s trying to do something with them.”5
 
—Andie Zhang
Global Produce Market Manager
ABB
Andie Zhang
 
 
 
 
 
Large six-axis industrial robots are used in places such as car factories to perform repetitive, laborious tasks. They are usually bolted in place at the base, but the arm of the robot still has a wide range of movement in all directions.
Each robot has a sensory perimeter or work area that extends in all directions to the limits the robot can reach.
These robots are increasingly working in proximity with human counterparts, rather than being isolated from people.
To protect human workers, the robot collapses to a safe position when a human enters the robot’s sensory perimeter.

Cybersecurity for the Intelligent Edge

A Microweb Tech Series with Arlen Baker

Watch them all

According to International Data Corporation (IDC), there will be an estimated 42 billion connected devices by 2025. Each of these devices represents a point of entry that can be exploited by a cyberattack. For devices and systems with safety-critical functionality, a security breach can have catastrophic consequences.

Join Wind River® Principal Security Architect Arlen Baker for a fast-paced look at how to design cybersecurity into can’t-fail devices and systems at the edge.

You can have a secure system that is not safety critical, but you cannot have a safety-critical system that isn’t secure.
 
Matt Jones, Chief Architect, Wind River

The Cybersecurity Journey Through the Full Product Lifecycle

Architecting a secure device starts well before the first line of code is written and ends only when the device is taken out of service. Learn how and where security is applied at different stages of the journey.

 
Watch Now

Capturing Use Case Security Requirements

Planning for security requirements up front greatly reduces friction and cost throughout development and deployment. Learn how they will influence the direction your project takes, and what you will need to prove you’ve met them.

 
Watch Now

Building a Security Policy

Building your security policy starts with determining the device’s assets, identifying threats to those assets, and defining mitigations to those threats. Your policy has to factor in your risk tolerance as you determine which mitigations to implement.

 
Watch Now

Designing with a Trusted Foundation

Your device is only as secure as its weakest link, so you must build on top of a proven and trusted platform with hardware-based security features as well as trustworthy software vendors, code pedigree, and secure software development practices.

 
Watch Now

Hardening and Fortifying

You must anticipate that a breach will happen. Learn to model different threat scenarios and put in place mechanisms to protect your applications, data, IP, and the resiliency of the device. Your threat model should assume that an attacker will get root (admin) access.

 
Watch Now

Ongoing Threat Prevention

Securing your device or system is a complete lifecycle effort. Threats must be actively monitored and resolutions must be implemented. Learn why proactive endpoint integrity monitoring is a must in today’s interconnected world.

 
Watch Now

Putting It All Together

Baking security into your system is not easy, but it must be done. Learn where to go to get the help you need to build a secure and safe device.

 
Watch Now

Security Assessment

Walk through a brief set of questions to see where you stand.

 
Take an assessment

More security questions?

Contact us

ACCELERATION PROGRAM SUCCESS POINTS