Levels of Robot Intelligence: How AI Shapes Machine Thinking

Levels of Robot Intelligence: How AI Shapes Machine Thinking

Key Takeaways of this Article

✅ The Levels of Robot Intelligence

✅ Typical Examples of Non-intelligent and Intelligent Applications

✅ The Maturity Stage of AI Implementation

Introduction to Intelligent Robotics

Artificial Intelligence and Machine Learning are the top technologies of 2024! Everyone talks about it and every week, another startup presents their new AI solution. In the robot world, several sophisticated robots have been presented recently: smart factory robot solutions, fruit-picking robots, mobile dog-like legged robots and even humanoid robots. Some of these robots may seem like they've come straight out of a sci-fi movie and we all have both positive utopian and negative dystopian examples in the back of our minds - though this little guy is quite cute.

AI-based Robot Mimics Human Behavior and Emotions

One may ask:

How intelligent are robots today, and how will they develop in the future?

This article presents and explains the different levels of robot intelligence and autonomy.

Levels of Robot Autonomy & Intelligence

The International Federation of Robotics (IFR) classifies five different levels of robot autonomy and intelligence, based on the degree of Artificial Intelligence (AI) implementation. These levels are:

  • Level 1: No autonomy, remote control
  • Level 2: No autonomy, without sense & respond
  • Level 3: No autonomy, with sense & respond
  • Level 4: Autonomy
  • Level 5: Advanced autonomy

Levels of Robot Autonomy & Intelligence Arranged by the Degree of AI Implementation

Level 1: No autonomy, remote control

At the lowest level of robot intelligence, there is no autonomy and the robot is remotely controlled. A good example of such robots are surgical robots, which are often operated manually and remotely. This means that a surgeon controls the robot's movements using a certain input device, like a data glove, from distance. The surgeon's motions are then translated into movements that the robot mimics.

Remotely controlled robot performs surgery at the example of a grape

Level 2: No autonomy, no sense & respond

The second level of robot autonomy includes classic industrial robot cells that neither sense their environment nor respond to it. Instead, these robots operate behind safety fences following pre-programmed hard-coded motion patterns. Flexibility is engineered into the system from the start, and motions are optimized in offline simulation environments in favor of process parallelization, cycle time and overall profitability. Material is provided in a fixed manner, typically positioned at a predefined position, allowing the robot to grip the workpiece blindly without the need for vision systems. As a result, the environment is not perceived, no external forces are measured and the robots do not react to their surroundings (except in the case of signal exchange, process-related sensing, program interruptions or safety-related issues). A typical example of such complex, cycle time-optimized cells can be seen in the following video at the example of arc welding.

Pre-programmed multi-robot arc welding system

Level 3: No autonomy, but sense & respond

The next level still does not imply autonomy but already integrates sense and respond capabilities. Collaborative robots, unlike their fenced-in colleagues, are developed for direct human-robot-interaction and fenceless operation. By integrating sensor technology, such as torque sensors, into the robot arm, it can sense external forces and react to its environment in a touch-based way - either by detecting a contact with an operator, by checking for resistance or for sensitive joining in assembly tasks. By adding external safety devices, such as laser scanners, the robot system detects the distance to an operator and adjusts its operating speed relative to the proximity. In the following example of a collaborative assembly station, the robot detects contact with the operator, stops accordingly, and checks for resistance when approaching the workpiece area.

Collaborative robot reacts to operator contact with integrated torque sensors

Level 4: Autonomy

With level 4 the robots reach autonomy, in which they are equipped with additional sensors as well as intelligent machine learning and AI algorithms. This type of robots can actively perceive their environment and react to it. Cameras attached to the robot, or independently mounted above the operating space, use intelligent vision algorithms to identify objects and detect their position and orientation. With this capability, material provision is much more flexible allowing for workpieces to be provided in an irregular fashion, like on a conveyor belt with varying positions and orientations. Motion paths between the pick and the place position are no longer pre-coded but rather AI-generated and flexibly adjusted if the place position changes, such as when a lattice box is moved by accident. Typical examples of this application type are bin picking and vision-based pick&place tasks, as shown in the following video.

Robot reacts autonomously to flexible material position and changing environment

Level 5: Advanced autonomy

At the highest level there is advanced autonomy - building upon the technology of level 4 with even more advanced robot capabilities. Robots can react to specific tasks that are given them in intuitive ways, such as voice recognition - similar to your Siri or Alexa, but with much more complexity behind. Advanced algorithms enable the robot to actively perceive its environment, differentiate between individual object classes, and interpret various situations. The following video shows such a demo case, in which the robot autonomously sorts randomly presented trash into a bin.

Robot perceives and interprets environment and reacts to it autonomously

Conclusion: How intelligent are robots today?

Now that we have examined the five different levels of robot intelligence & autonomy you may ask: at what level are we today, and what is already possible?

Level 1 tele-operated robots are standard solutions that are working in many hospitals already today. Additionally, other tele-operated robot systems, aside from surgery and medical applications, have been around for a while at a highly mature level.

Level 2 non-autonomous industrial robot systems constitute the majority of today's installed robots - standard machines that are designed, planned and programmed in custom-engineered cells or that are even available as a standard plug&play system.

Level 3 non-autonomous collaborative or sensitive robot system operate already in many productions without a fence in more or less direct interaction with the operator. Since collaborative robots have been on the market for many years, this type of robot has emerged into a new class of robot with many advantages in terms of flexibility but also limitations in allowed speed, size, reach, and payload.

Level 4 autonomous robots with additional sensors and AI/ ML technology are especially prevalent in bin picking and vision-based applications. These solutions are already operating in factories for selected cases and are usually custom solutions that are designed for a specific workpiece or group of workpieces.

Level 5 advanced autonomous robots are currently in development with first demonstrators having been presented. Due to the rapid advancements in AI technology, this kind of robotic applications is growing fast with an increasing number of fascinating features and technology integrations. Today, the capabilities of these robots are still limited and far from science fiction movies. But in the near future, robots will become more intelligent and therefore capable of solving tasks that were not automatable until now. New AI and ML advancements open up a completely new type of robotic applications!

Maturity Stage of AI Implementation

References

International Federation of Robotics (IFR) (2022). Position Paper: Artificial Intelligence in Robotics. Frankfurt, Germany.

Manning, C. (2020). Artificial Intelligence Definitions. [online] Stanford University. Stanford University.

Stackoverflow (2023). How Machine Learning Works: Types & Applications. [online]

Soori, M., Arezoo, B. and Dastres, R. (2023). Artificial intelligence, machine learning and deep learning in advanced robotics, a review. Cognitive Robotics, 3, pp.54–70.

PA Consulting Group (2018). AI and Robotics Automation in Consumer-driven Supply Chains - A rapidly evolving source of competitive advantage. London, United Kingdom: PA Consulting Group.

Roland Berger (2019). Rise of the machines – How robots and artificial intelligence are shaping the future of autonomous production. Munich, Germany: Roland Berger.

Walt Disney Imagineering (2023). A New Approach to Disney’s Robotic Character Pipeline [online] YouTube.

da Vinci Surgery (2014). da Vinci Robot Stitches a Grape Back Together [online] YouTube.

Yaskawa Europe GmbH (2023). Schlüsselfertige Riegelschweißanlage bei PERI in Günzburg [online] YouTube.

ZKW (2021). ZKW COBOT: Mensch und Roboter arbeiten im Team [online] YouTube.

Yaskawa Europe GmbH (2022). Yaskawa & Robotcloud | Collaborative packaging and vision guided robotics [online] YouTube.

Figure AI (2024). Figure Status Update - OpenAI Speech-to-Speech Reasoning. [online] YouTube.

Bruno Nogueira

Coordenador de Serviços l Coordenador de Projetos l Coordenador de Planejamento l Gestor de Contratos l Coordenador de Operações | Project Manager | Assistência Técnica | Coordenador Comercial| Customer Service

1mo

Thanks for sharing Dr. Christopher Schneider

Farhad Sajajid

Industrial Machine Designer (Specially Material & Goods Smart Handling Machines/Robots/Lines for Filling/Packing/Assembling Applications)

1mo

I am eagerly waiting for read this article 😍

Like
Reply
Akanksha S Gulia

UX Specialist | Design Thinking | Human-Centered Design

1mo

Thank you for sharing such an insightful article! The discussion about the different stages of AI implementation in robotics was particularly enlightening. It vividly depicted the possibilities and innovations that could transform various industries. 🌟

Like
Reply

Great article, Chris! The intersection of robotics and AI is so fascinating. What inspired you to write about this topic? Alex Belov

Like
Reply
Faraz Anis 🍁

| Increasing reach of busy entrepreneurs and coaches | Speaks for Jobseekers and Workplace wellness | Account management | Influencer Marketing | Personal Branding | Ai Advocate

1mo

Let's explore the levels of robot intelligence!

Like
Reply

To view or add a comment, sign in

More articles by Dr. Christopher Schneider

  • Overview of Industrial Robot Arm Kinematics

    Overview of Industrial Robot Arm Kinematics

    Key Takeaways of this Article: ✅ The fundamental types of robotic arm kinematics ✅ The differences between parallel and…

    8 Comments
  • Technical Alternatives for Fenceless Systems

    Technical Alternatives for Fenceless Systems

    Key Takeaways of this Article: ✅ How to combine HRI modes with industrial and collaborative robots? ✅ Which fenceless…

    13 Comments
  • Modes of Human-Robot-Interaction

    Modes of Human-Robot-Interaction

    Key Takeaways of this Article: ✅ Which modes of human-robot-interaction (HRI) are available? ✅ Which robot operating…

    6 Comments
  • Levels of Human-Robot-Interaction

    Levels of Human-Robot-Interaction

    Key Takeaways of this Article: ✅ Why flexible automated production systems are needed? ✅ How production can be…

    13 Comments
  • Welcome to my blog about robotics & AI

    Welcome to my blog about robotics & AI

    Hi there, thank you for visiting my blog about robotics and artificial intelligence. I hope you enjoy reading my blog…

    11 Comments

Insights from the community

Others also viewed

Explore topics