Work collaboratively for flexible factory automation
A collaborative robot (cobot) is a robot that allows humans to work alongside the robot through direct interaction without conventional safeguarding fences. The benefits of direct human interaction with cobots enable:
- Safe execution of complex tasks
- High production quality
- Intuitive and user-friendly teaching and programming of cobots
The concept of cobots, or “intelligent assist devices,” emerged from research projects and companies in the automotive industry, where cobots provided the power to move heavy objects under human control through direct interfaces. These systems ensured safe use of the cobots’ assistive capabilities. Over the years, cobots have been developed to perform tasks including:
In conventional industry automation, robots must be separated from physical human contact to ensure reliable functionality without causing physical harm to human operators. In these systems, robots operate in entirely human-free zones or within cages.
Constraining robots to cages limits their capabilities. Current markets demand reduced lead times and mass customization. These demands have stimulated interest in flexible and multipurpose manufacturing systems through human and robot collaborations that do not endanger workers. In flexible and collaborative automation, cobots augment and enhance human capabilities with strength, precision, and data analytic capabilities that add value for the cobot end-users. Cobot development aims for:
- Coexistence — shared workspace with human workers to optimize a process
- Collaboration — flexible automation for various tasks with human engagement
Safety fences present a technological barrier to the broader adoption of robots. Cobots are designed to satisfy safety requirements with intrinsic safety designs that allow safe interaction between the cobot and objects in its workspace (e.g., the ISO® 10218-1 standard). Cobots reduce the inertia exposed in potential collisions and contain compliant components such as joint torque sensors to absorb the energy of undesired impacts. Furthermore, cobot developers employ a large variety of external sensors (e.g., cameras, laser, depth, etc.) and fuse the acquired data to enable reliable recognition of human-robot proximity and gestures.
Cobot Programming with Advanced Algorithms and AI
There are cobot applications and technology gaps that hinder full cobot deployment. Advanced algorithms are needed for cobots to achieve their great potential for manufacturing in high-mix, low-volume production environments. Cobots must be able to perform in unfamiliar situations without explicit instructions by perceiving their environment using deep learning. The cobot’s motion planner allows the cobot to achieve a target position, and collision avoidance algorithms achieve reactive behavior in dynamic environments, based on local knowledge provided by sensors as the cobot moves.
Universal Robots Cobot Support from MATLAB
Design, simulate, test, and deploy UR Cobot applications
Cobot Design with MATLAB and Simulink
MATLAB® and Simulink® provide a full set of tools that allow you to:
- Use sensor models, such as camera, lidar, and IMU, to prototype how your cobot senses an environment.
- Perceive the environment for cobot applications using deep learning and computer vision.
- Teach your cobot motions using Inverse Kinematics Designer and motion planning.
- Design, iterate, and optimize motion controllers for safe interaction with your cobots.
- Model system control logic and evaluate autonomous algorithms for your cobot applications.
- Connect and control cobots from Kinova® and Universal Robots using MATLAB.
- Automatically generate production code to deploy to cobot controllers and onboard computers.
Examples and How To
See also: MATLAB and Simulink for robotics, MATLAB and Simulink for robot manipulators, Robotics System Toolbox™, Navigation Toolbox™, ROS Toolbox, Simscape Multibody™, Deep Learning Toolbox™, robot programming
“The integration of MATLAB, Simulink, and Deep Learning Toolbox gave us the confidence to move forward with the MBSE digital twin project.”Dr. T. John Koo, ASTRI