• New AI-enabled navigation technology allows AMRs to make intelligent decisions in dynamic and challenging environments
• Technology reduces commissioning time by up to 20 percent, requiring less calibration and no change to infrastructure
• Increases flexibility for fast-changing intralogistics requirements in production, logistics, healthcare and retail
ABB Robotics has transformed its Autonomous Mobile Robots (AMRs) with the addition of Visual Simultaneous Localization and Mapping (Visual SLAM) technology, enabling its AMRs to make intelligent navigation decisions based on their surroundings. Using AI-enabled 3D vision to perform location and mapping functions, ABB’s Visual SLAM AMRs make production faster, more flexible, efficient and resilient while taking on dull, dirty and dangerous tasks so people can focus on more rewarding work.
Visual SLAM combines AI and 3D vision technologies to guarantee a superior performance in comparison to other guidance techniques for AMRs. Offering significant advantages over other forms of navigation such as magnetic tape, QR codes, and traditional 2D SLAM that require additional infrastructure to function, Visual SLAM AMRs are being embraced by companies to handle an expanding range of production and distribution tasks.
“Our introduction of Visual SLAM AMRs radically enhances companies’ operations, making them faster, more efficient and more flexible, while freeing up employees to take on more rewarding work,” said Marc Segura, President of ABB Robotics Division. “Offering more autonomy and intelligence, our new AMRs operate safely in dynamic, human-populated environments. Visual SLAM technology provides a new level of intelligence for AMRs that transforms robotic applications, from production and distribution through to healthcare.”
Visual SLAM uses cameras mounted on the AMR to create a real time 3D map of all objects in the surrounding area. The system can differentiate between fixed navigation references such as floors, ceilings and walls that need to be added to the map, and objects such as people or vehicles that move or change position. The cameras detect and track natural features in the environment enabling the AMR to dynamically adapt to its surroundings and determine the safest and most efficient route to its destination. Unlike 2D SLAM, Visual SLAM requires no additional references such as reflectors or markers, saving cost and space and offers accurate positioning to within three millimeters.
By eliminating the need to change the environment, stop production, or add infrastructure, Visual SLAM technology helps to reduce commissioning time by up to 20 percent compared to 2D SLAM, significantly reducing the time needed to introduce a new AMR into the existing fleet. The technology can be used at scale with fleets updated remotely. The technology is also secure, as it analyzes raw data only, with no visual images saved on either the AMR or on a server.
ABB developed Visual SLAM AMRs in collaboration with partner Sevensense Robotics, a leading provider of AI and 3D visual technology. The technology is being incorporated in ABB’s latest generation AMRs, the AMR T702V, from Q3 2023, and the AMR P604V, from Q4 2023. These will be followed by other AMR products incorporating Visual SLAM which will be rolled out up to 2025. The Visual SLAM technology is already deployed in industrial projects for customers in automotive and retail, with the potential to replace conventional production lines with intelligent, modular production cells served by AMRs.
Visual SLAM will be introduced by ABB at booth G07, Hall 6 at LogiMAT 2023, which takes place from April 25 to 27 in Stuttgart, Germany. ABB will also be hosting a keynote presentation, ‘AI and Robotics enabling the next generation of logistics automation’, which will take place on April 26 in the ICS, room 5.3, at 15.00.