Walk along a modern automotive manufacturing line and you might think you’ve stepped onto the set of a “Terminator” movie. Everywhere you look, you’ll see robots, and very few humans, diligently building cars.
That’s because automotive manufacturing has always been a leader in the adoption of automation technology, including machine vision and robots — and for good reason. The use of automation has made automobiles more affordable to the masses and significantly safer due to higher-quality construction and advanced automotive systems, many of which wouldn’t be possible without the assistance of automation technology.
Given the automotive industry’s leading-edge adoption of automation tech, it’s no surprise that the number of new applications being automated for the first time isn’t driving the adoption of vision and other advanced automation solutions. Instead, growth in the automotive industry comes more from retooling and retrofits to production lines, rather than new applications solved for the first time. Today, integrated vision systems packed with intelligence to simplify their setup and operation are driving vision’s penetration into the motor vehicle market, helping the automotive manufacturing industry to achieve new heights in productivity and profitability.
A list of automotive systems that use vision technology during assembly or quality inspection reads like the table of contents from a service manual, covering every aspect of the automobile from chassis and power trains to safety, electronics, and tire and wheel. In most cases, machine vision is tracking the product through the use of 1D and 2D barcodes and performing quality inspections. But it’s also helping to assemble the products.
“Most of the applications we’re solving today involve material handling, moving parts and racks to assembly lines using either 2D or 3D vision,” explains David Bruce, Engineering Manager for General Industry & Automotive Segment for FANUC America (Rochester Hills, Michigan). “But the biggest buzz word right now is ‘3D.’”
FANUC’s iRVision machine vision package has long been a staple of the automotive industry, especially in the U.S. and Asia. In recent years, FANUC introduced a fully integrated 3D Area Sensor vision product that uses two cameras and structured light to generate 3D point clouds of the camera’s field of view.
“Today, one of the last manual processes on the automotive manufacturing line involves part feeding, getting parts out of bins, and so on,” Bruce says. “Our 3D Area Sensor isn’t just a hardware solution. It includes a lot of software developed just for bin picking applications.”
In some of the most advanced material handling work cells, one robot with a 3D sensor picks the parts out of the bin and places them on a table so that a second robot with a 2D vision system can easily pick up the part and feed another machine, conveyor, or other process. Bruce also notes that end-of-arm tooling is one of the toughest challenges for bin picking applications; magnets and vacuum work best.
“By having the vision system controller directly integrated with the robot instead of using a PC, the engineers can focus on the mechanical engineering challenges and developing a bin picking system with buffering to make sure acceptable cycle times are achieved,” Bruce says.
Tighter integration between vision system and robot also makes it easier for end users to train the FANUC-based work station. “The way you set up iRVision has gotten a lot simpler,” says Bruce. “You can take images of the robot in 7, 8, or 10 different poses and the system will guide you through programming. Or if you’re looking at a large part that won’t fit in the field of view — not uncommon in automotive manufacturing — you can take images from several small fields of view of the part, and the robot controller can determine the full 3D location of the part.”
3D vision is also enhancing the latest class of robot assembly: lightweight robots, also called collaborative robots due to their low-force operation and ability to work next to humans with minimal safety systems.
While the automotive industry is boosting the number of collaborative vision work cells, “right now the killer application is kitting,” says Bruce. Kitting is the process of collecting parts into a bin for a specific product configuration or assembly.
The Path to Full Traceability
Any kitting or assembly task is only as good as the quality and accuracy of the incoming parts, which is why track-and-trace vision applications are so important to the automotive industry. “Over the last 31 years, the industry average was 1,115 car recalls per every 1,000 sold, according to the National Highway Traffic Safety Administration,” says Adam Mull, Business Development Manager Machine Vision/Laser Marking for Datalogic (Telford, Pennsylvania). The recall rate can exceed 1,000 because a single car can have more than one recall.
“While we’re seeing applications across the board from inspection to vision-guided robotics [VGR], we’re definitely seeing a trend toward full traceability,” adds Bradley Weber, Application Engineering Leader and Industry Product Specialist – Manufacturing Industry at Datalogic. “There’s always been traceability of the most critical components of the car, but now it’s going everywhere. Every part is being laser marked or peened, read, and tracked. That’s part of what has opened a lot of doors for Datalogic because we have many types of laser markers, vision systems to verify those marks, and then both handheld and fixed barcode readers to read and track those marks all through the process.”
According to Mull, while one manufacturing plant used to manufacture only one type of vehicle, today each plant either makes parts for multiple vehicles or assembles different vehicles.
Consumer demand is driving the need for more automation in the factory. “When you go to a dealership, there are so many more options than there were years ago, from the color of the dashboard to the onboard electronics,” Weber says. “With all those choices, OEMs need a strong manufacturing execution system that is being fed data from every part along the manufacturing process.”
With machine-readable codes going on more and more components, it also opens up the possibility of reworking problem parts instead of scrapping them.
As automation and machine-to-machine communication continue to blur the lines between robot, vision, marking system, and production equipment, the benefit to the manufacturer is greater ease of use, leading to greater machine vision adoption.
Advanced vision software such as Matrox Design Assistant is aiding new adopters to quickly set up ID, VGR, and inspection routines using simple flow-chart programming and automated sample image acquisition, according to Fabio Perelli, Product Manager at Matrox Imaging (Dorval, Quebec).
Better automation integration is also helping to educate engineers, opening up even more opportunities for vision and other automation solutions.
“In automotive, engineers often work in bubbles,” says Datalogic’s Mull. “Everyone’s running with their own part of the project. But as one system works more closely with another system, the team members start to cross-pollinate, opening up the opportunity to teach the engineer who only knows about smart cameras how to use embedded controllers and advanced vision or marking systems. And since our systems all use the same software environment, it makes it seamless for an engineer to move from smart cameras to embedded controllers to other Datalogic solutions.”
>> This article by Winn Hardin was re-posted from AIA Vision Online (11/21/17)