Artificial intelligence technology is now making its way into manufacturing, and the machine-learning technology and pattern-recognition software at its core could hold the key to transforming factories of the near future.
While AI is poised to radically change many industries, the technology is well suited to manufacturing, says Andrew Ng, the creator of the deep-learning Google Brain project and an adjunct professor of computer science at Stanford University.
“AI will perform manufacturing, quality control, shorten design time, and reduce materials waste, improve production reuse, perform predictive maintenance, and more,” Ng says.
The term artificial intelligence is used today as something of a catch-all for software that can train itself to perform certain tasks and to get better at those tasks over time, he says.
For example, AI is behind the software that identifies your friends’ faces in photographs. Those systems eventually get better at facial recognition as you “train” them by continuing to tag and identify friends in a variety of poses and situations.
The same AI process can be used to inspect parts in factories, Ng says. In another AI application, a robotic prototype from Siemens automatically reads and follows CAD instructions to build parts without programming.
Ng made his own move into AI with the founding of his company, Landing.AI, late last year. The company’s goal is to help manufacturers incorporate AI into their workflows.
For visual inspection, Landing.AI’s system recognize patterns of imperfections after “viewing” only five product images. Visual inspection systems that don’t depend on AI must be trained with massive data sets of around one million images to ensure they recognize all potential imperfections, Ng says.
And employees at many factories still inspect parts themselves. “Today, thousands of people in a single factory work together to spot defects, an incredibly tiring task,” Ng says. “But our deep-learning algorithm takes half a second to inspect a part and in many applications more accurate than humans.”
In their move to bring AI into manufacturing, a team of researchers at Siemens Corporate Technology division in Munich, Germany, announced in December they developed a two-armed robot that can manufacture products without having to be programmed.
The robot’s arms automatically work together, dividing tasks as needed in the same way humans use their own arms.
While conventional robots cannot decipher a CAD model, the Siemens robot can interpret various CAD models, which eliminates the need to program its movements and processes, says Kai Wurm, who helmed the project along with George von Wichert. The pair research autonomous systems at Siemens.
“In the future, robots will no longer have to be expensively programmed in a time-consuming manner with pages of code that provide them with a fixed procedure for assembling parts,” Wurm says. “We will only have to specify the task and the system will then automatically translate these specifications into a program.”
The robot itself decides which task each arm should perform. To make this possible, the developers have enabled the prototype to raise information from the product development software to a semantic level.
“Product parts and process information are semantically converted into ontologies and knowledge graphs,” says Wurm. “This makes implicit information explicit. Until now the things that people simply know from experience when they are told to snap component X onto rail Y have had to be taught to robots in the form of code. However, our prototype analyzes the problem by itself and finds a corresponding solution.”
The robot can manufacture single parts or prototypes, a process called “batch-size one” in the manufacturing sector. The term refers to manufacturing or assembling a variety of products, each of which contain different components and setups.
The robot can also correct faults. If a part slips, one of its arms will find the part as long as it’s within its camera’s field of vision. The arm will then pick up the component and adjust all of its subsequent movements so that it can still install it correctly. It may, for example, transfer the part to its other arm if that position works better for part placement, Wurm says.
Siemens is also using AI to predict when factory equipment will need maintenance, says Roland Busch, Siemens AG chief technology officer.
The company installs “smart boxes” on older motors and transmissions that include sensors and a communications interface for data transfer, Busch says.
“By analyzing the data, our artificial intelligence systems can draw conclusions regarding a machine’s condition and detect irregularities in order to make predictive maintenance possible,” he says.
Ng says changes like the Siemens robot and his own visual inspection technology mean the manufacturing process may not be recognizably the same in the near future. He compared AI to the way electricity altered industry more than 100 years ago.
“Electricity transformed every business. It changed communication with the telegraph, and communications through the electric motor,” Ng said in a Stanford talk. “Now deep learning and AI has advanced to the point it too has the potential to transform every industry.”
>> Originally posted by Jean Thilmany, ASME.org, May 2018