Just How Effective Is VR for Industrial Training?

Simulation training is one of the most talked about enterprise applications for virtual reality. But how effective is it? FS Studio’s Robert Milton discusses in a video interview with Design News.

(Image source: Eddie Kopp on Unsplash)

When you really think about it, simulated training has been around for decades now. Anyone familiar with aviation will know about flight simulators; the military has used video games and other tools for combat simulation; and NFL teams even use the popular video game Madden to help train players.

But when it comes to the industrial space, Robert Milton—a designer and educator working with software solutions company, FS Studio—says manufacturing is lagging behind other industries, such as healthcare and aerospace, when it comes to adopting VR. Yet there is great potential for VR as well as augmented reality (AR) to transform how workers are trained and perform their jobs. And there’s a good amount of academic research to back this claim.

As an educator, Milton has consulted with Fortune 500 companies on recreating interactive learner experiences. Ahead of his talk at the Atlantic Design & Manufacturing Show, “Leveraging Virtual Reality for Industrial Training,” Milton sat down with Design News to discuss the current state of enterprise VR, research into the efficacy of VR training…and that one time VR made him homeless.

Watch the full video on DesignNews.com.

>> Originally posted by Chris Wiltz, Design News, June 9, 2018

Artificial Intelligence Transforms Manufacturing

Artificial intelligence technology is now making its way into manufacturing, and the machine-learning technology and pattern-recognition software at its core could hold the key to transforming factories of the near future.

While AI is poised to radically change many industries, the technology is well suited to manufacturing, says Andrew Ng, the creator of the deep-learning Google Brain project and an adjunct professor of computer science at Stanford University.

“AI will perform manufacturing, quality control, shorten design time, and reduce materials waste, improve production reuse, perform predictive maintenance, and more,” Ng says.

The term artificial intelligence is used today as something of a catch-all for software that can train itself to perform certain tasks and to get better at those tasks over time, he says.

For example, AI is behind the software that identifies your friends’ faces in photographs. Those systems eventually get better at facial recognition as you “train” them by continuing to tag and identify friends in a variety of poses and situations.

Siemens researchers test their robot prototype, which uses artificial intelligence to decipher CAD instructions and assemble parts. Image: Siemens

The same AI process can be used to inspect parts in factories, Ng says. In another AI application, a robotic prototype from Siemens automatically reads and follows CAD instructions to build parts without programming.

Ng made his own move into AI with the founding of his company, Landing.AI, late last year. The company’s goal is to help manufacturers incorporate AI into their workflows.

For visual inspection, Landing.AI’s system recognize patterns of imperfections after “viewing” only five product images. Visual inspection systems that don’t depend on AI must be trained with massive data sets of around one million images to ensure they recognize all potential imperfections, Ng says.

And employees at many factories still inspect parts themselves. “Today, thousands of people in a single factory work together to spot defects, an incredibly tiring task,” Ng says. “But our deep-learning algorithm takes half a second to inspect a part and in many applications more accurate than humans.”

In their move to bring AI into manufacturing, a team of researchers at Siemens Corporate Technology division in Munich, Germany, announced in December they developed a two-armed robot that can manufacture products without having to be programmed.

The robot’s arms automatically work together, dividing tasks as needed in the same way humans use their own arms.

While conventional robots cannot decipher a CAD model, the Siemens robot can interpret various CAD models, which eliminates the need to program its movements and processes, says Kai Wurm, who helmed the project along with George von Wichert. The pair research autonomous systems at Siemens.

“In the future, robots will no longer have to be expensively programmed in a time-consuming manner with pages of code that provide them with a fixed procedure for assembling parts,” Wurm says. “We will only have to specify the task and the system will then automatically translate these specifications into a program.”

The robot itself decides which task each arm should perform. To make this possible, the developers have enabled the prototype to raise information from the product development software to a semantic level.

“Product parts and process information are semantically converted into ontologies and knowledge graphs,” says Wurm. “This makes implicit information explicit. Until now the things that people simply know from experience when they are told to snap component X onto rail Y have had to be taught to robots in the form of code. However, our prototype analyzes the problem by itself and finds a corresponding solution.”

The robot can manufacture single parts or prototypes, a process called “batch-size one” in the manufacturing sector. The term refers to manufacturing or assembling a variety of products, each of which contain different components and setups.

The robot can also correct faults. If a part slips, one of its arms will find the part as long as it’s within its camera’s field of vision. The arm will then pick up the component and adjust all of its subsequent movements so that it can still install it correctly. It may, for example, transfer the part to its other arm if that position works better for part placement, Wurm says.

Siemens is also using AI to predict when factory equipment will need maintenance, says Roland Busch, Siemens AG chief technology officer.

The company installs “smart boxes” on older motors and transmissions that include sensors and a communications interface for data transfer, Busch says.

“By analyzing the data, our artificial intelligence systems can draw conclusions regarding a machine’s condition and detect irregularities in order to make predictive maintenance possible,” he says.

Ng says changes like the Siemens robot and his own visual inspection technology mean the manufacturing process may not be recognizably the same in the near future. He compared AI to the way electricity altered industry more than 100 years ago.

“Electricity transformed every business. It changed communication with the telegraph, and communications through the electric motor,” Ng said in a Stanford talk. “Now deep learning and AI has advanced to the point it too has the potential to transform every industry.”

>> Originally posted by Jean Thilmany, ASME.org, May 2018

Removing the Roadblocks of Machine to Machine Communication in Manufacturing

A big part of maximizing the value of industrial automation, and full realization of the benefits of Industry 4.0, is the flow of data and information. This has applications in a lot of environments, such as within manufacturing plants, warehouses, and distribution centres.

Also known as interoperability, this presents common challenge for companies  since overcoming the difficulties of connecting machines and equipment made by different companies is rarely simple. Companies design products to make data exchange easier between their own equipment but don’t place as much importance on how those products communicate with the products of other companies.

Of course, making the connections that enable information to be shared is just the first step. The goal of Industry 4.0 is really about the ability  to generate meaningful and useful data for the entire operation.

The Interoperability of Self-Driving Vehicles

In the context of material handling, this challenge can relate to the connectivity of self-driving vehicles (SDVs) and the role they play within a facility. Self-driving vehicles can be  a core part of Industry 4.0 for many businesses, and are integral to the Industrial Internet of Things (IIoT) and the overall efficiency of a facility..

Connectivity that allows operators to communicate to, and receive information from, SDVs is a must. Practical examples can include an SDV calling for more inventory based on real-time lineside inventory levels, obstructions that may affect the flow of materials,  and other details of what else is happening in production at that moment.

Overcoming the challenge of interoperability starts by creating a consistent framework for different machines to talk.

How does the OPC Unified Architecture (UA) platform help connectivity?

As we’ve identified, it’s a common problem for factories who operate multiple PLCs (Programmable Logic Controllers), like those in SDVs or otherwise, to end up with equipment than cannot readily communicate with other equipment. “Out of the box” they can only talk to other products manufactured by the same company. To make it work, operators are often forced to customize program connection drivers that enable these different connections to communicate with each other. This is an inherently time, resource and cost intensive process.

To make these connections easier, the industry is moving towards a broadly adopted communication protocol called OPC Unified Architecture.

What is OPC UA?

From the OPCFoundation.org website: The OPC Unified Architecture (UA), released in 2008, is a platform independent service-oriented architecture that integrates all the functionality of the individual OPC Classic specifications into one extensible framework.

This multi-layered approach accomplishes the original design specification goals of:

  • Functional equivalence: all COM OPC Classic specifications are mapped to UA
  • Platform independence: from an embedded micro-controller to cloud-based infrastructure
  • Secure: encryption, authentication, and auditing
  • Extensible: ability to add new features without affecting existing applications
  • Comprehensive information modeling: for defining complex information

Self-Driving Vehicles and Machine to Machine Communication

What does this look like in practice for manufacturers using SDVs?

The large part of the value of self-driving vehicles like OTTO is derived from their ability to make decisions and take action based on changing circumstances in real-time. When SDVs are on their job, it is vital that information on what they are doing and experiencing is fed back – especially when it involves any deviation from normal operating circumstances. There is value in other information that SDVs can feed into the network, as well. SDVs can also be programmed to provide advance notice that helps facilitate material flow, such as by signaling the shipping department that a pallet of completed product is on it’s way.

With connectivity established on the UPC UA, control over SDVs improves as well. Users can pause, unpause, switch off/on autonomy mode for SDVs. They can also query the state of the mission and understand exactly what the OTTOs are doing.

Solving the challenge of data interoperability exists for the IT function in many situations. For the connected factory this means enabling the flow of information between machines, which in the past has been easier said than done. Leveraging the benefits of a unified architecture in smart equipment enables tools, like SDVs, to become integrated and enhance the  benefits of the IIot. OTTO is on a mission to create more efficient material flow in industrial environments. Part of this commitment becomes creating ways in which the entire manufacturing facility, warehouse or distribution center can benefit from the self-driving vehicle. For the connected facility, this means enabling the flow of information between machines with the use of OPC UA and the OTTO Industrial API.

>>Originally posted by Clearpath Team, OTTO Motors, March 8, 2018

Simulation in the Cloud

Demand for simulation resources has expanded as engineering firms increase their use of simulation in the design process and throughout product development. That has taxed the IT infrastructure at many firms and left smaller companies struggling to find ways to run more complex simulations absent their own high-performance computing (HPC) resources.

Cloud-based simulation is slowly emerging as a solution to both problems. Transparency Market Research says the cloud-based simulation market reached $3.3 billion in 2016, and is expected to have a compound annual growth rate of 11.4% through 2025, reaching $8.5 billion—a faster rate of growth than the simulation market overall, according to the company’s data.

The need for cloud-based simulation is growing, particularly within the past year, because companies are searching for more elasticity when it comes to simulation, according to Todd McDevitt, director of product management at ANSYS. Companies are also looking for burst capabilities during peak demand periods without having to invest in additional hardware or software licenses.

“Customers may have some form of on-premise HPC infrastructure, but they want to reduce queue times, or they are going through a refresh cycle on their hardware and want to move workloads off of those resources,” McDevitt says. “But it usually begins with a bursting need.”

As a result, traditional simulation providers are branching out into the cloud, and new cloud-only startups are gaining traction in the market.

“There is an educated market around cloud engineering software now,” says David Heiny, CEO at SimScale. “The large vendors have made steps toward offering cloud solutions, so the industry is more familiar with the approach.”

The appeal of cloud-based simulation varies based on an individual company’s existing investments. “If they have already invested in existing hardware, then the impetus to go to the cloud is not as great,” says Ravi Shankar, director of simulation product marketing for Siemens PLM Software. “Cloud options get more interesting when [companies] are purchasing new hardware. They may have larger models and want to take advantage of better processing.”

Cloud computing allows this NX Nastran automotive body model to be simulated efficiently using parallel processing on a large number of cores. Image courtesy of Siemens PLM Software.

That bottleneck of queuing up for access to on-site HPC resources was the impetus behind the founding of OnScale, according to CEO Ian Campbell. “Legacy CAE is too costly, too risky and too slow,” he explains. “Every engineering firm goes through cycles of engineering workloads, but the way existing CAE models work [is] you have variable workloads and a fixed computer infrastructure that can’t scale to meet those workloads. If demand exceeds supply, you are wasting time waiting for access. If supply exceeds demand, you are wasting budget and resources.”

“The simulation load is never stable,” Heiny says. “You have peak demands. On the cloud, if you want to run 20 simulations on 100 cores, it’s just there. You just run it. You don’t have to talk the vendor about getting a burst license. With a cloud-based solution like SimScale, it’s there out of the box.”

Different Approaches

Simulation providers have taken slightly different approaches to their cloud products. SimScale was created to provide structural, mechanical, flow and thermal simulation in the cloud, as well as some multiphysics simulations. The company’s platform can be integrated with other design tools and fundamentally supports most general exchange formats. “Everything produced in SimScale can be downloaded and used in another solver,” Heiny says.

SimScale also offers a community plan for free that can be used for open source and hobby projects. A professional pricing level is targeted at proprietary projects and includes unlimited data storage, encryption and real-time support. The enterprise level is for multi-user and more computationally demanding simulations.

The community plans make projects public to other members. “Sharing this know-how in simulation becomes more intense and effective within the same software and with the community,” Heiny says. “There is collaboration built into the product.”

The company is targeting customers that are new to simulation. “Our approach is to make simulation more widely accessible,” Heiny says. Cloud-based simulation not only lowers the cost of entry, but also enables a greater degree of collaboration. “You can collaborate in real time with other stakeholders, who can all view the same simulation projects in real time,” Heiny says. “We have also leveraged data science to automate some of the work, and these things were not possible in the desktop realm.”

ANSYS has worked closely with its own network of cloud hosting partners to optimize its products to run on their infrastructure to ensure reliable availability via their data centers. “We make sure that our solvers and our products are robust and scalable enough to take advantage of cloud resources, that we are parallelizing on the new GPU instances and other HPC architectures,” McDevitt says. “We also don’t force our customers into a particular business model. Our customers can take their paid licenses to the cloud and use them there, and we’ve introduced a usage-based licensing system.”

Moving streamlines show airflow around a Chevy Traverse rendered using photorealistic ray tracing. Visualized using ANSYS Ensight. Image courtesy of ANSYS and General Motors.

Siemens offers cloud-based simulation primarily through its partnership with Rescale, providing access to NX Nastran, Simcenter 3D and Star CCM+. Customers can either purchase a fully software-as-a-service (SaaS) based license through Rescale, or they can purchase a traditional license from Siemens and use Rescale to host the software.

“Our philosophy and strategy is to offer as much flexibility as possible in terms of the way customers can interact with the software solutions we offer,” Shankar says. “The cloud model adds another layer of that flexibility.”

Moving forward, he says the company plans to eventually offer solutions that were built natively for the cloud.

“Demand for cloud has varied,” Shankar says. “It’s not a major part [of] our business today, but we are making sure we have everything in place for customers who do need it today, and we anticipate future growth.”

OnScale offers its software for free (for 10 core hours per month) and leverages Amazon Web Services (AWS) for its compute platform. “Users don’t have to pay all of that money per seat. The service is sold on a subscription basis,” Campbell says.

OnScale is targeting markets that Campbell notes are overlooked by other simulation companies, like internet of things (IoT) organizations that don’t necessarily have access to tools that meet their requirements. OnScale claims it can outperform traditional simulation tools thanks to its access to nearly unlimited compute resources on AWS, as well as perform tasks that would be difficult or impossible using traditional infrastructure, like 3D versions of surface acoustic wave filters.

Some experts believe that a full SaaS model is the future of engineering. “Engineers are enthusiastic about the removal of the licensing and computational bottlenecks,” says Gerry Harvey, vice president of engineering at OnScale.

Other new tools are emerging, and traditional players are partnering with cloud-based firms as well. Earlier this year, SimScale announced it was integrating Siemens Parasolid software and HOOPS Exchange to provide a more seamless simulation workflow and improved accuracy.

Autodesk offers cloud simulation for Autodesk CFD, Moldflow and Fusion360. Plastic simulation provider Moldex3D has also released a cloud extension of its software on AWS, allowing users to offload larger simulations without the expense of investing in new hardware or software licenses.

Rescale provides cloud-based access to a wide variety of applications, different cloud providers and hybrid on-premise data centers. In addition to working with ANSYS and Siemens, Rescale also offers X2 Firebird CAE software from Xplicit Computing, charging a flat, hourly rate. Other partners include Autodesk, CAE Solutions, COMSOL and Dassault Systèmes.

Greater Flexibility

Customers take different approaches to cloud deployments, based on the size of their models and their own existing infrastructure. Often they want burst access or are looking for other ways to augment their own computing infrastructure. Smaller firms may rely entirely on cloud solutions for their simulation needs.

ANSYS Fluent simulation of oil volume in a gerotor pump showing the extent of cavitation (red) on the gear wall. Visualized using ANSYS Ensight. Image courtesy of ANSYS.

The cloud also provides hardware elasticity. Different solvers and physics are usually optimized to run on different types of hardware. Cloud-based solutions allow engineers to access a much wider variety of hardware instances than would be possible to support on premise (at least without significant costs). How much memory per core a user needs is different for computational fluid dynamics (CFD) than for high-frequency electromagnetic or mechanical simulations—the cloud provides access to all of these configurations.

Companies with existing on-premise infrastructure may take a hybrid approach, running some simulations on their own equipment and others in the cloud, based on demand. Data is a factor as well, since simulation creates large amounts of data that users don’t necessarily want to move back and forth between systems.

Whether a simulation is run on premise or on a cloud-based infrastructure should be invisible to the engineer. “The platform should send the job to the appropriate resources, so the process looks the same to the engineer,” McDevitt says. Those decisions about resources can be made at the application layer rather than the platform level.

At ANSYS, McDevitt says that for CFD customers, many users do some pre-processing locally and then push out the solve portion to the cloud so that they can achieve faster turnaround times. That has been driven, in part, by model size and complexity.

“On the mechanical side, we see customers want[ing] to be more interactive and do more pre- and post-processing on the cloud,” McDevitt says. In the electronic space, customers may solve for a particular frequency on their workstation, but move to the cloud to repeat the same test across multiple other frequencies.

“That’s similar to a design point scenario, where customers want to look at different geometry parameters or different material configurations to optimize design points,” McDevitt says. “They’ll solve one design point and then go back for a hundred or a thousand different variations.”

Shankar explains that the Siemens customers most interested in cloud-based offerings have been those with larger models that want to take advantage of parallel processing or burst simulations. “They are able to use the extra capacity that the cloud provides almost instantaneously, without the need for IT to set up new hardware,” Shankar says.

That doesn’t mean that the cloud will work for every company. Users looking for steady-state usage won’t find an economic benefit to cloud platforms. “It’s like selling your car, and then paying for an Uber driver to sit outside your house 24/7,” McDevitt says. “The economics are not going to work out.”

Going Off-Cloud

Some aerospace and defense customers cannot utilize certain types of cloud services because of certification requirements or security issues. Companies that primarily are working with smaller models that can compute quickly, or those that already have unused compute capacity in house, are not likely to benefit from a move to the cloud. “If the models are not taking advantage of parallel processing, then those would be situations where the cloud doesn’t offer an advantage,” Shankar says.

“And if you just invested in a new HPC architecture, you first want to make sure you can leverage that hardware,” Heiny adds.

He says that the biggest challenge for cloud simulation customers is one shared by their on-premise counterparts—users must trust the simulation results.

“If they aren’t ready to take advantage of those insights, then there’s no point in doing it, and that’s true of both on-promise and cloud solutions,” he says. “That’s where collaboration and support come in handy. We have a dedicated onboarding program to work with customers to get them to the point that they can trust in what they are doing. That’s mission critical. The cloud doesn’t remove that obstacle, but the tools that make the user successful allow us to get there faster than in the desktop realm.”

>> Originally posted by Brian Albright, Digital Engineering, June 1, 2018

What Manufacturers Should Do With IoT Data Once They Gather It

Manufacturers are pumping the IoT full of billions of dollars every year — and several claim great success as a result. While some have undoubtedly benefitted more from the IoT than others, there is one problem: What do we do with all the incoming data?

To address this growing issue, manufacturers turn to various data management strategies.

While some find one works best in their case, others use a combination of tactics to deal with the flood of new information. Here are four real-world examples of tactics some companies have tried.

Improve Production Through Automation

Tech-savvy manufacturers already use the IoT in production flow planning and monitoring. Not only does this eliminate unnecessary waste and improve factory sustainability, but it ultimately results in increased profitability in the long run.

Lido Stone Works recently automated much of their production activities, and they’ve already experienced a 30 percent increase in productivity. Their new equipment features built-in diagnostics and communications, which minimizes strain on maintenance technicians.

Bolster Safety and Security

While breakthroughs like the IoT are a boon to manufacturers, they pose a myriad of new risks and dangers that weren’t necessarily a concern in the past. It’s best to tackle digital safety on a case-by-case basis. In an industrial plant, digital security might use systems that require two-factor authentication (2FA) or video analytics to detect chemical spills.

Smart factories often use cameras that automatically focus on a specific area during an alert or alarm. In many ways, both digital and physical security complement one another in the next-gen factory or warehouse.

Wearable devices are equipped with built-in sensors that monitor air quality levels, detect the presence of dangerous chemicals and alert the user when they’re approaching unsafe conditions. Other devices, like a tool by SmartCap Technologies out of Australia, monitor truck drivers and equipment operators for levels of fatigue or drowsiness.

Enhance the Supply Chain

Next-gen technology provides an exciting juxtaposition regarding product visibility. While consumers are tracking deliveries like pizza and their next Amazon shipment, many Fortune 500 companies can’t even monitor the status of shipments worth millions of dollars.

The IoT hopes to solve that issue by providing increased visibility throughout every step of the modern supply chain.

In its current form, the IoT is ideal for tracking and managing assets — including raw materials, finished goods and even personnel. Tech-savvy manufacturers use sensors to collect, monitor and transmit data to their analysts.

Such data has the potential to uncover new, faster delivery routes, maintain temperature-controlled environments more efficiently and ensure product integrity before a shipment leaves the warehouse.

Such transparency and visibility wouldn’t be possible without a highly mobilized workforce and innovations like the IoT.

Strengthen the Customer Experience

Manufacturers are also using the IoT to strengthen the overall customer experience. In the retail sector, customer service often involves face-to-face interaction with the general public. But in manufacturing, this usually amounts to negotiating with third-party distributors over pricing, shipping logistics or project scheduling.

Regardless, paying customers are the lifeblood of any manufacturer. That’s why many companies use the IoT to cultivate strong professional relationships with their customers — and it has much to offer. From computer systems that accurately forecast inventory needs to greater visibility of products and services, the IoT is already changing the way manufacturers interact with customers.

Kaeser Kompressoren, known as Kaeser Compressors in the United States, now offers compressed air-as-a-service — which spares customers the expense of having to purchase industrial-scale air compressors and machinery.

Find What Works and Stick to It

The IoT isn’t a standardized solution to any specific problem. Instead, it’s a highly flexible and adaptable tool that caters to several individual needs. While some use the IoT for rapid costing, plant load optimization or advanced reporting, others use it on the factory floor and right alongside human workers. Find what works for your company, then stick with it.

>> Originally posted by Kayla Matthews, Manufacturing.Net, 5/24/2018

3D-Printed Tooling Offers Durability for Precast Concrete

As an alternative to wooden tooling, 3D-printed forms for precast concrete are proving to be more durable and better able to support a large-scale renovation project.

There are certain applications today where 3D printing makes sense. An injection molder might choose to print a small batch of plastic parts that would be cost-prohibitive to mold. A machine shop might invest in a 3D printer to make jigs to aid in inspecting short runs of parts. A service bureau might rely on 3D printing for product development work, as a way to make prototypes quickly and easily.

What these scenarios have in common is that 3D printing makes it easier to produce a small, custom quantity. A few dozen parts. A couple of temporary fixtures. One prototype. There are exceptions, but manufacturers today don’t necessarily see 3D printing as the solution for repeatable, high-volume jobs.

One of those exceptions is proving to be the precast concrete industry. Gate Precast, a supplier of precast structural and architectural concrete, is finding that 3D-printed tooling is exactly the right solution for a job requiring high repeatability over many concrete pours: manufacturing hundreds of punched windows for the façade of a 42-story building in New York City. For this large-scale project, 3D-printed forms have proved their worth in terms of faster lead times, increased durability and better quality in the end product.

Enter 3D Printing

Pouring these precast punched windows for the Domino Sugar Refinery apartment complex was a large job for Gate Precast, requiring multiple concrete casts in three different profiles. Durable forms 3D-printed with Big Area Additive Manufacturing (BAAM) printers have enabled the company to deliver quality precast pieces within a shorter timeline than wooden forms.

Gate Precast manufactures precast concrete in nine locations nationwide. Some of these facilities focus on structural concrete—weight-bearing items like the beams and columns that make up parking garages—which is produced with metal forms, often in very large quantities. Others, like its Winchester, Kentucky, plant, specialize in architectural pieces which are typically made in smaller batches.

For most jobs, the Winchester plant builds its own concrete forms from plywood and fiberglass through an in-house carpentry department. These forms are not highly durable, but they don’t need to be. A wood form will start to break down, typically after 15 to 20 castings. But for a typical job where only 5 to 10 castings might be needed, this is no problem. It’s larger jobs that are the challenge, when multiple forms must be built to support many concrete pours—which is where 3D printing comes in.

Oak Ridge National Laboratory (ORNL) brought the idea to use 3D printing for concrete tooling to the Precast/Prestressed Concrete Institute (PCI), the industry’s technical institute and trade association, as the result of a collaborative research project. No printing was involved in that project, which focused on developing a lighter, thinner precast insulated panel. But ORNL is famously a co-developer and user of Big Area Additive Manufacturing (BAAM), the 3D printing system now sold commercially by Cincinnati Inc. that uses fiber-reinforced polymer to rapidly build large structures such as automotive parts and lay-up molds for composites.

ORNL hypothesized that 3D-printed molds might help precasters keep up with demand for concrete forms as master carpenters and form builders retreat from the labor force, while also shortening lead times, offering improved durability and potentially reducing costs. To test these ideas, the lab began 3D printing some test forms in a cornice shape which were then sent to Gate Precast’s Ashland City, Tennessee, plant, where between 30 and 40 cornice segments were successfully cast with no deterioration om the tooling.

The Right Job Comes Along

This testing was taking place around the same time that Gate Precast had won a major new job: the production of punched windows for the Domino Sugar Refinery development. This Brooklyn, New York, development will feature several new buildings and recreational space surrounding the historic Domino Sugar Refinery. One of these new constructions is a 42-story complex that includes office and retail space as well as apartments. Gate Precast is making 993 precast concrete panels for the residential portion at the facility in Winchester, Kentucky. (Its Oxford, North Carolina, plant is making an addition 612 panels for the commercial and retail parts of the building.)

Gate Precast won the Domino Sugar Refinery job with a quote based on the cost of making the forms in wood, but knew that this would be a major undertaking. Just to build the wooden forms would have taken 9 months, says Steve Schweitzer, vice president of operations. It wasn’t a given that Gate Precast would be able to build all these forms itself, and farming the work out to subcontractors would likely drive the price up. But with the success of the cornice forms in testing in Tennessee, 3D printing began to look like a viable option.

3D-printed forms may not be worthwhile for a lot of precast concrete jobs. Architectural precast work, like what the Winchester plant performs, can be small batches and one-offs for which a wooden form can be built quickly and cheaply. But the Domino Sugar Refinery project is a job on a completely different scale, illustrating the transformative possibilities of 3D printing for concrete applications. Casting the hundreds of punched windows needed would demand durability and repeatability from the tooling—two things difficult to achieve with forms made of wood.

Gate Precast approached ORNL and Two Trees, the contractor and owner of the Domino Sugar development, to discuss using 3D-printed tooling for the precast windows. Doing so also meant proposing a design change: reducing the number of different window profiles used on the building to make it cost-effective and practical to 3D print the forms. Fortunately, the owner liked the idea of using an unusual technology in the project and approved the change.

For the Domino Sugar Refinery project, Gate Precast is using forms provided by Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee, and Additive Engineering Solutions (AES) in Akron, Ohio. The forms are 3D-printed from carbon fiber-filled polymer and then machined on the critical surfaces.

ORNL went to work manufacturing 40 3D-printed forms in five different window profiles at its facility in Knoxville, Tennessee, and eventually Additive Engineering Solutions (AES), a commercial BAAM user in Akron, Ohio, was brought on to help make the forms as well. About half of the forms are being manufactured in each location.

It takes between 8 and 11 hours to print each concrete form to its “as-printed” model in the Big Area Additive Manufacturing (BAAM) 3D printer, says Austin Schmidt, AES president. Image courtesy AES.

The process starts with developing two CAD models for each mold: an “as-printed” version and an “as-machined” version, says Austin Schmidt, AES president. The “as-printed” model incorporates sufficient stock for the machining process and is fed into the printer as an STL file. Each form weighs between 450 and 750 lbs and takes 8 to 11 hours to print, Schmidt says. Following printing, a large five-axis router machines each form to the “as-machined” model. Light sanding and inspection follows before the forms are shipped to Kentucky.

On one of the largest forms, measuring 104 by 67 by 21 inches, AES enlisted help from Thermwood. The company’s Large Scale Additive Manufacturing (LSAM) system, which features both an extrusion head and five-axis router on independent gantries, made it possible to manufacture this form completely on one machine. (Watch it here.)

Pouring Concrete with a 3D-Printed Form

In practice, pouring concrete with a 3D-printed form is not much different from pouring with a wooden one. The windows are typically cast three together on top of a large platform in Gate Precast’s Winchester facility. The 3D-printed forms actually represent the front and inner sides of the window cavities, with the outer edges formed by removable plywood walls. Both the forms and sides are treated with a form release oil to help with the unmolding process.

Although they are not structural features, the punched windows still require reinforcement in the form of rebar, which helps to support the windows during stripping and shipping. The rebar is set into the empty form before the pour, along with lifting inserts and tiebacks that will serve to enable shipping, lifting and eventually hold the windows onto the building.

An overhead crane carries buckets of concrete to the platform; it takes about three buckets of concrete to fill one of the three-window forms, Schweitzer says, with each concrete cast weighing 20,000 to 30,000 lbs depending on the profile. The pour stops periodically to allow the platform to vibrate and consolidate the concrete around the rebar, removing any air pockets from the face. Once the form is filled, workers manually smooth the top surface of the concrete (which will be the back of the windows) and clear the areas around the lifting and tieback inserts.

Concrete pours take place each day in the afternoon, and then cure for 12 to 14 hours. The plywood walls are stripped from the cast first, and then the concrete pieces are lifted from the forms around 3 a.m. each morning. Each window frame is manually cleaned with an acid wash spray that exposes the sand in the mix. Then, some of the front faces are polished to expose the sand and aggregate. The resulting finish sparkles—quite intentionally—like sugar cubes.

Finally, before shipping the panels off to Brooklyn, Gate Precast is also installing the glass for many of the windows. This is a new assembly step for the company, says Schweitzer, but a good example of how precast concrete can provide savings in cost and labor by shifting assembly tasks off-site.

Lessons Learned

Getting to the point of producing windows with these forms took a bit of R&D. Pours on early 3D-printed forms resulted in what Schwietzer calls “the corduroy effect”—visible bead lines from the 3D print that transferred into the concrete. ORNL solved this problem by increasing the bead size in the BAAM and overprinting the forms so that the excess could be machined down to a smooth surface. Gate Precast also discovered that the bottom faces of the forms were not necessarily square as-printed, so the manufacturers have to flip each form after milling the walls to then CNC the bottom to a flat surface.

AES also cites thermal distortion as a challenge in working with parts of this size. “Even though we are using carbon fiber-filled ABS, the parts still have a tendency to warp slightly,” Schmidt says,  “and that must be accounted for during the design process.” (Like Gate Precast, AES typically deals in small quantities but has benefited from the repetitive nature of this job.  Manufacturing multiple pieces that are very similar “has given us the ability to really hone in our process,” Schmidt says. “This was a welcomed bonus as we are usually making one-off parts.”)

There were also challenges with some specific features. For instance, one of the window profiles has an indentation in the frame which must be supported by the 3D-printed form. Early forms lacked reinforcement under this feature, and the weight of the concrete caused sagging in the form and final product. It took about half a dozen design iterations to achieve a workable solution, Schweitzer says. The forms for this profile now include ribs under the indentation to support the concrete during the pour and curing process.

Maintenance has also been an ongoing process of discovery. While the 3D-printed molds are far more durable than their wooden counterparts, they still require regular maintenance as they age. “We know we’re going to ding it up,” Schwietzer says, so “How do we repair it?” is a necessary question to answer. The most common maintenance issue is separation of the 3D-printed layers at the bead line, which Gate Precast mitigates with Bondo (commonly used on the wooden forms as well) and the use of a heat gun to deposit polymer pellets (which are then sanded down to a smooth finish).

A Piece of the Pie

Gate Precast estimated that each wood form necessary for the Domino Sugar building would cost between $1,500 and $1,800, and require substantial skill and labor to produce. With the sheer volume of forms needed, this approach would have been costly and time consuming. In addition, a wooden form would be good for about 15-20pours before requiring maintenance or scrapping.

Each 3D-printed concrete form, by contrast, costs about $9,000—by no means a small investment. However, this project should show that a printed form can support as many as 200 concrete pours in its lifetime. For the consistency and high volume required on the Domino Sugar project, the repeatability and durability of these forms makes sense. The 3D-printed forms also provide an aesthetic benefit: Compared side-by-side with concrete casts made on wood forms, the 3D-printed precast window frames can be smoother, with sharper corners.

The project has gone so well so far that PCI has been using Gate Precast as an example for its members. The Winchester plant has hosted several groups of precasters who are interested in seeing how 3D printing can support their work. These tours include representatives from companies that could be considered Gate Precast’s competitors—but that doesn’t worry them.

“Our competitors are not other precasters, but cast-in-place concrete and other materials,” Schweitzer explains. For Gate Precast, “It’s about getting a bigger piece of the building pie, not just the precast pie.” The 3D-printed forms for the Domino Sugar windows are providing proof of concept that could win more of that pie for precast concrete.

>> Originally posted by Stephanie Hendrixson, Additive Manufacturing Magazine, 5/15/2018

15 ‘Facts’ About Robot Integration That Everyone Thinks Are True

Integration… It’s hard, right? You need an integrator, years of robotics experience, lots of time… Nope! These common “facts” are all fiction.

There are a lot of misconceptions about robotics.

Some of these myths are so maddeningly common and misguided that we’ve covered them many times before (e.g. robots steal jobs, robots will kill us all, etc). I won’t bore you by covering these again. They are the type of myth that the general public believes, but those of us with engineering experience know that they are not true.

However, there are some misguided “facts” which even many engineers and business owners believe. They regard robot integration.

A lot of people wrongly believe that robotics is not for them because they are misled by these integration myths.

Here are 15 of the most pervasive “facts” which stop intelligent people like you from experiencing the great benefits of robotics.

“If you can work a smartphone, you can work a robot.” – Sabrina Thompson, production employee, Scott Fetzer Group
  1. Integration requires an integrator
    It makes sense, right? You need to hire an external robot integrator to integrate your robot.Wrong!

    In the past, yes, it was true. In the past, robot integration was a complex and highly-skilled job requiring years of programming experience. In some specific cases, you do still need an integrator these days. However, for many applications, you can get started with robotics with no external integrator at all. This is particularly true with collaborative robots.

  2. Integration requires robotics experience
    Nope! We’ve seen countless professionals who had no previous robotics experience manage to integrate robots with no problems.Take the example of engineer Victor Canton from our case study of Continental in Spain. Even with no experience in robotics, he was able to integrate a robot for their Quality Testing task.
  3. Robots are only suitable for mass production
    This pervasive myth originates from an out-of-date idea of robotics. From the 1960s (when the first industrial robot was built) through to the 1990s, robots were solely the tools of mass production. They were only used by industries like automotive, where the high volume, low mix environment suited the inflexible industrial robots of the time.These days, however, robots are just as applicable to low volume, high mix environments as they are for mass production.
  4. Robots need vision sensors
    People often think that robots need to have vision sensors because we humans need our eyes to see what we’re doing. However, this is not necessarily true.Robot vision can make integration harder than it needs to be (even with easy-to-use vision sensors). Many tasks do not require vision. As a result, such tasks are very simple to integrate.
  5.  Integration takes a long time
    Despite what many people think, there is no need for robot integration to take a long time. This misconception stems from the fact that traditional industrial robotics integration was a very complex and time-consuming affair.At Robotiq, we are committed to helping people get robots up-and-running quickly. We have seen people integrate the bare-bones of a robot application within just a few hours.
  6. Integration is expensive
    The cost of integration has been falling over the last decade or so. Integration is now much more cost effective, even for industrial robots which are more expensive to integrate than collaborative robots.With industrial robots, integration costs increase the overall cost of the robot by 300%. With collaborative robots, integration costs can be negligible.
  7. Integration is complex
    Robots sound complicated, right? However, it is no longer the case that integration has to be complex.Even if you have never used a robot before, integration can be very simple. Sabrina Thompson from our case study of the Scott Fetzer group said: “If you can work a smartphone, you can work a robot.” Robot programming is much easier than it used to be.
  8. Robotics means changing production layout“We can’t integrate a robot because we don’t want to change our process.”

    This is a common concern. People think that robots will cause a major upheaval and require the production layout to be changed.

    In reality, you can slot a robot into a small space in an existing work cell without changing the rest of the process at all.

  9.  Robots need a lot of technologySensors, security fences, controller add-ons, advanced fixings, part positioners… when many people think of a robot cell, they imagine it with lots of added technology. This may be because we are used to seeing images and videos of industrial robots kitted out with all the technological extras.

    You can actually get a robot running with nothing more than the robot itself, a gripper and a controller. Simple.

  10.  Robots are only for new tasksPeople sometimes think that robots are only applicable to new tasks in their business. They can’t imagine how a robot could be used for their current, manual tasks. They wrongly assume that they will only be able to automate once they scale their business and design new processes.

    In actual fact, manual tasks are often the best tasks to automate. Once you have identified a suitable application, it is much quicker and easier to automate an existing task than to create a new application from scratch.

  11.  Robots are too dangerous to integrateSafety is at the top of some people’s list as a reason that they can’t integrate a robot themselves. They worry that they might make a mistake and create a robot which is dangerous.

    With traditional industrial robots, safety was always a huge concern. However, with collaborative robots, safety features are built in. This means that safe integration is almost guaranteed.

  12.  Safety standards are too complexAlthough collaborative robots are inherently safe, they still need a risk assessment. Some people are put off by this, thinking that they will have to learn about robotic safety standards in order to integrate their own robots.

    Cobot safety standards are actually simple to get your head around. You can download our eBook How to Perform a Risk Assessment for Collaborative Robots for a practical guide to the essential safety standards.

  13. High-changeovers are a no-goA lot of the companies we hear from think that they can’t integrate a robot into their process because they have too many changeovers. They think that robots are only useful for long runs of product. However, environments with a high number of changeovers can benefit a lot from adding a robot. The trick is to pick the right robot gripper and tooling.
  14. Robots have limited usesSometimes, people can only think of one or two potential applications for robots and assume that robots have limited applications. For example, they might only be able to think of welding or painting because these are classic examples of traditional industrial robotic tasks.

    In reality, there are so many potential uses of robots that there are too many for us to count. We are regularly amazed by the many new and inventive applications that people come up with when they first start using collaborative robots.

  15. Getting started with robots is hardProbably the most pervasive misconception is that it’s hard to start using robots.

    Nothing could be further from the truth! These days, it is easier than ever to start using a robot with almost no training at all. You can integrate your own robots quickly with no previous experience.

    At Robotiq, we’re committed to doing all that we can to help you get started with robotics.

    Got a task you’re thinking about automating? Enter the detail into our Blueprints form and we’ll help you out.

>> Originally posted by Alex Owen-Hill, Robotiq, May 8, 2018

Zoomorphism: Making More Natural Robots To Answer Industrial Woes

Zoomorphism describes how animal characteristics can apply to inanimate objects, which could have the potential to innovate the robotic world by changing the way humans revolutionize robotic sciences. Jonathan Wilkins, marketing director at equipment supplier EU Automation, explains how zoomorphism can impact the design for better industrial robots.

The nation was touched when the groundbreaking BBC documentary, Spy in the Wild, broadcasted a group of Indian langur monkeys mourning the death of a robotic baby monkey they had accepted into their group. The robotic monkey was filming the group when it fell at a steep height and was taken out of action. As it lay still, silence spread through the group and, one by one, the monkeys began to hug and console each other in a show of grief.

This is a touching story that has helped scientists learn about group behaviour in a new and novel way, and demonstrated how engineers can create a robot so natural in its movements and mannerisms that a group of relatively intelligent animals could not tell it apart from their own.

While humans are yet to overcome the same uncanny valley for human androids, there is a lot we can learn from the development of robots that exhibit animal characteristics. Three areas that are particularly interesting are grippers, limbs and artificial intelligence software.


One of the biggest barriers to the adoption of industrial robots in picking and packing lines has been the use of adequate grippers that can pick objects of varying size, shape, and weight quickly and accurately without damaging or deforming the product.

This is especially important in the food and beverage sector, such as supermarket fulfilment centres, where soft hand-like grippers covered in tiny suckers are used to pick and pack items of food, such as heads of lettuce without damaging the product. At the same time, the grippers are durable enough to handle glass bottles and heavier metal cans of soup.


Although there is a tendency to create robots after our own image, why create robots with human limitations? Modelling the limb movements of robots after those of arthropods, insects and four-legged mammals, such as dogs and cheetahs, offers engineers the ability to create robots that can traverse rough terrain quickly and quickly recover from falls and setbacks.

While this is particularly useful for military applications, it also offers opportunities for industrial use in factories and plants. Here, such robots could provide use in a more diverse range of applications, and replace the need for single-use robots such as automated guided vehicles (AGVs), cranes and forklifts.

Software and AI

Creating hardware that is capable of mimicking animal movements is only half the battle. Creating the software and algorithms that can mimic the subtle nuances of human and animal interaction is the other challenge.

Michael Mendelson, a curriculum developer at the NVIDIA Deep Learning Institute was quoted in Autodesk’s Redshift publication explaining, “Without flexible algorithms, computers can only do what we tell them. Many tasks, especially those involving perception, can’t be translated into rule-based instructions. In a manufacturing context, some of the more immediately interesting applications will involve perception”.

Although, high-resolution machine vision optical sensors exist already, making sense of the high volumes of data in fractions of a second will continue to improve areas such as quality control. Imagine a robot capable of seeing microscopic defects in an integrated circuit board. Also, envision a collaborative robot (CoBot) that can stop an accident when working alongside a human being by catching a falling object or swerving to avoid a collision without having to bring the factory to a halt.

There are already companies leading the way in zoomorphism-based research and development. Companies such as German automation giant Festo and US robotics expert Boston Dynamics are already pushing the boundaries of what robots can do, having developed examples of birds, sea creatures and mammals in robot form.

By learning the right lessons and embracing what the natural world offers, engineers can go beyond the ordinary and create robots that illicit a truly emotional response.

>> Originally posted by Jonathan Wilkins, Product Design & Development, 5/25/2018

How data will build the factories of the future

With the continuing rise of robotics and automation on the factory floor, it’s not just UK manufacturing that needs to get to grips with these technological advances. Leo Craig, General Manager of Riello UPS, explains impact Industry 4.0 will have on the data centre and power protection sectors too.

‘Made Smarter’, the Government’s industry-led review of industrial digitalisation, claims Industry 4.0 could boost UK manufacturing by £455 billion and create 175,000 jobs whilst cutting CO2 emissions by 4.5%. On a global scale, Accenture research reveals the ‘Industrial Internet of Things’ (IIoT) could add more than $14 trillion by 2030, the equivalent of the current GDP of the UK, France, Germany, Italy, Spain, Canada, the Netherlands, and Belgium combined.

When you comprehend the sheer scale of those figures, it’s no surprise why the majority of UK manufacturers are keen to embrace the possibilities provided by increased automation, artificial intelligence, machine learning, and robotics.

The factory floor is home to hundreds of machines, from industrial plant and production lines, through to air conditioning units and the vital uninterruptible power supply (UPS) systems keeping the electricity flowing. Each of these devices is fitted with countless sensors that produce priceless data and enable them to interact with each other.

Combine this constant flow of data with intelligent, real-time analysis and insight, and the outcome is obvious. Reduced processing flaws, improved production quality, enhanced efficiency, optimised supply chains, better maintenance – working smarter offers some serious time and money savings.

As with any industrial revolution, there are questions and concerns. Nearly a third of UK jobs (30%) are said to be at risk from the ‘rise of the robots’, although the reality is likely to see many traditional roles displaced rather than mass unemployment.

There are big questions to answer for us in the datacentre and critical power protection industries too, namely how we contend with the enormous volumes of data that smart factory, IoT-connected devices will create. The performance logs from a single works machine can generate around 5 gigabytes (GB) of data per week, and a typical smart factory produces around 5 petabytes (PB) per week – that’s 5 million GB, the equivalent of more than 300,000 16 GB iPhones!

Big Data, big opportunities

For manufacturers to fully tap into the potential of Industry 4.0, they must combine data from their connected devices with AI, processing, and analytics. Whereas a traditional factory would produce goods, smart factory produces goods and data – the two must go hand-in-hand to realise the advances in productivity and efficiency.

These big data benefits are far-reaching across the factory floor and used in all areas of operations, from product quality and stock control, through to supply chain optimisation and improved health and safety. The data produced by a smart, connected UPS, can feed into AI-influenced decisions that impact on a plant’s power consumption, energy efficiency, and machinery maintenance regime.

As much as 30% of a manufacturer’s annual revenues can be lost through defects in the production process. By analysing real-time data from sensors on the production line, many quality issues are spotted and rectified as soon as they arise.

Tech giant Intel famously used data analytics to help predict equipment failure in one of their microchips. The outcomes were exceptional. A 50% reduction in maintenance time, 25% higher yields, and a 20% reduction in the cost of spare parts all added up to a saving of $3 million. The flip-side is that to produce this positive outcome required 5 terabytes (TB) of machine data per hour!

Another crucial area where data from IIoT devices is having a major impact is machinery maintenance. Industrial equipment tends to be serviced on a fixed schedule (i.e. monthly or yearly) regardless of its operating condition. Analysing performance statistics produced by the machines enables preventive maintenance to be carried out based on needs rather than time, leading to less wasted labour and reducing the risk of unexpected failure.

Whether it’s measuring damage or deterioration during transit to optimise packaging materials, studying consumer trends and buyer behaviour in real-time to inform production output, or monitoring KPIs such as staff absence or workplace injuries to improve in-house processes, the impact of connected devices and data analysis impacts every aspect of industrial life.

Manufacturers even glean valuable insight from sensors built-in to many products long after they’ve been sold to the end-user. Data demonstrates how an item is being used – is its performance impacted by the surrounding environment? What features are customers utilising the most? Precious insight that feeds back into the ongoing product development process.

The rise of the robots – or should that be the Co-Bots?

Of course, whenever there’s discussion about Industry 4.0 and smart factories, the talk soon turns to robots. Robotic process automation (RPA) has always played a huge part in driving forward manufacturing efficiencies.

Robots have automated so many of the dull, manual, and even unsafe tasks that we humans used to perform. They perform these tasks better, quicker, and more accurately than we ever could. And unlike us, robots don’t get tired towards the end of a shift, so productivity levels remain consistently high.

Here in the UK, we lag behind many of our competitors. International Federation of Robotics statistics shows that there are only 33 robots per 10,000 employees here, well behind Japan (213 per 10,000), Germany (170 per 10,000) and even Sweden (154 per 10,000). And with the Boston Consulting Group predicting a quarter of manufacturing tasks will be carried out by a robot by 2025 (up from 10% at present) that prevailing mindset, and those ratios, need to change.

Advances in computing power and networking technologies mean today’s robots are far ‘smarter’ than their predecessors, not just in the functions they can perform but how they can adapt on the go. Whereas early robots simply carried out the same function again and again, modern robots can adjust their movements in real-time, learn lessons, and even collaborate with each other – all thanks to the quantities of data they and their fellow machines produce.

In factories of the future, it won’t be an either-or choice between humans or robots. Co-bots and workers will be side-by-side on the plant floor. Indeed, by 2020 The Manufacturer predicts 60% of human factory workers will be working alongside automated assistance technologies such as robotics or AI.

What are the Industry 4.0 implications for data centres?

At one time, datacentres were purely means for storage of data. But we all know the industrial internet of things makes everything far more dynamic and fluid. In smart factories across the country, data is being captured, analysed, and processed – all in real-time.

All this additional processing requires extra storage capacity and extra power. It does, of course, provide the data centre industry with fantastic opportunities, but at the same time raises legitimate questions for managers and operators to consider if they are to fully capitalise.

Is your datacentre capable of handling the 5 GB of data that a single smart machine will create each week? Probably. But what about 100 machines? Or 1,000 machines? Or even more, when you think how many devices and sensors can be housed on the factory floor? And how will you balance the need for additional power to keep up with processing demands without requiring a huge expansion in footprint? The classic conundrum of ‘doing more with less’.

Smaller micro datacentres utilising a modular approach are making it possible for data processing facilities to be based either on-site or as close to the location as possible. This edge computing is essential as it gives manufacturers the capabilities to run real-time analytics, rather than vast volumes of data needing to be shipped all the way to the cloud and back for processing.

Modular datacentres give operators the scope to ‘pay as you grow’ as and when the time comes for expansion. The rise of modular UPS provides similar benefits in terms of power protection requirements too. And with all the additional revenues a datacentre could make from Industry 4.0, the need for a reliable and robust continuous supply of electricity becomes even more imperative.

Transformerless modular UPSs deliver higher power density in less space, run far more efficiently at all power loads so waste less energy, and also don’t need as much energy-intensive air conditioning to keep them cool.

Any data centre manager planning to take advantage of manufacturers’ growing data demands would be wise to review their current power protection capabilities. If their UPS units are older, bigger, and less efficient models, upgrading to modern, modular versions would be a prudent move to ensure you’re fully prepared for the demands smart factories will bring.

Of course, for certain manufacturers, particularly those with multiple sites spread across several locations, there’ll still be a requirement for some sort of centralised data storage capability, whether that’s an onsite datacentre or via the cloud.

Increased automation will also leave us questioning many of the fundamental factors usually required for running a factory or datacentre. If processes are automated to the stage where facilities are practically unmanned, do they need lighting or heating? This opens up opportunities for significant energy and cost savings.

Today’s consumers are more demanding than ever. They prefer personalised or custom products to standard ones, and they want them almost instantaneously. Of course, they also expect these unique products to be accessible and as cheap as mass-produced ones.

The rise of machine learning and Industry 4.0 has the potential to kick-start an ‘Uber-isation’ of manufacturing, where custom products are made on-demand locally, rather than being shipped throughout the world.

>> Originally posted by Leo Craig, The Stack, May 14, 2018

How Digital Twins For Metrology Enables Smart Manufacturing

A Digital Twin (DT) is a software model of a process and/or product. Though we don’t often think of it as such, a 3D CAD model with semantic tolerances (a Model-Based Definition) is a digital twin of a to-be manufactured part.

Essential in discrete part manufacturing, Computer Numerical Controlled (CNC) Coordinate Measurements Machines (CMMs) are type of automatic machine designed to verify a part’s dimensional compliance to a 3D CAD model, validate the associated manufacturing process, and provide actionable data for process correction and design optimization.

CNC CMMs are very effective in executing programs for these dimensional metrology tasks, but have an incomplete awareness of itself and of the overall context of the manufacturing process in which it resides.

Completing a Digital Twin for a CNC CMM first requires that a 3D-software model of the physical machine is created by the CMM’s manufacturer and made available to the owner of the CMM.   Functional capabilities (speed, accuracy, working volume for any configured CMM are then associated with the 3D software model. To further complete the Digital Twin, the machine’s kinematic capabilities are also integrated into the 3D software model.

At this point, when synchronized to its real counterpart on the shop floor, the CMM’s Digital Twin is driven by the actual CMM’s operation and can include real-time measured results and process data. This driven Digital Twin can be displayed locally or at a multi-machine monitoring station, or on a tablet while the responsible operator or owner is elsewhere.

To enable a CMM’s Digital Twin to support driving a physical CMM, it is necessary to add another layer of awareness to the Digital Twin’s 3D software model. Humans are quite adept at task-based planning, and a CMM Digital Twin should have complementary capabilities in the highly technical topic of dimensional measurement process planning.

The complex task of measurement planning first requires the CMM’s Digital Twin be able to import and process the 3D model of the part and its precise nominal boundaries and tolerances.

The CMM’s Digital Twin then applies, based upon a company’s measurement best practices (a set of editable rules) and the Digital Twin’s associated capabilities to determine what can be automatically measured and what cannot. The final steps in this Digital Twin workflow are collision avoidance calculations, graphical 3D simulation, and the automatic generation of a CNC CMM part program. The person overseeing the measurement process plan generation has the final option to approve or override the Digital Twin’s recommended program.

In this extended use case, the smarter and more complete CMM Digital Twin can now drive its physical counterpart. The CMM Digital Twin can be locally connected to the physical CMM, or disconnected and operating independently on another software platform or at another location.

When disconnected, this now smart CMM Digital Twin operates in a software-based manufacturing process planning ‘what if’ system. As the CMM manufacturer’s library of Digital Twins can encompass hundreds of available models and configurations, narrowing the selected CMM in this larger context becomes practical, if not required. The selected Digital Twin can furthermore operate in simulation in a Smart Manufacturing planning process system to optimize the overall manufacturing process before actual execution on the shop floor.

Another emerging, but not yet widely implemented use case for Digital Twins includes not just CMMs but also Machining Centers, Material Handling Robots, and other Smart Machines connected peer-to-peer (machine to machine). Human mediated communication between CNC machines is the gold standard but can be tedious for a person. Programmable Logic Controllers (PLCs) are reliable for low level machine to machine communications, but are generally considered to be much less adaptable than people or a machine’s Digital Twin.

Heterogeneous Smart Digital Twins communicating peer-to-peer can be reliable as well adaptable (to a certain extent) to changing manufacturing requirements. This allows people to attend to exceptions and more effectively orchestrate these capable and smart engines of manufacturing.

The National Institute of Standards and Technology (NIST) in particular has taken the lead in the digital transformation of U.S. manufacturing by developing a working reference implementation with their Smart Manufacturing Systems (SMS) Test Bed project.

It’s contemplated that ‘Factory Intelligence’, whatever that may be, will emerge from the interactions from skilled staff and cyberphysical systems, with Digital Twins for metrology playing a key role.

>> Originally posted by Larry Maggiano, Manufacturing Business Technology, 5/30/18