Using Digital Twins to Reduce Costs in Machine Commissioning

The push to develop and release disruptive products ahead of the competition can come unstuck during the early commissioning phases of product development. Too often, design problems only come to light as components or subsystems are being integrated into the machine during the first customer build, forcing the need for late-stage design changes that cause costs to escalate and projects to overrun significantly.

The Business Case for Machine-Level Virtual Commissioning

Machine designers can test a virtual system and catch errors before commissioning. (Image courtesy of Maplesoft.)

In the manufacturing field, virtual commissioning is typically understood to mean the use of software tools for visualizing the complete assembly line or

factory. However, the commissioning of the individual machine before integration into the assembly line allows users to investigate how the mechanisms are driven by actuation systems to provide the required motion.

The primary driver for adopting this approach is the cost of forcing a design changes that are caused by problems identified during the commissioning phases of machine development. In simply considering the increased effort in labor alone, an issue discovered in the integration and commissioning phases will require 60-100 times more effort to address than it would if it were discovered in the conceptual and design phases. These costs continue to multiply if issues persist after the systems is delivered to a customer. Then, the costs become substantial, resulting in lost production, labor downtime, and major damage to reputations.

Investing in the right tools for virtual commissioning can reduce many of these costs. Cost reductions can range from 50 to 100%. Yet most machine manufacturers still view the commissioning stages as the moment-of-truth and fully expect things to go wrong, attributing the time and dollars spent in fixing these issues as part of the cost of doing business.

The Virtual Commissioning Process

The digital twin can be defined as a physics-based model that encompasses the dynamics and kinematics of a machine that is made of numerous subsystems from different engineering domains – mechanical, electrical, hydraulic, and so on. The digital twin’s purpose is to predict the response of the machine to input power and varying loads.

Virtual commissioning, in this context, is the act of integrating the various subsystems into the digital twin and then testing its operation through a wide range of duty cycles. Essentially, this process is performing the same tests one would carry out on the real machine, but much earlier and without the outlay of the hardware.

Having a better handle on the system behavior can provide engineers with a rigorous platform for developing the control system by connecting and testing the controller with the digital twin. Then, by integrating the control systems into the digital twin and running it on a real-time automation platform, engineers can start validating the hardware against the twin as the components become available from the suppliers.

Using Virtualization to Reduce Costs

Early insight into machine behavior helps reduce the substantial costs during commissioning. For instance, model-based actuator sizing can reduce undersized motors from failing and reduce the costs of purchasing oversized components. Quick tests can ensure the proper performance for any changes in the machine’s duty cycle. Catching a single late-stage design flaw can justify the costs of virtual commissioning.

The full implementation of twin-based design techniques won’t happen overnight, but it is inevitable. As early adopters begin integrating the digital twin into their design and development processes, we will see growing model-driven capabilities being incorporated into their products. We expect companies to gear up for tighter integration with global data pools and machine learning technologies over the next 10 years.

Implementing virtual commissioning into their design processes is a very powerful entry point for this journey, bringing almost immediate benefits of reducing integration costs and minimizing product risk, while providing a solid foundation for their digital twin development in the future.

>> This article by Paul Goossens was re-posted from Design News, January 2, 2018

Siemens Digitizes Industrial Machines to Speed Development

Siemens PLM has created the Advanced Machine Engineering (AME) solution to provides a platform that connects mechanical, electrical, and software engineering data to allow engineers access to a completely digital machine-build prototype. This digital twin represents an industrial machine operation that can be tested virtually throughout the development process. The goal of the engineering platform is to increase collaboration and reduce development time, while also reducing risk and allowing for the reuse of existing designs.

The AME uses modularized product development to establish common parts and processes among a family of products while defining functional modules that can be easily modified to meet specific requirements support changes. In other words, you can build the manufacturing process like a collection of Legos (chunks of software), then customize the configuration and test it before you begin banging equipment into place.

Mechatronic design provides a common platform for concurrent product development. Image courtesy of Siemens PLM

By involving mechanical engineering, electrical engineering, and software development processes simultaneously, you shift away from the more time-consuming serial development process. You create a concurrent method that effectively turns the process into mechatronics.

Siemens developed the AME into order to speed the time it takes to set up plant equipment while also making the machine configurations easier to customize. “We created this for companies that are making automation control equipment, packaging machines, printing machines, anything that has a lot of mechanical systems and components, as well as sensors, and drives,” Rahul Garg, senior global director of industrial machinery and heavy equipment at Siemens PLM, told Design News. “Typically, these are the companies making products and machines that go into a plant.”

Creating the Modular Plant

One of the goals in developing AME was to make plant equipment modular, so the overall configuration of plant processes could be done more quickly and with greater flexibility. The digitized modular plant concept was also designed to reduce risk and engineering time. The process can be design and tested digitally. “Many of these companies need to serve their end customers with increasing customization,” said Garg. “We wanted to create the ability to modularize the machine structure to deal with customization and quickly respond to engineering or systems changes.”

Leverage a digital twin to virtually test complex machine requirements. Image courtesy of Siemens PLM

The modular approach to managing plant equipment also supports change, especially since much of the engineering to support the change is worked out on a digital level using existing modules that are already validated. “This improves the way the machine builders manage the end-customer requirements. Those requirements are change. How do you, manage that change? Get the engineering communicated to the shop floor and to those who service the products,” said Garg. “We are trying to improve the way they manage the engineering process and schedules to better control and improve the risk while working on large projects.”

Mechatronics on the Machine Level

The idea is to build new functionality into the equipment driven by automation and analytics. The intention is to turn it into an easy and rapid process. “You have to deliver the innovation in a fast process and reuse it,” said Garg. “The idea is to create a digital twin of the machine where you can simulate the entire behavior of the machine using control software and control applications. You drive the systems with the software.”

The AME contributes to the concept of the digital twin, which digitizes a product from design, through configuration, and into operation at the customer’s plant. “What we are trying to do is create manufacturing functions through the visualization process,” said Garg. “Then we want to take digitation further, by closing the loop with the physical product. Once the plant equipment is out in the field and the customers start using the equipment and machines, we want the ability to see and monitor the performance of the equipment and see how it’s performing.”

>> This article by Rob Spiegel was reposted from DesignNews.com (November 23, 2017)

Test Drive Watson IoT in Factory Simulation

The singularly defining moment in Artificial Intelligence (AI)—even bigger than when IBM’s Deep Blue beat chess master Garry Kasparov—happened back in in 2011 when the unbeatable Jeopardy! Champion Ken Jennings was finally taken down by IBM’s new supercomputer, Watson.

“I felt obsolete,” the rust-haired computer programmer said after IBM’s supercomputer Watson usurped his position. “I felt like a Detroit auto worker of the ’80s seeing a robot that could do his job on the assembly line.”

Jennings, who won 74 matches of the trivial knowledge game show in a row, explained in a TedTalk that he thought there was no way the AI could pick up on the various clues’ nuances and double meanings.

Watson, it turns out, was not only a fast thinker (16 TB of RAM) with a great memory (1 TB), it also understood context. It cleaned the stage floor with Jennings, beating him by $53,147.

With that victory under its belt, Watson has moved on from the game show circuit to uses its powers and contextual brilliance to take on something even bigger: manufacturing.

Inviting Watson to connect with all your Industry 4.0 machines and sensors has been proven to reduce equipment downtime and extend their life, improve process and product quality, and optimize product development. According to IBM, one manufacturer was even able to double its output without doubling assets.

Game Theory

To show how this is possible, IBM gamified Watson, in the styling of one of those classic “Choose Your Own Adventure” books. It’s called Industry 4.0 Model Factory and you can play it here. It works on PCs and smart devices.

As the resident gamer here in the office, I gladly volunteered to test it out and see what it’s all about.

The simulation puts me in charge of a shoe plant where I’m tasked with producing 2,000 blue and 3,000 tan units.

Everything is humming along smoothly, until out of nowhere, the evil villain of unpredictable weather comes knocking. It uses blizzard attack on my tan materials supplier, and it’s super effective. Watson Weather Alert estimates a 35% risk of disruption.

In gaming terms, this is basically like Donkey Kong dropping a barrel on your head.

My factory’s throughput drops from 99% to 81% and my on-time deliverables fall from near perfect to 72%.

The prompt asks to choose between acting at 50% or 75%. In most of my life, I’m a procrastinator, but there seems to be a lot of simulated workers and their families in the Matrix counting on me, so I act immediately.

Hooray, tan units are ahead of schedule! My orders are 91% on-time and my risk is low. Crisis averted.

Immediately, this makes me wonder what would have happened if Watson was on the job in 2010 when Iceland’s Eyjafjallajökull volcano erupted. The plume of ash inflated to cover much of the North Atlantic, disrupting flights across the globe. The International Air Transport Association estimated the airline industry lost $1.7 billion, so you can guess what impact that had on the world’s supply chain. Kenya alone had to destroy 400 metric tons of flowers earmarked for Europe.

Which means, at just one disaster in, I’m already seeing the point of this system. In a real world, harnessing this smart game’s brain could have dramatic impact across the whole supply chain.

Watson can determine how inclement weather on the other side of the globe can affect your bottom line.

For the next quest in this simulation, Watson has noticed irregular vibration and temperature readings in my tan shoe assembly machine. Now I have to decide whether to fix this one machine and avoid any tan delays, or retool all lines for blue.

I figure, if one machine has a problem, there might be more problems afoot. So I switch the whole line to blue, and the simulation says I prevented a disruption and boosted blue output. The game doesn’t give me any points or extra lives, but we can only hope my virtual shareholders take notice and give me a hefty pretend raise.

IBM’s AI solution doesn’t just deal with performance metrics in complex blockchains. It’s there right with your worker, diagnosing and predicting equipment failures on the floor.

Finally, when it’s time to make repairs, Watson is my Sherlock, ready to deduce what’s going on simply by talking to it. In the simulation, the HMI is a tablet. After using that amazing processing power first developed to crush Ken Jennings, it informs me that “Part A12 may fail after 370 hours of operation if humidity tops 50%.”

Way to almost ruin it for everybody, A12.

Once Watson discovers an issue on one machine, it can apply a fix to all like equipment within the network.

Well, now Watson knows the real culprit and applies that knowledge to the rest of the factory. For at least the next month, no outages are predicted and everything is back to being on time and on budget.

It’s not as satisfying as rescuing a princess or saving the planet from alien invasion, but this game does have a satisfying ending.

Potential Impact

As a bonus game, IBM offers a quick four-question survey to assess where you stand on the road to Industry 4.0. The biggest takeaway here is that according to the survey, one out of every three manufacturers is collecting machine equipment data (good), but isn’t using it to its full advantage. That’s like running your conveyors at half-speed for no apparent reason. That’s at least better than the 26% still collecting data manually, but not by much.

Once everyone starts looking at data as potential tools, and not trivial points, that’s when we can say we’re all on the path to winning. And at least for now, it seems Watson isn’t trying to take our jobs in the process, but ensuring that we still have them.

Play Industry 4.0 Model Factory for yourself here.

>> This article by John Hicks was re-posted from IndustryWeek.com, November 13, 2017

Reexamining the Role of SCADA Systems in Digital Manufacturing Operations

The data revolution is firmly underway within today’s manufacturing industry. Those companies that capture their “big data” and leverage that analyzed data as a framework for making faster, better decisions will lead the industry in productivity and time to market.

Let’s put into perspective the scale of this data revolution. Consider the following: In just one manufacturing site, an estimated 3 billion sensor-related events occur during a 24-hour period. Each of those sensor-related transactions represents a piece of data, and most of that data can and should be used to improve operational efficiency.

As greater numbers of smart field sensors and actuators deploy across manufacturing sites, these formerly “dumb” devices are now “connected” and begin to add to the data stream. Like tributaries flowing into a giant pool of data, those data elements are converted into useful information, which serves to aid in the decision-making process and ultimately, results in improved production.

But all of this does not happen automatically and some sophisticated tools are required. The good news for manufacturers is that most operators are already familiar with the core tools that provide this data capture and analysis service. The industry calls them supervisory control and data acquisition (SCADA) systems. However, in this new era of full-fledged digitization, traditional SCADA systems have improved and their business value have taken on a new meaning.

At Siemens, the developers have recognized that SCADA is now a critical solution for connecting a plant’s distributed assets in order to generate actionable intelligence. To support this role, Siemens is evolving its SCADA applications for deployment on a smaller and more condensed scale. For instance, in traditional applications system nodes might have been physically located miles apart. In the new connected and data dense environments, these nodes may now be inches apart. Siemens newest SCADA platform, WinCC, for instance, has the ability to tie together data from both closely coupled and widely deployed assets. The result is a more flexible, reliable and transparent environment with more intelligent automation and the ability to collect and analyze big data in real time for actionable information and better business decisions.

What is the digital factory big picture?

The way products are made is the same within both a traditional factory and a digital factory.  Holes are drilled, parts molded, bottles filled—but the difference is information. In the case of the digital factory, the “smart” devices work together, and the control system interconnects the disparate processes all with one objective in mind–to build competitiveness. Increased efficiency, reduced time to market, and enhanced manufacturing flexibility are now enhanced because of an underlying system that is optimized to process data.

The digitized approach of data gathering, data centralization and data analysis helps to integrate the five basic stages of the product lifecycle:  design, production planning, engineering, production, and services. While the product is being designed, all the subsequent stages are already planned so that the overall process operates more efficiently. For example, manufacturing offers feedback on product design from the earliest stages to ensure smooth production. Using simulated modeling through every phase of manufacturing, it is easy to identify critical elements and potential risks and address issues as early as possible for maximum efficiency.

The SCADA contribution to the data flow

Smart devices throughout the plant become part of the SCADA network facilitating the data flow.  Listed below are five areas where SCADA systems like Siemens WinCC add value to a functioning digitalized plant:

1.    Data management—The enormous variety of field devices each generate their own data. To make this incoming data useful, data formats need to be consistent. That’s where the data management component comes in. The output of a good data management system is the rationalization of data so that it is both comparable and storable. A system such as WinCC presents data in real time and also archives it for subsequent analysis. The system can then identify trends or engage in troubleshooting. If a problem occurred in one section of the packaging line at 3:00 pm last Tuesday, what information was being generated by devices up- and down-stream of that problem area during that time period? The WinCC system can provide such information in a quick and straightforward manner.

2.    Information management—Data needs to be translated into production information so it can help optimize manufacturing. WinCC’s Information Server tool can create dashboards that provide real-time displays and visibility to plant operations. Managers can access the dashboards either locally, or remotely. Automated reports are generated that monitor critical process elements across any desired time interval.

3.    Energy management— Energy management has emerged as both a regulatory and cost control issue. Adhering to standards such as ISO50001 helps to conserve resources, tackle climate change and lower electricity, gas and water costs. In order to reduce energy consumption, the first step is to be able to measure how much energy is being consumed.  WinCC can act as a mechanism for capturing energy consumption data from devices such as transformers, meters, circuit breakers and motor drives—all places where power consumption can be measured. Then, understanding these energy use patterns, operations teams can avoid utility peak charges by reducing consumption during the times of the day when rates are expensive.

4.    Diagnostics management— Tools within the WinCC environment allow users to view system and device diagnostic information. The easy access to this information speeds up the process of troubleshooting and repair. Everyday issues such as identifying shorts, wire breakage, missing voltage load, limit violations and other system defects can be quickly identified and addressed, avoiding long delays in both locating the problem and identifying the solution. WinCC provides alarms for immediate notification when problems emerge, and displays clear-text information pertinent to all devices, including sensors, PLCs, HMIs and servers. If a programming error exists within a PLC, the system identifies which line of code caused a trip.

A Totally Integrated Automation Portal (TIA Portal) provides a consistent look and feel as users navigate across plant functionality areas including process and component diagnostics. Simulation tools within the TIA portal allow for more proactive approaches to both diagnostics and energy management functions. In essence, the TIA portal acts as a gateway to the automation within the digital factory.

5.    Open communication—Digitalization is driving a merger of automation systems with the IT world, so more systems, even those traditionally considered incompatible, are interconnected. A system such as WinCC serves as a data bridge between the core Operations Technology (OT) and Information Technology (IT).  To access even more operational data through the value chain, WinCC leverages MindSphere, Siemens cloud-based, open IoT operating system, which enables powerful industry applications and digital services to drive business success.

First steps, and a path to digitalization payback

Digitalization is a competitive manufacturing advantage that can be adopted over time. As a manufacturer works to modernize a plant, adding a SCADA system such as WinCC is a critical first step in establishing interoperability. The advantages of interoperability aid in facilitating a more competitive manufacturing environment. Some of the benefits include:

  • Plant and IT systems begin to communicate – A more direct exchange of information can occur as plant-level functions connect with MES, ERP and other management platforms.
  • Management can make decisions more quickly – More up-to-date and detailed information allows management to drive more optimized plant processes.
  • Energy savings – Energy use can be measured and reduced as consumption data becomes more transparent. Implementation of ISO50001 standards becomes simpler.
  • Improved production uptime – Effective use of diagnostic information allows for more streamlined maintenance, and resources spend time where it’s needed most.
  • Synergistic improvement – Initial successes encourage wider deployment of smart devices at all levels, increasing the flow of information to support better decision-making.

As the digitalization process evolves, SCADA systems such as WinCC expand easily to accommodate and support new integration and communication phases. As digitalization intensifies, the system maintains its role as the primary facilitator of networking and information flow for a more connected and competitive plant.

>> This article by Alan Cone, Siemens, was re-posted from Automation.com, November 3, 2017

Simulation-as-a-Service On-Ramp

Simulation is transitioning from a highly specialized operation performed by experts at the end of the design process to a more ubiquitous activity that can help optimize products throughout the design cycle. That means companies are doing more simulation and asking for results much faster.

Cloud-based solutions that enable simulation-as-a-service—either through complete outsourcing of simulation or opportunistically accessing simulation tools on an as-needed basis—have made this easier.

“Whereas a few years ago clients were really not averse to bringing simulation tools in house, development schedules and markets are moving at such a fast pace that clients are willing to pay a premium to leverage outside expertise and resources,” says Scott Herndon, manager of simulation client development for CFD (computational fluid dynamics) and FEA (finite element analysis) at IMAGINiT Technologies. IMAGINiT offers cloud-based Autodesk products, and has also developed its own mini-cloud resources on its own supercomputers.

Stream lines passing over the rear wing of a dirt track vehicle. (Images courtesy of TotalSim.)

“The costs of simulation were too big of a barrier for smaller companies,” says Ray Leto, president of TotalSim. “The idea of simulation-as-a-service is to take all of the expertise and resources [and] put it in a black box with an easy-to-use interface so that customers can walk through it and use it as they need it.”

“With smaller companies that have not used HPC in the past, by having access to the cloud they are able to run bigger jobs once they outgrow what they can do on a workstation,” says Gabriel Broner, vice president and general manager of HPC at Rescale.

Although the availability of cloud-based simulation-as-a-service solutions is expanding, in some cases these operations require the transfer of large amounts of data or the use of simulation tools that require very low latency. This can prove to be challenging, particularly for small- to mid-size businesses (SMBs) that may not have access to direct, high-speed connections to cloud providers or other network infrastructure.

“The reliability of the network is critical because the last thing you want is to put a lot of work into putting a solver into the cloud and have it fail because of a bandwidth or data connection issue,” Herndon says. “We rarely have issues with that, though, and it happens much less now than it did in years past.”

The availability of new high-speed networks can enable new types of services. Simulation services company TotalSim, for example, has been able to leverage access to a local gigabit fiber network at its home base in Dublin, OH, as well as the resources of the statewide OARnet (Ohio Academic Resources Network) 100G regional network and the Ohio Supercomputer Center.

“I can’t overstate how important it is for us within the city of Dublin to have access to these networking infrastructures,” says Leto.

Leto’s company was involved in a project to make simulation applications available to customers using those high-speed network resources. Working with researchers at Ohio State University and other technology partners, TotalSim was able to offer simulation capabilities to customers with very little latency. “We’re not networking people, so we’re not always thinking about the challenges on that side of the problem,” Leto says.

The push for municipalities to expand their gigabit network infrastructure, and then offer access to those networks ilike they offer access to water, sewer and other utilities, will help make these cloud-based applications easier to use and access.

That is because the network plays an important role in making these applications work for users that need real-time responsiveness, says Prasad Calyam, assistant professor of computer science at the University of Missouri College of Engineering, who worked with TotalSim on the project while he was acting as research director at the Ohio Supercomputer Center/OARnet at Ohio State University. “Integrating the various desktops and HPC systems, storage and other collaborators, and providing connectivity across them can provide a huge improvement in workflows,” Calyam says. “It helps improve the gains in time, cost, effort and convenience. Networking really drives this transformation to the cloud.”

Companies are also finding other ways to tap into cloud-based simulation resources. Workload data can be placed directly in the cloud so that the data is not moving back and forth between the user and the cloud infrastructure.

At Rescale, customers can connect via the public internet, while Rescale manages connections to cloud services providers. “We also have the ability to offer dedicated high-speed links,” Broner says. “For customers that already have some systems on premise, we can work collaboratively with them to determine what jobs are good candidates to move to the cloud.”

High-Speed Access in Ohio

In Dublin, TotalSim teamed with the city, Ohio State University, the OARnet network and other entities to create an app-based approach to providing access to simulation and compute resources to local businesses.

The team at OSU worked with TotalSim, which uses HPC resources to test virtual prototypes for clients in the aerospace, automotive and manufacturing markets. The problem TotalSim wanted to solve was to find a way to provide access to data-intensive services without overwhelming the public network and bogging down simulation activities for its clients.

The project was funded in part through an award from US Ignite, a nonprofit focused on helping to create services and applications that leverage advanced networking technology. Launched in 2012 by the National Science Foundation and White House Office of Science and Technology Policy, the initiative has helped launch and support Smart Gigabit Communities (funded by the NSF) that offer access to high-bandwidth networks (among other things). Those cities include Cleveland, Austin, Chattanooga, Kansas City and others.

The project was named “Best Application for Advanced Manufacturing” at the US Ignite Next Generation Application Summit in Chicago. Researchers from the Ohio Supercomputer Center (OSC), OARnet, Ohio State University, the City of Dublin, Metro Data Center (MDC) and the University of Missouri-Columbia (MU), in partnership with TotalSim, VMware and HP, were awarded $25,000 to develop “Simulation-as-a-Service for Advanced Manufacturing.”

The application allows users to remotely access the software and compute resources through a virtual desktop-as-a-service system for manufacturing. Users access results of simulations via a thin-client connection to a virtual desktop, whereas the heavy lifting of the large data sets is handled on the DubLink and OARnet fiber networks.

Dublin rolled out its own gigabit network called DubLink, which serves as the backbone of the project. It connects to Metro Data Center, a regional supercomputing facility in the city and runs parallel to OARnet, Ohio’s statewide 100 gigabit network.

Calyam says the project built on existing networking resources that had not, to that point, been operationalized with any applications. Funds from winning a Mozilla competition helped launch the prototyping phase, and Calyam was able to obtain a donation of cloud data center GENI racks and last-mile fiber connections.

“They created a modeling and simulation service that people can access as if they had their own solution because of the speed and low latency of the fiber network,” says Glenn Ricart, founder and CTO of US Ignite. “They can rotate models in real time.”

By reducing latency, the application allows better real-time collaboration between TotalSim and its customers, which can speed up iterations.

Having local infrastructure was critical, as distance between the companies using the app and the compute resources can affect latency. “Even if you have a gigabit network all the way from here to Oregon, you can’t necessarily provide the responsiveness you’d get having the application hosted in Dublin and distributed over local Dublin fiber,” Ricart says.

Network Alternatives

Although high-speed fiber networks are expanding, and the number of Smart Gigabit Communities is growing, not every small or mid-sized company has an on-ramp to this infrastructure. Fortunately, there are other ways to successfully access cloud-based or hosted simulation solutions.

At Rescale, customers submit simulation jobs and are able to match them to the most effective compute architectures for that particular project. “Having more bandwidth is always going [to] help things for SMBs, but even without a gigabit connection, there are a lot of workloads that can run in the cloud really well,” says Ryan Kaneshiro, chief architect at Rescale. “The sweet spot is probably CFD jobs that have small input file sizes and generate a lot of output data. You can do post-processing remotely through a remote visualization node or batch load processing script to whittle down the amount of data that needs transferred back to the local workstation.”

For other jobs with larger data transfer needs, more bandwidth is crucial. “That said, it isn’t the only option,” Kaneshiro says. “If you are talking about static data sets, a lot of that information is already sitting in the cloud. If it’s a one-time transfer, then there’s also the option of shipping hard drives to Amazon or Microsoft or whatever provider you are using.”

More direct connections are also available. “We see direct connections filtering down to SMBs as cloud usage starts to grow,” Kaneshiro says. For example, services like Megaport can help companies establish those direct connections to cloud services.

Example interface of 3D viewer for F3 vehicle using TotalSim’s results web application hosted at Ohio Supercomputer Center.

US Ignite’s Ricart says that high-speed networks will always have some latency limitations because of the speed of light and the way the fiber networks are designed (which is typically not in a straight line and requires several hops). “Every time you take a signal and put it through a router, that creates a delay,” Ricart says. “Both the speed of light and the number of times you have to make a connection run the clock when it comes to latency.”

TotalSim has successfully worked with clients across the country by using its app design to work around the latency issue. “What we’ve found is that the way we’ve designed the UI (user interface) side of the web application, and the latest software stack, is that it has allowed rendering and visualization and transfer of data to be faster than it used to be,” Leto says. “A person in California that is accessing the apps is still looking at a web page coming out of the Ohio Supercomputer Center. We’re just sending images and charts and graphs to be rendered in a browser, which is very lightweight and fast.”

Where the company does see a challenge is with interactive 3D post-processing or manipulation. “But every year the tools are getting better, and people are figuring out how to make the rendering work faster and the remote visualization capabilities through the browser are getting better,” Leto adds.

Taking advantage of solutions that offer different access options is important for making simulation-as-a-service work for smaller companies. Leto says that “bare metal” HPC installations provide the best bang for the buck when it comes to CFD simulations. Public cloud services like Amazon or Google can provide greater scalability.

It’s also important that the various high-speed networks typically available in larger cities can be coordinated and integrated as these high-speed networks expand, and that’s been a big part of US Ignite’s efforts. “We are rapidly seeing the ‘gigafication’ of the American internet,” Ricart says.

With a local hub that connects those networks, communities can reduce latency; otherwise, traffic may flow hundreds of miles away before it can be relayed and exchanged between network providers. Cities can establish digital “town squares” where a variety of applications and services are available for use. That presents an opportunity for companies with heavy simulation needs to access compute resources they would otherwise be unable to afford, while helping cities attract and retain high-tech businesses.

“In the case of Dublin, the city is already invested in data centers and has resources they can use as incentives for companies,” Calyam says. “If TotalSim is using this, then other companies will look at it and be able to do a cloud-based transformation. Those companies are much less likely to relocate to another city because they have these resources.”

>> Read more by Brian Albright, Digital Engineering, November 1, 2017

DIGITALIZATION: The New Critical Success Factor

The terms Industry 4.0, Big Data, the Internet of Things, and the Digital Factory are being pitched around like a rugby ball, and almost always with a decided lack of clear definition. Let’s set the record straight.

History

MindSphere, the Siemens Cloud for Industry, is an open operating system for IoT that links physical products and production facilities with digital data. (Source: TechBriefs.com)

After German Chancellor Angela Merkel, in conjunction with her ministers of industry and education, ordered a study about the manufacturing environment, the German Academy of Science & Engineering drafted the vision of Industrie 4.0. It was planned as a coordinated initiative among the IT world, universities, and various manufacturing associations designed to reshape industry. It would seek to combine the physical, virtual, IT, and cyber systems, thereby creating a new working environment between the worker and machine. The 4.0 part of the name, incidentally, derives from the fourth industrial revolution — the predecessors being the emergence of mechanization through steam/water power, the impact of electricity on mass production, and the invention of the computer, which led to our modern concepts of IT and automation.

Industry 4.0 (English spelling) has been adopted worldwide as a functional goal in industry -— especially the manufacturing world. Industry 4.0 represents a high point of dynamic achievement, where every company — whether a large OEM, major tier supplier, or smaller job shop — can implement and benefit from the technologies and communications platforms available today.

Without question, Industry 4.0 is less a vision of the future and more a vibrant collaboration among IT, machine builders, industrial automation integrators, and especially motion control suppliers that function at the heart of the machines, simultaneously effecting motion, then gathering and transmitting the relevant data to the appropriate control link in the company’s infrastructure, all at speeds measured in nanoseconds.

To work effectively, this concept requires a standardization of platforms in both communications and languages used.

Integration in Practice

While the Big Data idea overwhelms most managers, technicians, and operators alike, the key is the manipulation of that data in a hierarchy of need, to borrow a term from the psychology world. The mobile device, tablet, cellphone, and now the human machine interface (HMI) screen itself, can all be useful tools in transmitting the most important data from the shop floor to the top floor, or just down the hall to the front office. We say that for a reason, as the small shop owner would be well advised to heed this trend and respond appropriately. That action might take the form of using an integrator to tie all the machine functions and outputs together for that day when his OEM or upper tier customer demands it. In many industrial sectors, that day has already arrived.

The mobile device, tablet, cellphone, and human machine interface (HMI) screen can all be useful tools in transmitting the most important data from the factory floor. (Source: TechBriefs.com)

Also, the cybersecurity issue cannot be understated, as we will soon see a shift from the open to the closed cloud for data storage in a factory or shop network. The protection of intellectual property remains paramount, on a global scale, today. To overlook that reality is to compromise the stability and security of your company.

“Remaining competitive” takes on many meanings, depending on your location in the world, but here are some thoughts on how manufacturers can do it better today By the time you finish reading this article, another entrepreneur will have figured out a way to make it happen for his or her company.

Time-to-market reduction is as critical today as ever. Shorter innovation cycles — the result of new product lifecycle management software and services available to companies both big and small — mean the savvy product companies can take their concept and make it fly in just a fraction of the time spent in the past. And by past, we mean compared to about ten years ago.

With the recent, rapid expansion of application-specific integrated circuit (ASIC) capability, much more functionality can be built into a product today, and this means the manufacturing community must be even more flexible and responsive — not merely reactive — than ever before.

With the Big Data impact that has resulted from the above scenario, both machine and component manufacturers are challenged in many ways, not the least of which is the daunting task of deciphering the important or exceptional from the nominal. A quality ERP or MES system can tell you what you need to know, but the keys are the determining factors that make up the inputs to these systems and how their priorities are set.

From the perspective of the motion control and communication platform world — which focuses on the control, generation, or application of movement on everything from a machine tool to a packaging line, from an automotive assembly line to a turnkey book printing facility — a great variety of needs is seen among OEMs as well as end-users in these various segments. All of them require flexibility and often highly customized solutions to their manufacturing or processing challenges. Plus, maintaining high productivity on aging equipment is a constant concern for every company. Do you need to retrofit existing machine or invest in a new one? Are enhanced robotics and transfer mechanisms or more personnel required on the line? Should you focus on better asset management or an entirely new business model when thinking about factories or processing facilities? Today, as the digital factory emerges in all industries and at companies of all sizes, we find ourselves providing answers to these questions, based not only on product, but also software, communication, bus protocol, and other areas of manufacturing expertise.

Utilizing Data to Remain Competitive

The mobile device, tablet, cellphone, and human machine interface (HMI) screen can all be useful tools in transmitting the most important data from the factory floor. (Source: TechBriefs.com)

It’s now a popular saying that “data drives utilization.” Using data smartly, however, requires an educated workforce that can take product design and turn it into viable and profitable production for the employer, regardless of the machine, widget, chemistry, or package being produced. In a world dictated by product lifecycle management needs, the correlation among design, production planning, output, and delivery — plus the monitoring of usage and returns in the field — has never been more important, but also never more manageable, given the new tools available from both product and service providers in the market today.

With IT as the link, today’s digital factory will tie the shop floor to the top floor. A word about security: The involvement of suppliers, especially as it pertains to the cybersecurity of Big Data, is a critical factor today. While technology is key, so is the old-fashioned but highly underrated notion of trust. Companies are most productive when they can trust their suppliers, especially those who promote a “defense in depth” approach to cyber-security.

That value can often come in unseen ways, such as the access provided to your workforce for prompt and effective answers to questions. Perhaps it’s a 24-hour hotline, perhaps it’s an onboard technical manual in the machine controller with troubleshooting capability on-screen, or perhaps it’s a supplier-provided training webinar that will expand the way your operators and maintenance personnel use their machines. Taking full advantage of these services will improve the productivity of your factory floor. You hear about total cost of ownership (TCO), and this is one of those subtle but very real factors that drives that calculation.

Another key area in remaining competitive is the cost of energy. The more a machine can do with less energy, the more efficient and profitable it becomes. That’s the obvious part. How to get there can take many forms. For example, the simple notion of regenerative energy — a concept in play in the electrical world since Sprague’s regenerative braking motor in 1886 — can be monitored and manipulated by today’s drives, putting power back onto the grid or using it to drive other equipment. By simply implementing “smart” motors, drives, and other equipment, manufacturers of all types can improve their productivity and the bottom line — a win-win, to be sure.

Lastly, safety must be paramount, not only as it protects the workforce, but also as it contributes to overall efficiency and the profit picture. Fewer accidents result when there is a reduction in the mean time to repair, and equipment is replaced before it malfunctions and hurts someone. This requires implementing both preventive and predictive maintenance protocols.

Examples from Industry Today

Digitalization, and its proper implementation, is now emerging as a critical success factor for industry, changing the way products evolve through cloud technology, knowledge automation, and big data analytics. (Source: TechBriefs.com)

Confidence in digital manufacturing is higher than ever among leading companies these days, and for good reason. Industry leaders are beginning to realize benefits from their investments in digital technologies and next-generation robotics. One car maker offers a prime example of how the benefits of digitalization can accrue. In their case, everything from design to execution planning is implemented digitally. They once required 30 months to manufacture their luxury sports sedan, from start to finish. Thanks to digitalization, production time was reduced to 16 months, and the company succeeded in achieving a threefold manufacturing productivity increase.

Another successful application of digitalization can be found at another car plant equipped with more than 1,000 robots, all of which help to weld vehicle bodies with accuracy within a tenth of a millimeter. Robots also control the first fully automated hang-on assembly line, which attaches the doors, hoods, and hatches to the vehicles — a process that previously was entirely manual. The plant also has an automated engine marriage process and a new integrated paint process that uses 30% less energy and produces 40% fewer emissions.

Digitalization, and its proper implementation, is now emerging as a critical success factor for industry. It means gathering more data and analyzing that data in a virtual context so that better decisions and, in many cases, predictive decisions can be made. It’s changing the way products are developed, built, and delivered through machine learning, additive manufacturing, and advanced robotics. And it’s changing the way products evolve through cloud technology, knowledge automation, and Big Data analytics.

Digital technologies present a billion dollar opportunity for manufacturers to transform their production and reorient their value proposition to meet the needs of today’s digital consumers. The competitiveness of the manufacturer increases because digitalization introduces even higher speed into the product development lifecycle, thus enabling faster response to consumer demand.

Simulation is one digitalization tool that drives shorter innovation cycles, even when highly complex products and large volumes of manufacturing data are involved. In a simulation environment, a virtual model of each component in a device or machine is generated, which allows designers and builders to explore what-if scenarios easily and quickly. These virtual models have come to be known as “digital twins.” They analyze the gathered data and then use it to run simulations and benchmark performance, allowing plant operators to pinpoint where gains can be made. By pairing both virtual and physical worlds (the twins), analysis of data and monitoring of systems can actively avert problems before they occur, which prevents downtime, develops new efficiency opportunities, and enables planning for the future. Existing assets can be modeled against their digital twins and new designs can be tested in the virtual world, saving time, money, and resources. Testing the interaction on a screen can verify a modification to a car engine, for instance, before new holes need to be drilled. Such scenarios are occurring at every supply chain step in the auto, aero, medical, off-highway, appliance, and other industries.

Conclusion

A connected digital factory, and the Big Data it generates, provides manufacturers with the insight and agility required to compete. Digitalization gives manufacturers the capability to increase productivity across their entire value chain — from design and engineering to production, sales, and service — with integrated feedback throughout the process. In practical terms, this means faster time-to-market, greater flexibility, and enhanced availability of systems on the plant floor.

The integration of digitalization into operations is also a flexible process. Digitalization can be adopted at any pace that fits the needs of the organization. Some manufacturers start with retrofits or may begin by digitalizing one assembly line or even one machine at a time. By whatever means a company chooses to begin its path to digitalization, the critical challenge is to start now.

>> This article from Tech Briefs, September 1, 2017,  was prepared with contributions from Arun Jain, Vice President, Siemens Industry, Motion Control Business, and Alisa Coffey, MarCom Manager of Aerospace, Automotive and OEMs, Siemens Industry, Atlanta, GA; and Bernd Heuchemer, Vice President of Marketing, Siemens, Munich, Germany. For more information, go to Siemens.com

 

 

From Advanced Robotics to Rapid Prototyping: 10 of the Varied Faces of Smart Manufacturing

Rob Spiegel, from Design News, highlights some of the varied technologies that are expandind under the umbrella of advanced manufacturing.

Smart Manufacturing by Any Name

What’s in a name? The nomenclature for manufacturing technology comes in many colors: advanced manufacturing, smart factories, the digital plant, Industry 4.0. The term Industry 4.0 comes from a German government program announced in 2011 that is designed to encourage manufacturers to digitize manufacturing in order to improve Germany’s global competitiveness. In 2015, China launched “Made in China 2025,” an effort to advance its manufacturing technology, again for competitiveness.

In the US, smart manufacturing is viewed as a collection of emerging technologies based on digital communication and high-powered computer processing. Software is the key.

The Big Promise of Big Data

Big Data is the term given to data processing that takes advantage of high processing speeds and inexpensive memory. With increased processing speeds, simulation and analysis that previously took days has been reduced to hours, even minutes. Analytic challenges that were not possible just a few years ago – or not affordable – have become inexpensive and quick.

As applied to manufacturing, Big Data makes it possible for control engineers to move from standard preventive maintenance – much like automobile maintenance that’s done on a mileage schedule – to predictive maintenance, where the maintenance schedule is based on real-time data collection and analysis. The savings comes from doing maintenance when the equipment actually needs work rather than adhering to a per-determined schedule. The Big Data monitoring also detects and alerts to potential machine malfunction before the machine breaks down.

Maintenance is the low-hanging fruit of Big Data. Other functions include determining the manufacturability of products during the design process, as well as altering the product to accommodate the specific manufacturing equipment. Big Data is also used to analyze and adjust production in order to speed the process, reduce inefficiencies, improve throughput, reduce scrap and defects, and drive down energy consumption.

The Futuristic World of Advanced Robotics

Developments in robotics have been stunning in recent years, including applications in surgery, health care support, agriculture, military robots that swarm, and robots that detect and deactivate bombs. Recently ABB programmed a robot to conduct a symphony orchestra. Yet manufacturing was the birthplace of modern robotics applications, particularly in automotive. Robots on the car line took over welding and painting decades ago.

Robot applications in manufacturing have matured substantially since the turn of the century. Aided by processing intelligence and sometimes internet connectivity, manufacturing robots have moved into specialized roles such as food processing – where the robots are built to withstand wash-down – and inspection where the robotic eye can detect cracks that can’t be seen by human vision. In packaging, robots have taken over most functions.

Grippers have evolved, as well, becoming their own specialized mini-industry. You can find grippers that lift heavy weights or use suction to move a windshield. You can also swap those for robotic hands that can lift a delicate orchard or move an egg.

The Friendly World of Collaborative Robots

Cobot and human working together on the assembly line.
Cobots and humans work together on the assembly line. (Image source: rethinkrobotics.com)

The collaborative robot – sometimes called a “cobot” – is the steel within the velvet glove. The robot can help you lift, but it won’t hurt you if you bump into it. These robots are designed to work side-by-side with humans on the assembly line. These friendly robots were invented in 1996 by J. Edward Colgate and Michael Peshkin, professors at Northwestern University. A 1997 US patent filing describes them as “an apparatus and method for direct physical interaction between a person and a general-purpose manipulator controlled by a computer.”

While assembly is the collaborative robot’s forte, they’ve been deployed for lifting, moving heavy objects, and recently, to conduct a symphony orchestra, which ABB’s YuMi did over the summer. These robots are produced by traditional robot companies such as ABB and companies such as Rethink Robotics that were launched specifically to build and market collaborative robots.

The Virtual Factory: The Plant on a Computer Screen

(Image source: www.sew-eurodrive.de)

In the past, factory lines were set up by moving equipment around until it was arranged in an optimal manner to support manufacturing. Some plant managers used forethought before moving heavy equipment, but basically, it was a bit of trial and error to get things right. The virtual factory occurs in a computer model that simulates the optimal configuration – including everything from wiring to stampers, robots and, electrical boxes. The layout is completed before any physical equipment is touched. The idea is to use simulation to gain the just-right configuration.

As well as creating a simulated configuration of an individual production line, the virtual plant is also portable. A manufacturer can take an optimized plant configuration and use it for plants across the globe. A well-configured plant simulation can also be used as a teaching tool. Ford and Siemens created a virtual model of a highly optimized plant that showed detail down to individual work stations. Users across the globe are able to use to model to teach workers how to efficiently run the work station.

IoT Connectivity: The World of Connected Devices

IoT connectivity on the factory floor went from zero to sixty at lightning speed. One very simple advantage of internet-connected devices – such as sensors on plant equipment – is the ability to move beyond wiring. Hard-to-get-to sensors were suddenly easy to connect. Another major benefit of plant IoT is the ability to gather plant data into one dashboard that can be read from anywhere via a browser.

While the overall benefits of IoT are still nascent in wearables, healthcare, and buildings, the IoT delivered nearly instant return on investment to manufacturing. IDC estimated that the 2016 global spend on IoT by manufacturers was $102 billion.

Use of IoT by manufacturers includes monitoring of plant assets for maintenance (catching ill machines before they fail), monitoring all plant operations for efficiency and optimization, and collecting field-service data. Some automakers are monitoring products after they reach the consumer, gathering data to determine consumer use and to track product problems.

Machine Learning: Machines Teaching Machines

(Image source: autodesk.com)

Machine learning in manufacturing a form of artificial intelligence (AI). Computers are using algorithms to monitor and optimize plant operations. Forms of AI can be deployed to measure plant efficiency and make adjustments to optimize operations. This can include everything from energy use to energy usage to production speeds and maintenance schedules.

Machine learning came into play as tremendous amounts of data became available from plant operations. Plant managers have started to use that information to analyze all aspects of plant operations, including how machines can best work together.

Data from the plant can also be deployed to the product design team so product engineers can alter designs to optimize the manufacturing process. Computers can figure out the best match between design and plant equipment to get the optimal result. This includes letting the computer help design the product and allowing the software to plot the manufacturing process.

M2M Connectivity: The Fully Connected Plant

M2M (machine-to-machine) connectivity can be deployed either by wire, through a wireless local network, or via the internet. The link from machine to machine enables a sensor or meter to communicate the data it records – such as temperature or inventory level – to application software that can use it to make appropriate adjustments.

In manufacturing, the M2M vision is a fully connected plant, from the product design team, to suppliers, to the plant operations, to maintenance, to finance, to the customer. Any of these teams can look into the plant’s operations to see production status, inventory, as well as finished products and shipping status.

M2M began life in the hard-wired world, where the connections from plant to enterprise traveled along cables. The IoT made the process less onerous and less expensive while also allowing a greater number of stakeholders to view plant data via browser. The downside of the IoT connectivity is greater security vulnerabilities.

Additive Manufacturing: 3D Printing on the Production Floor

Additive manufacturing – also known as 3D printing – is having an effect on plant operations. For decades, the technology was primarily used to create prototypes. The 3D-printing process was simply too slow for production. Plus, the products were typically weaker than those created by traditional manufacturing processes such as machining, casting and plastic molding.

That has changed somewhat as 3D-printing technology and materials have advanced. The 3D-printing process is getting faster, and materials are now available that render products as strong as traditional production methods. These advances have allowed for some low-volume manufacturing applications that use additive manufacturing as part of production. One advantage of additive manufacturing is the ability to create customized products.

The Digital Twin: Virtual Products in Virtual Production

Physical and digital representation of GE turbine.
Digital twin of GE turbine. (Image source: geglobalresearch.com)

Large manufacturing technology companies such as GE and Siemens have been talking about the Digital Twin for a few years. The concept is that each physical product has a collection of data to accompany it. Given the low cost of memory and processing, the kind of data that can be collected into the digital twin is vast. The digital twin can include design data that stretches from early sketches to fully formed iterations. Collected with the design data is simulation results on materials as well as finished-product test simulations.

The digital twin includes the bill of materials (from supplier to costs), the manufacturing configuration, and production data. Rounding out the content is sales records and field information on how the product is used by the consumer as well as maintenance and defect records on the products through its end-of-life.

Cyber/Physical Systems: The Blend of Real and Digital

Cyber-Physical Systems (CPS) include the integration of data computation, networking, and physical processes. Embedded computers and networks monitor and control the physical processes, with feedback loops where physical processes affect computations and vice versa.

In manufacturing, the CPS brings together asset management, configurability, and productivity analysis. CPS in manufacturing provides data to keep equipment users aware of networked asset status as well as alerts for possible risks that can include asset stress indications or security breaches. These systems combine engineering and computation to provide analysis of the manufacturing process, paths toward optimization and efficiency, and data on equipment health.

>> Learn more by Rob Spiegel, Design News, September 18, 2017

Simulation Visualization: Seeing is Understanding

>> Article by Randall Newton re-posted from Digital Engineering, August 1, 2017

Computer-aided simulation and analysis for engineering (CAE) has become widely used throughout industry and is no longer restricted to a few key analysts. Along with this greater usage comes a greater need to share results with a wider audience. The current trend of increased digitalization—such as internet of things (IoT) and Industry 4.0—also places new demands on the use of simulation and analysis. More than ever before, visualization is key to extending the usefulness of CAE data.

Data visualizations provide context, enabling engineering teams to find flaws or help explain complicated issues. As digital twin technology becomes commonplace, the simulation results will become as important as the geometric model for the ongoing relationship between the digital model and the physical instantiation. Simulation tools create visual results, but often these graphics are densely technical and require refinement to make their information accessible to a wider audience.

The growing complexity of simulation results and the use of simulation data further up and down the design cycle are two reasons why ANSYS recently acquired CEI (Computational Engineering International, Inc.), known for its EnSight simulation visualization software.

Software generated image of simulation.
Tecplot software organizes sets of simulations and can provide data plots in three dimensions as well as visualizations that can be explored. (Image courtesy of Tecplot.)
Software interface showing simulation. (Source: Tecplot)
Tecplot Chorus allows multiple simulations to become part of a larger study. (Image courtesy of Tecplot.)

“Simulation is going from what was the verification stage to more and more engineers using it upfront in the design cycle, then also further down the cycle for additive manufacturing and digital twins,” says Mark Hindsbo, ANSYS vice president and general manager. “All that data is great, but if you can’t use it effectively, what good is it?”

Understanding the Uncertainties

There are two forms of uncertainty being studied by engineering analysis, says Scott Imlay, chief technical officer of Tecplot. The first is random uncertainty, more commonly explored in scientific inquiry but also an influence in engineering. The second is deterministic, the cause-and-effect processes inherent in CAE. “If you visualize the data, you must understand the uncertainties,” says Imlay. “It is a risk-discovery process. If a simulation doesn’t tell you about risk, you have to find ways to figure it out.”

When multiple simulations are run, each with the same product specs but with changes in flow rates, speed and so on, the results need to be linked. The relationship of each CAE visualization result needs to be seen in the context of others in the set. Sometimes blank spots in the data set become obvious only when organized.

Screen capture of Ceetron digital model.
Ceetron products process large CAE sets on the server then send the results to the client using WebGL technology. (Image courtesy of Ceetron.)

Imlay says Tecplot Chorus is most commonly used to optimize designs, develop comprehensive long-term databases that include CAE results, predict performance over time and investigate specific engineering problems by being able to review multiple simulations as a single set.

“In all these scenarios, engineers need to discover the trends and anomalies in output variables,” says Imlay, “and to understand the underlying physics that cause these variations.”

Not a Typical Big Data Problem

Big Data is a buzzword today in enterprise IT. Most of the time, the big databases being mined by new algorithms are unstructured in nature. It becomes the job of the inquiring business intelligence software to make sense of it before it offers insights to the user. CAE data is already highly structured; the algorithms used for business investigations aren’t suitable for pass-through to engineering, thus the need for solutions specific to engineering.

Engineering data is not only different in nature from business data, but it is also inherently much larger in scale. “If you compute large amounts of data, FedEx is still the best way to transfer it,” notes Fredrik Viken, technical director at Ceetron AS, a software developer specializing in post-processing simulation data for visualization, in-depth analysis and interpretation/presentation.

“The largest engineering teams are generating thousands of simulations daily,” he says. This adds up to terabytes of data.

Most of Ceetron’s customers are vendors of CAE software, but they also sell some products directly to end users. Around 2011 Ceetron made the decision to rebuild its product line using cloud technology. A specific customer may install on a private server, but the inherent advantages of cloud technology—flexibility, infrastructure costs, location-independent access, security and reliability—are available whether using a private cloud, a service like Amazon Web Services or Microsoft Azure, or a hybrid of the two.

Ceetron applications leave the actual storage and management of CAE data to the product lifecycle management system; its products focus on the creation and use of the visualizations. Because it uses a cloud paradigm, Ceetron applications use server-side processing instead of client side, and results are rendered and presented to the client computer using WebGL. Thus the results can be viewed in any device that runs a web browser. By separating the computation of the data from the visualization, a Ceetron product like GLview Inova allows model slicing, rotation and so on at the speed of the local device processor and is not dependent on the relationship to the server.

“We are now able to simulate almost in real time the data from IoT sources, such as stresses and strains, and see the results in the web browser.” Viken says, adding that this linking of simulation to real-time performance data “will make simulation part of the lifecycle and not just design. Visualization will continue to be more important to see and understand the results and observations.” Ceetron GLview Inova works directly with most major CAE systems. It visualizes dynamic/transient results and creates time/frequency domain and mode shape animations. Once a database has been loaded, the user can rotate, zoom and translate interactively. Both image plots and 2D plots can be stored in various common formats.

More Info:

ANSYS

Ceetron

Computational Engineering International (CSI)

Tecplot

An insight into the future of making things

Seen through a British lens, design, manufacturing, and construction have changed with every decade. More than ever, designers must continually innovate to keep up with new demands, using technology to respond to the shifting needs of clients.

The cloud has rapidly advanced computational design and given rise to completely new means of designing products. Design simulation, generative design and reality capture are all being employed to produce better, stronger and more predictable designs.

In manufacturing, additive now complements subtractive techniques, modern materials and composites have become commonplace, and a new breed of self-aware, intelligent robots are tearing up the production playbook.

As sensors become embedded in everything, the Internet of Things (IoT) has begun to trigger a massive influx of data; data which is being harnessed to drive innovation, create and connect ecosystems, and provide substantial new business opportunities.

Construction is leveraging a whole new generation of concrete – from ultra-durable and high-performing, to self-compacting and 3D printed. Bio-inspired plastics are being used to create office blocks and virtual technologies like augmented reality (AR) are being used to monitor and optimise operations.

The landscape of these three industries is being radically transformed, noted Autodesk’s Chris Bradshaw, one in which traditional boundaries are disappearing and industries are converging. Construction companies are acting like manufacturers, and manufacturers are increasingly becoming service providers.

“When you combine, not separate, industries, innovation abounds. We need only look at the global success of Apple’s iPhone, which combines art and science, to understand that convergence can bring about magic; that is the Future of Making Things,” Bradshaw proclaimed.

How is additive driving real innovation?

Manufacturers are focused on three key objectives: innovation, quality and speed to market. This is true if they are producing a consumer product or a piece of industrial machinery. The combination of collaborative design coupled with advanced production techniques not only helps achieve these goals, it also provides a unique opportunity to break away from the pack.

A ship’s propeller, has been made using this hybrid production process, combining wire and arc additive manufacturing
The first pilot component, a ship’s propeller, has been made using this hybrid production process, combining wire and arc additive manufacturing (Image courtesy of Autodesk.)

The Port of Rotterdam is the largest in Europe and currently handles more than 460 million tonnes of cargo every year. As such, efficiency and cost effective time management are imperative. However, when vessels come into the port and require maintenance, the downtime can potentially cost millions and take weeks of action to repair.

Compounding the issue is the added expense of having to manage a considerable stockpile of spare parts in either a centralised or disparate warehouse network. To overcome these issues, the Port opened its innovative additive manufacturing field lab, ‘RAMLAB’.

The on-site facility includes a pair of six-axis robotic arms capable of additively manufacturing large scale metal industrial parts, saving time and money without sacrificing precision or performance. RAMLAB has developed, with the support of Autodesk, a hybrid manufacturing approach, which combines both additive and subtractive manufacturing.

This style enables the facility to pursue faster fabrication options: 3D printing large ship components in metal and then finishing the pieces using traditional CNC milling and grinding methods within a matter of days.

The first pilot component, a ship’s propeller, has been made using this hybrid production process, combining wire and arc additive manufacturing. The next step will be a final to-scale version, manufactured and fitted to a ship.

But that is just the beginning. RAMLAB aims to inspire other industries through this point-of-demand printed approach, including oil & gas, transportation and space exploration.

Computational design

This new landscape may provide an abundance of new growth prospects, yet an equal amount of competitive threats is emerging. The digital disruption which has already shaped the winners and losers of the media and entertainment industries has truly arrived to design, manufacturing, and construction.

Innovative vehicle chassis – the ‘Hackrod’ – created using Autodesk’s Dreamcatcher generative design software.
Innovative vehicle chassis – the ‘Hackrod’ – created using Autodesk’s Dreamcatcher generative design software.

Today, success lies in constantly approaching both the evolution of the future and the adoption of technology to realise it. Considering the prevalence of advanced technologies in almost every aspect of our lives, there still exists a level of fear, uncertainty and doubt about what it represents. Martha Tsigkari, of British architects Foster + Partners, offered some reassurance.

“Machines, whether hardware or software, robots or algorithms, are not there to replace human creativity, but rather to complement our human intuition and enable us to push the boundaries of construction, engineering and design.”

Computational design (the greatest evolution of which is ‘generational design’) is being intrinsically used in the way industry professionals go about their everyday. It aids model creation, drawing sheet production and productivity gains, but fundamentally it helps carve out more time for creativity – the ultimate goal for any design department or business.

Tsigkari went so far as to say that without computational design, we wouldn’t be able to handle the design complexities routinely demanded by clients, nor the frequent revisions as a project progresses, in a timely fashion – if at all.

Data mining

It was mentioned by several speakers that the future doesn’t stand still, the moment one future is reached a new one is created. If today’s future will be tomorrow’s reality, what comes next? According to Autodesk’s Chris Bradshaw, the next wave of innovation is going to be driven by data as the technology platform.

Fuel has always acted as an accelerant to how we make our ideas real. In the past, that fuel has been coal, steam and oil. Now, data has become the commodity being fought over, with the cloud serving as its pipeline. It’s not so much a case of the next ‘industrial’ revolution, but rather the first ‘information’ or ‘digital’ revolution, with a corresponding shift in focus towards amplifying brain power, not muscle power.

The real value of data can only be leveraged once it becomes information and knowledge.

However, as we’re often told, data by itself – whether generating internally or externally by an organisation – is relatively worthless. The real value of data can only be leveraged once it becomes information and knowledge.

Today, the availability of greater volumes and sources of data is for the first time enabling capabilities in generative design and artificial intelligence. They have remained dormant for decades due to an inability to analyse and utilise all this data. Thanks to the infinite, scalable power of the cloud, those constraints no longer exist; cloud has swiftly changed not only what gets designed, but how.

Previously, high-compute tasks such as computational design, simulation, design optimisation and additive manufacturing were expensive, technically challenging and perhaps accessible only to the few. Cloud has fundamentally rebalanced that equation by democratising access to advanced technology, helping to create brand new ecosystems in the process.

Self-learning robotics

The physical capability of the world’s industrial robots barely changed in decades, and arguably neither did their environments. Robots were typically large, bulky and blind, performing repetitive tasks either in isolation or behind heavy-duty security fences. They were expensive and largely incapable of getting any smarter in regards to either their existing tasks or learning new ones.

Manufacturing assets and processes were rigidly locked down in an effort to protect the sizeable investments placed in them. This equation was balanced through volume, i.e. mass production and economies of scale. Now, that equation is being flipped on its head.

In the same way that design no longer has to remain trapped in the boxes on top of or underneath our desks, production assets and processes no longer have to subscribe to being conditional, conformist, calculated and directive.

The new breed of collaborative robots (‘cobots’) are agile, flexible machines capable of forging a close relations
The new breed of collaborative robots (‘cobots’) are agile, flexible machines capable of forging a close relations.

The new breed of collaborative robots (‘cobots’) are agile, flexible machines capable of forging a close relationship with humans. Freed from their cages and taught via virtual reality techniques and advanced algorithms, robotic systems have become insightful, intuitive, capable of reason and flexible.

It’s no surprise to see that these are all ‘creative’ characteristics, with the combined abilities offering a tool far more powerful than the sum of its parts.

As Autodesk’s Mike Haley commented: “By giving our tools computer vision, we give them the ability to see, so no longer are they blind to our presence. By applying machine learning, we give them the ability to learn, so no longer are they blind to our processes. Rather than requiring the conformity of an assembly line, robots can new learn to accommodate the variability of the disorderly human world.”

The age of intelligent machines

Recently, Autodesk introduced its robot – ‘Bishop’ – to the very disorderly world of Lego. The machine was built to have curiosity and designed to let intelligence evolve. Thanks to these cognitive capabilities, Bishop came to understand the he has eyes to see and a hand to grasp, which in turn allowed him to evolve a basic ability to react and respond to his environment.

The next stage in Bishop’s development came via the creation of an artificial world for him to learn in. Engineers replaced what he sees by substituting the optical image for a rendered one made from physical simulations of Lego bricks that correspond to the real world.

Feeding these rendered simulations into a neural network effectively decoupled Bishop’s learning environment from his physical environment. In this way, time was similarly decoupled meaning the robot could now learn in computer time. Millions of grabbing and assembling simulations were run overnight, immeasurably accelerating Bishop’s learning.

Impressively, the Lego bricks can be replaced with anything, allowing Bishop to learn to grasp and assembly any object. Additionally, Bishop is intelligent enough to accommodate any parts which come out of order, rather than having to halt and wait for normal service to resume (typically via human intervention).

Even more exciting, once Bishop has learnt a new skill, the power of the cloud means that all robots have learnt it.

>> Article by Jonny Williamson, found in The Manufacturer, June 28, 2017

Digital Twin Technology Reduces Need for Device Developers

IoT platforms that support digital twin technology will be crucial for application developers to deliver IoT projects, and development teams at companies undertaking IoT initiatives should make them a centerpiece of their IoT strategies, according to a newly updated report from Forrester.

Digital twin graphic of General Electric locomotives.
(Source: General Electric)

Digital twin technology removes two roadblocks for developers working on IoT projects–speed and complexity. The technology insulates back-end developers from complexity in edge computing development, which requires knowledge of embedded programming and specialized communication protocols. And similar to mobile application development, IoT platforms that support digital twins provide the “core plumbing” to connect, secure and manage devices and quickly stand up connected products. Using an IoT platform that supports digital twin technology reduces the need for highly specialized device developers.

“The digital twin creates a nice abstraction mechanism for these developers,” Forrester Vice President and Principal Analyst Jeffrey Hammond, who himself has two decades of development experience, said in an interview. “It allows each developer to do the work they want to do without having to worry about the entire system.”

Connected-product development is difficult because it requires the convergence of two different development spheres. Developers working on front-end systems—devices at the network’s edge—have expertise in embedded hardware and specialized communication protocols, and have traditionally been focused on operational efficiency. Back-end developers, on the other hand, focus on integration and analysis of data to render it in a consumable format that provides insight to the business.

The digital twin is an instantiation of a real, physical object in an abstracted, digital form that acts as a proxy for all communication to an actual device, according to the report, “The Digital Twin Accelerates IoT Development.” IoT platforms with frameworks that include digital twin technology provide several benefits. These include detection of intermittent device connectivity, the ability to manage over-the-air device updates and reboots, built-in support for IoT protocols that automatically generate APIs and support for emerging de facto standards, acting as an anchor for thing-specific algorithms for asset optimization and integration, and the ability to abstract automation topologies to speed the response of analysis.

Researchers highlight capabilities in seven platforms that support variations of a digital twin concept: AWS IoT, C3 IoT, IBM Watson IoT, GE Predix, Microsoft’s Azure Service Fabric, PTC’s ThingWorx, and SAP IoT Application Enablement.

Hammond sees adoption of these platforms falling into two categories. For industrial IoT, customers are looking to their “incumbents,” like SAP, GE, and Microsoft. For connected devices, companies like C3 IoT and Amazon are leading the way.

As such, interoperability among these platforms will be extremely important, as the asset-intensive industries won’t buy everything from one vendor, Hammond said.

Even with the technology to empower it, bringing the two groups of developers together will be a challenge. As such, Hammond is seeing digital specialists and design agencies that spearheaded mobile application development efforts taking similar roles in IoT development.

“In organizations that do it well, most of the time it’s [led by] the digital organization, who might be reporting into the marketing organization or operations,” he said. “And as business cases get more complex, the digital group and operations organization will need to work together.”

>> Read more by Courtney Bjorlin, Internet of Things Institute, Jun 21, 2017