In a marketplace plagued by customer frustration, one product lifecycle management (PLM) developer takes an unusual approach — starting with open software.
Aras, developer of the Innovator product lifecycle management (PLM) solution, recently released results from its PLM Benchmark Survey for Enterprise Organizations. The survey, which ran from 2015 to 2017 and was conducted by Gatepoint Research, invited participation from select executives from “a wide variety of industries,” both within and outside the manufacturing arena.
The results weren’t pretty: Aras found that more than two-thirds of the 300 respondents are unsatisfied with the PLM software they have in place. Most of the surveyed companies “have more of what we would describe as a PDM [product data management] implementation, around MCAD,” explained Marc Lind, senior vice-president of strategy at Aras. Only one-quarter of those surveyed reported being able to make changes quickly and easily — “they’re really being handcuffed by the PDM system,” he said.
Aras realized that the complexity of the modern design and development environment demands greater customizability and upgradeability, Lind explained. Aras PLM is gaining traction, he said, because “people are recognizing that just using mechanical CAD is not going to get them to the smart, connected [place] their products need to be for tomorrow’s world.” With the rise of smart devices and vehicles, “everything is moving in the direction of increasingly sophisticated mechanical design” — complicated by added electronics, sensors, and software. According to Lind, Aras brings all that design information together in a cross-disciplinary way, providing a unified view and bill of materials (BOM) for an entire aircraft, for example.
Tackling PLM Pain Points
Aras is out to make PLM a less painful proposition, starting by eliminating something that no one enjoys: paying for software.
Changing the cost equation. “Anybody can download and use [Aras Innovator] forever with no cost,” said Lind. Aras has brought a “Red Hat business model” to the PLM marketplace, Lind explained, referencing a company known for providing Linux platforms and other open-source software products.
An optional subscription plan, which includes security updates, user training, and a help hotline, is available for a fee (which varies based on the number of users). Lind pointed out that “if people stop subscribing, they still can use the software,” but most don’t stop; he reported a subscription renewal rate of more than 97%.
Accommodating a custom fit. Customers on subscription also receive software release upgrades, regardless of how much they have customized the system. That’s a benefit that “nobody else includes,” said Lind, because of the way his competitors’ solutions are structured.
According to Lind, “all other PLM systems” — including Windchill from PTC, Teamcenter from Siemens PLM Software, and ENOVIA from Dassault Systèmes — are built from hard-coded data modules with hard-coded business logic. Making a change to the system, such as expanding part numbers from seven digits to eleven, requires breaking the business model. And by doing so, “you have effectively orphaned yourself from future updates,” Lind explained.
Aras takes a different approach, and hard-codes services instead. Services for check-in/check-out, revision and version, etc., “are hard-coded, but they have no comprehension of a data model,” said Lind. Updates to those services don’t impact users’ customizations, so Aras is able to guarantee upgradeability.
Overcoming institutional inertia. “These are long-life systems,” Lind observed; in many cases, they have been in place more than 10 years, so adopting a new solution is challenging. And particularly in larger organizations, there are many different processes that are all or partially automated. “Companies are customizing their systems, even if it is just MCAD management, and they’ve had them in place a long time,” he noted. “[There’s] a lot of organizational status quo … it’s not an easy proposition to say, ‘We need a new one, let’s go for it.’”
To reduce these obstacles to a PLM changeover, Aras offers customers the option to layer the new solution over the old. “With Aras, it’s not an all-or-nothing proposition,” said Lind. He shared the example of car maker GM, which uses Teamcenter with NX — both Siemens PLM Software solutions — but needed enterprise change management, so the company layered Aras PLM over Teamcenter.
It’s less disruptive, Lind explained, to leave a legacy MCAD system in place for a time, rather than “rip and replace.” Digital transformation and the current pace of disruption are such that “you have bigger fish to fry [than] simply ripping out your old CAD management system … you really need to get control of the BOM and variance of your product.”
PLM: Perceived Like Mandate?
The level of dissatisfaction revealed in Aras’s survey is especially concerning since PLM is increasingly seen as a non-optional tool. As the survey report states, “Product lifecycle management (PLM) is a core requirement for modern product development, particularly as organizations face increasing product complexity and shorter product lifecycles.”
Lind pointed to technological advances that are making PLM ever more necessary: For the industrial Internet of Things (IoT), PLM is a critical enabler to get data in and out of the system more easily. “The digital twin is so important right now,” Lind said. “People are realizing that the virtual model from the design is not a digital twin”; rather, a digital twin fully replicates one specific real-world item, not just in its mechanical, electrical, software, and firmware aspects but in its maintenance schedule, wear, environmental conditions, etc. — all of which translates to tremendous masses of data streaming in. “If machine learning and AI [artificial intelligence] can’t get to the data, it’s either going to not work well or not work at all,” he said.
Lind is seeing “real movement across the board,” as organizations of all sizes struggle to embrace these changes. “It’s not just consumer companies — even the largest companies are realizing that they’ve got to do something … [we’re seeing] an uncharacteristically high level of motivation to change and take this seriously.” So while companies in aerospace, automotive, industrial equipment, and medical devices may be at the front of the pack, product developers in every industry would do well to pay close attention to the changes afoot — and to examine their strategy for managing product data.
>> This article by Cyrena Respini-Irwin appeared in Cadalyst, January 27, 2018
It wasn’t that long ago that the widespread use of digital twins — simply put, digital replicas of physical counterparts — seemed like an out-of-reach goal. Providing the real-time analytics and machine learning necessary to track and monitor equipment simply was not cost-effective. But the internet of things changed all that.
Today, various companies are using artificial intelligence, data analytics, simulations and the combined knowledge of their teams of scientists and engineers to create interactive 3D models of their end products, such as plane, train and turbine engines. These models monitor machine usage, forecast life expectancy and optimize efficiency.
As more companies start implementing this technology, it’s only a matter of time before digital twins become more intricate and elaborate. Instead of simple lines of code, digital twins are set to be the next big thing in mixed reality.
Augmenting physical with virtual
Of course, digital twins are not an entirely new concept — they’re already playing a role in virtual reality (VR) and augmented reality (AR). Since the consumer release of the Oculus Rift and HTC VIVE, VR has become more commonplace around the world. Meanwhile, AR is quickly gaining steam in some sectors. As workers get more familiar with the technology at home and work, data reports, spreadsheets and other analytics are being visually upgraded to adapt to these platforms.
Rockwell Automation, for example, used its Studio 5000 development platform with Microsoft’s HoloLens VR headset to create a next-generation mixed reality experience for designers. Siemen’s COMOS is a single data platform designed to map out an entire manufacturing plant’s lifecycle by monitoring and modeling instrument data, logic diagrams, piping and more. It’s doing all this in an immersive VR environment developed for the Oculus Rift VR headset.
These data models allow workers to view and interact with data in a whole new way. Instead of being limited to two-dimensional flat screens, they’re surrounded by data and reporting. Whether as an overlay to the real world or a fully virtual one, the amount of data that can be consumed at one time increases exponentially.
But this is just the beginning of how digital twins and mixed reality are transforming the workforce.
Digital twins are taking over
For enterprises, this new accompaniment to the greater IoT ecosystem is the ultimate realization of mixed reality combined with data, algorithms, computer-aided design and simulation. The entire lifecycle of equipment — design, create, maintain and troubleshoot — gets a major productivity boost with digital twins. Countless iterations can be pushed through an endless loop of design and development testing to compare results in a virtual world much faster than the real one. Maserati recently proved this by cutting production time of its Ghibli sports saloon almost in half.
The goods being produced don’t have to be as complex as cars — even food, drugs and other mass-produced things can have useful digital twins. It can be as simple as tracking production volume: A plant owner can easily conceptualize where physical tablets are by viewing the data model. Or digital twins can make a process much more efficient. They can also be used to get information about items and their provenance. Through digital twins, that information can be created along with the original and updated in the cloud.
Imagine the possibilities for training and collaboration, especially in extreme situations — for instance, a technician working in the field may not have all the necessary data or skills for a crucial repair on a company turbine. Using a digital twin, an engineer from the manufacturer can diagnose the issue and feed instructions to an AR display worn by the field technician. It’s teamwork that wasn’t possible before.
And this is just the beginning. The continued evolution of AI, machine learning systems (along with constantly improving processing power), and IoT in general is only going to make digital twins more prominent and powerful.
Building a new virtual ecosystem
Whether digital twins are the final frontier remains to be seen, but they’re certainly the next one. The IoT ecosystem we’re building has the necessary data, and 5G wireless networks allow us to transfer that data faster than ever before. This combination of AI and 3D data modeling is the ideal use case for enterprise mixed reality.
As we move toward 2020, digital twins are going to become exact proxies of their real-world counterparts. And it’s not just happening on the manufacturing end — consumers are willingly installing fitness trackers and other IoT sensors in their homes and on their bodies. This data can be invaluable in assessing what consumers need and enhancing products to fit those needs.
From concept to production to even beyond the sale, digital twins are poised to affect nearly every industry from the ground up. And if you’re not ready, chances are you’ll be surpassed by those who are.
Why is cloud becoming so increasingly popular with manufacturers? Perhaps it has something to do with the fact that manufacturers are looking for ways to improve the efficiency of new product introductions (NPIs). Getting to market faster, easier, smarter, and more affordably are keys to success, especially in a competitive market where opportunities appear and disappear at the drop of a hat. If you just convinced your business to invest in a cloud solution you’re obviously not befuddled, because you’re aware these challenges also drive demand for the benefits of cloud solutions:
IT management of on-premise solutions and fake clouds can compromise a manufacturer’s agility, bog them down, and distract them from becoming more return on investment (ROI) -focused on its core business.
Dissemination of critical data—including bills of materials (BOMs) and intellectual property (IP)—across an increasingly complicated network raises the risk of security breaches, product errors, and delayed time to market.
Mobile and portable devices require a solution architecture that’s deliverable 24/7 at world-class service levels.
A “flat world” (a level global playing field without geographical, balance sheet, or technology borders) increases the need for solutions that can scale efficiently to enable manufacturers to take advantage of global market opportunities.
Global distribution of a supply chain workforce challenges the proper alignment of people, processes, and overarching business goals. This can result in potential miscommunication, product revision coordination, and control issues.
Despite the fact that a greater number of manufacturers are embracing the cloud to maximize their IT budget and return on investment, many competing product lifecycle management (PLM) providers still see cloud as a hazy virtual concept. It’s not. On-premise cloud virtualization sacrifices cloud-based economics, shifts business risks back on the manufacturer, and weakens scalability.
As more companies realize the benefits of cloud, more fake cloud providers have appeared that are eager to exploit the naïve. The true promise of a multi-tenant cloud PLM solution includes the following: cost savings by removing IT expenses, simplifying implementation, scalability, increasing security, and providing a faster path to ROI.
One of the top benefits of cloud is enabling accessibility and mobility. With cloud applications, supply chain teams can access e-mail, documents, and BOMs from anywhere in the world over a secure internet connection. Because employees, partners, and contractors have remote mobile access to information outside of the traditional firewall, and have it whenever they want, processes are streamlined with greater efficiency than can be expected with an on-premise solution. Streamlined processes and tasks, in turn, allow companies to focus more on innovating and developing new products.
With cloud computing, there are no up-front capital costs needed for servers, hardware, and data centers, as is the case with on premise solutions. A cloud service provider charges less for energy than you’re spending on your own data center. Consider these statistics: In general, only 15% of on-premise servers are ever utilized, with operations accounting for 30% of total data center costs; plus, power and cooling are usually responsible for an additional 30% of costs.
And finally, cloud offers customers a distinct security advantage over on-premise solutions and charlatan cloud solutions. Once considered a security liability, cloud—with upgrades, updates, backup, and security patches managed by experts—is now considered a superior security option compared with many on-premise solutions.
Congratulations. You’re now armed with information to convince your business to invest in a cloud solution, thus forgoing the costly infrastructure and resources needed to maintain on-premise software. You’ve just eliminated the opportunity cost of managing an on-premise solution, enabling your company’s precious human capital (namely engineers) to focus on critical core business activities like designing new products.
Last January, Marinko Lazanja, director of engineering systems at Gentherm, pulled the trigger on the first of many dress rehearsals for an important milestone for the manufacturer of climate control and thermal management systems: migrating from a 15-year-old legacy product lifecycle management (PLM) platform to PTC’s Windchill.
The $1 billion Gentherm, a key player in the automotive supply chain, had long outgrown its MatrixOne system. However, because the platform was ground zero for critical product and engineering-related material (nearly 40,000 documents comprising up to 2 to 3 terabytes of data) and because its functionality had mostly held up over the years, Gentherm kept postponing an upgrade to modern-day PLM. That is until the company’s exponential growth—nearly doubling in size over the last five years—and the complexity of its product bench made the decision to put off switching PLM platforms no longer sustainable.
“Our old system served its job well, but it couldn’t support our current processes—it lacked support for a multi-CAD environment and it didn’t adequately enable reuse,” Lazanja explains. “We couldn’t keep using old tools. It was becoming more work to use the tool than to look for something else.”
Gentherm, like hundreds of early adopters of PLM, kept its first-generation PLM platform active far beyond its shelf life because of the cost and complexity associated with replacing a core system on an enterprise scale. “Lots of customers are stuck with 15- to 20-year-old PLM technology and they can’t upgrade,” says Kevin Power, business development manager for Tata Technologies. “They look at the data migration mountain and the huge effort to move into a new system and there’s not enough pain (tolerance) to go through with that.”
In addition to expensive software, protracted implementation services and the general upheaval to engineering culture and the broader core business processes, one of the biggest inhibitors to moving between PLM platforms is the laborious task of migrating legacy data over to the new system. The process hasn’t gotten much easier despite the significant technology advances in current-day PLM, in part, because of the sheer size and scope of data now managed by the platform. In addition, many first-generation PLM deployments are highly customized, making it difficult for data to easily translate between systems.
“The tools are getting better at the same time the problem is getting harder and more complex,” says Tom Makoski, executive vice president, PLM & Migration for ITI Global, a consultancy specializing in product data interoperability problems. “There’s been an improvement in import and export capabilities, but there’s now much more complexity to the data going into these systems and much larger amounts of data migrating.”
PLM Data Migration Best Practices
Gentherm’s migration plan involves shifting 15 years’ worth of product-related data out of its long-time Matrix One PLM platform and over to PTC Windchill. Rather than taking a big bang approach, the company has staged its migration over the course of two years, initially moving its quality management and documentation systems, implementing ProjectLink for project management in September, conducting a range of migration dress rehearsals and data verification tests, and targeting a full migration and retirement of MatrixOne by 2018.
“We’re building it out phase by phase—Windchill is a powerful tool, but there’s a lot of cultural change that comes with it and we’ve been using [Matrix One] for many years,” Lazanja says. “This is something you have to plan carefully.”
One of the first steps to migration planning is to understand your data and figure out exactly what and how much data to move over to a new PLM system, according to Annalise Suzuki, director of technology and engagement for Elysium, a provider of multi-CAD interoperability tools. In the case of CAD data and 3D models, much of the complexity can be removed from the equation because it’s not always necessary to transfer 100% of existing data, depending on how that data is expected to be used, she explains.
“The percentage of legacy data expected to undergo engineering changes, which truly require feature history, can be very small,” she explains. “It’s a lot of work if you expect you need everything and really don’t.” Elysium’s tools can be tapped to repair quality issues on CAD models and to facilitate CAD-to-CAD translation.
Once the data has been identified, transformation of that data is critical to ensure a smooth migration. Over the years, data in legacy PLM systems can be corrupted for a variety of reasons, including system updates and upgrades or changes to process and business rules. Removal of relations, missing mandatory attributes, incomplete revisions and orphan data sets are common causes of data integrity failures, says Troy Banitt, director, Teamcenter Product Management, Platform and Product Engineering at Siemens PLM Software.
“When you want to move from an older to a newer PLM solution and migrate data, the big expense is cleansing and validating the legacy data,” says Tom Gill, senior consultant, PLM Enterprise Value & Integration Practice Manager for CIMdata, a PLM strategy management consultant firm. “The reality is data atrophy can set in, and data needs to be maintained to stay valid. People don’t follow standards and rules so data is never perfect.”
PLM migrations can also be compromised by legacy data that doesn’t conform to current business rules. In addition, because companies are increasingly operating at a 24/7 pace, migration performance is critical, as there is a limited window to accommodate what is typically a lengthy schedule, Siemens’ Banitt adds. To help its customers with PLM migration, Siemens released the Deployment Center, a web-based installer designed to make it easier to install, patch, and upgrade Teamcenter software, including development and test environments. The company also has a number of loading tools to convert data from CSV format to native Teamcenter format along with frameworks designed to streamline the integration of Teamcenter with legacy applications.
One of the ongoing questions in a PLM migration project is what and how much data to migrate. Some, like CIMdata’s Gill, advocate for migrating all relevant data because it creates too much complexity to maintain separate processes and incur infrastructure costs related to keeping siloed systems live. “If you think data has value then it should be migrated—scope what needs to be done and retire the old solutions to reduce complexity and costs,” he says.
Another school of thought is to make some data accessible in a PLM system, in PDFs for viewing, for example, but don’t migrate complex native data like CAD to the new platform. A manufacturer building a highly engineered product like a satellite might not want to retire its legacy system and instead may want to build integration between that platform and its new PLM system. On the other hand, a supplier that builds small assemblies might decide it’s easier to bite the bullet and drop its legacy platform and migrate everything over to the new PLM platform.
After the merger of car giants Chrysler and Fiat, the new Fiat Chrysler Automobiles (FCA) embarked on a full-scale integration effort to replace Dassault Systèmes’ CATIA V5 and VPM with the Siemens suite of Teamcenter, TcVis, and NX to serve as the key PLM backbone for FCA Engineering. There was considerable effort to prepare CAD data structures and a set of dedicated tools were implemented for data extraction from the previous solution for ingestion into Teamcenter, says Gilberto Ceresa, senior vice president at CIO for FCA. To streamline the effort, FCA staged the migrations based on vehicle program timing and their related part carry over needs to be made available from legacy solutions. “This approach allowed us to split the migration in different waves of smaller complexity than a single big bang approach and allowed the team to gain experience on the migration procedures applied along the process to improve the overall quality,” Ceresa says.
From the get-go, Ceresa says it’s important to have a clear understanding of the business benefits to be gained, starting with a clear picture of the existing processes and with a clear vision about how the new PLM solution will interact with the existing application landscape. “Internal customer involvement in the solution design, a timely communication plan for change management, and tailored training are then crucial to mitigate the risks of activities with this level of complexity,” he explains.
Honda, a long-time Dassault customer, is taking a similar tack of careful planning and moving only the data it needs as part of its ongoing transition toward increased virtualization and digital manufacturing using Dassault Systèmes’ 3DEXPERIENCE platform. For example, when implementing a new model process planning tool to help engineers create an efficient workflow for assembly, the company realized it only needed some of the CAD data in the new tool.
“We had a legacy tool that was only text-based. It was really a valuable tool for our process engineers for what we had at the time, but we couldn’t take advantage of the 3D data that was being generated by R&D,” said Ron Emerson, associate chief engineer, Honda North America, during his presentation at the Dassault Systèmes 3DEXPERIENCE Forum in October last year. “We couldn’t take advantage of all the data that came along with that. It was really hard for the process planners to visualize what they were assembling.”
The requirements for the more visual tool included fast and simple part visualization. Honda worked with Dassault to develop the new model planning structure application, which helps engineers quickly see when they’ve dragged and dropped all the required parts into an assembly sequence. The new tool provides Honda with the ability to load an entire vehicle into a session 80% faster using lightweight data, rather than all of the heavy mathematics of the full CAD data.
“Both strategies are valid and you make the decision on a customer-by-customer, application-by-application basis,” says ITI Global’s Makoski, which promotes a multistep process for PLM migration, including exporting, transforming, aggregating and loading.
PTC offers a range of choices to ease the migration burden, promoting multiple levels of migration and integration and the idea data should live where it works best. For lightweight, spontaneous access to data in PLM or SAP platforms, mainstream users can tap the new ThingWorx Navigate to aggregate data, enabling them to look at a parts structure as a part of a bill of materials (BOM), for example, without going through the pain of a big migration project, says Mark Taber, vice president of marketing for the PLM group.
Support for standards like Open Services for Lifecycle Collaboration (OSLC) enable a higher level of integration, allowing data to stay in an existing system, but still synchronizing traceability and compliance between platforms, Taber explains. PTC also has capabilities for integration via partner middleware programs along with a solutions and partner program to assist customers in the traditional full-scale PLM migration.
The idea, Taber says, is to offer choices. “The fact that product data lives in so many different systems of record, the idea that you have a single system that has everything isn’t practical anymore,” he says. “But you still need a single view of that information.”
For companies like Autodesk and Arena Solutions, the nature of their typical customer profile (companies new to PLM) and the fact they support cloud-based PLM solutions makes migration less of a pain point for customers, although it still remains an issue, says Steve Chalgren, Arena Solutions’ executive vice president of product management and strategy, and its chief strategy officer. Most Arena customers don’t have PLM already in place, he explains, and are instead migrating data stored in Access databases or spreadsheets to the new platform.
In fact, Chalgren contends Arena’s support for the cloud actually serves as a catalyst for many customers to bite the bullet on PLM migration. “We are seeing a regular cadence of people migrating to us from legacy systems because we are modern technology and they want to move to a more efficient enterprise software strategy in the cloud,” he says. “Enterprise software is relatively sticky—it’s expensive and disruptive to move so you need a reason to do it.”
Cloud-based PLM also provides an opportunity to approach what is typically a significant business project in digestible pieces, notes Charlie Candy, Autodesk’s senior manager, Global Business Strategy for Enterprise Cloud Platform, which includes the Fusion Lifecycle PLM platform. “Starting with quick wins, supporting processes to larger use cases like NPI (new product introduction) is a good way to get users on board early and show the product potential,” he explains. “This approach delivers incremental value and improvement.”
Standards help alleviate some of the pain associated with legacy PLM data migration, but the problem will remain a heavy lift for the foreseeable future. “The newer PLM solutions have more capabilities and support integration better and there are standards that make overall interconnection easier, but it’s still a complex task and you still need governance,” CIMdata’s Gill explains. “Otherwise, you paint yourself into a corner and make the solution unsustainable.”
On their own, the Internet of Things (IoT) and product lifecycle management (PLM) each offer opportunities for product design professionals. What insights can result when the two systems are brought together?
By enabling products to connect with the Internet (and with each other), the Internet of Things (IoT) opens up new avenues for gathering information about product function, provides insights into user behavior, and enables functionalities not possible otherwise. But to realize these benefits, product developers must embrace new workflows, rethink how familiar products should work, and determine how to handle the heavy data loads their new designs will generate.
The latter is a source of headaches, but also tremendous opportunities. Feeding product developers a stream of real-world data about how — and how frequently — a product is used, the conditions it operates in, and how well it performs can yield improvements to that product, and even entirely new products, according to Arena Solutions. The company foresees a future in which a maturing IoT will deliver “enormous” benefits to product developers, making this data more accessible and actionable by connecting cloud-based PLM systems (such as its Arena PLM solution) with the IoT.
“Now they can see much more clearly how to improve that product … all that guesswork and hunch-work starts disappearing,” said Steve Chalgren, executive vice-president of engineering and CTO at Arena Solutions.
It’s an evolution that mirrors what happened when software began migrating from on-premise implementations to the cloud a decade ago, Chalgren pointed out. Before that point, software developers had a disconnected view of their products’ use, he said, relying on surveys, customer site visits, and feedback from sales teams. Although those information sources were valuable, they left crucial questions unanswered: “How many people are adopting and using it? Is it shelfware or not? What parts [of the software product] are they using or rejecting?”
As a cloud-based company, Arena is able to collect data about the use of its software that provides a clearer view of what works for customers — and what doesn’t. “Now we can see with real data, and make real product improvements based on the data,” said Chalgren. “That’s the benefit to companies that are connecting their product with IoT.”
Connecting PLM and IoT systems is not a new concept. “I think it happens already — it starts as an informal process,” Chalgren observed. Data collected by IoT developers, such as the number of times that a user presses a button on a product, is very helpful to engineering teams, he noted. “I don’t think you’d have to ask [engineers] to look at it, they’d be so curious.”
The next step is to make that transfer of data more programmatic, defining analytic triggers so when a particular threshold is reached, an engineering process is automatically launched — such as creating tickets in a PLM system. “It will just get more and more codified and automated,” Chalgren predicted. As for Arena, which provides an application programming interfaces (API) for integrating Arena with other systems, “we’re on a very clear path to connectivity,” he affirmed.
The Non-Optional IoT?
As Chalgren sees it, for product developers the impact of the IoT is not a matter of if, but when. “I think it’s just becoming part of every product. … It’s cheap to have a connected device and a chip,” he said. “Everyone’s going to be forced to do it.”
That growing ubiquity is reflected in Arena’s own customer base: more than 250 of them are major or startup IoT companies. That’s a huge increase over three or four years ago, Chalgren noted, when the number was no more than 50.
Improvements from one generation of a product to the next are much bigger than in the past. According to Chalgren, high-tech products continue to evolve after hitting the market almost as quickly as during their initial development. (The inclusion of electrical elements in IoT-connected devices is a contributing factor, as electrical engineering processes are much more iterative than mechanical ones.)
Chalgren also noted that many “companies you wouldn’t have thought of” are connecting. For example, he sees a huge opportunity in the automotive area, with cars and stoplights interacting, or with streetlights that can monitor and report on local weather conditions (an initiative that GE is pursuing). “I think IoT will move outside … will start going from the computer in the home to the real world.”
Boosting the Human Voice with Data
Embracing the IoT/PLM connection to gain product improvement insights will provide benefits in a variety of markets, Chalgren explained: “In a super highly competitive market, where small things make a big difference in your success, this is it.” And in a stable market where design teams are isolated from customers — meaning they don’t know about certain problem areas customers are encountering — gaining more insight into those problems can shake things up. “Someone is going to use that information to disrupt [the status quo],” he observed.
Even with the data Arena now gathers about its software via the cloud, the company still has a place for surveys and meetings with customers to observe their challenges and get their feedback. “There’s no replacement for hearing directly from someone about their frustrations, so I think that will continue, but it will be a lot more productive … before, we couldn’t even know where to look,” said Chalgren. “The improvement of communication, the quality of voice of the customer, is now able to be clear.”
>> Read more by the Cadalyst Staff, Cadalyst, November 25, 2017
Siemens PLM has created the Advanced Machine Engineering (AME) solution to provides a platform that connects mechanical, electrical, and software engineering data to allow engineers access to a completely digital machine-build prototype. This digital twin represents an industrial machine operation that can be tested virtually throughout the development process. The goal of the engineering platform is to increase collaboration and reduce development time, while also reducing risk and allowing for the reuse of existing designs.
The AME uses modularized product development to establish common parts and processes among a family of products while defining functional modules that can be easily modified to meet specific requirements support changes. In other words, you can build the manufacturing process like a collection of Legos (chunks of software), then customize the configuration and test it before you begin banging equipment into place.
By involving mechanical engineering, electrical engineering, and software development processes simultaneously, you shift away from the more time-consuming serial development process. You create a concurrent method that effectively turns the process into mechatronics.
Siemens developed the AME into order to speed the time it takes to set up plant equipment while also making the machine configurations easier to customize. “We created this for companies that are making automation control equipment, packaging machines, printing machines, anything that has a lot of mechanical systems and components, as well as sensors, and drives,” Rahul Garg, senior global director of industrial machinery and heavy equipment at Siemens PLM, told Design News. “Typically, these are the companies making products and machines that go into a plant.”
Creating the Modular Plant
One of the goals in developing AME was to make plant equipment modular, so the overall configuration of plant processes could be done more quickly and with greater flexibility. The digitized modular plant concept was also designed to reduce risk and engineering time. The process can be design and tested digitally. “Many of these companies need to serve their end customers with increasing customization,” said Garg. “We wanted to create the ability to modularize the machine structure to deal with customization and quickly respond to engineering or systems changes.”
The modular approach to managing plant equipment also supports change, especially since much of the engineering to support the change is worked out on a digital level using existing modules that are already validated. “This improves the way the machine builders manage the end-customer requirements. Those requirements are change. How do you, manage that change? Get the engineering communicated to the shop floor and to those who service the products,” said Garg. “We are trying to improve the way they manage the engineering process and schedules to better control and improve the risk while working on large projects.”
Mechatronics on the Machine Level
The idea is to build new functionality into the equipment driven by automation and analytics. The intention is to turn it into an easy and rapid process. “You have to deliver the innovation in a fast process and reuse it,” said Garg. “The idea is to create a digital twin of the machine where you can simulate the entire behavior of the machine using control software and control applications. You drive the systems with the software.”
The AME contributes to the concept of the digital twin, which digitizes a product from design, through configuration, and into operation at the customer’s plant. “What we are trying to do is create manufacturing functions through the visualization process,” said Garg. “Then we want to take digitation further, by closing the loop with the physical product. Once the plant equipment is out in the field and the customers start using the equipment and machines, we want the ability to see and monitor the performance of the equipment and see how it’s performing.”
>> This article by Rob Spiegel was reposted from DesignNews.com (November 23, 2017)
The data revolution is firmly underway within today’s manufacturing industry. Those companies that capture their “big data” and leverage that analyzed data as a framework for making faster, better decisions will lead the industry in productivity and time to market.
Let’s put into perspective the scale of this data revolution. Consider the following: In just one manufacturing site, an estimated 3 billion sensor-related events occur during a 24-hour period. Each of those sensor-related transactions represents a piece of data, and most of that data can and should be used to improve operational efficiency.
As greater numbers of smart field sensors and actuators deploy across manufacturing sites, these formerly “dumb” devices are now “connected” and begin to add to the data stream. Like tributaries flowing into a giant pool of data, those data elements are converted into useful information, which serves to aid in the decision-making process and ultimately, results in improved production.
But all of this does not happen automatically and some sophisticated tools are required. The good news for manufacturers is that most operators are already familiar with the core tools that provide this data capture and analysis service. The industry calls them supervisory control and data acquisition (SCADA) systems. However, in this new era of full-fledged digitization, traditional SCADA systems have improved and their business value have taken on a new meaning.
At Siemens, the developers have recognized that SCADA is now a critical solution for connecting a plant’s distributed assets in order to generate actionable intelligence. To support this role, Siemens is evolving its SCADA applications for deployment on a smaller and more condensed scale. For instance, in traditional applications system nodes might have been physically located miles apart. In the new connected and data dense environments, these nodes may now be inches apart. Siemens newest SCADA platform, WinCC, for instance, has the ability to tie together data from both closely coupled and widely deployed assets. The result is a more flexible, reliable and transparent environment with more intelligent automation and the ability to collect and analyze big data in real time for actionable information and better business decisions.
What is the digital factory big picture?
The way products are made is the same within both a traditional factory and a digital factory. Holes are drilled, parts molded, bottles filled—but the difference is information. In the case of the digital factory, the “smart” devices work together, and the control system interconnects the disparate processes all with one objective in mind–to build competitiveness. Increased efficiency, reduced time to market, and enhanced manufacturing flexibility are now enhanced because of an underlying system that is optimized to process data.
The digitized approach of data gathering, data centralization and data analysis helps to integrate the five basic stages of the product lifecycle: design, production planning, engineering, production, and services. While the product is being designed, all the subsequent stages are already planned so that the overall process operates more efficiently. For example, manufacturing offers feedback on product design from the earliest stages to ensure smooth production. Using simulated modeling through every phase of manufacturing, it is easy to identify critical elements and potential risks and address issues as early as possible for maximum efficiency.
The SCADA contribution to the data flow
Smart devices throughout the plant become part of the SCADA network facilitating the data flow. Listed below are five areas where SCADA systems like Siemens WinCC add value to a functioning digitalized plant:
1. Data management—The enormous variety of field devices each generate their own data. To make this incoming data useful, data formats need to be consistent. That’s where the data management component comes in. The output of a good data management system is the rationalization of data so that it is both comparable and storable. A system such as WinCC presents data in real time and also archives it for subsequent analysis. The system can then identify trends or engage in troubleshooting. If a problem occurred in one section of the packaging line at 3:00 pm last Tuesday, what information was being generated by devices up- and down-stream of that problem area during that time period? The WinCC system can provide such information in a quick and straightforward manner.
2. Information management—Data needs to be translated into production information so it can help optimize manufacturing. WinCC’s Information Server tool can create dashboards that provide real-time displays and visibility to plant operations. Managers can access the dashboards either locally, or remotely. Automated reports are generated that monitor critical process elements across any desired time interval.
3. Energy management— Energy management has emerged as both a regulatory and cost control issue. Adhering to standards such as ISO50001 helps to conserve resources, tackle climate change and lower electricity, gas and water costs. In order to reduce energy consumption, the first step is to be able to measure how much energy is being consumed. WinCC can act as a mechanism for capturing energy consumption data from devices such as transformers, meters, circuit breakers and motor drives—all places where power consumption can be measured. Then, understanding these energy use patterns, operations teams can avoid utility peak charges by reducing consumption during the times of the day when rates are expensive.
4. Diagnostics management— Tools within the WinCC environment allow users to view system and device diagnostic information. The easy access to this information speeds up the process of troubleshooting and repair. Everyday issues such as identifying shorts, wire breakage, missing voltage load, limit violations and other system defects can be quickly identified and addressed, avoiding long delays in both locating the problem and identifying the solution. WinCC provides alarms for immediate notification when problems emerge, and displays clear-text information pertinent to all devices, including sensors, PLCs, HMIs and servers. If a programming error exists within a PLC, the system identifies which line of code caused a trip.
A Totally Integrated Automation Portal (TIA Portal) provides a consistent look and feel as users navigate across plant functionality areas including process and component diagnostics. Simulation tools within the TIA portal allow for more proactive approaches to both diagnostics and energy management functions. In essence, the TIA portal acts as a gateway to the automation within the digital factory.
5. Open communication—Digitalization is driving a merger of automation systems with the IT world, so more systems, even those traditionally considered incompatible, are interconnected. A system such as WinCC serves as a data bridge between the core Operations Technology (OT) and Information Technology (IT). To access even more operational data through the value chain, WinCC leverages MindSphere, Siemens cloud-based, open IoT operating system, which enables powerful industry applications and digital services to drive business success.
First steps, and a path to digitalization payback
Digitalization is a competitive manufacturing advantage that can be adopted over time. As a manufacturer works to modernize a plant, adding a SCADA system such as WinCC is a critical first step in establishing interoperability. The advantages of interoperability aid in facilitating a more competitive manufacturing environment. Some of the benefits include:
Plant and IT systems begin to communicate – A more direct exchange of information can occur as plant-level functions connect with MES, ERP and other management platforms.
Management can make decisions more quickly – More up-to-date and detailed information allows management to drive more optimized plant processes.
Energy savings – Energy use can be measured and reduced as consumption data becomes more transparent. Implementation of ISO50001 standards becomes simpler.
Improved production uptime – Effective use of diagnostic information allows for more streamlined maintenance, and resources spend time where it’s needed most.
Synergistic improvement – Initial successes encourage wider deployment of smart devices at all levels, increasing the flow of information to support better decision-making.
As the digitalization process evolves, SCADA systems such as WinCC expand easily to accommodate and support new integration and communication phases. As digitalization intensifies, the system maintains its role as the primary facilitator of networking and information flow for a more connected and competitive plant.
>> This article by Alan Cone, Siemens, was re-posted from Automation.com, November 3, 2017
As little as a year ago, small and medium sized enterprises (SMEs) were being told about the benefits of big data, but most didn’t know where to start nor did they have the time and expertise to produce insights that actually impacted their business. Today, SMEs are actively implementing big data into their processes—extending the use of data collected from multiple departments throughout an organization and its supply chain. Being able to tap into data retrieved from their partners, suppliers and end users is more common for SMEs today than you would think.
According to BI (Business Insider) Intelligence, IoT (Internet of Things) devices connected to the internet will more than triple by 2020, from 10 billion to 34 billion. IoT devices will account for 24 billion, whereas traditional computing devices (e.g., smartphones, tablets, smartwatches, etc.) will comprise 10 billion. The research firm also noted that businesses will be the top adopters of IoT solutions because they will use IoT to (1) lower operating costs; (2) increase productivity and (3) expand to new markets or develop new product offerings.
The Internet of Things (IoT)
New trends in data and the increasing presence of the IoT can be leveraged without sinking in time, investment and payback. The balancing act is to navigate this data with a minimal amount of resources to find anomalies in the simplest way. Product Lifecycle Management (PLM) software can help SMEs do just that. PLM offers the ability to retrieve valuable information including design data, product engineering, manufacturing and field data and tie this information to a central product record to reference for new product iterations. Although the big PLM players have made acquisitions, tacking on expensive data analysis services to their pricing or additional fees for integration to other systems, there are PLM vendors looking out for the SME by providing a simpler, more streamlined way to leverage the IoT and Big Data trends in a more significant, beneficial framework. It won’t seem like a tsunami crashing down but rather the kind of waves that you would want to ride and that would actually bring you to the shore—on solid ground and in one piece.
For starters, PLM technology that connects processes within multiple departments and active teams from engineering, operations and manufacturing can create a closed loop, enabling manufacturers to develop products that are able to leverage IoT data to improve product development processes. The IoT data gathered from a product’s real-world performance and quality could impact all stakeholders across the company in product design and development, manufacturing, sales and marketing, customer operations and after-sales services. It can also make a difference for suppliers as well since this feedback can be circulated back to the suppliers of the parts being used in the product design. Suppliers can then make any adjustments to parts to optimize performance.
IoT data often has the ability to deliver data points in real time and close the lifecycle loop. Product planning, design and quality departments can now learn from a product’s operational behavior to improve features that customers use most. For example, IoT data managed with PLM will enable manufacturers to track and configure product design requirements based on usage patterns and allow for the redesign of parts or systems to improve quality. In addition, mobility improvements give users access to PLM data and processes from their mobile devices, allowing them to securely review, respond and react faster. It gives them the ability to record and process audit findings and any quality issues onsite or in the field.
Leveraging the Digital Age
Manufacturers have consistently focused on improving quality, performance, reliability and positive relationships with customers. However, it has become evident that the next generation of competition is digital and seems to entail change in everything from designing products to supply chain management. The product manufacturer that never had to worry about embedding electronics and software components into its product design is now finding the opposite is true. Having to buy chips and software off the shelf is changing the supply chain for manufacturers. They must work with new suppliers and determine delivery schedules, costs, end of life, etc. PLM helps manage the more complex supply chains by connecting directly to online content providers to predict issues with key factors such as availability early in the development process to save time and money.
There are still some potential challenges for SMEs regarding how to best use the data they have acquired. SMEs want to be able to harness the power of this data available to them without drowning in it, and potentially losing their progress. Useful, meaningful data retrieved can assist SMEs in gaining a competitive edge through designing better-engineered products. Data is streaming in faster and from more sources than ever as the new movement toward IoT grows and mobility brings even more connectedness, resulting in more information.
Riding the Wave
There is no doubt that this data trend is here to stay. For SMEs, this means finding a way to ride the wave to be able to reap the business benefits without drowning in the data. It is apparent that PLM technology designed for SMEs with its ability to centrally manage complex product data, connect internal and external departments and devices, and track and resolve product issues, is a natural solution. SMEs who turn to their PLM provider will be able to take advantage of Big Data and the IoT to design better products and maintain a competitive edge.
>> Read more by Chuck Cimalore, Digital Engineering, September 27, 2017
HP’s Multi Jet Fusion 3D printing technology integrated with Siemens’ flagship NX software for product development and manufacturing
New Siemens’ NX AM for HP Multi Jet Fusion software module officially certified by HP for production-scale industrial 3D printing
Reinforced partnership aligns technology roadmaps for next generation HP Multi Jet Fusion 3D printers
Building on a longstanding partnership, HP Inc. and Siemens are accelerating 3D printing for industrial production through the creation of a new HP-certified Additive Manufacturing (AM) software module from Siemens. The new software module, Siemens NX AM for HP Multi Jet Fusion, is now available from Siemens PLM Software as an extension to Siemens’ end-to-end design-to-production solution for additive manufacturing. The NX™ software module will allow customers to develop and manage parts in a single software environment for their HP 3D Printing projects, avoid costly and time-consuming data conversions and third-party tools, and improve their overall design-to-finished-part workflow efficiency. Siemens and HP are also aligning future technology roadmaps to enable designers and engineers to completely reimagine products to take advantage of HP’s 3D printing capabilities, escape the limitations of conventional manufacturing, and cost-effectively produce new products at faster speeds. This in turn will lead to greatly expanded opportunities for the industrial 3D printing of innovative designs.
Siemens’ new software module will enable NX customers to combine design, optimization, simulation, preparation of print jobs, and inspection processes for HP Multi Jet Fusion 3D printed parts in a managed environment. Users can now load multiple 3D part models into NX, and auto nest and submit them to an HP 3D printer, all in a single environment and with a minimum of steps. The NX and Multi Jet Fusion integration also eliminates the need for data conversion between software applications or process steps and, in the future, is intended to allow unprecedented control, including material characteristics down to the individual voxel-level. This will result in the ability to print parts with variable textures, density, strength and friction, as well as thermal, electrical, and conductivity characteristics.
“HP and Siemens are bringing together the best in design and manufacturing workflow software for the best in 3D printing, unleashing a wave of new product possibilities with the speed, quality, and economics required for the modern digital industrial era,” said Michelle Bockman, global head of 3D Printing Commercial Expansion and Development, HP Inc. “We look forward to collaborating with Siemens to continually raise the industry bar on what’s possible for customers with the voxel-level design capabilities of our Multi Jet Fusion 3D printing solutions.”
Siemens and HP share the objective to industrialize additive manufacturing. HP’s award-winning Multi Jet Fusion 3D printing solution is a production-ready commercial 3D printing system that delivers superior1 quality physical parts up to 10 times faster2 and at half the cost3 of current 3D printing systems. With Siemens’ comprehensive offering covering product lifecycle management (PLM) and electronic design automation (EDA) software, integrated automation and manufacturing operations management, combined with HP’s 3D printing solutions, manufacturers have the tools to establish additive manufacturing as a truly industrial production process. Both companies continue to work together and with other industry leaders to create an important ecosystem of partners who can help realize the goal of additive manufacturing as a viable production alternative.
“At Siemens, we see additive manufacturing as a transformative digital force that is empowering companies to reimagine their products and factories to achieve new levels of business performance,” said Zvi Feuer, senior vice president of Manufacturing Engineering Software, Siemens PLM Software. “Deepening our partnership with HP and driving their innovative 3D printing technology is especially important as companies look to increase speed to market, differentiate on product performance, simplify production and supply chain operations, and implement new business models. As products become more complex and individualized, we look forward to the next frontier of 3D printed parts with multiple materials, tunable mechanical properties and integrated electronics.”
Based on dimensional accuracy of ±0.2 mm/0.008 inches, measured after sand blasting. See hp.com/go/3Dmaterials for more info on materials specifications. Based on the following mechanical properties: Tensile strength at 50, Modulus Z 1900, Modulus XY 1900. ASTM standard tests with PA-12 material.
Based on internal testing and simulation, HP Jet Fusion 3D printing solution average printing time is up to 10x faster than FDM & SLS printer solutions from $100,000 USD to $300,000 USD on market as of April 2016. Testing variables: Part Quantity -1 full bucket of parts from HP Jet Fusion 3D at 20% of packing density vs same number of parts on above-mentioned competitive devices; Part size: 30g; Layer thickness: 0.1mm/0.004 inches. Fast Cooling is enabled by HP Jet Fusion 3D Processing Station with Fast Cooling, available in 2017. HP Jet Fusion 3D Processing Station with Fast Cooling accelerates parts cooling time vs recommended manufacturer time of SLS printer solutions from $100,000 USD to $300,000 USD, as tested in April 2016. FDM not applicable.
Based on internal testing and public data, HP Jet Fusion 3D printing solution average printing cost-per-part is half the cost of comparable FDM & SLS printer solutions from $100,000 USD to $300,000 USD on market as of April 2016. Cost analysis based on: standard solution configuration price, supplies price, and maintenance costs recommended by manufacturer. Cost criteria: printing 1-2 buckets per day/ 5 days per week over 1 year of 30 grams parts at 10% packing density using the powder reusability ratio recommended by manufacturer.
>> Read more from Siemens Press Release, September 6, 2017
The terms Industry 4.0, Big Data, the Internet of Things, and the Digital Factory are being pitched around like a rugby ball, and almost always with a decided lack of clear definition. Let’s set the record straight.
After German Chancellor Angela Merkel, in conjunction with her ministers of industry and education, ordered a study about the manufacturing environment, the German Academy of Science & Engineering drafted the vision of Industrie 4.0. It was planned as a coordinated initiative among the IT world, universities, and various manufacturing associations designed to reshape industry. It would seek to combine the physical, virtual, IT, and cyber systems, thereby creating a new working environment between the worker and machine. The 4.0 part of the name, incidentally, derives from the fourth industrial revolution — the predecessors being the emergence of mechanization through steam/water power, the impact of electricity on mass production, and the invention of the computer, which led to our modern concepts of IT and automation.
Industry 4.0 (English spelling) has been adopted worldwide as a functional goal in industry -— especially the manufacturing world. Industry 4.0 represents a high point of dynamic achievement, where every company — whether a large OEM, major tier supplier, or smaller job shop — can implement and benefit from the technologies and communications platforms available today.
Without question, Industry 4.0 is less a vision of the future and more a vibrant collaboration among IT, machine builders, industrial automation integrators, and especially motion control suppliers that function at the heart of the machines, simultaneously effecting motion, then gathering and transmitting the relevant data to the appropriate control link in the company’s infrastructure, all at speeds measured in nanoseconds.
To work effectively, this concept requires a standardization of platforms in both communications and languages used.
Integration in Practice
While the Big Data idea overwhelms most managers, technicians, and operators alike, the key is the manipulation of that data in a hierarchy of need, to borrow a term from the psychology world. The mobile device, tablet, cellphone, and now the human machine interface (HMI) screen itself, can all be useful tools in transmitting the most important data from the shop floor to the top floor, or just down the hall to the front office. We say that for a reason, as the small shop owner would be well advised to heed this trend and respond appropriately. That action might take the form of using an integrator to tie all the machine functions and outputs together for that day when his OEM or upper tier customer demands it. In many industrial sectors, that day has already arrived.
Also, the cybersecurity issue cannot be understated, as we will soon see a shift from the open to the closed cloud for data storage in a factory or shop network. The protection of intellectual property remains paramount, on a global scale, today. To overlook that reality is to compromise the stability and security of your company.
“Remaining competitive” takes on many meanings, depending on your location in the world, but here are some thoughts on how manufacturers can do it better today By the time you finish reading this article, another entrepreneur will have figured out a way to make it happen for his or her company.
Time-to-market reduction is as critical today as ever. Shorter innovation cycles — the result of new product lifecycle management software and services available to companies both big and small — mean the savvy product companies can take their concept and make it fly in just a fraction of the time spent in the past. And by past, we mean compared to about ten years ago.
With the recent, rapid expansion of application-specific integrated circuit (ASIC) capability, much more functionality can be built into a product today, and this means the manufacturing community must be even more flexible and responsive — not merely reactive — than ever before.
With the Big Data impact that has resulted from the above scenario, both machine and component manufacturers are challenged in many ways, not the least of which is the daunting task of deciphering the important or exceptional from the nominal. A quality ERP or MES system can tell you what you need to know, but the keys are the determining factors that make up the inputs to these systems and how their priorities are set.
From the perspective of the motion control and communication platform world — which focuses on the control, generation, or application of movement on everything from a machine tool to a packaging line, from an automotive assembly line to a turnkey book printing facility — a great variety of needs is seen among OEMs as well as end-users in these various segments. All of them require flexibility and often highly customized solutions to their manufacturing or processing challenges. Plus, maintaining high productivity on aging equipment is a constant concern for every company. Do you need to retrofit existing machine or invest in a new one? Are enhanced robotics and transfer mechanisms or more personnel required on the line? Should you focus on better asset management or an entirely new business model when thinking about factories or processing facilities? Today, as the digital factory emerges in all industries and at companies of all sizes, we find ourselves providing answers to these questions, based not only on product, but also software, communication, bus protocol, and other areas of manufacturing expertise.
Utilizing Data to Remain Competitive
It’s now a popular saying that “data drives utilization.” Using data smartly, however, requires an educated workforce that can take product design and turn it into viable and profitable production for the employer, regardless of the machine, widget, chemistry, or package being produced. In a world dictated by product lifecycle management needs, the correlation among design, production planning, output, and delivery — plus the monitoring of usage and returns in the field — has never been more important, but also never more manageable, given the new tools available from both product and service providers in the market today.
With IT as the link, today’s digital factory will tie the shop floor to the top floor. A word about security: The involvement of suppliers, especially as it pertains to the cybersecurity of Big Data, is a critical factor today. While technology is key, so is the old-fashioned but highly underrated notion of trust. Companies are most productive when they can trust their suppliers, especially those who promote a “defense in depth” approach to cyber-security.
That value can often come in unseen ways, such as the access provided to your workforce for prompt and effective answers to questions. Perhaps it’s a 24-hour hotline, perhaps it’s an onboard technical manual in the machine controller with troubleshooting capability on-screen, or perhaps it’s a supplier-provided training webinar that will expand the way your operators and maintenance personnel use their machines. Taking full advantage of these services will improve the productivity of your factory floor. You hear about total cost of ownership (TCO), and this is one of those subtle but very real factors that drives that calculation.
Another key area in remaining competitive is the cost of energy. The more a machine can do with less energy, the more efficient and profitable it becomes. That’s the obvious part. How to get there can take many forms. For example, the simple notion of regenerative energy — a concept in play in the electrical world since Sprague’s regenerative braking motor in 1886 — can be monitored and manipulated by today’s drives, putting power back onto the grid or using it to drive other equipment. By simply implementing “smart” motors, drives, and other equipment, manufacturers of all types can improve their productivity and the bottom line — a win-win, to be sure.
Lastly, safety must be paramount, not only as it protects the workforce, but also as it contributes to overall efficiency and the profit picture. Fewer accidents result when there is a reduction in the mean time to repair, and equipment is replaced before it malfunctions and hurts someone. This requires implementing both preventive and predictive maintenance protocols.
Examples from Industry Today
Confidence in digital manufacturing is higher than ever among leading companies these days, and for good reason. Industry leaders are beginning to realize benefits from their investments in digital technologies and next-generation robotics. One car maker offers a prime example of how the benefits of digitalization can accrue. In their case, everything from design to execution planning is implemented digitally. They once required 30 months to manufacture their luxury sports sedan, from start to finish. Thanks to digitalization, production time was reduced to 16 months, and the company succeeded in achieving a threefold manufacturing productivity increase.
Another successful application of digitalization can be found at another car plant equipped with more than 1,000 robots, all of which help to weld vehicle bodies with accuracy within a tenth of a millimeter. Robots also control the first fully automated hang-on assembly line, which attaches the doors, hoods, and hatches to the vehicles — a process that previously was entirely manual. The plant also has an automated engine marriage process and a new integrated paint process that uses 30% less energy and produces 40% fewer emissions.
Digitalization, and its proper implementation, is now emerging as a critical success factor for industry. It means gathering more data and analyzing that data in a virtual context so that better decisions and, in many cases, predictive decisions can be made. It’s changing the way products are developed, built, and delivered through machine learning, additive manufacturing, and advanced robotics. And it’s changing the way products evolve through cloud technology, knowledge automation, and Big Data analytics.
Digital technologies present a billion dollar opportunity for manufacturers to transform their production and reorient their value proposition to meet the needs of today’s digital consumers. The competitiveness of the manufacturer increases because digitalization introduces even higher speed into the product development lifecycle, thus enabling faster response to consumer demand.
Simulation is one digitalization tool that drives shorter innovation cycles, even when highly complex products and large volumes of manufacturing data are involved. In a simulation environment, a virtual model of each component in a device or machine is generated, which allows designers and builders to explore what-if scenarios easily and quickly. These virtual models have come to be known as “digital twins.” They analyze the gathered data and then use it to run simulations and benchmark performance, allowing plant operators to pinpoint where gains can be made. By pairing both virtual and physical worlds (the twins), analysis of data and monitoring of systems can actively avert problems before they occur, which prevents downtime, develops new efficiency opportunities, and enables planning for the future. Existing assets can be modeled against their digital twins and new designs can be tested in the virtual world, saving time, money, and resources. Testing the interaction on a screen can verify a modification to a car engine, for instance, before new holes need to be drilled. Such scenarios are occurring at every supply chain step in the auto, aero, medical, off-highway, appliance, and other industries.
A connected digital factory, and the Big Data it generates, provides manufacturers with the insight and agility required to compete. Digitalization gives manufacturers the capability to increase productivity across their entire value chain — from design and engineering to production, sales, and service — with integrated feedback throughout the process. In practical terms, this means faster time-to-market, greater flexibility, and enhanced availability of systems on the plant floor.
The integration of digitalization into operations is also a flexible process. Digitalization can be adopted at any pace that fits the needs of the organization. Some manufacturers start with retrofits or may begin by digitalizing one assembly line or even one machine at a time. By whatever means a company chooses to begin its path to digitalization, the critical challenge is to start now.
>> This article from Tech Briefs, September 1, 2017, was prepared with contributions from Arun Jain, Vice President, Siemens Industry, Motion Control Business, and Alisa Coffey, MarCom Manager of Aerospace, Automotive and OEMs, Siemens Industry, Atlanta, GA; and Bernd Heuchemer, Vice President of Marketing, Siemens, Munich, Germany. For more information, go to Siemens.com