Rising complexity is leaving conventional PLM processes behind

Designing a gearbox used to be a demanding mechanical engineering task. There was a lot to get right: optimising durability, efficiency and shifting behaviour; ensuring parts were inexpensive to manufacture and easy to assemble; meeting stringent packaging constraints.

These challenges remain, but a modern gearbox is likely to be a sophisticated electromechanical assembly, controlled by powered actuators, monitored by sensors and managed by complex software. It now requires collaboration between multiple groups, each with their own specialisms: mechanical and electrical design engineers, software developers and data analysts.

This is repeated across the world of manufactured products, but rising product complexity is only part of the story. Sub-assemblies and sub-systems are increasingly likely to be engineered by different suppliers. Also, carmakers now offer additional functionality during the life of their products, and in industrial equipment, manufacturers use remote links to monitor the performance of their products in the field.

The data gap

“The challenges facing today’s manufacturers are formidable, but they are compounded when companies find themselves with the wrong tools for the job,” says Graham McCall, vice president of operations at Aras. “Specifically, the systems used to manage product data are struggling to keep pace with the requirements placed on them by modern engineering and business processes. Companies strive to create continual “digital threads” that connect their products from concept to end-of-life, but often, those threads are broken.”

It’s not hard to see why. The majority of product lifecycle management (PLM) systems have roots that stretch back to the early days of computer aided design (CAD). As companies moved from paper to electronic drawings, and on to 3D models and simulations, it became clear that they needed systems to manage the growing volumes of data these tools produced.

CAD evolved into product data management (PDM) as systems gained the ability to manage bills-of-materials, capture and formalise engineering standards and rules, or generate multiple product configurations from a common underlying dataset. PDM then evolved into PLM, which was intended to extend the common data architecture across the boundaries between disciplines, engineering phases and organisations.

“PLM has failed to deliver its objectives for two fundamental reasons,” explains McCall. “First, most existing PLM systems haven’t broken free from their roots in mechanical CAD. They do some things very well: handling product geometry, materials data, assemblies and complex configurations for example. But they struggle with others, notably the integration of the software development processes that are a central part of many products. This leaves a gaping hole in processes that usually get filled with a patchwork of sub-par tools like Excel spreadsheets, shared drives, e-mail or Dropbox. None of these connect to the PLM system or each other.

“Second, typical PLM implementations are often notoriously slow, difficult and expensive to deploy. Companies can spend millions of pounds and thousands of hours, to find they have a solution that covers only a fraction of their processes, products and business activities. And once that PLM system is up and running, changing it is just as difficult. That makes it difficult for companies to seize new opportunities – evolving their internal processes and external offerings to take advantage of new technologies or shifts in customer demand.”

The birth of resilient PLM

“Organisations are learning that they can’t tackle today’s challenges with their existing data infrastructure,” McCall adds.“What they need is a resilient PLM platform to make them successful in their enterprise-wide digital transformation. One that is flexible, scalable and upgradable.”

A resilient platform is one that can accommodate the modern agile approach to IT implementation. Companies need the ability to deploy systems rapidly and adapt them quickly to meet their changing needs. That calls for a fundamentally different software architecture. Instead of hard-coding business logic, services and database structures into the platform itself, these elements need to be accessible and adaptable, implemented as models that can be changed quickly and inexpensively.

“A scalable PLM solution is one that meets the needs of all current and potential users,” McCall says. “That requires a clear separation between a company’s data and the tools used to create and access it. Only a few individuals need the ability to generate the geometry of a component, for example, but many hundreds may need to access and review that model without paying thousands a year for access to the original authoring tool.”

According to McCall, the digital thread that interconnects throughout lifecycles won’t be spun out of legacy CAD tools. It will be woven around them, and the other tools, data sources and business processes.