Day 15 – The Evolution of PLM: Why Old Monolithic Systems Fail SMBs

Oleg Shilovitsky
Oleg Shilovitsky
6 November, 2025 | 8 min for reading
Day 15 – The Evolution of PLM: Why Old Monolithic Systems Fail SMBs

Product Lifecycle Management (PLM) was born in the 1990s, in the era of big industrial IT.
Back then, large OEMs — automotive, aerospace, defense — were dealing with thousands of engineers and complex product structures that required strict control. PLM emerged as the digital backbone for these enterprises, bringing together CAD data, product structures, and change management into one monolithic system.

Those early PLM systems mirrored ERP logic — centralized, hierarchical, and file based. They were built on relational databases, governed by workflows, and designed for a world of controlled access and tightly managed engineering processes. Each PLM was installed on premise and guarded by a dedicated PLM IT team. 

And for billion-dollar manufacturers, this made sense. The model delivered governance, compliance, and traceability. But over time, as global manufacturing became decentralized and innovation shifted toward small and mid-sized manufacturers (SMBs), the cracks in that architecture became impossible to ignore.

What once worked for massive enterprises now stands in the way of agility. Monolithic PLM systems — with their high implementation cost, rigid data models, and slow change cycles — simply can’t keep pace with how modern engineering teams design, collaborate, and manufacture products in the networked era.

As I wrote in Beyond PLM, the second wave of PLM arrived with the cloud. But most vendors only moved the old model into hosted environments. They lifted the same architecture — the same rigid data schemas — and ran them on cloud servers. The interface became web-based, but the logic remained frozen in the 1990s-2000s.

Today, PLM evolution isn’t about moving servers to the cloud. It’s about rethinking how data connects across design, production, and supply chains — and building systems that can adapt as fast as the teams who use them.

Why Monolithic PLM Systems Fail SMBs

For small and mid-sized manufacturers, traditional PLM systems create more friction than value. They were designed for control — not connection. Here are three fundamental reasons why they fail.

1. Cost and Complexity

On-premise or single-tenant PLM platforms require long implementations, customization, and consultants. The total cost of ownership is high not because the technology is expensive, but because it’s inflexible. Every installation becomes a unique snowflake — requiring special configurations, scripts, and training.

For an SMB, this is a non-starter. The value is delayed, the deployment is cumbersome, and the return on investment is unclear. Many small manufacturers try to adapt — combining Excel, SharePoint, and email to fill the gaps — but end up spending more time managing files than building products.

2. Complex, Rigid Data Models

Traditional PLM systems are built on SQL-based relational databases — powerful for storing structured data, but ill-suited for modeling relationships between components, assemblies, and revisions that constantly evolve.

Each implementation defines its own schema, its own rules. Once deployed, change becomes a project in itself. The model isn’t designed for flexibility — it’s designed for governance.

That rigidity works in environments where processes never change, but modern product development is fluid.New suppliers appear, new configurations are introduced, and digital threads cross multiple systems and organizations. A rigid data model can’t support this evolution.

3. Hosting ≠ Cloud

Many traditional PLM vendors now claim to be “cloud-based.” But in reality, they’re hosted — meaning they run the same old monolithic platform on a server in someone else’s data center. It’s still single-tenant, still complex, and still expensive to maintain.

This is a critical distinction. Hosting is not cloud-native. A true cloud-native PLM must be multi-tenant, data-driven, and continuously updated — the same way Google Docs or Salesforce operate.

Most SMBs live in that SaaS world. They expect instant access, automatic updates, and flexible pricing. But monolithic PLMs can’t deliver this experience. Their architecture — built for one customer per instance — collapses under modern SaaS economics.

As I’ve written before, “Monolithic PLM belongs to the age of centralized IT. Modern manufacturing lives in networks.”

The Architectural Shift — From Monoliths to Composable and Federated PLM

The evolution of PLM mirrors the broader transformation of enterprise software — from monolithic applications to composable platforms that scale through modular, connected services. Let’s trace that journey.

1st Generation: The On-Premise Monolith

This was the age of PDM + workflow. PLM systems like Teamcenter, Enovia, and Windchill were tightly coupled ecosystems that required dedicated servers and expert administrators. They provided structure and traceability but were inaccessible to smaller teams.

2nd Generation: Cloud Replicas of the Old World

In the 2010s, the industry shifted to “cloud.” But instead of reinventing data architecture, most vendors simply hosted their legacy systems. The UX improved, but the model didn’t. Costs remained high. Flexibility remained low. The systems still treated data as static tables instead of dynamic relationships.

3rd Generation: Composable, Data-Driven, Multi-Tenant Platforms

This is where OpenBOM stands. Instead of a single, monolithic codebase, OpenBOM delivers federated, graph-based data services that connect seamlessly to design, manufacturing, and supply chain systems. Here’s how OpenBOM breaks the old paradigm:

  • Graph Data Model:
    Unlike relational tables, OpenBOM models products as networks of relationships — items, assemblies, suppliers, costs, and revisions are all connected nodes in a graph. This allows data to evolve naturally as designs change and new relationships form.
  • Multi-Tenant SaaS Infrastructure:
    Built for the cloud from day one, OpenBOM delivers the same service to every customer, continuously updated, with scalable performance and security built in.
  • Real-Time Collaboration:
    Engineers and buyers can simultaneously edit BOMs in real time, just like in Google Sheets.This capability eliminates the need for check-in/check-out workflows, bringing modern collaboration into product data management.
  • xBOM Services:
    OpenBOM’s xBOM framework supports multiple views — engineering BOM, manufacturing BOM, procurement BOM — all linked through the same underlying graph.
    Users can model data from different perspectives without duplication or synchronization pain.

This architecture allows SMBs to start small and scale fast. There’s no massive deployment phase, no consultants, and no proprietary database to maintain. OpenBOM grows with the team — from a few engineers managing CAD data to full digital-thread integration with procurement, vendors, and ERP.

Composable PLM doesn’t require a “big bang.” It evolves with the company’s maturity, connecting services as needed. That’s the power of flexibility — and it’s the future of PLM.

SMB Manufacturing Reality — What They Actually Need

For SMBs, manufacturing is increasingly digital, distributed, and collaborative.
They don’t have massive IT budgets or PLM administrators — they have engineers, buyers, and project managers wearing multiple hats.

What they need is a system that adapts to them, not the other way around.

Fencequip – Lean Manufacturing Meets Digital BOMs

Fencequip is a perfect example. They manufacture advanced fence equipment and needed better visibility into materials and costs. Before OpenBOM, they managed everything manually — Excel sheets, local files, and disconnected data.

OpenBOM gave them a live view of their product structures, materials, and costs, connecting engineering to procurement. Engineers could instantly see cost impact as designs changed, and purchasing could order based on accurate, validated data.

The result: fewer errors, leaner inventory, and faster production cycles.

CANMET Design – From Spreadsheets to Connected Product Costing

CanMet Design faced a similar challenge managing complex costing workflows tied to CAD models. OpenBOM integrated directly with SolidWorks, automatically synchronizing assemblies, part data, and cost attributes. Instead of reconciling spreadsheets manually, they now have instant roll-ups and live validation, dramatically improving quotation speed and accuracy.

The Common Thread

Both stories share a core theme: speed, affordability, and collaboration.

These companies didn’t need enterprise-scale PLM with hundreds of modules. They needed a flexible, real-time system that could connect engineering and procurement — something that felt as easy as a spreadsheet, but with the power of a database.

As I often write: “The future of PLM isn’t another platform. It’s a network of connected services built on open, intelligent data.”

The Broader Trend — PLM Meets the Network Economy

The next industrial transformation is not about single companies — it’s about networks of companies. Design, sourcing, manufacturing, and logistics now happen across distributed ecosystems. Suppliers and OEMs collaborate continuously, sharing product data in real time.

Monolithic PLM systems, built for single-enterprise control, simply don’t fit this model. They assume that all data lives within one company and one database. But in the real world, products are designed by one team, sourced from another, and built by a third — all using different tools and systems.

This is where OpenBOM’s federated digital thread provides a new model. It connects multiple tenants — each representing a company, a supplier, or a contractor — into a secure, graph-native network where data is shared by reference, not by duplication.

Each participant maintains ownership of their data while collaborating across boundaries.This is the foundation of the networked PLM economy — flexible, connected, and intelligent.

Conclusion — The End of Monolithic PLM

We are witnessing the end of the monolithic era in PLM. The systems that once symbolized control now represent constraint.

SMBs — the backbone of modern manufacturing — need agility, collaboration, and scalability, not rigid workflows and decade-long implementations.

Monolithic PLM was built for large companies. SMB/SME manufacturing are looking for modern PLM build for for connection.

OpenBOM represents the natural evolution of PLM — a platform that replaces rigidity with openness, complexity with simplicity, and isolation with collaboration. It’s not a hosted version of the past — it’s a new architecture for the future.

Tomorrow’s PLM systems will not be defined by how much data they control, but by how well they connect data, people, and processes across networks.

REGISTER FOR FREE to check how OpenBOM can help. 

Best, Oleg 

Related Posts

Also on OpenBOM

4 6
4 December, 2025

Every once in a while, something shifts in the way engineering teams collaborate. Sometimes it’s a new tool, a new...

3 December, 2025

Manufacturing teams across industries face a consistent and very familiar operational bottleneck: the moment a design leaves the CAD system,...

2 December, 2025

Everyone wants AI right now: copilots, assistants, automations, the whole thing. But very few teams stop to ask the real...

1 December, 2025

Over the past month, I’ve written a lot about architecture, data models, digital threads, xBOM structures, multi-tenancy, ordering, security and...

28 November, 2025

Every manufacturing company today is under tremendous pressure to deliver products faster, more efficiently, and with fewer mistakes. Product complexity...

27 November, 2025

We are almost at the end of our 30 day journey of OpenBOM. Today we speak about data architecture and...

26 November, 2025

From EBOM to Procurement Readiness to Build Getting a product ready for manufacturing always depends on one thing. The organization...

25 November, 2025

Digital thread is often described in vague and futuristic terms. Many companies imagine it as something that requires a large...

24 November, 2025

In my article today, we speak about best practices for part numbers, classification & catalog management in OpenBOM. If there...

To the top