Back to Case Studies
architectureintermediate 9 min read

Shopify

Shopify's Modular Monolith: Avoiding Microservices Complexity

How Shopify tamed a 2M-line Rails codebase without microservices

Key outcome: Zero microservices overhead
ArchitectureMonolithRuby on RailsModularityScaling

The Situation

In 2019, Shopify's core platform was a single Ruby on Rails application — over 2 million lines of code, deployed hundreds of times per day by hundreds of engineers.

The conventional advice for a codebase at this size and age: break it into microservices. Separate deploy pipelines, autonomous teams, independent scaling.

Shopify considered it. They chose not to.

Instead, they built a modular monolith — enforced module boundaries within the same deployable unit. This decision shaped Shopify's architecture for the following five years and became an influential case study in the industry.


Why Not Microservices?

Shopify's engineering leadership ran a clear-eyed cost/benefit analysis:

The Costs of Microservices at Their Stage

Distributed systems complexity. Every inter-service call becomes a network call: latency, timeouts, retries, circuit breakers, distributed tracing. A 5ms function call becomes a 50ms+ HTTP request with failure modes that don't exist in-process.

Operational overhead. Each service needs its own deploy pipeline, monitoring, alerting, on-call rotation, and SLOs. At 40+ services, the operational surface dwarfs the product surface.

Data consistency. Transactions that span service boundaries require sagas or 2PC. What was a simple ActiveRecord.transaction {} block becomes a distributed workflow with compensation logic.

Deployment coupling. Despite promises of independent deployability, services that share data contracts often need coordinated deployments anyway. Teams spend more time negotiating API contracts than building features.

The real cost: Shopify was shipping features extremely fast with their monolith. Microservices would have slowed them down for at least 2-3 years while the migration was underway.

The Actual Problem

The real problem wasn't the monolith. It was lack of boundaries within the monolith. Engineers could call any code from anywhere. Any class could depend on any other class. Business logic leaked everywhere.

The result: spaghetti. Changing one feature required understanding half the codebase. A refactor in payments accidentally broke shipping calculations.

The fix wasn't microservices — it was discipline within the monolith.


The Solution: Componentized Rails

Shopify introduced a concept they called "components" — a set of conventions and tools enforcing module boundaries:

1. Explicit Package Boundaries

Each component became a separate Ruby engine (a Rails mechanism for encapsulating a sub-application). Components had:

  • Their own app/ directory
  • Their own migrations
  • Their own test suite
  • A declared public API
RUBY
# Component structure example
components/
  payments/
    app/
      models/
      services/
      controllers/  # internal only
    lib/
      payments/
        public_api.rb  # the only callable surface from outside
    spec/

2. Violation Detection

They wrote a static analysis tool that runs in CI:

RUBY
# Simplified concept of what the checker does
# Any cross-component call not going through public_api.rb = build failure
module ComponentBoundaryChecker
  def self.check(calling_component, callee_constant)
    callee_component = component_for(callee_constant)
    return if calling_component == callee_component
    return if public_api?(callee_constant, callee_component)

    raise BoundaryViolation,
      "#{calling_component} cannot directly call #{callee_constant}. " \
      "Use #{callee_component}::PublicApi instead."
  end
end

Boundary violations fail the build. This made the rule real — not just a convention that eroded under deadline pressure.

3. Dependency Graph

With explicit boundaries, Shopify could generate a dependency graph of components. This revealed:

  • Which components were depended on by many others (high coupling risk)
  • Which components had circular dependencies (a maintenance nightmare)
  • Where the "God component" anti-patterns had emerged

The Outcome

Developer Velocity Maintained

Shopify continued shipping at high velocity. There was no 18-month "let's break apart the monolith" migration. New features were built inside components from day one.

Autonomous Teams

Teams owned specific components. Because components had declared public APIs, teams could change internal implementation freely without affecting callers. This gave teams the autonomy microservices promise — without the operational overhead.

Deployment Simplicity

One application. One deploy. One set of monitoring. Rolling deployments, canary releases, and rollbacks applied to the whole application, not 40 independent services.

When They Did Extract Services

Over time, Shopify did extract some functionality into separate services — but only where there was a compelling reason:

  • Different scaling characteristics (image processing at high parallelism)
  • Different language requirements (ML inference in Python)
  • Vendor/partner boundary (Shopify Payments running on different compliance infrastructure)

The monolith wasn't dogma. It was the default, with extraction as a considered option — not the default.


The Architecture Pattern

Shopify Monolith
│
├── components/
│   ├── storefront/      # Customer-facing shop
│   │   └── public_api.rb
│   ├── payments/        # Payment processing
│   │   └── public_api.rb
│   ├── fulfillment/     # Order fulfillment
│   │   └── public_api.rb
│   ├── analytics/       # Reporting
│   │   └── public_api.rb
│   └── platform/        # Shared primitives
│       └── public_api.rb
│
└── (enforced: cross-component calls via public_api only)

Each component is independently testable, independently owned, and independently evolvable — while sharing one database, one deploy, and one set of infrastructure.


Comparison: Modular Monolith vs Microservices

| Dimension | Modular Monolith | Microservices | |-----------|-----------------|---------------| | Deploy complexity | One deploy | One deploy per service | | Cross-boundary calls | In-process (fast) | Network (slow, fallible) | | Data consistency | ACID transactions | Eventual consistency / Saga | | Team autonomy | High (with enforcement) | High | | Operational overhead | Low | High | | Independent scaling | Limited | Full | | Migration cost | Low | High | | Best for | Strong teams, fast shipping | Different scaling needs, different tech stacks |


The Lesson

Microservices solve an organisational problem — multiple teams needing independent deploy pipelines and ownership. But they introduce a distributed systems problem in exchange.

If your actual problem is code coupling and unclear ownership, you can solve that within a monolith through:

  1. Enforced module boundaries — tools that make violations fail CI
  2. Explicit public APIs — the only callable surface for cross-module communication
  3. Dependency graphs — visibility into coupling to catch architectural drift early
  4. Team ownership — assign each module to a team, not just a folder

You gain team autonomy, independent evolution, and clear ownership — without the operational overhead of distributed systems.

Shopify's architecture is living proof that scale (billions in GMV, hundreds of engineers) is achievable without microservices.


Further Reading

  • Shopify Engineering Blog: "Deconstructing the Monolith" (2019)
  • Majestic Monolith talk by DHH (2016)
  • Course: Architecture patterns

Related Case Studies

Go Deeper

Case studies teach the "what". Our courses teach the "how" — the patterns behind these decisions, built up from first principles.

Explore Courses