Practical

Architectural Tests

Automated enforcement of architectural rules - because architecture that depends on human vigilance will erode.

The Erosion Problem

You’ve defined the layers. You’ve established the dependency rules. The team understands the architecture. Everything is clean.

Then someone, under deadline pressure, adds a using Services.Audio statement inside a Core class. It’s small. It works. The code reviews miss it (or don’t - “we’ll fix it later”). The next developer sees the precedent and adds another cross-layer reference. Then another.

Within months, the layered architecture that was carefully designed exists only in documentation. The actual code has eroded into a tangle of cross-cutting dependencies.

This is not a discipline problem. It’s a systems problem. Architecture that depends on human vigilance will erode. Architecture that is verified by automated tests will hold.

What Architectural Tests Do

Architectural tests are static analysis checks that verify structural properties of your codebase:

These tests don’t run your code. They analyze the structure of your code - assemblies, namespaces, type references, inheritance hierarchies - and fail if they find violations.

How They Work

Architectural tests use reflection or static analysis to inspect compiled assemblies (or source files) and verify rules. The mechanics vary by language and engine, but the concept is universal.

Example: Layer Dependency Check

Rule: The Core assembly must not reference any type from the Use Cases,
      Controllers, Services & Views, or Engine assemblies.

Implementation:
  1. Load the Core assembly
  2. For each type in the assembly, inspect all referenced types
  3. If any referenced type belongs to a forbidden assembly, fail

Output (on violation):
  FAIL: Core.PlayerHealth references Services.AudioManager
        Core/PlayerHealth.cs, line 47

Example: Use Case Interface Enforcement

Rule: All public types in a feature's UseCases namespace that are
      referenced by other features must be interfaces.

Implementation:
  1. Find all types in UseCases namespaces that are used across features
  2. For each such type, check if it's an interface
  3. If any cross-feature dependency targets a concrete class, fail

Output (on violation):
  FAIL: Quests.UseCases.CompleteQuestUseCase is referenced by Inventory.
        Cross-feature dependencies must target interfaces, not implementations.

Example: Engine Isolation

Rule: No type in Core or UseCases may reference engine-specific types
      (MonoBehaviour, Actor, Node, etc.)

Implementation:
  1. Load Core and UseCases assemblies
  2. For each type, check base classes and field types
  3. If any reference an engine type, fail

Output (on violation):
  FAIL: Core.QuestSystem inherits from MonoBehaviour
        Core must not contain engine-specific types.

Tools and Approaches

Reflection-Based (Runtime)

Write test methods that use your language’s reflection capabilities to inspect assemblies at test time. This runs as part of your normal test suite.

// C# / .NET example using NUnit
[Test]
public void Core_Should_Not_Reference_Outer_Layers() {
    var coreAssembly = typeof(PlayerHealth).Assembly;
    var forbiddenAssemblies = new[] {
        typeof(CompleteQuestUseCase).Assembly,  // Use Cases
        typeof(QuestLogController).Assembly,     // Controllers
        typeof(AudioService).Assembly            // Services & Views
    };

    foreach (var type in coreAssembly.GetTypes()) {
        foreach (var referencedType in GetAllReferencedTypes(type)) {
            Assert.That(
                !forbiddenAssemblies.Contains(referencedType.Assembly),
                $"{type.Name} references {referencedType.Name} from an outer layer"
            );
        }
    }
}

Advantages: Uses your existing test framework. No additional tooling. Disadvantages: Limited to what reflection can see. Can miss some dependency types.

Dedicated Architecture Testing Libraries

Some ecosystems have libraries purpose-built for architectural testing:

These provide richer rule definitions and better error messages than hand-rolled reflection.

// ArchUnitNET example
[Test]
public void Core_Layer_Rules() {
    IArchRule rule = Types()
        .That().ResideInNamespace("Core")
        .Should().NotDependOnAny(
            Types().That().ResideInNamespace("UseCases")
                .Or().ResideInNamespace("Controllers")
                .Or().ResideInNamespace("Services")
                .Or().ResideInNamespace("Views")
        );

    rule.Check(Architecture);
}

Build-Time Analysis

Some languages support build-time dependency checking:

This is the strongest form of enforcement - violations are impossible, not just detected.

Custom Linting Rules

For projects with specific conventions, custom lint rules can enforce patterns:

What Rules to Enforce

Start with the critical rules that, if violated, undermine the entire architecture:

Essential Rules

  1. Core references nothing outside Core - no Use Cases, no Controllers, no Services, no engine, no UI framework
  2. Use case interfaces reference only Core - no Controllers, no Services, no engine
  3. Use case implementations reference only Core and use case interfaces - no Controllers, no Services & Views, no engine
  4. Controllers reference only Core and Use Cases - no Services & Views, no engine
  5. No engine types in Core, Use Cases, or Controllers - no MonoBehaviour, no Actor, no Node

Valuable Rules

  1. Controller interfaces live in Use Cases, not Controllers - maintaining the dependency inversion
  2. No cross-feature references except through use case interfaces - features interact through public contracts only
  3. No circular dependencies between features - feature A can use feature B’s use case interfaces, but not the reverse simultaneously
  4. Event properties are initialized inline - not in constructors (prevents emission during construction)

Project-Specific Rules

As your project matures, you’ll identify patterns worth enforcing:

Integrating with CI/CD

Architectural tests should run on every commit, alongside unit tests:

Build Pipeline:
1. Compile
2. Run unit tests
3. Run architectural tests <- catches layer violations
4. Run integration tests
5. Deploy (if all pass)

A failed architectural test should block the build just like a failed unit test. This makes the architecture a living, enforced property of the codebase rather than an aspiration.

The Cultural Effect

Beyond their direct value, architectural tests have a powerful cultural effect. They communicate that architecture matters - that it’s not optional, not aspirational, and not subject to “we’ll fix it later.”

When a developer writes code that violates a layer rule, the test fails immediately. The developer must understand and fix the violation before their code can be merged. This creates a constant, gentle pressure toward architectural compliance that no amount of documentation or code review can match.

Over time, the team internalizes the rules. The architectural tests catch fewer violations - not because they’re less useful, but because the team has learned the patterns. The tests become a safety net rather than a daily obstacle.

Getting Started

  1. Start with one rule - “Core must not reference any outer layer.” Get it running in your test suite.
  2. Fix existing violations - there will be some. Fix them before enabling the test.
  3. Add rules incrementally - one new rule per sprint, fixing violations as you go.
  4. Run in CI - make violations block the build.
  5. Document why - when a test fails, the error message should explain not just what is wrong but why the rule exists.

The Architecture Model defines the rules. Architectural tests make them real.