Waqas Ahmad — Software Architect & Technical Consultant - Available USA, Europe, Global

Waqas Ahmad — Software Architect & Technical Consultant

Specializing in

Distributed Systems

.NET ArchitectureCloud-Native ArchitectureAzure Cloud EngineeringAPI ArchitectureMicroservices ArchitectureEvent-Driven ArchitectureDatabase Design & Optimization

👋 Hi, I'm Waqas — a Software Architect and Technical Consultant specializing in .NET, Azure, microservices, and API-first system design..
I help companies build reliable, maintainable, and high-performance backend platforms that scale.

Experienced across engineering ecosystems shaped by Microsoft, the Cloud Native Computing Foundation, and the Apache Software Foundation.

Available for remote consulting (USA, Europe, Global) — flexible across EST, PST, GMT & CET.

services
Article

Testing Strategies: Unit, Integration, and E2E In-Depth

Testing strategies: unit, integration, e2e; when to use which in .NET.

services
Read the article

Introduction

This guidance is relevant when the topic of this article applies to your system or design choices; it breaks down when constraints or context differ. I’ve applied it in real projects and refined the takeaways over time (as of 2026).

Teams often end up with too many slow E2E tests or over-mocked unit tests and lose fast feedback or real integration confidence. This article covers the three main testing strategies in .NET: unit tests (fast, isolated), integration tests (components together), and end-to-end tests (full user flows)—with xUnit examples and when to use each. For architects and tech leads, following the pyramid (many unit, some integration, few E2E) keeps feedback fast and catches integration bugs that mocks miss; we explain what to test at each level and how to fix flakiness.

See the .NET Architecture Guide for broader .NET patterns.

Decision Context

  • System scale: Any application that needs reliable behaviour; from a single service to many. Applies when you’re defining or refining a testing strategy (unit, integration, E2E).
  • Team size: Developers (and sometimes QA); someone must own test coverage, CI runs, and flakiness. Works when the team can run unit tests on every commit and integration/E2E on PR or main.
  • Time / budget pressure: Fits when you can invest in fast unit tests and selective integration/E2E; breaks down when everything is E2E and the suite takes hours—then rebalance toward the pyramid.
  • Technical constraints: .NET (xUnit, NUnit); in-memory or real DB for integration; real or mocked external services. Assumes you can run tests in CI and isolate flaky tests.
  • Non-goals: This article does not optimise for a specific framework only; it focuses on the pyramid and when to use unit vs integration vs E2E.

The testing pyramid

Level Speed Scope Count
Unit Fast (ms) Single class/method Many
Integration Medium (seconds) Multiple components Some
E2E Slow (seconds-minutes) Full system Few

Pyramid shape: Many unit tests at the base, fewer integration in the middle, fewest E2E at the top. Fast feedback from unit; confidence from integration and E2E.

Unit tests

Unit tests verify a single unit (class, method) in isolation. Dependencies are mocked.

Characteristics:

  • Fast (milliseconds)
  • No external dependencies (DB, network, file system)
  • Test business logic and edge cases
  • Run on every commit

Example: testing a discount calculator

public class PercentageDiscount : IDiscountStrategy
{
    private readonly decimal _percentage;
    public PercentageDiscount(decimal percentage) => _percentage = percentage;
    public decimal Apply(decimal amount) => amount * (1 - _percentage / 100);
}

// Unit test
public class PercentageDiscountTests
{
    [Fact]
    public void Apply_TenPercent_ReturnsNinetyPercentOfAmount()
    {
        var discount = new PercentageDiscount(10);
        var result = discount.Apply(100m);
        Assert.Equal(90m, result);
    }

    [Theory]
    [InlineData(0, 100, 100)]
    [InlineData(50, 100, 50)]
    [InlineData(100, 100, 0)]
    public void Apply_VariousPercentages_ReturnsExpected(decimal pct, decimal amount, decimal expected)
    {
        var discount = new PercentageDiscount(pct);
        Assert.Equal(expected, discount.Apply(amount));
    }
}

Integration tests

Integration tests verify multiple components together: API + database, service + repository, etc.

Characteristics:

  • Slower (seconds)
  • Use real or test database (in-memory, container)
  • Test API endpoints, repositories, message handlers
  • Run in CI; may run on PR or main branch

Example: testing an API endpoint

public class OrdersControllerTests : IClassFixture<WebApplicationFactory<Program>>
{
    private readonly HttpClient _client;

    public OrdersControllerTests(WebApplicationFactory<Program> factory)
    {
        _client = factory.CreateClient();
    }

    [Fact]
    public async Task GetOrder_ExistingId_ReturnsOk()
    {
        // Arrange: seed data in test setup
        
        // Act
        var response = await _client.GetAsync("/api/orders/1");
        
        // Assert
        response.EnsureSuccessStatusCode();
        var order = await response.Content.ReadFromJsonAsync<OrderDto>();
        Assert.NotNull(order);
        Assert.Equal("1", order.Id);
    }

    [Fact]
    public async Task GetOrder_NonExistingId_ReturnsNotFound()
    {
        var response = await _client.GetAsync("/api/orders/999999");
        Assert.Equal(HttpStatusCode.NotFound, response.StatusCode);
    }
}

Using Testcontainers for real DB:

public class DatabaseFixture : IAsyncLifetime
{
    private readonly MsSqlContainer _container = new MsSqlBuilder().Build();
    
    public string ConnectionString => _container.GetConnectionString();

    public async Task InitializeAsync() => await _container.StartAsync();
    public async Task DisposeAsync() => await _container.DisposeAsync();
}

End-to-end tests

E2E tests verify full user flows: browser → API → database.

Characteristics:

  • Slowest (seconds to minutes)
  • Full system running
  • Test critical user journeys
  • Keep few and stable; run nightly or on main

Example: Playwright test

public class CheckoutFlowTests : PageTest
{
    [Test]
    public async Task User_CanCompleteCheckout()
    {
        await Page.GotoAsync("https://myapp.local/");
        await Page.ClickAsync("text=Add to Cart");
        await Page.ClickAsync("text=Checkout");
        await Page.FillAsync("#email", "test@example.com");
        await Page.ClickAsync("text=Place Order");
        await Expect(Page.Locator(".order-confirmation")).ToBeVisibleAsync();
    }
}

What to test at each level

Level What to test
Unit Business logic, calculations, edge cases, validation
Integration API contracts, DB queries, message handlers
E2E Critical user journeys (login, checkout, signup)

Do not test everything at every level. Unit tests for logic; integration for wiring; E2E for flows.

Test organization

Folder structure:

tests/
  MyApp.UnitTests/
    Services/
      DiscountCalculatorTests.cs
    Domain/
      OrderTests.cs
  MyApp.IntegrationTests/
    Api/
      OrdersControllerTests.cs
    Repositories/
      OrderRepositoryTests.cs
  MyApp.E2ETests/
    Flows/
      CheckoutFlowTests.cs

Naming:

  • MethodName_Scenario_ExpectedResult
  • Should_DoSomething_When_Condition

Mocking and test doubles

Type Description Use
Mock Verifies calls were made Check interactions
Stub Returns canned responses Provide data
Fake Working implementation (in-memory) Simplify tests

Example with Moq:

[Fact]
public async Task PlaceOrder_ValidOrder_SavesAndPublishes()
{
    // Arrange
    var mockRepo = new Mock<IOrderRepository>();
    var mockPublisher = new Mock<IEventPublisher>();
    var service = new OrderService(mockRepo.Object, mockPublisher.Object);

    // Act
    await service.PlaceOrderAsync(new PlaceOrderCommand("cust-1", new List<OrderLine>()));

    // Assert
    mockRepo.Verify(r => r.AddAsync(It.IsAny<Order>(), default), Times.Once);
    mockPublisher.Verify(p => p.PublishAsync(It.IsAny<OrderPlaced>()), Times.Once);
}

Enterprise best practices

1. Follow the pyramid. Many unit, some integration, few E2E.

2. Run unit tests on every commit. Fast feedback.

3. Run integration tests on PR. Catch wiring issues before merge.

4. Run E2E tests nightly or on main. Slower but catch regressions.

5. Use test containers for real DB. More realistic than in-memory.

6. Keep tests independent. No shared state between tests.

7. Name tests clearly. Intent should be obvious from name.

8. Test behavior, not implementation. Do not over-mock; test outcomes.

Common issues

Issue Cause Fix
Slow tests Too many E2E, slow setup More unit tests; parallelize
Flaky tests Timing, shared state Isolate; use deterministic data
Over-mocking Testing mocks, not code Test behavior; fewer mocks
No tests Not prioritized Start with critical paths
Tests pass, bugs in prod Wrong level of testing Add integration tests
Hard to maintain Brittle assertions Test outcomes, not implementation

Summary

Unit tests are fast and many for business logic; integration tests verify components together (API + DB, contracts); E2E tests are slow and few for critical user journeys—follow the pyramid. Relying only on unit tests with heavy mocks or putting everything in E2E leads to missed integration bugs or a slow, flaky suite; balancing the three gives fast feedback and real confidence. Next, add or rebalance: more unit tests for logic, integration tests for main API/DB paths, and a small stable set of E2E for critical flows; isolate and fix flakiness with deterministic data.

Position & Rationale

I use unit tests for business logic and pure functions—fast, many, run on every commit; I mock external dependencies so we’re testing one unit. I use integration tests for API + DB, service + repository, or any combination that must work together; I run them on PR or main, not on every commit, and I use a real or in-memory DB so we catch contract and query issues. I use E2E tests sparingly for the few critical user journeys (e.g. login, checkout); they’re slow and brittle, so I keep the set small and stable. I avoid over-mocking in unit tests—if the test only verifies that a mock was called, we’re not testing behaviour. I don’t put everything in E2E; the pyramid exists so we get fast feedback from unit and integration and use E2E for confidence, not coverage.

Trade-Offs & Failure Modes

Unit tests are fast but can miss integration bugs; integration tests catch more but are slower and need DB or services; E2E tests give confidence but are slow and flaky. Too many E2E and the suite takes too long; too many mocks and we test the wrong thing. Failure modes: flaky tests (timing, shared state)—isolate and use deterministic data; tests that pass but production fails (wrong level of testing)—add integration tests for the path that broke; brittle tests (implementation-coupled)—test outcomes, not internals.

What Most Guides Miss

Most guides describe the pyramid but don’t stress that integration tests are where many bugs are caught—unit tests with mocks can pass while the real DB or API contract fails. Another gap: what to test at each level—unit = business rules and pure logic; integration = “does this API + DB work?”; E2E = “can the user complete this flow?” Flakiness is often attributed to E2E only, but shared state or timing in integration tests can cause flakiness too; isolate tests and avoid order-dependent state.

Decision Framework

  • If testing business logic → Unit test with mocks; keep dependencies minimal so the unit is real logic.
  • If testing that components work together (API + DB, service + repo) → Integration test with real or in-memory dependencies.
  • If testing a critical user journey → E2E, but keep the set small; run on PR or nightly.
  • For flakiness → Isolate tests (no shared state); use deterministic data; fix or quarantine flaky tests quickly.
  • For coverage → Prefer many unit tests, enough integration to cover main paths, few E2E for critical flows.

Key Takeaways

  • Unit = fast, many, business logic; integration = API + DB, contracts; E2E = few, critical journeys.
  • Follow the pyramid: more unit, some integration, few E2E.
  • Avoid over-mocking (testing mocks, not behaviour); add integration tests for paths that matter.
  • Isolate tests and fix flakiness; test outcomes, not implementation.

When I Would Use This Again — and When I Wouldn’t

I’d use unit tests again for all non-trivial business logic—fast feedback on every commit. I’d use integration tests again for API and DB and for any boundary where the contract matters. I’d use E2E again only for the few critical flows that must not break. I wouldn’t rely only on unit tests with heavy mocks and no integration tests; real integration bugs will slip through. I also wouldn’t grow the E2E suite without bound; keep it small and stable so it stays runnable and meaningful.

services
Frequently Asked Questions

Frequently Asked Questions

What is a unit test?

Unit test verifies a single class or method in isolation with mocked dependencies. Fast, runs on every commit.

What is an integration test?

Integration test verifies multiple components together (API + DB, service + repository). Slower, runs on PR or main.

What is an E2E test?

E2E test verifies full user flow (browser → API → DB). Slowest, tests critical journeys, runs nightly.

How many tests at each level?

Pyramid: Many unit, some integration, few E2E. Fast feedback from unit; confidence from higher levels.

What should I test with unit tests?

Business logic, calculations, validation, edge cases. Anything that does not need external systems.

What should I test with integration tests?

API endpoints, database queries, message handlers. Components working together.

What should I test with E2E?

Critical user journeys: login, checkout, signup. The paths users actually take.

How do I mock dependencies?

Use Moq, NSubstitute, or similar. Create fake implementations for complex dependencies.

Should I use in-memory DB or real DB?

Real DB (via containers) is more realistic. In-memory is faster but may miss issues.

How do I organize tests?

Separate projects: UnitTests, IntegrationTests, E2ETests. Mirror source structure.

What is a flaky test?

Test that passes sometimes, fails sometimes. Caused by timing, shared state, or external dependencies.

How do I fix flaky tests?

Isolate tests, use deterministic data, avoid timing dependencies, parallelize carefully.

Should I mock the database?

For unit tests, mock repository interface. For integration tests, use real or test DB.

What is test coverage?

Percentage of code executed by tests. Useful metric but do not chase 100%; focus on critical paths.

How do I test async code?

Use async Task test methods. Await the code under test. xUnit and NUnit support async.

services
Related Guides & Resources

services
Related services