Última actividad 1 month ago

Revisión 88f869e3cb11807146fe9ad35f2f209dfaab7a2b

README.md Sin formato

How to use

🍎 Linux / macOS

Run the following command in your terminal. Replace MyProjectName with your desired solution name.

# Syntax: curl [url] | bash -s -- [ProjectName]
curl -fsSL https://opengist.rmrf.online/weehong/7548f80484b4419b9ba93ca45e784638/raw/HEAD/scaffold.sh | bash -s -- MyCleanApp
bash -c "$(curl -fsSL https://opengist.rmrf.online/weehong/7548f80484b4419b9ba93ca45e784638/raw/HEAD/scaffold_project.sh)"

🪟 Windows (PowerShell)

Run this command to scaffold the solution.

# Syntax: irm [url] | % { & ([scriptblock]::Create($_)) [ProjectName] }
irm "https://opengist.rmrf.online/weehong/7548f80484b4419b9ba93ca45e784638/raw/HEAD/scaffold.ps1" | % { & ([scriptblock]::Create($_)) Tidverse }
dotnet-10-clean-architecture-boilerplate-guide.md Sin formato

The Ultimate .NET 10 Clean Architecture Boilerplate: Step-by-Step Implementation Guide

This guide is an exhaustive, detail-oriented manual for recreating this .NET 10 Clean Architecture and CQRS boilerplate from scratch. It documents every file in the repository — from root-level configuration to Docker setups, middleware, and test scaffolding — so you can understand not just what each file does, but why it exists.

How to Use This Guide

  • Prose uses {ProjectName} as a placeholder. Replace it with your actual project name (e.g., {Projects}, Inventory, Payments).
  • Code blocks use {Projects} as the concrete example — the actual project this guide was written for.
  • Sections are ordered by dependency: set up root config first, then work inward from Domain → Application → Infrastructure → API.

Table of Contents

  1. Architecture Overview
  2. Design Decisions & Trade-offs
  3. Prerequisites
  4. Solution & Project Scaffolding
  5. Root Configuration Files
  6. Project Files (.csproj)
  7. Docker & Dev Environment
  8. Layer 1: Domain
  9. Layer 2: Application
  10. Layer 3: Infrastructure
  11. Layer 4: API / Presentation
  12. Test Projects
  13. CI/CD Pipeline
  14. AI-Assisted Development
  15. Running the Project

1. Architecture Overview

Clean Architecture

Clean Architecture organises code into concentric layers where dependencies always point inward. The innermost layer (Domain) has zero external dependencies; each outer layer may only reference layers closer to the core.

┌─────────────────────────────────────────────────────────────┐
│                        API / Presentation                   │
│  Controllers · Middleware · Serilog · OpenAPI · Program.cs  │
│                                                             │
│  ┌─────────────────────────────────────────────────────┐    │
│  │                    Infrastructure                    │    │
│  │  EF Core DbContext · Repositories · Interceptors    │    │
│  │                                                     │    │
│  │  ┌─────────────────────────────────────────────┐    │    │
│  │  │                Application                   │    │    │
│  │  │  Commands · Queries · Handlers · Behaviors   │    │    │
│  │  │  Validators · Mapping · DI Registration      │    │    │
│  │  │                                              │    │    │
│  │  │  ┌──────────────────────────────────────┐    │    │    │
│  │  │  │              Domain                  │    │    │    │
│  │  │  │  Entities · Value Objects · Events   │    │    │    │
│  │  │  │  Result · Error · Abstractions       │    │    │    │
│  │  │  │  (ZERO external dependencies)        │    │    │    │
│  │  │  └──────────────────────────────────────┘    │    │    │
│  │  └─────────────────────────────────────────────┘    │    │
│  └─────────────────────────────────────────────────────┘    │
└─────────────────────────────────────────────────────────────┘

The Dependency Rule: Source-code dependencies must point inward. Nothing in an inner circle can reference anything in an outer circle. Concretely:

  • Domain references nothing.
  • Application references Domain only.
  • Infrastructure references Application (and transitively, Domain).
  • API references Application and Infrastructure.

This means the Domain and Application layers are fully testable without a database, web server, or any framework.

Project Dependency Graph

{ProjectName}.Api
├── {ProjectName}.Application
│   └── {ProjectName}.Domain        (zero NuGet deps)
└── {ProjectName}.Infrastructure
    └── {ProjectName}.Application
        └── {ProjectName}.Domain

{ProjectName}.Domain.Tests
└── {ProjectName}.Domain

{ProjectName}.Application.Tests
└── {ProjectName}.Application

{ProjectName}.IntegrationTests
└── {ProjectName}.Api

CQRS (Command Query Responsibility Segregation)

CQRS separates read operations (Queries) from write operations (Commands). Each operation is a standalone class that carries all the data it needs, and each has a dedicated handler. This gives you:

  • Single Responsibility — every handler does exactly one thing.
  • Explicit contracts — the request shape is the documentation.
  • Pipeline behaviors — cross-cutting concerns (logging, validation) are applied uniformly via MediatR's pipeline, not scattered through service classes.

In this boilerplate, CQRS is implemented through MediatR:

Controller                                    Handler
    │                                            ▲
    │  IMediator.Send(command)                   │
    ▼                                            │
┌──────────────────────────────────────────────────┐
│              MediatR Pipeline                     │
│                                                   │
│  ┌─────────────────┐   ┌──────────────────────┐  │
│  │ LoggingBehavior  │──▶│ ValidationBehavior   │──┼──▶ Handler
│  └─────────────────┘   └──────────────────────┘  │
└──────────────────────────────────────────────────┘

Controllers never call services directly. They send a command or query to IMediator, which routes it through the pipeline behaviors and into the correct handler.

The Result Pattern

Instead of throwing exceptions for expected failures (validation errors, not-found, conflicts), every handler returns a Result or Result<T>. This makes the success/failure path explicit in the type system:

// Handler returns Result<Guid>, not just Guid
public async Task<Result<Guid>> Handle(CreateMatchCommand request, CancellationToken ct)
{
    // Failure path — no exception thrown
    if (exists) return Result.Failure<Guid>(Error.Conflict);

    // Success path
    return Result.Success(match.Id);
}

The IValidationResult interface with a static abstract method enables the ValidationBehavior to create typed failure results without reflection — a zero-reflection, high-performance validation pipeline.

MediatR Pipeline Behaviors

Pipeline behaviors are middleware that wrap every MediatR request. They execute in registration order, forming a chain:

  1. LoggingBehavior — logs the request name and execution time.
  2. ValidationBehavior — runs all FluentValidation validators for the request. If any fail, it short-circuits and returns a Result.Failure without ever reaching the handler.

You can add more behaviors (e.g., authorization, caching, transaction management) by registering additional IPipelineBehavior<,> implementations.


2. Design Decisions & Trade-offs

Decision Why Alternatives Considered
Central Package Management (CPM) Single Directory.Packages.props controls all NuGet versions — no version drift between projects Per-project <PackageVersion> attributes
Result pattern over exceptions Makes success/failure explicit in return types; eliminates try/catch ceremony in callers; no stack-trace overhead for expected failures Throwing domain exceptions; OneOf<T> discriminated unions
MediatR for CQRS Decouples controllers from handlers; pipeline behaviors give free cross-cutting concerns; widely adopted in .NET ecosystem Hand-rolled mediator; direct service injection; Wolverine
static abstract on IValidationResult Enables ValidationBehavior to create typed Result.Failure without reflection or Activator.CreateInstance Reflection-based factory; generic constraints with new()
Manual mapping (extension methods) Zero magic, fully debuggable, no hidden runtime behavior; keeps mapping close to the feature that uses it AutoMapper; Mapster
FluentValidation Declarative, composable rules; integrates cleanly with MediatR pipeline Data Annotations; hand-rolled validation
Serilog Structured logging with rich sink ecosystem; configuration-driven via appsettings.json Built-in ILogger with console provider; NLog; log4net
EF Core Interceptors Keeps audit (CreatedAt/UpdatedAt) and domain event dispatch out of the DbContext, making them composable and testable Overriding SaveChangesAsync directly; domain event outbox pattern
API Versioning (URL path segment) Non-breaking evolution of APIs; URL path (/api/v1/) is the most explicit and cache-friendly strategy; Asp.Versioning.Mvc is the official Microsoft-maintained library Query string versioning; header versioning; no versioning
C# 13 (via .NET 10) Latest language version: collection expressions, static abstract interfaces, primary constructors, file-scoped namespaces, etc. Pinning an older LangVersion
.slnx (XML solution file) New lightweight format; cleaner diffs than .sln; created directly via dotnet new slnx Traditional .sln
PostgreSQL Open-source, production-grade RDBMS; excellent JSON support; strong EF Core provider SQL Server; SQLite (dev-only); MySQL
Quartz.NET Mature, cron-capable job scheduler for background tasks (email, cleanup, etc.) Hangfire; IHostedService with Timer; custom BackgroundService
compose.yml (not docker-compose.yml) Docker Compose V2 standard; shorter name; docker compose CLI (no hyphen) Legacy docker-compose.yml
Multi-stage Dockerfile Separates build and runtime images; final image contains only published output (~200 MB vs ~1.5 GB) Single-stage build; publishing locally and copying artifacts
Correlation ID middleware Traces a request across services and log entries; accepts client-supplied IDs or generates new ones W3C Trace Context; OpenTelemetry baggage (heavier)
Sensitive data redaction Prevents passwords, tokens, and PII from appearing in logs; JSON DOM + regex fallback handles truncated bodies Manual redaction per log call; not logging bodies at all

3. Prerequisites

Ensure you have the following installed:

  • .NET 10 SDK (Version 10.0.103 or higher)
  • Docker & Docker Compose (for PostgreSQL and containerization)
  • IDE: Visual Studio, Rider, or VS Code

4. Solution & Project Scaffolding

Create the root directory and initialize the solution and projects:

mkdir {projectname}-api
cd {projectname}-api

# Create the solution (slnx = lightweight XML format, cleaner diffs than .sln)
dotnet new slnx -n {Projects}

# Create the source projects
dotnet new webapi -n {Projects}.Api -o src/{Projects}.Api
dotnet new classlib -n {Projects}.Application -o src/{Projects}.Application
dotnet new classlib -n {Projects}.Domain -o src/{Projects}.Domain
dotnet new classlib -n {Projects}.Infrastructure -o src/{Projects}.Infrastructure

# Create the test projects
dotnet new xunit -n {Projects}.Application.Tests -o tests/{Projects}.Application.Tests
dotnet new xunit -n {Projects}.Domain.Tests -o tests/{Projects}.Domain.Tests
dotnet new xunit -n {Projects}.IntegrationTests -o tests/{Projects}.IntegrationTests

# Add source projects to the solution
dotnet sln {Projects}.slnx add src/{Projects}.Api/{Projects}.Api.csproj --solution-folder src
dotnet sln {Projects}.slnx add src/{Projects}.Application/{Projects}.Application.csproj --solution-folder src
dotnet sln {Projects}.slnx add src/{Projects}.Domain/{Projects}.Domain.csproj --solution-folder src
dotnet sln {Projects}.slnx add src/{Projects}.Infrastructure/{Projects}.Infrastructure.csproj --solution-folder src

# Add test projects to the solution
dotnet sln {Projects}.slnx add tests/{Projects}.Application.Tests/{Projects}.Application.Tests.csproj --solution-folder tests
dotnet sln {Projects}.slnx add tests/{Projects}.Domain.Tests/{Projects}.Domain.Tests.csproj --solution-folder tests
dotnet sln {Projects}.slnx add tests/{Projects}.IntegrationTests/{Projects}.IntegrationTests.csproj --solution-folder tests

# Configure Clean Architecture Dependencies
dotnet add src/{Projects}.Application/{Projects}.Application.csproj reference src/{Projects}.Domain/{Projects}.Domain.csproj
dotnet add src/{Projects}.Infrastructure/{Projects}.Infrastructure.csproj reference src/{Projects}.Application/{Projects}.Application.csproj
dotnet add src/{Projects}.Api/{Projects}.Api.csproj reference src/{Projects}.Application/{Projects}.Application.csproj
dotnet add src/{Projects}.Api/{Projects}.Api.csproj reference src/{Projects}.Infrastructure/{Projects}.Infrastructure.csproj

Resulting {ProjectName}.slnx

The dotnet new slnx command creates the lightweight XML-based solution format directly — no migration from .sln needed. The resulting file is concise:

<Solution>
    <Folder Name="/src/">
        <Project Path="src/{Projects}.Api/{Projects}.Api.csproj"/>
        <Project Path="src/{Projects}.Application/{Projects}.Application.csproj"/>
        <Project Path="src/{Projects}.Domain/{Projects}.Domain.csproj"/>
        <Project Path="src/{Projects}.Infrastructure/{Projects}.Infrastructure.csproj"/>
    </Folder>
    <Folder Name="/tests/">
        <Project Path="tests/{Projects}.Application.Tests/{Projects}.Application.Tests.csproj"/>
        <Project Path="tests/{Projects}.Domain.Tests/{Projects}.Domain.Tests.csproj"/>
        <Project Path="tests/{Projects}.IntegrationTests/{Projects}.IntegrationTests.csproj"/>
    </Folder>
</Solution>

5. Root Configuration Files

These files enforce consistency, lock SDK versions, and centrally manage NuGet packages across all projects. Create them at the repository root (next to the .slnx file).

global.json

Locks the .NET SDK version so every developer and CI runner uses the same toolchain. The rollForward: latestFeature policy allows patch updates within the 10.0.1xx band but prevents major/minor surprises.

{
  "sdk": {
    "rollForward": "latestFeature",
    "version": "10.0.103"
  }
}

Directory.Build.props

MSBuild imports this file automatically into every .csproj in the repo tree. It sets the target framework, enables nullable reference types, implicit usings, and treats warnings as errors — so no project can accidentally diverge from these defaults.

<Project>
  <PropertyGroup>
    <TargetFramework>net10.0</TargetFramework>
    <Nullable>enable</Nullable>
    <ImplicitUsings>enable</ImplicitUsings>
    <TreatWarningsAsErrors>true</TreatWarningsAsErrors>
  </PropertyGroup>
</Project>

Directory.Packages.props

Enables Central Package Management (CPM). Every NuGet package version is declared once here. Individual .csproj files reference packages by name only (no Version attribute). This eliminates version drift across projects and makes upgrades a single-file change.

<Project>

  <PropertyGroup>
    <ManagePackageVersionsCentrally>true</ManagePackageVersionsCentrally>
  </PropertyGroup>

  <ItemGroup>
    <!-- Web & API -->
    <PackageVersion Include="Asp.Versioning.Mvc" Version="10.0.0-preview.2"/>
    <PackageVersion Include="Asp.Versioning.Mvc.ApiExplorer" Version="10.0.0-preview.2"/>
    <PackageVersion Include="AspNetCore.HealthChecks.NpgSql" Version="9.0.0"/>
    <PackageVersion Include="Microsoft.AspNetCore.OpenApi" Version="10.0.5"/>

    <!-- Logging -->
    <PackageVersion Include="Serilog.AspNetCore" Version="10.0.0"/>
    <PackageVersion Include="Serilog.Enrichers.Environment" Version="3.0.1"/>
    <PackageVersion Include="Serilog.Enrichers.Thread" Version="4.0.0"/>

    <!-- Application -->
    <PackageVersion Include="FluentValidation" Version="12.1.1"/>
    <PackageVersion Include="FluentValidation.DependencyInjectionExtensions" Version="12.1.1"/>
    <PackageVersion Include="MediatR" Version="14.1.0"/>
    <PackageVersion Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.5"/>

    <!-- Infrastructure -->
    <PackageVersion Include="Newtonsoft.Json" Version="13.0.4"/>
    <PackageVersion Include="Quartz.Extensions.Hosting" Version="3.16.1"/>

    <!-- Entity Framework Core -->
    <PackageVersion Include="Microsoft.EntityFrameworkCore" Version="10.0.5"/>
    <PackageVersion Include="Microsoft.EntityFrameworkCore.Design" Version="10.0.5"/>
    <PackageVersion Include="Microsoft.EntityFrameworkCore.Tools" Version="10.0.5"/>
    <PackageVersion Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="10.0.1"/>

    <!-- Testing -->
    <PackageVersion Include="coverlet.collector" Version="8.0.0"/>
    <PackageVersion Include="FluentAssertions" Version="8.8.0"/>
    <PackageVersion Include="Microsoft.AspNetCore.Mvc.Testing" Version="10.0.5"/>
    <PackageVersion Include="Microsoft.NET.Test.Sdk" Version="18.3.0"/>
    <PackageVersion Include="Moq" Version="4.20.72"/>
    <PackageVersion Include="xunit" Version="2.9.3"/>
    <PackageVersion Include="xunit.runner.visualstudio" Version="3.1.5"/>
  </ItemGroup>

</Project>

nuget.config

Explicitly clears any inherited package sources and sets the official NuGet feed as the only source. This prevents builds from silently pulling packages from unexpected feeds (e.g., a corporate proxy or local cache).

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <packageSources>
    <clear />
    <add key="nuget" value="https://api.nuget.org/v3/index.json" />
  </packageSources>
</configuration>

.editorconfig

Enforces consistent code style across all editors and IDEs. The C# section is particularly important — it mandates file-scoped namespaces, var usage, and naming conventions (e.g., _camelCase for private fields, I prefix for interfaces). These rules integrate with Roslyn analyzers, so violations appear as warnings during build.

root = true

# All files
[*]
indent_style = space
indent_size = 4
end_of_line = lf
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true

# XML project files
[*.{csproj,vbproj,vcxproj,vcxproj.filters,proj,projitems,shproj,props,targets}]
indent_size = 2

# XML files
[*.{xml,config,nuspec,resx}]
indent_size = 2

# JSON files
[*.json]
indent_size = 2

# YAML files
[*.{yml,yaml}]
indent_size = 2

# Markdown files
[*.md]
trim_trailing_whitespace = false

# C# files
[*.cs]

# Organize usings
dotnet_sort_system_directives_first = true
dotnet_separate_import_directive_groups = false

# Namespace settings
csharp_style_namespace_declarations = file_scoped:warning

# var preferences
csharp_style_var_for_built_in_types = true:suggestion
csharp_style_var_when_type_is_apparent = true:suggestion
csharp_style_var_elsewhere = true:suggestion

# Expression-level preferences
csharp_prefer_simple_using_statement = true:warning
csharp_style_prefer_switch_expression = true:suggestion
csharp_style_prefer_pattern_matching = true:suggestion

# Null-checking preferences
csharp_style_throw_expression = true:suggestion
csharp_style_conditional_delegate_call = true:suggestion

# New line preferences
csharp_new_line_before_open_brace = all
csharp_new_line_before_else = true
csharp_new_line_before_catch = true
csharp_new_line_before_finally = true

# Indentation preferences
csharp_indent_case_contents = true
csharp_indent_switch_labels = true

# Naming conventions
dotnet_naming_rule.interface_should_be_begins_with_i.severity = warning
dotnet_naming_rule.interface_should_be_begins_with_i.symbols = interface
dotnet_naming_rule.interface_should_be_begins_with_i.style = begins_with_i

dotnet_naming_rule.private_field_should_be_camel_case_with_underscore.severity = warning
dotnet_naming_rule.private_field_should_be_camel_case_with_underscore.symbols = private_field
dotnet_naming_rule.private_field_should_be_camel_case_with_underscore.style = camel_case_with_underscore

dotnet_naming_symbols.interface.applicable_kinds = interface
dotnet_naming_symbols.interface.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected

dotnet_naming_symbols.private_field.applicable_kinds = field
dotnet_naming_symbols.private_field.applicable_accessibilities = private, private_protected

dotnet_naming_style.begins_with_i.required_prefix = I
dotnet_naming_style.begins_with_i.capitalization = pascal_case

dotnet_naming_style.camel_case_with_underscore.required_prefix = _
dotnet_naming_style.camel_case_with_underscore.capitalization = camel_case

.dockerignore

Tells Docker which files to exclude from the build context. Keeping the context small speeds up docker build significantly. Test projects, CI config, docs, and IDE metadata are not needed in the production image.

**/.git
**/.vs
**/.vscode
**/.idea
**/bin
**/obj
**/logs
**/.DS_Store
**/node_modules
tests/
.github/
*.md
.editorconfig
.gitignore
.env
.env.example
qodana.yaml
compose.yml

.gitignore

The .gitignore file is a standard .NET template that excludes build output (bin/, obj/), IDE-specific directories (.vs/, .idea/, .vscode/), user-specific files (*.user, launchSettings.json overrides), and environment files (.env). It is ~300 lines and is generated via dotnet new gitignore — not reproduced here for brevity.

.env.example

Template for environment variables consumed by compose.yml. Developers copy this to .env and customize. The .env file is gitignored; this .example file is committed so new developers know what variables are needed.

# ──────────────────────────────────────────────
# Docker image
# ──────────────────────────────────────────────
DOCKER_IMAGE=your-dockerhub-username/{projects}-api
IMAGE_TAG=latest

# ──────────────────────────────────────────────
# API configuration
# ──────────────────────────────────────────────
ASPNETCORE_ENVIRONMENT=Production
API_PORT=5212

# ──────────────────────────────────────────────
# Database configuration
# ──────────────────────────────────────────────
POSTGRES_DB={projects}
POSTGRES_USER=postgres
POSTGRES_PASSWORD=change-me-to-a-strong-password
DB_PORT=5432

# ──────────────────────────────────────────────
# Connection string (must match DB settings above)
# ──────────────────────────────────────────────
CONNECTION_STRING=Host=db;Port=5432;Database={projects};Username=postgres;Password=change-me-to-a-strong-password

Install Packages into Projects

With CPM, running dotnet add package registers the package in the .csproj (without a version). The version is resolved from Directory.Packages.props.

# Application Layer
dotnet add src/{ProjectName}.Application package MediatR
dotnet add src/{ProjectName}.Application package FluentValidation
dotnet add src/{ProjectName}.Application package FluentValidation.DependencyInjectionExtensions
dotnet add src/{ProjectName}.Application package Microsoft.Extensions.Logging.Abstractions

# Infrastructure Layer
dotnet add src/{ProjectName}.Infrastructure package Microsoft.EntityFrameworkCore
dotnet add src/{ProjectName}.Infrastructure package Microsoft.EntityFrameworkCore.Design
dotnet add src/{ProjectName}.Infrastructure package Npgsql.EntityFrameworkCore.PostgreSQL
dotnet add src/{ProjectName}.Infrastructure package Newtonsoft.Json
dotnet add src/{ProjectName}.Infrastructure package Quartz.Extensions.Hosting

# API Layer
dotnet add src/{ProjectName}.Api package Serilog.AspNetCore
dotnet add src/{ProjectName}.Api package Serilog.Enrichers.Environment
dotnet add src/{ProjectName}.Api package Serilog.Enrichers.Thread
dotnet add src/{ProjectName}.Api package AspNetCore.HealthChecks.NpgSql
dotnet add src/{ProjectName}.Api package Microsoft.AspNetCore.OpenApi
dotnet add src/{ProjectName}.Api package Microsoft.EntityFrameworkCore.Tools
dotnet add src/{ProjectName}.Api package Asp.Versioning.Mvc
dotnet add src/{ProjectName}.Api package Asp.Versioning.Mvc.ApiExplorer

# Test Projects (Domain)
dotnet add tests/{ProjectName}.Domain.Tests package Microsoft.NET.Test.Sdk
dotnet add tests/{ProjectName}.Domain.Tests package xunit
dotnet add tests/{ProjectName}.Domain.Tests package xunit.runner.visualstudio
dotnet add tests/{ProjectName}.Domain.Tests package coverlet.collector
dotnet add tests/{ProjectName}.Domain.Tests package FluentAssertions
dotnet add tests/{ProjectName}.Domain.Tests package Moq

# Test Projects (Application)
dotnet add tests/{ProjectName}.Application.Tests package Microsoft.NET.Test.Sdk
dotnet add tests/{ProjectName}.Application.Tests package xunit
dotnet add tests/{ProjectName}.Application.Tests package xunit.runner.visualstudio
dotnet add tests/{ProjectName}.Application.Tests package coverlet.collector
dotnet add tests/{ProjectName}.Application.Tests package FluentAssertions
dotnet add tests/{ProjectName}.Application.Tests package Moq

# Test Projects (Integration)
dotnet add tests/{ProjectName}.IntegrationTests package Microsoft.NET.Test.Sdk
dotnet add tests/{ProjectName}.IntegrationTests package xunit
dotnet add tests/{ProjectName}.IntegrationTests package xunit.runner.visualstudio
dotnet add tests/{ProjectName}.IntegrationTests package coverlet.collector
dotnet add tests/{ProjectName}.IntegrationTests package FluentAssertions
dotnet add tests/{ProjectName}.IntegrationTests package Moq
dotnet add tests/{ProjectName}.IntegrationTests package Microsoft.AspNetCore.Mvc.Testing

6. Project Files (.csproj)

With CPM and Directory.Build.props, individual .csproj files are minimal. They declare only package references (no versions) and project references (enforcing the dependency rule).

src/{ProjectName}.Domain/{ProjectName}.Domain.csproj

The Domain project has no NuGet dependencies at all. This is intentional — the Domain layer must be pure C# with zero framework coupling.

<Project Sdk="Microsoft.NET.Sdk">

</Project>

Note: TargetFramework, Nullable, ImplicitUsings, and TreatWarningsAsErrors are inherited from Directory.Build.props — no need to repeat them.

src/{ProjectName}.Application/{ProjectName}.Application.csproj

References Domain and adds MediatR, FluentValidation, and logging abstractions.

<Project Sdk="Microsoft.NET.Sdk">

  <ItemGroup>
    <ProjectReference Include="..\{Projects}.Domain\{Projects}.Domain.csproj"/>
  </ItemGroup>

  <ItemGroup>
    <PackageReference Include="FluentValidation"/>
    <PackageReference Include="FluentValidation.DependencyInjectionExtensions"/>
    <PackageReference Include="MediatR"/>
    <PackageReference Include="Microsoft.Extensions.Logging.Abstractions"/>
  </ItemGroup>

</Project>

src/{ProjectName}.Infrastructure/{ProjectName}.Infrastructure.csproj

References Application and adds EF Core with PostgreSQL, Newtonsoft.Json, and Quartz for background jobs.

<Project Sdk="Microsoft.NET.Sdk">

  <ItemGroup>
    <ProjectReference Include="..\{Projects}.Application\{Projects}.Application.csproj"/>
  </ItemGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.EntityFrameworkCore"/>
    <PackageReference Include="Microsoft.EntityFrameworkCore.Design">
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
      <PrivateAssets>all</PrivateAssets>
    </PackageReference>
    <PackageReference Include="Newtonsoft.Json"/>
    <PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL"/>
    <PackageReference Include="Quartz.Extensions.Hosting"/>
  </ItemGroup>

</Project>

EF Core Design is marked as a development-only dependency (PrivateAssets: all) — it's used by dotnet ef tooling at design time, not at runtime.

src/{ProjectName}.Api/{ProjectName}.Api.csproj

The web application project. Uses Microsoft.NET.Sdk.Web (not Microsoft.NET.Sdk). References both Application and Infrastructure to wire everything together at the composition root.

<Project Sdk="Microsoft.NET.Sdk.Web">

  <ItemGroup>
    <PackageReference Include="Asp.Versioning.Mvc"/>
    <PackageReference Include="Asp.Versioning.Mvc.ApiExplorer"/>
    <PackageReference Include="AspNetCore.HealthChecks.NpgSql"/>
    <PackageReference Include="Microsoft.AspNetCore.OpenApi"/>
    <PackageReference Include="Microsoft.EntityFrameworkCore.Tools">
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
      <PrivateAssets>all</PrivateAssets>
    </PackageReference>
    <PackageReference Include="Serilog.AspNetCore"/>
    <PackageReference Include="Serilog.Enrichers.Environment"/>
    <PackageReference Include="Serilog.Enrichers.Thread"/>
  </ItemGroup>

  <ItemGroup>
    <ProjectReference Include="..\{Projects}.Application\{Projects}.Application.csproj"/>
    <ProjectReference Include="..\{Projects}.Infrastructure\{Projects}.Infrastructure.csproj"/>
  </ItemGroup>

</Project>

Test .csproj Files

All test projects share the same structure: IsPackable set to false (prevents accidentally publishing test assemblies as NuGet packages), common test packages, a global Using for xUnit, and a single project reference to the layer under test.

tests/{ProjectName}.Domain.Tests/{ProjectName}.Domain.Tests.csproj

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <IsPackable>false</IsPackable>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="coverlet.collector"/>
    <PackageReference Include="FluentAssertions"/>
    <PackageReference Include="Microsoft.NET.Test.Sdk"/>
    <PackageReference Include="Moq"/>
    <PackageReference Include="xunit"/>
    <PackageReference Include="xunit.runner.visualstudio"/>
  </ItemGroup>

  <ItemGroup>
    <Using Include="Xunit"/>
  </ItemGroup>

  <ItemGroup>
    <ProjectReference Include="..\..\src\{Projects}.Domain\{Projects}.Domain.csproj"/>
  </ItemGroup>

</Project>

tests/{ProjectName}.Application.Tests/{ProjectName}.Application.Tests.csproj

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <IsPackable>false</IsPackable>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="coverlet.collector"/>
    <PackageReference Include="FluentAssertions"/>
    <PackageReference Include="Microsoft.NET.Test.Sdk"/>
    <PackageReference Include="Moq"/>
    <PackageReference Include="xunit"/>
    <PackageReference Include="xunit.runner.visualstudio"/>
  </ItemGroup>

  <ItemGroup>
    <Using Include="Xunit"/>
  </ItemGroup>

  <ItemGroup>
    <ProjectReference Include="..\..\src\{Projects}.Application\{Projects}.Application.csproj"/>
  </ItemGroup>

</Project>

tests/{ProjectName}.IntegrationTests/{ProjectName}.IntegrationTests.csproj

Integration tests reference the API project (which transitively brings in everything) and add Microsoft.AspNetCore.Mvc.Testing for WebApplicationFactory<Program> support.

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <IsPackable>false</IsPackable>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="coverlet.collector"/>
    <PackageReference Include="FluentAssertions"/>
    <PackageReference Include="Microsoft.AspNetCore.Mvc.Testing"/>
    <PackageReference Include="Microsoft.NET.Test.Sdk"/>
    <PackageReference Include="Moq"/>
    <PackageReference Include="xunit"/>
    <PackageReference Include="xunit.runner.visualstudio"/>
  </ItemGroup>

  <ItemGroup>
    <Using Include="Xunit"/>
  </ItemGroup>

  <ItemGroup>
    <ProjectReference Include="..\..\src\{Projects}.Api\{Projects}.Api.csproj"/>
  </ItemGroup>

</Project>

7. Docker & Dev Environment

compose.yml (Root)

Sets up the API and a PostgreSQL database. All values are configurable via .env with sensible defaults for local development.

services:
  api:
    container_name: {projects}-api
    image: ${DOCKER_IMAGE:-{projects}-api}:${IMAGE_TAG:-latest}
    build:
      context: .
      dockerfile: Dockerfile
    ports:
      - "${API_PORT:-5212}:8080"
    environment:
      - ASPNETCORE_ENVIRONMENT=${ASPNETCORE_ENVIRONMENT:-Development}
      - ConnectionStrings__DefaultConnection=${CONNECTION_STRING:-Host=db;Port=5432;Database={projects}_dev;Username=postgres;Password=postgres}
    depends_on:
      db:
        condition: service_healthy

  db:
    container_name: {projects}-db
    image: postgres:17-alpine
    ports:
      - "${DB_PORT:-5432}:5432"
    environment:
      POSTGRES_DB: ${POSTGRES_DB:-{projects}_dev}
      POSTGRES_USER: ${POSTGRES_USER:-postgres}
      POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-postgres}
    volumes:
      - postgres_data:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U postgres"]
      interval: 5s
      timeout: 5s
      retries: 5

volumes:
  postgres_data:

Dockerfile (Root)

Optimized multi-stage build. The key optimization is copying .csproj files first and running dotnet restore before copying source code — this means the NuGet restore layer is cached and only invalidated when dependencies change, not when code changes.

FROM mcr.microsoft.com/dotnet/aspnet:10.0 AS base
WORKDIR /app
EXPOSE 8080

FROM mcr.microsoft.com/dotnet/sdk:10.0 AS build
ARG BUILD_CONFIGURATION=Release
WORKDIR /src

COPY global.json .
COPY nuget.config .
COPY Directory.Build.props .
COPY Directory.Packages.props .
COPY src/{Projects}.Api/{Projects}.Api.csproj src/{Projects}.Api/
COPY src/{Projects}.Application/{Projects}.Application.csproj src/{Projects}.Application/
COPY src/{Projects}.Domain/{Projects}.Domain.csproj src/{Projects}.Domain/
COPY src/{Projects}.Infrastructure/{Projects}.Infrastructure.csproj src/{Projects}.Infrastructure/

RUN dotnet restore src/{Projects}.Api/{Projects}.Api.csproj

COPY . .
RUN dotnet build src/{Projects}.Api -c $BUILD_CONFIGURATION --no-restore

FROM build AS publish
ARG BUILD_CONFIGURATION=Release
RUN dotnet publish src/{Projects}.Api -c $BUILD_CONFIGURATION --no-build -o /app/publish /p:UseAppHost=false

FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "{Projects}.Api.dll"]

src/{ProjectName}.Api/Properties/launchSettings.json

Configures how dotnet run launches the API locally. Two profiles are defined: HTTP-only (port 5212) and HTTPS (ports 7031 + 5212). launchBrowser is disabled — APIs don't need a browser window.

{
  "$schema": "https://json.schemastore.org/launchsettings.json",
  "profiles": {
    "http": {
      "commandName": "Project",
      "dotnetRunMessages": true,
      "launchBrowser": false,
      "applicationUrl": "http://localhost:5212",
      "environmentVariables": {
        "ASPNETCORE_ENVIRONMENT": "Development"
      }
    },
    "https": {
      "commandName": "Project",
      "dotnetRunMessages": true,
      "launchBrowser": false,
      "applicationUrl": "https://localhost:7031;http://localhost:5212",
      "environmentVariables": {
        "ASPNETCORE_ENVIRONMENT": "Development"
      }
    }
  }
}

8. Layer 1: Domain

The innermost layer. Contains entities, value objects, domain events, the Result pattern, and repository abstractions. It has zero NuGet dependencies — pure C# only.

Why a dependency-free Domain? The Domain layer encodes business rules. By keeping it free of frameworks (no EF Core attributes, no MediatR, no JSON serializers), it remains:

  • Testable with plain unit tests (no mocking infrastructure).
  • Portable — you can swap EF Core for Dapper or PostgreSQL for MongoDB without touching a single Domain file.
  • Focused — developers reading Domain code see only business logic, not framework ceremony.

src/{ProjectName}.Domain/Common/BaseEntity.cs

All entities inherit from this. It provides a GUID primary key and a domain events collection. Domain events are raised by entities during business operations and dispatched after SaveChanges by the DomainEventInterceptor in the Infrastructure layer.

namespace {Projects}.Domain.Common;

public abstract class BaseEntity
{
    private readonly List<IDomainEvent> _domainEvents = [];
    public Guid Id { get; private init; } = Guid.NewGuid();
    public IReadOnlyCollection<IDomainEvent> DomainEvents => _domainEvents.AsReadOnly();

    public void AddDomainEvent(IDomainEvent domainEvent) => _domainEvents.Add(domainEvent);
    public void RemoveDomainEvent(IDomainEvent domainEvent) => _domainEvents.Remove(domainEvent);
    public void ClearDomainEvents() => _domainEvents.Clear();
}

src/{ProjectName}.Domain/Common/AuditableEntity.cs

Extends BaseEntity with CreatedAt and UpdatedAt timestamps. The setters are accessed by the AuditableEntityInterceptor — entities themselves don't need to worry about setting timestamps.

namespace {Projects}.Domain.Common;

public abstract class AuditableEntity : BaseEntity
{
    public DateTime CreatedAt { get; private set; }
    public DateTime? UpdatedAt { get; private set; }

    public void SetCreatedAt(DateTime createdAt) => CreatedAt = createdAt;
    public void SetUpdatedAt(DateTime updatedAt) => UpdatedAt = updatedAt;
}

src/{ProjectName}.Domain/Common/IDomainEvent.cs

Marker interface for domain events. Events carry a timestamp so consumers know when the event occurred.

namespace {Projects}.Domain.Common;

public interface IDomainEvent
{
    DateTime OccurredOn { get; }
}

src/{ProjectName}.Domain/Common/ValueObject.cs

Base class for value objects (DDD concept). Value objects are compared by their component values, not by identity. Two Money(100, "USD") instances are equal regardless of reference identity.

namespace {Projects}.Domain.Common;

public abstract class ValueObject : IEquatable<ValueObject>
{
    protected abstract IEnumerable<object?> GetEqualityComponents();

    public override bool Equals(object? obj)
    {
        if (obj is null || obj.GetType() != GetType()) return false;
        return Equals((ValueObject)obj);
    }

    public bool Equals(ValueObject? other)
    {
        if (other is null || other.GetType() != GetType()) return false;
        return GetEqualityComponents().SequenceEqual(other.GetEqualityComponents());
    }

    public override int GetHashCode() => GetEqualityComponents().Aggregate(0, (current, obj) => HashCode.Combine(current, obj?.GetHashCode() ?? 0));
    public static bool operator ==(ValueObject? left, ValueObject? right) => left is null && right is null || (left is not null && right is not null && left.Equals(right));
    public static bool operator !=(ValueObject? left, ValueObject? right) => !(left == right);
}

src/{ProjectName}.Domain/Common/Error.cs

Defines the Error record and ErrorType enum used throughout the Result pattern. Pre-defined static errors cover the most common failure cases. Domain-specific errors are defined alongside their entities (e.g., User.Errors.EmailTaken).

namespace {Projects}.Domain.Common;

public enum ErrorType { None = 0, Failure = 1, Validation = 2, NotFound = 3, Conflict = 4 }

public sealed record Error(string Code, string Description, ErrorType Type)
{
    public static readonly Error None = new(string.Empty, string.Empty, ErrorType.None);
    public static readonly Error NullValue = new("Error.NullValue", "A null value was provided.", ErrorType.Failure);
    public static readonly Error NotFound = new("Error.NotFound", "The requested resource was not found.", ErrorType.NotFound);
    public static readonly Error Conflict = new("Error.Conflict", "A conflict occurred with the current state.", ErrorType.Conflict);
    public static readonly Error Validation = new("Error.Validation", "A validation error occurred.", ErrorType.Validation);
}

src/{ProjectName}.Domain/Common/Result.cs

The Result pattern implementation. Key design points:

  • IValidationResult uses static abstract interface members (C# 13) — this enables ValidationBehavior to call TResponse.Failure(error) without reflection or Activator.CreateInstance.
  • Constructor guards prevent invalid states (success with error, failure without error).
  • Result<T> provides an implicit conversion from T for ergonomic returns.
namespace {Projects}.Domain.Common;

/// <summary>
/// Defines a contract for creating a failure result of a specific type.
/// Used for type-safe, high-performance validation in CQRS pipelines.
/// </summary>
public interface IValidationResult
{
    static abstract Result Failure(Error error);
}

public class Result : IValidationResult
{
    protected Result(bool isSuccess, Error error)
    {
        if (isSuccess && error != Error.None)
            throw new InvalidOperationException("A successful result cannot have an error.");

        if (!isSuccess && error == Error.None)
            throw new InvalidOperationException("A failed result must have an error.");

        IsSuccess = isSuccess;
        Error = error;
    }

    public bool IsSuccess { get; }
    public bool IsFailure => !IsSuccess;
    public Error Error { get; }

    public static Result Success()
    {
        return new Result(true, Error.None);
    }

    public static Result<T> Success<T>(T value)
    {
        return Result<T>.Success(value);
    }

    public static Result Failure(Error error)
    {
        return new Result(false, error);
    }

    public static Result<T> Failure<T>(Error error)
    {
        return Result<T>.Failure(error);
    }
}

public class Result<T> : Result, IValidationResult
{
    private readonly T? _value;

    private Result(T? value, bool isSuccess, Error error)
        : base(isSuccess, error)
    {
        _value = value;
    }

    public T Value => IsSuccess
        ? _value!
        : throw new InvalidOperationException("Cannot access the value of a failed result.");

    public static Result<T> Success(T value)
    {
        return new Result<T>(value, true, Error.None);
    }

    public new static Result<T> Failure(Error error)
    {
        return new Result<T>(default, false, error);
    }

    public static implicit operator Result<T>(T value)
    {
        return Success(value);
    }
}

src/{ProjectName}.Domain/Abstractions/IUnitOfWork.cs

Abstracts the "save all pending changes" operation. In the Infrastructure layer, ApplicationDbContext implements this interface directly.

namespace {Projects}.Domain.Abstractions;

public interface IUnitOfWork
{
    Task<int> SaveChangesAsync(CancellationToken cancellationToken = default);
}

9. Layer 2: Application

Contains use cases (commands, queries, handlers), validation, mapping, and MediatR pipeline behaviors. References Domain only.

Why a separate Application layer? The Application layer orchestrates business operations without knowing how data is persisted or how HTTP requests arrive. This means:

  • Handlers are testable by mocking ApplicationDbContext and IUnitOfWork — no database needed.
  • The same handlers can serve a REST API, gRPC service, or message queue consumer.
  • Validation is co-located with the command/query it validates.

Messaging Interfaces (src/{ProjectName}.Application/Abstractions/Messaging/)

These interfaces wrap MediatR's IRequest and IRequestHandler to enforce that all commands and queries return Result or Result<T>. This guarantees the Result pattern is used consistently throughout the application.

ICommand.cs

using MediatR;
using {Projects}.Domain.Common;

namespace {Projects}.Application.Abstractions.Messaging;

/// <summary>
/// Marker interface for commands that do not return a value.
/// </summary>
public interface ICommand : IRequest<Result>;

/// <summary>
/// Marker interface for commands that return a value wrapped in Result.
/// </summary>
public interface ICommand<TResponse> : IRequest<Result<TResponse>>;

ICommandHandler.cs

using MediatR;
using {Projects}.Domain.Common;

namespace {Projects}.Application.Abstractions.Messaging;

/// <summary>
/// Handler for commands that do not return a value.
/// </summary>
public interface ICommandHandler<in TCommand> : IRequestHandler<TCommand, Result>
    where TCommand : ICommand;

/// <summary>
/// Handler for commands that return a value wrapped in Result.
/// </summary>
public interface ICommandHandler<in TCommand, TResponse> : IRequestHandler<TCommand, Result<TResponse>>
    where TCommand : ICommand<TResponse>;

IQuery.cs

using MediatR;
using {Projects}.Domain.Common;

namespace {Projects}.Application.Abstractions.Messaging;

/// <summary>
/// Marker interface for queries that return a value wrapped in Result.
/// </summary>
public interface IQuery<TResponse> : IRequest<Result<TResponse>>;

IQueryHandler.cs

using MediatR;
using {Projects}.Domain.Common;

namespace {Projects}.Application.Abstractions.Messaging;

/// <summary>
/// Handler for queries that return a value wrapped in Result.
/// </summary>
public interface IQueryHandler<in TQuery, TResponse> : IRequestHandler<TQuery, Result<TResponse>>
    where TQuery : IQuery<TResponse>;

Domain Event Handler (src/{ProjectName}.Application/Abstractions/IDomainEventHandler.cs)

Bridges domain events to MediatR's notification pipeline. DomainEventNotification<T> wraps any IDomainEvent as an INotification, keeping the Domain layer free of MediatR references.

using MediatR;
using {Projects}.Domain.Common;

namespace {Projects}.Application.Abstractions;

/// <summary>
/// Wraps a domain event as a MediatR notification so it can be published
/// through the MediatR pipeline without coupling the Domain layer to MediatR.
/// </summary>
public sealed class DomainEventNotification<TDomainEvent>(TDomainEvent domainEvent)
    : INotification where TDomainEvent : IDomainEvent
{
    public TDomainEvent DomainEvent { get; } = domainEvent;
}

/// <summary>
/// Convenience interface for handling domain events via MediatR.
/// Implement this instead of INotificationHandler&lt;DomainEventNotification&lt;T&gt;&gt; directly.
/// </summary>
public interface IDomainEventHandler<TDomainEvent>
    : INotificationHandler<DomainEventNotification<TDomainEvent>>
    where TDomainEvent : IDomainEvent;

Behaviors (src/{ProjectName}.Application/Behaviors/)

Pipeline behaviors are MediatR middleware. They wrap every request and can inspect, modify, or short-circuit the pipeline.

LoggingBehavior.cs

Logs the request name before handling and the elapsed time after. Uses Stopwatch for high-resolution timing.

using System.Diagnostics;
using MediatR;
using Microsoft.Extensions.Logging;

namespace {Projects}.Application.Behaviors;

public sealed class LoggingBehavior<TRequest, TResponse>(ILogger<LoggingBehavior<TRequest, TResponse>> logger) : IPipelineBehavior<TRequest, TResponse> where TRequest : IRequest<TResponse>
{
    public async Task<TResponse> Handle(TRequest request, RequestHandlerDelegate<TResponse> next, CancellationToken cancellationToken)
    {
        var requestName = typeof(TRequest).Name;
        logger.LogInformation("Handling {RequestName}", requestName);
        var stopwatch = Stopwatch.StartNew();
        var response = await next(cancellationToken);
        stopwatch.Stop();
        logger.LogInformation("Handled {RequestName} in {ElapsedMilliseconds}ms", requestName, stopwatch.ElapsedMilliseconds);
        return response;
    }
}

ValidationBehavior.cs (Zero Reflection / High-Performance)

Runs all registered IValidator<TRequest> validators before the handler executes. If any validation fails, it short-circuits the pipeline and returns a Result.Failure — the handler is never invoked.

The key innovation is the where TResponse : Result, IValidationResult constraint combined with the static abstract method on IValidationResult. This allows TResponse.Failure(error) to be called directly — no reflection, no Activator.CreateInstance, fully AOT-compatible.

using FluentValidation;
using MediatR;
using Microsoft.Extensions.Logging;
using {Projects}.Domain.Common;

namespace {Projects}.Application.Behaviors;

public sealed class ValidationBehavior<TRequest, TResponse>(
    IEnumerable<IValidator<TRequest>> validators,
    ILogger<ValidationBehavior<TRequest, TResponse>> logger)
    : IPipelineBehavior<TRequest, TResponse>
    where TRequest : IRequest<TResponse>
    where TResponse : Result, IValidationResult
{
    public async Task<TResponse> Handle(
        TRequest request,
        RequestHandlerDelegate<TResponse> next,
        CancellationToken cancellationToken)
    {
        var validatorList = validators as IReadOnlyList<IValidator<TRequest>> ?? [.. validators];

        if (validatorList.Count == 0)
            return await next(cancellationToken);

        var context = new ValidationContext<TRequest>(request);

        var validationResults = await Task.WhenAll(
            validatorList.Select(v => v.ValidateAsync(context, cancellationToken)));

        var failures = validationResults
            .SelectMany(r => r.Errors)
            .Where(f => f is not null)
            .ToList();

        if (failures.Count != 0)
        {
            var errorMessage = string.Join("; ", failures.Select(f => f.ErrorMessage));
            var error = new Error("Validation", errorMessage, ErrorType.Validation);

            logger.LogWarning(
                "Validation failed for {RequestName}: {ErrorMessage}",
                typeof(TRequest).Name,
                errorMessage);

            // Directly call the static abstract Failure method.
            // No reflection, 100% type-safe and high performance.
            return (TResponse)TResponse.Failure(error);
        }

        return await next(cancellationToken);
    }
}

Dependency Injection (src/{ProjectName}.Application/DependencyInjection.cs)

Registers MediatR (with pipeline behaviors in order) and FluentValidation validators (auto-discovered from the assembly).

using FluentValidation;
using MediatR;
using Microsoft.Extensions.DependencyInjection;
using {Projects}.Application.Behaviors;

namespace {Projects}.Application;

public static class DependencyInjection
{
    public static IServiceCollection AddApplication(this IServiceCollection services)
    {
        var assembly = typeof(DependencyInjection).Assembly;

        services.AddMediatR(cfg =>
        {
            cfg.RegisterServicesFromAssembly(assembly);
            cfg.AddBehavior(typeof(IPipelineBehavior<,>), typeof(LoggingBehavior<,>));
            cfg.AddBehavior(typeof(IPipelineBehavior<,>), typeof(ValidationBehavior<,>));
        });

        services.AddValidatorsFromAssembly(assembly);

        return services;
    }
}

Vertical Slice Convention

Customize: The folder structure below is a convention, not enforced by tooling. Adapt the nesting depth to your project's complexity.

When adding features, organize the Application layer by vertical slices rather than by technical concern (no Commands/, Queries/, Validators/ top-level folders). Each feature is a self-contained folder:

src/{ProjectName}.Application/
└── Features/
    └── {FeatureName}/
        ├── {Operation}/
        │   ├── {Operation}Command.cs        (or {Operation}Query.cs)
        │   ├── {Operation}CommandHandler.cs  (or {Operation}QueryHandler.cs)
        │   ├── {Operation}CommandValidator.cs (optional)
        │   └── {Operation}Response.cs        (optional — only if returning data)
        └── Mappings/
            └── {FeatureName}Mappings.cs      (static extension methods)

Example for an "Items" feature:

Features/
└── Items/
    ├── CreateItem/
    │   ├── CreateItemCommand.cs
    │   ├── CreateItemCommandHandler.cs
    │   └── CreateItemCommandValidator.cs
    ├── GetItem/
    │   ├── GetItemQuery.cs
    │   ├── GetItemQueryHandler.cs
    │   └── ItemResponse.cs
    └── Mappings/
        └── ItemMappings.cs

Why vertical slices?

  • Cohesion — everything needed for a use case is in one folder; no jumping between Commands/, Validators/, Handlers/.
  • Discoverability — new developers find related code immediately.
  • Safe deletion — removing a feature means deleting one folder.
  • Reduced merge conflicts — different developers working on different features rarely touch the same files.

MediatR auto-discovers all IRequestHandler<,> and IValidator<> implementations from the assembly scan (configured in DependencyInjection.cs), so no manual registration is needed when adding new slices.

Mapping Convention

Customize: If your project grows large enough to benefit from auto-mapping, you can introduce Mapster or AutoMapper later. Start with manual mapping.

Use static extension methods for mapping between domain entities and response DTOs. Keep mapping logic close to the feature that uses it:

// Features/Items/Mappings/ItemMappings.cs
namespace {Projects}.Application.Features.Items.Mappings;

public static class ItemMappings
{
    public static ItemResponse ToResponse(this Item entity) => new(
        entity.Id,
        entity.Name,
        entity.CreatedAt);
}

Why manual mapping over AutoMapper/Mapster?

  • Zero magic — mappings are plain C# code, fully debuggable with F12 / Go to Definition.
  • Compile-time safety — missing properties cause build errors, not runtime surprises.
  • No hidden performance costs — no reflection, no expression compilation, no global configuration scanning.
  • Co-located — the mapping lives next to the feature that uses it.

API Versioning Convention

Customize: The versioning strategy (URL path) is baked in. The version numbers and deprecation schedule are project-specific.

Controllers use URL path versioning via Asp.Versioning.Mvc. Decorate controllers with [ApiVersion] and use [Route("api/v{version:apiVersion}/[controller]")]:

using Asp.Versioning;

[ApiVersion(1.0)]
[ApiController]
[Route("api/v{version:apiVersion}/[controller]")]
public class ItemsController : ControllerBase
{
    // All endpoints in this controller are v1
    // URL: /api/v1/items
}

When introducing breaking changes, add a new version:

[ApiVersion(2.0)]
[ApiController]
[Route("api/v{version:apiVersion}/[controller]")]
public class ItemsV2Controller : ControllerBase
{
    // URL: /api/v2/items
}

To deprecate an older version: [ApiVersion(1.0, Deprecated = true)].


10. Layer 3: Infrastructure

Implements the abstractions defined in Domain and Application. Contains the EF Core DbContext and interceptors.

Why Infrastructure is separate from Application: The Application layer defines what operations are needed (via IUnitOfWork and feature-specific repository interfaces). Infrastructure provides how they're implemented (via EF Core, PostgreSQL, etc.). If you ever need to swap databases or add a caching layer, you change Infrastructure — Application and Domain remain untouched.

Interceptors (src/{ProjectName}.Infrastructure/Persistence/Interceptors/)

EF Core interceptors hook into the SaveChanges pipeline. They keep cross-cutting concerns out of the DbContext itself, making them individually testable and composable.

AuditableEntityInterceptor.cs

Automatically sets CreatedAt (on insert) and UpdatedAt (on update) for any entity that extends AuditableEntity. This means domain code never needs to manually set timestamps.

using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Diagnostics;
using {Projects}.Domain.Common;

namespace {Projects}.Infrastructure.Persistence.Interceptors;

public sealed class AuditableEntityInterceptor : SaveChangesInterceptor
{
    public override ValueTask<InterceptionResult<int>> SavingChangesAsync(DbContextEventData eventData, InterceptionResult<int> result, CancellationToken cancellationToken = default)
    {
        UpdateAuditableEntities(eventData.Context);
        return base.SavingChangesAsync(eventData, result, cancellationToken);
    }

    private static void UpdateAuditableEntities(DbContext? context)
    {
        if (context is null) return;
        var utcNow = DateTime.UtcNow;
        foreach (var entry in context.ChangeTracker.Entries<AuditableEntity>())
        {
            if (entry.State == EntityState.Added) entry.Entity.SetCreatedAt(utcNow);
            if (entry.State == EntityState.Modified) entry.Entity.SetUpdatedAt(utcNow);
        }
    }
}

DomainEventInterceptor.cs

Dispatches domain events after SaveChanges completes successfully. This ensures events are only published when the database transaction has committed. Events are collected from all tracked entities, the entity event lists are cleared, and each event is published through MediatR as a DomainEventNotification<T>.

using MediatR;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Diagnostics;
using {Projects}.Application.Abstractions;
using {Projects}.Domain.Common;

namespace {Projects}.Infrastructure.Persistence.Interceptors;

public sealed class DomainEventInterceptor(IPublisher publisher) : SaveChangesInterceptor
{
    public override async ValueTask<int> SavedChangesAsync(SaveChangesCompletedEventData eventData, int result, CancellationToken cancellationToken = default)
    {
        if (eventData.Context is not null)
            await PublishDomainEventsAsync(eventData.Context, cancellationToken);
        return await base.SavedChangesAsync(eventData, result, cancellationToken);
    }

    private async Task PublishDomainEventsAsync(DbContext context, CancellationToken cancellationToken)
    {
        var entities = context.ChangeTracker.Entries<BaseEntity>().Where(e => e.Entity.DomainEvents.Count != 0).Select(e => e.Entity).ToList();
        var domainEvents = entities.SelectMany(e => e.DomainEvents).ToList();
        entities.ForEach(e => e.ClearDomainEvents());

        foreach (var domainEvent in domainEvents)
        {
            var notificationType = typeof(DomainEventNotification<>).MakeGenericType(domainEvent.GetType());
            var notification = Activator.CreateInstance(notificationType, domainEvent)!;
            await publisher.Publish(notification, cancellationToken);
        }
    }
}

Database Context (src/{ProjectName}.Infrastructure/Persistence/ApplicationDbContext.cs)

The EF Core DbContext. It also implements IUnitOfWork — calling SaveChangesAsync on the context fulfills the unit-of-work contract. Entity configurations are auto-discovered from the Infrastructure assembly via ApplyConfigurationsFromAssembly.

using Microsoft.EntityFrameworkCore;
using {Projects}.Domain.Abstractions;

namespace {Projects}.Infrastructure.Persistence;

public sealed class ApplicationDbContext(DbContextOptions<ApplicationDbContext> options) : DbContext(options), IUnitOfWork
{
    protected override void OnModelCreating(ModelBuilder modelBuilder)
    {
        modelBuilder.ApplyConfigurationsFromAssembly(typeof(ApplicationDbContext).Assembly);
        base.OnModelCreating(modelBuilder);
    }
}

Dependency Injection (src/{ProjectName}.Infrastructure/DependencyInjection.cs)

Registers the interceptors, DbContext (with PostgreSQL and interceptors wired in), and IUnitOfWork. Handlers access data directly through ApplicationDbContext — no generic repository abstraction. For complex data access patterns, create feature-specific repository interfaces in the Domain layer (e.g., IMatchRepository).

using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using {Projects}.Domain.Abstractions;
using {Projects}.Infrastructure.Persistence;
using {Projects}.Infrastructure.Persistence.Interceptors;

namespace {Projects}.Infrastructure;

public static class DependencyInjection
{
    public static IServiceCollection AddInfrastructure(this IServiceCollection services, IConfiguration configuration)
    {
        services.AddSingleton<AuditableEntityInterceptor>();
        services.AddScoped<DomainEventInterceptor>();

        services.AddDbContext<ApplicationDbContext>((sp, options) =>
        {
            var auditableInterceptor = sp.GetRequiredService<AuditableEntityInterceptor>();
            var domainEventInterceptor = sp.GetRequiredService<DomainEventInterceptor>();

            options.UseNpgsql(configuration.GetConnectionString("DefaultConnection"))
                .AddInterceptors(auditableInterceptor, domainEventInterceptor);
        });

        services.AddScoped<IUnitOfWork>(sp => sp.GetRequiredService<ApplicationDbContext>());

        return services;
    }
}

Why AuditableEntityInterceptor is Singleton: It has no mutable state and doesn't depend on scoped services — a single instance is reused across all requests, avoiding unnecessary allocations.

Why DomainEventInterceptor is Scoped: It depends on IPublisher (MediatR), which is scoped to the HTTP request. Using a scoped lifetime ensures events are published through the correct scope.


11. Layer 4: API / Presentation

The outermost layer. Contains the ASP.NET Core web host, middleware, configuration, and the composition root where all layers are wired together.

Why the API layer exists separately: It is the composition root — the only place where all layers meet. Controllers receive HTTP requests, translate them into MediatR commands/queries, and map results back to HTTP responses. By keeping this layer thin, you ensure that business logic stays in Application and domain rules stay in Domain.

src/{ProjectName}.Api/appsettings.json

Main configuration file. Defines the connection string placeholder, Serilog configuration (structured console and file output with correlation ID, daily rolling log files), request logging options, and allowed hosts.

{
  "ConnectionStrings": {
    "DefaultConnection": ""
  },
  "Serilog": {
    "Using": ["Serilog.Sinks.Console", "Serilog.Sinks.File"],
    "MinimumLevel": {
      "Default": "Information",
      "Override": { "Microsoft.AspNetCore": "Warning", "Microsoft.EntityFrameworkCore": "Warning" }
    },
    "WriteTo": [
      { "Name": "Console", "Args": { "outputTemplate": "[{Timestamp:HH:mm:ss} {Level:u3}] {SourceContext}{CorrelationId: [{CorrelationId}]} {Message:lj}{NewLine}{Exception}" } },
      { "Name": "File", "Args": { "path": "logs/log-.txt", "rollingInterval": "Day", "retainedFileCountLimit": 7, "outputTemplate": "[{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} {Level:u3}] {SourceContext}{CorrelationId: [{CorrelationId}]} {Message:lj}{NewLine}{Exception}" } }
    ],
    "Enrich": ["FromLogContext", "WithMachineName", "WithThreadId"]
  },
  "RequestLogging": {
    "MaxBodySizeBytes": 65536,
    "SlowRequestThresholdMs": 500,
    "CorrelationIdHeader": "X-Correlation-Id",
    "SensitiveFields": ["password", "token", "secret", "authorization", "creditCard", "ssn", "accessToken", "refreshToken"],
    "LoggableContentTypes": ["application/json"],
    "EnableRequestBodyLogging": true,
    "EnableResponseBodyLogging": true,
    "ExcludedPaths": ["/health"]
  },
  "AllowedHosts": "*"
}

src/{ProjectName}.Api/appsettings.Development.json

Development overrides. Lowers the minimum log level to Debug for richer output during development, raises the slow request threshold to avoid noise, and provides a local PostgreSQL connection string.

{
  "ConnectionStrings": {
    "DefaultConnection": "Host=localhost;Port=5432;Database={projects}_dev;Username=postgres;Password=postgres"
  },
  "Serilog": {
    "MinimumLevel": {
      "Default": "Debug",
      "Override": {
        "Microsoft.AspNetCore": "Information",
        "Microsoft.EntityFrameworkCore": "Information"
      }
    }
  },
  "RequestLogging": {
    "SlowRequestThresholdMs": 1000
  }
}

src/{ProjectName}.Api/Configuration/RequestLoggingOptions.cs

Strongly-typed options class for the request logging middleware. Bound from the RequestLogging section of appsettings.json via IOptions<RequestLoggingOptions>. All values have sensible defaults so the middleware works out of the box even without explicit configuration.

namespace {Projects}.Api.Configuration;

public sealed class RequestLoggingOptions
{
    public const string SectionName = "RequestLogging";

    /// <summary>
    /// Maximum request/response body size (in bytes) to capture in logs.
    /// Bodies exceeding this limit are truncated. Default: 65,536 (64 KB).
    /// </summary>
    public int MaxBodySizeBytes { get; set; } = 65_536;

    /// <summary>
    /// Requests exceeding this duration (in milliseconds) are logged at Warning level.
    /// Default: 500ms.
    /// </summary>
    public int SlowRequestThresholdMs { get; set; } = 500;

    /// <summary>
    /// The HTTP header name used for correlation ID propagation.
    /// If the header is present on the incoming request, its value is reused;
    /// otherwise a new GUID is generated. The correlation ID is always returned
    /// in the response headers.
    /// </summary>
    public string CorrelationIdHeader { get; set; } = "X-Correlation-Id";

    /// <summary>
    /// JSON field names whose values should be replaced with a redaction placeholder
    /// before logging request/response bodies. Matching is case-insensitive.
    /// </summary>
    public List<string> SensitiveFields { get; set; } =
    [
        "password",
        "token",
        "secret",
        "authorization",
        "creditCard",
        "ssn",
        "accessToken",
        "refreshToken"
    ];

    /// <summary>
    /// Content types for which request/response body logging is enabled.
    /// Only JSON content types are included by default because
    /// <see cref="SensitiveDataRedactor"/> only redacts JSON payloads.
    /// Adding non-JSON types (XML, plain text) will cause sensitive data in
    /// those formats to be logged unredacted.
    /// </summary>
    public List<string> LoggableContentTypes { get; set; } =
    [
        "application/json"
    ];

    /// <summary>
    /// When true, request bodies are captured and logged.
    /// </summary>
    public bool EnableRequestBodyLogging { get; set; } = true;

    /// <summary>
    /// When true, response bodies are captured and logged.
    /// </summary>
    public bool EnableResponseBodyLogging { get; set; } = true;

    /// <summary>
    /// Request paths that should be excluded from logging entirely.
    /// Useful for high-frequency endpoints like health checks and readiness probes
    /// that would otherwise generate excessive log noise.
    /// </summary>
    public List<string> ExcludedPaths { get; set; } =
    [
        "/health"
    ];
}

src/{ProjectName}.Api/Middleware/SensitiveDataRedactor.cs

Redacts sensitive field values from JSON content before it reaches the logs. This prevents passwords, tokens, and PII from being stored in log aggregation systems.

Dual-strategy approach:

  1. JSON DOM path — for valid JSON, parses the content into a JsonNode tree and recursively replaces sensitive field values with ***REDACTED***. This handles nested objects and arrays reliably.
  2. Regex fallback — for invalid/truncated JSON (e.g., bodies that exceeded MaxBodySizeBytes), uses a pre-compiled regex that matches "sensitiveField": "value" patterns. The regex has a 100ms timeout to prevent ReDoS attacks.
using System.Text.Json;
using System.Text.Json.Nodes;
using System.Text.RegularExpressions;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using {Projects}.Api.Configuration;

namespace {Projects}.Api.Middleware;

/// <summary>
/// Redacts sensitive field values from content before it is written to logs.
/// Field names to redact are configured via <see cref="RequestLoggingOptions.SensitiveFields"/>.
/// For valid JSON, uses a DOM-based approach for reliable recursive redaction.
/// For invalid/truncated JSON, falls back to regex-based pattern matching.
/// </summary>
public sealed class SensitiveDataRedactor
{
    private const string RedactedPlaceholder = "***REDACTED***";

    private static readonly JsonSerializerOptions SerializerOptions = new() { WriteIndented = false };

    private readonly ILogger<SensitiveDataRedactor> _logger;
    private readonly HashSet<string> _sensitiveFields;
    private readonly Regex _fallbackRegex;

    public SensitiveDataRedactor(
        ILogger<SensitiveDataRedactor> logger,
        IOptions<RequestLoggingOptions> options)
    {
        _logger = logger;
        _sensitiveFields = new HashSet<string>(
            options.Value.SensitiveFields,
            StringComparer.OrdinalIgnoreCase);

        // Build a regex that matches JSON key-value pairs for any sensitive field name.
        // Pattern: "fieldName" : "..." — matches quoted string values only.
        // Non-string values (numbers, booleans, null) are not matched by this regex;
        // those are handled by the JSON DOM path for valid JSON.
        if (_sensitiveFields.Count > 0)
        {
            var escapedFields = _sensitiveFields.Select(Regex.Escape);
            var alternation = string.Join("|", escapedFields);

            var pattern = $"""
                (?<="(?:{alternation})"\s*:\s*)"(?:[^"\\]|\\.)*"
                """;

            _fallbackRegex = new Regex(
                pattern.Trim(),
                RegexOptions.IgnoreCase | RegexOptions.Compiled,
                matchTimeout: TimeSpan.FromMilliseconds(100));
        }
        else
        {
            // No sensitive fields configured — use a regex that never matches.
            _fallbackRegex = new Regex(
                "(?!)",
                RegexOptions.Compiled,
                matchTimeout: TimeSpan.FromMilliseconds(100));
        }
    }

    /// <summary>
    /// Redacts values of sensitive fields in the provided content.
    /// Attempts JSON DOM-based redaction first. If the content is not valid JSON
    /// (e.g. truncated bodies), falls back to regex-based pattern matching.
    /// </summary>
    public string Redact(string content)
    {
        if (string.IsNullOrWhiteSpace(content))
            return content;

        try
        {
            var node = JsonNode.Parse(content);

            if (node is null)
                return content;

            RedactNode(node);

            return node.ToJsonString(SerializerOptions);
        }
        catch (JsonException)
        {
            // Content is not valid JSON (e.g. truncated). Fall back to regex redaction.
            return RedactWithRegex(content);
        }
    }

    /// <summary>
    /// Regex-based fallback for redacting sensitive values in content that is not
    /// parseable as JSON (e.g. truncated bodies). Replaces quoted string values
    /// following sensitive field names with the redaction placeholder.
    /// </summary>
    private string RedactWithRegex(string content)
    {
        try
        {
            return _fallbackRegex.Replace(content, $"\"{RedactedPlaceholder}\"");
        }
        catch (RegexMatchTimeoutException)
        {
            _logger.LogWarning(
                "Sensitive data redaction regex timed out — body redacted entirely to prevent sensitive data exposure");
            return "[REDACTION FAILED — BODY SUPPRESSED]";
        }
    }

    private void RedactNode(JsonNode node)
    {
        switch (node)
        {
            case JsonObject jsonObject:
                RedactObject(jsonObject);
                break;

            case JsonArray jsonArray:
                RedactArray(jsonArray);
                break;
        }
    }

    private void RedactObject(JsonObject jsonObject)
    {
        var propertyNames = jsonObject.Select(p => p.Key).ToList();

        foreach (var name in propertyNames)
        {
            if (_sensitiveFields.Contains(name))
            {
                jsonObject[name] = RedactedPlaceholder;
                continue;
            }

            var child = jsonObject[name];

            if (child is not null)
                RedactNode(child);
        }
    }

    private void RedactArray(JsonArray jsonArray)
    {
        foreach (var element in jsonArray)
        {
            if (element is not null)
                RedactNode(element);
        }
    }
}

src/{ProjectName}.Api/Middleware/RequestLoggingMiddleware.cs

Comprehensive HTTP request/response logging middleware. This is the largest single file in the boilerplate (~450 lines) because it handles many concerns carefully:

  • Correlation ID tracking — accepts a client-supplied correlation ID (validated for safety) or generates a new GUID. The ID is pushed into Serilog's LogContext so all downstream log entries include it.
  • Request/response body capture — uses EnableBuffering() for the request stream and a MemoryStream swap for the response stream. Both are size-limited to MaxBodySizeBytes.
  • Sensitive data redaction — bodies are passed through SensitiveDataRedactor before logging.
  • Slow request warnings — requests exceeding SlowRequestThresholdMs trigger a warning-level log.
  • Path exclusion — health checks and other high-frequency endpoints can be excluded to reduce log noise.
  • Security hardening — correlation IDs are validated against a safe character set, client IPs are sanitized to prevent log injection, and UTF-8 truncation respects character boundaries.
using System.Buffers;
using System.Diagnostics;
using System.Text;
using System.Text.RegularExpressions;
using Microsoft.Extensions.Options;
using Serilog.Context;
using {Projects}.Api.Configuration;

namespace {Projects}.Api.Middleware;

/// <summary>
/// Middleware that provides comprehensive HTTP request/response logging including:
/// <list type="bullet">
///   <item>Correlation ID tracking (accept from header or generate)</item>
///   <item>Request and response body capture (with configurable size limits)</item>
///   <item>Sensitive data redaction in logged bodies</item>
///   <item>Slow-request performance warnings</item>
///   <item>Enriched Serilog LogContext (client IP, user agent, user identity, etc.)</item>
/// </list>
/// </summary>
public sealed partial class RequestLoggingMiddleware
{
    private const int MaxCorrelationIdLength = 128;

    private readonly RequestDelegate _next;
    private readonly ILogger<RequestLoggingMiddleware> _logger;
    private readonly RequestLoggingOptions _options;
    private readonly SensitiveDataRedactor _redactor;
    private readonly List<string> _excludedPathPrefixes;

    public RequestLoggingMiddleware(
        RequestDelegate next,
        ILogger<RequestLoggingMiddleware> logger,
        IOptions<RequestLoggingOptions> options,
        SensitiveDataRedactor redactor)
    {
        _next = next;
        _logger = logger;
        _options = options.Value;
        _redactor = redactor;
        _excludedPathPrefixes = _options.ExcludedPaths
            .Select(p => p.TrimEnd('/'))
            .ToList();
    }

    public async Task InvokeAsync(HttpContext context)
    {
        // Skip logging for excluded paths (prefix match, e.g. /health also covers /health/ready).
        var requestPath = context.Request.Path.ToString();

        if (IsExcludedPath(requestPath))
        {
            await _next(context);
            return;
        }

        var correlationId = GetOrCreateCorrelationId(context);
        context.Items["CorrelationId"] = correlationId;
        context.Response.OnStarting(() =>
        {
            context.Response.Headers[_options.CorrelationIdHeader] = correlationId;
            return Task.CompletedTask;
        });

        var clientIp = GetClientIp(context);
        var userAgent = context.Request.Headers.UserAgent.ToString();

        // Push enrichment properties into Serilog's LogContext so that all
        // downstream log entries within this request scope include them automatically.
        using (LogContext.PushProperty("CorrelationId", correlationId))
        using (LogContext.PushProperty("ClientIp", clientIp))
        using (LogContext.PushProperty("UserAgent", userAgent))
        using (LogContext.PushProperty("RequestMethod", context.Request.Method))
        using (LogContext.PushProperty("RequestPath", requestPath))
        using (LogContext.PushProperty("QueryString", context.Request.QueryString.ToString()))
        using (LogContext.PushProperty("UserIdentity", context.User.Identity?.Name ?? "anonymous"))
        {
            var requestBody = await CaptureRequestBodyAsync(context);

            _logger.LogDebug(
                "HTTP {RequestMethod} {RequestPath}{QueryString} started",
                context.Request.Method,
                context.Request.Path,
                context.Request.QueryString);

            if (requestBody.Content.Length > 0)
            {
                _logger.LogDebug(
                    "Request body: {RequestBody}",
                    FormatBodyForLog(requestBody));
            }

            // Only allocate and swap the response body stream when response body logging is enabled.
            Stream? originalBodyStream = null;
            MemoryStream? responseBodyStream = null;

            if (_options.EnableResponseBodyLogging)
            {
                originalBodyStream = context.Response.Body;
                responseBodyStream = new MemoryStream();
                context.Response.Body = responseBodyStream;
            }

            // Start timing just before calling the next middleware so the elapsed time
            // reflects actual pipeline processing, not request body capture overhead.
            var stopwatch = Stopwatch.StartNew();

            try
            {
                await _next(context);
            }
            finally
            {
                stopwatch.Stop();
                var elapsedMs = stopwatch.ElapsedMilliseconds;

                var responseBody = CapturedBody.Empty;

                if (_options.EnableResponseBodyLogging
                    && responseBodyStream is not null
                    && originalBodyStream is not null)
                {
                    responseBody = await CaptureResponseBodySafeAsync(
                        context, responseBodyStream, originalBodyStream);
                }

                _logger.LogInformation(
                    "HTTP {RequestMethod} {RequestPath} completed {StatusCode} in {ElapsedMs}ms",
                    context.Request.Method,
                    context.Request.Path,
                    context.Response.StatusCode,
                    elapsedMs);

                if (responseBody.Content.Length > 0)
                {
                    _logger.LogDebug(
                        "Response body: {ResponseBody}",
                        FormatBodyForLog(responseBody));
                }

                if (elapsedMs > _options.SlowRequestThresholdMs)
                {
                    _logger.LogWarning(
                        "Slow request detected: HTTP {RequestMethod} {RequestPath} took {ElapsedMs}ms (threshold: {ThresholdMs}ms)",
                        context.Request.Method,
                        context.Request.Path,
                        elapsedMs,
                        _options.SlowRequestThresholdMs);
                }

                if (responseBodyStream is not null)
                    await responseBodyStream.DisposeAsync();
            }
        }
    }

    /// <summary>
    /// Redacts the body content first, then appends the truncation marker if needed.
    /// This ensures sensitive fields are redacted even in truncated bodies.
    /// </summary>
    private string FormatBodyForLog(CapturedBody body)
    {
        var redacted = _redactor.Redact(body.Content);

        return body.IsTruncated
            ? $"{redacted} ... [TRUNCATED - body exceeds {_options.MaxBodySizeBytes} bytes]"
            : redacted;
    }

    private bool IsExcludedPath(string path)
    {
        foreach (var prefix in _excludedPathPrefixes)
        {
            if (path.Equals(prefix, StringComparison.OrdinalIgnoreCase)
                || path.StartsWith(prefix + "/", StringComparison.OrdinalIgnoreCase))
            {
                return true;
            }
        }

        return false;
    }

    private string GetOrCreateCorrelationId(HttpContext context)
    {
        if (context.Request.Headers.TryGetValue(_options.CorrelationIdHeader, out var existingId)
            && !string.IsNullOrWhiteSpace(existingId))
        {
            var candidate = existingId.ToString();

            if (candidate.Length <= MaxCorrelationIdLength && SafeCorrelationIdRegex().IsMatch(candidate))
                return candidate;

            // Invalid or oversized correlation ID from client; generate a new one.
        }

        return Guid.NewGuid().ToString();
    }

    private async Task<CapturedBody> CaptureRequestBodyAsync(HttpContext context)
    {
        if (!_options.EnableRequestBodyLogging)
            return CapturedBody.Empty;

        if (!IsLoggableContentType(context.Request.ContentType))
            return CapturedBody.Empty;

        context.Request.EnableBuffering();

        // Read the request body at the byte level. EnableBuffering() wraps the stream
        // so it supports seeking, allowing us to reset the position after reading.
        var body = await ReadStreamBytesAsync(context.Request.Body, _options.MaxBodySizeBytes);

        context.Request.Body.Position = 0;

        return body;
    }

    /// <summary>
    /// Captures the response body from the memory stream and copies it to the original
    /// response stream. Wrapped in a try/catch so that a failure here (e.g. the response
    /// has already started on the original stream) does not mask the original exception.
    /// </summary>
    private async Task<CapturedBody> CaptureResponseBodySafeAsync(
        HttpContext context,
        MemoryStream responseBodyStream,
        Stream originalBodyStream)
    {
        try
        {
            responseBodyStream.Position = 0;

            var body = CapturedBody.Empty;

            if (IsLoggableContentType(context.Response.ContentType))
            {
                body = await ReadStreamFromMemoryAsync(responseBodyStream, _options.MaxBodySizeBytes);
            }

            responseBodyStream.Position = 0;
            await responseBodyStream.CopyToAsync(originalBodyStream);
            context.Response.Body = originalBodyStream;

            return body;
        }
        catch (Exception ex)
        {
            _logger.LogDebug(ex, "Failed to capture response body for logging");

            // Best-effort: try to restore the original body stream so the client gets a response.
            try
            {
                context.Response.Body = originalBodyStream;
            }
            catch
            {
                // Nothing more we can do.
            }

            return CapturedBody.Empty;
        }
    }

    /// <summary>
    /// Reads a stream at the byte level, suitable for forward-only streams (e.g. request body)
    /// where <c>stream.Length</c> may not be available before the stream is consumed.
    /// Uses <see cref="ArrayPool{T}"/> to avoid large object heap allocations.
    /// The limit is enforced in bytes to match <c>MaxBodySizeBytes</c>.
    /// Truncation respects UTF-8 character boundaries.
    /// </summary>
    private static async Task<CapturedBody> ReadStreamBytesAsync(Stream stream, int maxBytes)
    {
        stream.Position = 0;

        // Read one extra byte to detect whether the stream has more data beyond the limit.
        var readLimit = maxBytes + 1;
        var buffer = ArrayPool<byte>.Shared.Rent(readLimit);

        try
        {
            var totalRead = 0;

            while (totalRead < readLimit)
            {
                var bytesRead = await stream.ReadAsync(buffer.AsMemory(totalRead, readLimit - totalRead));

                if (bytesRead == 0)
                    break;

                totalRead += bytesRead;
            }

            var isTruncated = totalRead > maxBytes;
            var usableBytes = Math.Min(totalRead, maxBytes);

            // Adjust the truncation point to avoid splitting a multi-byte UTF-8 sequence.
            if (isTruncated)
                usableBytes = FindUtf8SafeTruncationPoint(buffer, usableBytes);

            var content = Encoding.UTF8.GetString(buffer, 0, usableBytes);

            return new CapturedBody(content, isTruncated);
        }
        finally
        {
            ArrayPool<byte>.Shared.Return(buffer);
        }
    }

    /// <summary>
    /// Reads a MemoryStream where <c>stream.Length</c> is reliable.
    /// Used for response body capture. Uses <see cref="ArrayPool{T}"/> to avoid
    /// per-request heap allocations.
    /// Truncation respects UTF-8 character boundaries.
    /// </summary>
    private static async Task<CapturedBody> ReadStreamFromMemoryAsync(MemoryStream stream, int maxBytes)
    {
        stream.Position = 0;

        var length = stream.Length;
        var bytesToRead = (int)Math.Min(maxBytes, length);
        var buffer = ArrayPool<byte>.Shared.Rent(bytesToRead);

        try
        {
            var bytesRead = await stream.ReadAsync(buffer.AsMemory(0, bytesToRead));

            var isTruncated = length > maxBytes;

            var usableBytes = bytesRead;

            // Adjust the truncation point to avoid splitting a multi-byte UTF-8 sequence.
            if (isTruncated)
                usableBytes = FindUtf8SafeTruncationPoint(buffer, bytesRead);

            var content = Encoding.UTF8.GetString(buffer, 0, usableBytes);

            return new CapturedBody(content, isTruncated);
        }
        finally
        {
            ArrayPool<byte>.Shared.Return(buffer);
        }
    }

    /// <summary>
    /// Walks backwards from <paramref name="length"/> to find a byte position that
    /// does not split a multi-byte UTF-8 character. UTF-8 continuation bytes have the
    /// bit pattern <c>10xxxxxx</c> (0x80..0xBF). If the byte at the truncation point
    /// is a continuation byte, we step back until we reach the leading byte of that
    /// character and exclude the incomplete sequence.
    /// </summary>
    private static int FindUtf8SafeTruncationPoint(byte[] buffer, int length)
    {
        if (length == 0)
            return 0;

        // Walk backwards over any continuation bytes (10xxxxxx).
        var i = length - 1;
        while (i > 0 && (buffer[i] & 0xC0) == 0x80)
            i--;

        // i now points at a leading byte (or byte 0). Determine the expected
        // character length from the leading byte.
        var leadByte = buffer[i];
        int expectedCharBytes;

        if ((leadByte & 0x80) == 0)
            expectedCharBytes = 1;        // 0xxxxxxx — ASCII
        else if ((leadByte & 0xE0) == 0xC0)
            expectedCharBytes = 2;        // 110xxxxx
        else if ((leadByte & 0xF0) == 0xE0)
            expectedCharBytes = 3;        // 1110xxxx
        else if ((leadByte & 0xF8) == 0xF0)
            expectedCharBytes = 4;        // 11110xxx
        else
            return i;                     // Invalid leading byte — truncate before it.

        // If the full character fits within the buffer, keep it; otherwise drop it.
        return i + expectedCharBytes <= length ? length : i;
    }

    private bool IsLoggableContentType(string? contentType)
    {
        if (string.IsNullOrWhiteSpace(contentType))
            return false;

        return _options.LoggableContentTypes.Exists(
            ct => contentType.Contains(ct, StringComparison.OrdinalIgnoreCase));
    }

    private static string GetClientIp(HttpContext context)
    {
        // Check for forwarded headers first (reverse proxy scenarios).
        var forwardedFor = context.Request.Headers["X-Forwarded-For"].FirstOrDefault();

        if (!string.IsNullOrWhiteSpace(forwardedFor))
        {
            // X-Forwarded-For may contain multiple IPs; the first is the original client.
            var ip = forwardedFor.Split(',', StringSplitOptions.TrimEntries)[0];
            return SanitizeForLog(ip);
        }

        return context.Connection.RemoteIpAddress?.ToString() ?? "unknown";
    }

    /// <summary>
    /// Strips control characters and newlines from a string to prevent log injection.
    /// Limits length to avoid unbounded values in log output.
    /// </summary>
    private static string SanitizeForLog(string value)
    {
        const int maxLength = 45; // Max length of an IPv6 address with zone ID

        if (value.Length > maxLength)
            value = value[..maxLength];

        return LogSanitizeRegex().Replace(value, string.Empty);
    }

    /// <summary>
    /// Matches control characters, newlines, and other non-printable characters
    /// that could be used for log injection.
    /// </summary>
    [GeneratedRegex(@"[\x00-\x1F\x7F]")]
    private static partial Regex LogSanitizeRegex();

    /// <summary>
    /// Matches safe correlation ID values: alphanumeric characters, hyphens, underscores,
    /// periods, and colons. Rejects control characters, braces (Serilog template injection),
    /// and other unsafe characters.
    /// </summary>
    [GeneratedRegex(@"^[\w\-.:]+$")]
    private static partial Regex SafeCorrelationIdRegex();

    /// <summary>
    /// Represents a captured request or response body along with a flag indicating
    /// whether the body was truncated to fit within the configured size limit.
    /// Separating the content from the truncation flag allows the redactor to operate
    /// on valid (non-truncated) content before the truncation marker is appended.
    /// </summary>
    private readonly record struct CapturedBody(string Content, bool IsTruncated)
    {
        public static readonly CapturedBody Empty = new(string.Empty, false);
    }
}

src/{ProjectName}.Api/Middleware/GlobalExceptionHandler.cs

Catches all unhandled exceptions and returns a standardized ProblemDetails response. In development, the exception message is included for debugging; in production, a generic message is returned to avoid leaking internals.

using System.Net;
using Microsoft.AspNetCore.Diagnostics;
using Microsoft.AspNetCore.Mvc;

namespace {Projects}.Api.Middleware;

public sealed class GlobalExceptionHandler(ILogger<GlobalExceptionHandler> logger) : IExceptionHandler
{
    public async ValueTask<bool> TryHandleAsync(HttpContext httpContext, Exception exception, CancellationToken cancellationToken)
    {
        logger.LogError(exception, "An unhandled exception occurred: {Message}", exception.Message);

        var problemDetails = new ProblemDetails
        {
            Status = (int)HttpStatusCode.InternalServerError,
            Title = "An unexpected error occurred",
            Detail = httpContext.RequestServices.GetRequiredService<IHostEnvironment>().IsDevelopment() ? exception.Message : "An internal server error has occurred.",
            Instance = httpContext.Request.Path
        };

        httpContext.Response.StatusCode = problemDetails.Status.Value;
        httpContext.Response.ContentType = "application/problem+json";
        await httpContext.Response.WriteAsJsonAsync(problemDetails, cancellationToken: cancellationToken);
        return true;
    }
}

src/{ProjectName}.Api/Extensions/ServiceCollectionExtensions.cs

The composition root's service registration. Wires together:

  • Serilog (reads config from appsettings.json)
  • Request logging options and redactor
  • Controllers, OpenAPI, exception handler, problem details
  • API versioning (URL path segment: /api/v1/)
  • Application layer (MediatR + behaviors + validators)
  • Infrastructure layer (EF Core + interceptors)
  • Health checks (PostgreSQL connectivity)
using Asp.Versioning;
using Serilog;
using {Projects}.Api.Configuration;
using {Projects}.Api.Middleware;
using {Projects}.Application;
using {Projects}.Infrastructure;

namespace {Projects}.Api.Extensions;

public static class ServiceCollectionExtensions
{
    public static WebApplicationBuilder AddServices(this WebApplicationBuilder builder)
    {
        builder.Host.UseSerilog((context, loggerConfiguration) =>
            loggerConfiguration.ReadFrom.Configuration(context.Configuration));

        builder.Services.Configure<RequestLoggingOptions>(
            builder.Configuration.GetSection(RequestLoggingOptions.SectionName));
        builder.Services.AddSingleton<SensitiveDataRedactor>();

        builder.Services.AddControllers();
        builder.Services.AddOpenApi();
        builder.Services.AddExceptionHandler<GlobalExceptionHandler>();
        builder.Services.AddProblemDetails();

        builder.Services.AddApiVersioning(options =>
        {
            options.DefaultApiVersion = new ApiVersion(1, 0);
            options.AssumeDefaultVersionWhenUnspecified = true;
            options.ReportApiVersions = true;
            options.ApiVersionReader = new UrlSegmentApiVersionReader();
        })
        .AddApiExplorer(options =>
        {
            options.GroupNameFormat = "'v'VVV";
            options.SubstituteApiVersionInUrl = true;
        });

        builder.Services.AddApplication();
        builder.Services.AddInfrastructure(builder.Configuration);

        var connectionString = builder.Configuration.GetConnectionString("DefaultConnection")
            ?? throw new InvalidOperationException(
                "Connection string 'DefaultConnection' is not configured.");

        builder.Services.AddHealthChecks()
            .AddNpgSql(connectionString);

        return builder;
    }
}

src/{ProjectName}.Api/Extensions/WebApplicationExtensions.cs

Configures the HTTP request pipeline (middleware order matters):

  1. OpenAPI endpoint (development only)
  2. RequestLoggingMiddleware — must come early to capture the full request lifecycle
  3. Exception handler — catches exceptions from downstream middleware
  4. HTTPS redirection
  5. Authorization
  6. Controller mapping
  7. Health check endpoint
using {Projects}.Api.Middleware;

namespace {Projects}.Api.Extensions;

public static class WebApplicationExtensions
{
    public static WebApplication ConfigurePipeline(this WebApplication app)
    {
        if (app.Environment.IsDevelopment())
            app.MapOpenApi();

        app.UseMiddleware<RequestLoggingMiddleware>();
        app.UseExceptionHandler();
        app.UseHttpsRedirection();
        app.UseAuthorization();
        app.MapControllers();
        app.MapHealthChecks("/health");

        return app;
    }
}

src/{ProjectName}.Api/Program.cs

The application entry point. Deliberately minimal — all setup is delegated to extension methods. The try/catch/finally ensures Serilog captures fatal startup errors and flushes all buffered log events on shutdown.

The public partial class Program; declaration at the end enables WebApplicationFactory<Program> in integration tests.

using Serilog;
using {Projects}.Api.Extensions;

var builder = WebApplication.CreateBuilder(args);

builder.AddServices();

var app = builder.Build();

app.ConfigurePipeline();

try
{
    Log.Information("Starting {Projects} API in {Environment} environment", app.Environment.EnvironmentName);
    app.Run();
}
catch (Exception ex)
{
    Log.Fatal(ex, "Application terminated unexpectedly");
}
finally
{
    Log.CloseAndFlush();
}

public partial class Program;

12. Test Projects

Test Strategy

The boilerplate scaffolds three test projects, each targeting a different layer and testing style:

Project Tests Style
{ProjectName}.Domain.Tests Entities, value objects, Result pattern, domain logic Pure unit tests — no mocks needed (Domain has zero dependencies)
{ProjectName}.Application.Tests Command/query handlers, validators, pipeline behaviors Unit tests with mocked ApplicationDbContext and IUnitOfWork
{ProjectName}.IntegrationTests Full HTTP request/response cycle through the API Integration tests using WebApplicationFactory<Program> with a real PostgreSQL instance

Shared test tooling across all projects:

  • xUnit — test framework ([Fact], [Theory])
  • FluentAssertions — expressive assertions (result.Should().Be(...))
  • Moq — mocking framework for isolating dependencies
  • coverlet.collector — code coverage collection (used by CI pipeline)

Integration test additions:

  • Microsoft.AspNetCore.Mvc.Testing — provides WebApplicationFactory<Program> for in-process HTTP testing

Test File Naming Convention

Test files follow the pattern {ClassUnderTest}Tests.cs. For example:

  • ResultTests.cs tests Result.cs
  • ValidationBehaviorTests.cs tests ValidationBehavior.cs

Project References

Each test project references only the layer it tests:

  • Domain.TestsDomain
  • Application.TestsApplication (which transitively includes Domain)
  • IntegrationTestsApi (which transitively includes everything)

This mirrors the Clean Architecture dependency rule — test projects never reach across layers.


13. CI/CD Pipeline

The CI/CD pipeline is split into two conceptual sections:

  1. Standard CI (test, SonarCloud, Qodana) — reusable as-is across projects. Just update repository variables.
  2. Deployment (Docker build/push, SSH deploy) — project-specific. Customize the deployment target, Tailscale config, and SSH details.

qodana.yaml

Configuration for JetBrains Qodana static analysis. Points to the .slnx solution file and uses the starter inspection profile.

#-------------------------------------------------------------------------------#
#               Qodana analysis is configured by qodana.yaml file               #
#             https://www.jetbrains.com/help/qodana/qodana-yaml.html            #
#-------------------------------------------------------------------------------#

#################################################################################
#              WARNING: Do not store sensitive information in this file,        #
#               as its contents will be included in the Qodana report.          #
#################################################################################
version: "1.0"

#Specify IDE code to run analysis without container (Applied in CI/CD pipeline)
ide: QDNET

#Specify the .NET solution to analyze
dotnet:
  solution: {Projects}.slnx

#Specify inspection profile for code analysis
profile:
  name: qodana.starter

#Enable inspections
#include:
#  - name: <SomeEnabledInspectionId>

#Disable inspections
#exclude:
#  - name: <SomeDisabledInspectionId>
#    paths:
#      - <path/where/not/run/inspection>

#Execute shell command before Qodana execution (Applied in CI/CD pipeline)
#bootstrap: sh ./prepare-qodana.sh

#Install IDE plugins before Qodana execution (Applied in CI/CD pipeline)
#plugins:
#  - id: <plugin.id> #(plugin id can be found at https://plugins.jetbrains.com)

# Quality gate. Will fail the CI/CD pipeline if any condition is not met
# severityThresholds - configures maximum thresholds for different problem severities
# testCoverageThresholds - configures minimum code coverage on a whole project and newly added code
# Code Coverage is available in Ultimate and Ultimate Plus plans
#failureConditions:
#  severityThresholds:
#    any: 15
#    critical: 5
#  testCoverageThresholds:
#    fresh: 70
#    total: 50

.github/workflows/ci.yml

Full CI/CD pipeline with four jobs:

name: CI/CD Pipeline

on:
  push:
    branches: [main, dev]
  pull_request:
    branches: [main, dev]

env:
  DOTNET_VERSION: "10.0.x"
  JAVA_VERSION: "17"
  DOCKER_IMAGE: ${{ vars.DOCKERHUB_USERNAME }}/{projects}-api

Job 1: Build & Test

Runs on every push and PR. Spins up a PostgreSQL service container, restores, builds, and runs all tests. Test results and coverage reports are uploaded as artifacts.

jobs:
  # ──────────────────────────────────────────────
  # Job 1: Build, test, and collect coverage
  # ──────────────────────────────────────────────
  test:
    name: Build & Test
    runs-on: ubuntu-latest

    services:
      postgres:
        image: postgres:17-alpine
        env:
          POSTGRES_DB: {projects}_test
          POSTGRES_USER: postgres
          POSTGRES_PASSWORD: postgres
        ports:
          - 5432:5432
        options: >-
          --health-cmd "pg_isready -U postgres"
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - name: Checkout repository
        uses: actions/checkout@v4

      - name: Setup .NET
        uses: actions/setup-dotnet@v4
        with:
          dotnet-version: ${{ env.DOTNET_VERSION }}

      - name: Restore dependencies
        run: dotnet restore

      - name: Build solution
        run: dotnet build --no-restore --configuration Release

      - name: Run tests
        run: >-
          dotnet test
          --no-build
          --configuration Release
          --logger "trx;LogFileName=test-results.trx"
          --collect:"XPlat Code Coverage"
          --results-directory ./TestResults
        env:
          ConnectionStrings__DefaultConnection: "Host=localhost;Port=5432;Database={projects}_test;Username=postgres;Password=postgres"

      - name: Upload test results
        uses: actions/upload-artifact@v4
        if: always()
        with:
          name: test-results
          path: ./TestResults
          retention-days: 7

Job 2: SonarCloud Analysis

Runs after tests pass. Performs static analysis and code coverage reporting via SonarCloud. Requires SONAR_TOKEN secret and SONAR_PROJECT_KEY / SONAR_ORGANIZATION_KEY variables.

  # ──────────────────────────────────────────────
  # Job 2: SonarCloud analysis
  # ──────────────────────────────────────────────
  sonar:
    name: SonarCloud Analysis
    runs-on: ubuntu-latest
    needs: test

    services:
      postgres:
        image: postgres:17-alpine
        env:
          POSTGRES_DB: {projects}_test
          POSTGRES_USER: postgres
          POSTGRES_PASSWORD: postgres
        ports:
          - 5432:5432
        options: >-
          --health-cmd "pg_isready -U postgres"
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - name: Checkout repository
        uses: actions/checkout@v4
        with:
          fetch-depth: 0

      - name: Setup .NET
        uses: actions/setup-dotnet@v4
        with:
          dotnet-version: ${{ env.DOTNET_VERSION }}

      - name: Setup Java
        uses: actions/setup-java@v4
        with:
          distribution: "temurin"
          java-version: ${{ env.JAVA_VERSION }}

      - name: Install SonarScanner
        run: dotnet tool install --global dotnet-sonarscanner

      - name: Begin SonarCloud analysis
        env:
          SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
        run: >-
          dotnet sonarscanner begin
          /k:"${{ vars.SONAR_PROJECT_KEY }}"
          /o:"${{ vars.SONAR_ORGANIZATION_KEY }}"
          /d:sonar.token="${{ secrets.SONAR_TOKEN }}"
          /d:sonar.host.url="https://sonarcloud.io"
          /d:sonar.cs.opencover.reportsPaths="**/TestResults/**/coverage.opencover.xml"
          /d:sonar.exclusions="**/obj/**,**/bin/**"
          /d:sonar.coverage.exclusions="**/obj/**,**/bin/**,**/Migrations/**"

      - name: Build solution
        run: dotnet build --configuration Release

      - name: Run tests with coverage
        run: >-
          dotnet test
          --no-build
          --configuration Release
          --collect:"XPlat Code Coverage;Format=opencover"
          --results-directory ./TestResults
        env:
          ConnectionStrings__DefaultConnection: "Host=localhost;Port=5432;Database={projects}_test;Username=postgres;Password=postgres"

      - name: End SonarCloud analysis
        env:
          SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
        run: dotnet sonarscanner end /d:sonar.token="${{ secrets.SONAR_TOKEN }}"

      - name: Check SonarCloud Quality Gate
        uses: sonarsource/sonarqube-quality-gate-action@v1.2.0
        timeout-minutes: 5
        continue-on-error: true
        with:
          scanMetadataReportFile: .sonarqube/out/.sonar/report-task.txt
        env:
          SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}

Job 3: Qodana Analysis

Runs in parallel with SonarCloud (both depend on test). Uses JetBrains Qodana for .NET static analysis. Requires QODANA_TOKEN secret and contents: write permissions for PR annotations.

  # ──────────────────────────────────────────────
  # Job 3: Qodana analysis (parallel with SonarCloud)
  # ──────────────────────────────────────────────
  qodana:
    name: Qodana Analysis
    runs-on: ubuntu-latest
    needs: test

    permissions:
      contents: write
      pull-requests: write
      checks: write

    steps:
      - name: Checkout repository
        uses: actions/checkout@v4
        with:
          fetch-depth: 0

      - name: Qodana Scan
        uses: JetBrains/qodana-action@v2025.1
        env:
          QODANA_TOKEN: ${{ secrets.QODANA_TOKEN }}

Job 4: Deploy to Server

Only runs on pushes to dev (not PRs, not main). Builds a Docker image, pushes it to Docker Hub, then deploys to a remote server via SSH over Tailscale. This job is project-specific — customize the deployment target, Tailscale OAuth credentials, and SSH details.

  # ──────────────────────────────────────────────
  # Job 4: Build, push, and deploy
  # ──────────────────────────────────────────────
  deploy:
    name: Deploy to Server
    runs-on: ubuntu-latest
    needs: [sonar, qodana]
    if: github.ref == 'refs/heads/dev' && github.event_name == 'push'
    environment: Development

    steps:
      - name: Checkout repository
        uses: actions/checkout@v4

      - name: Login to Docker Hub
        uses: docker/login-action@v3
        with:
          username: ${{ vars.DOCKERHUB_USERNAME }}
          password: ${{ secrets.DOCKERHUB_TOKEN }}

      - name: Set up Docker Buildx
        uses: docker/setup-buildx-action@v3

      - name: Build and push Docker image
        uses: docker/build-push-action@v6
        with:
          context: .
          push: true
          tags: |
            ${{ env.DOCKER_IMAGE }}:latest
            ${{ env.DOCKER_IMAGE }}:${{ github.sha }}
          cache-from: type=gha
          cache-to: type=gha,mode=max

      - name: Connect to Tailscale
        uses: tailscale/github-action@v4
        with:
          oauth-client-id: ${{ vars.TS_OAUTH_CLIENT_ID }}
          oauth-secret: ${{ secrets.TS_OAUTH_SECRET }}
          tags: tag:ci

      - name: Copy compose file to server
        uses: appleboy/scp-action@v1.0.0
        with:
          host: ${{ vars.SSH_HOST }}
          username: ${{ vars.SSH_USERNAME }}
          key: ${{ secrets.SSH_KEY }}
          port: ${{ vars.SSH_PORT }}
          source: "compose.yml,.env.example"
          target: ${{ vars.DEPLOY_PATH }}

      - name: Deploy via SSH
        uses: appleboy/ssh-action@v1.2.0
        with:
          host: ${{ vars.SSH_HOST }}
          username: ${{ vars.SSH_USERNAME }}
          key: ${{ secrets.SSH_KEY }}
          port: ${{ vars.SSH_PORT }}
          envs: DOCKER_IMAGE,IMAGE_TAG
          script: |
            cd ${{ vars.DEPLOY_PATH }}
            if [ ! -f .env ]; then
              cp .env.example .env
              chmod 600 .env
            fi

            # Update DOCKER_IMAGE and IMAGE_TAG in .env with values from CI
            sed -i "s|^DOCKER_IMAGE=.*|DOCKER_IMAGE=${DOCKER_IMAGE}|" .env
            sed -i "s|^IMAGE_TAG=.*|IMAGE_TAG=${IMAGE_TAG}|" .env

            docker compose pull
            docker compose up -d
            sleep 10

            # Verify all expected services are running
            EXPECTED=2
            RUNNING=$(docker compose ps --status running --quiet | wc -l)
            if [ "$RUNNING" -lt "$EXPECTED" ]; then
              echo "Expected $EXPECTED services, found $RUNNING running"
              docker compose logs --tail=50
              exit 1
            fi
        env:
          DOCKER_IMAGE: ${{ env.DOCKER_IMAGE }}
          IMAGE_TAG: ${{ github.sha }}

Required GitHub Secrets and Variables

Type Name Description
Secret SONAR_TOKEN SonarCloud authentication token
Secret QODANA_TOKEN Qodana Cloud authentication token
Secret DOCKERHUB_TOKEN Docker Hub access token
Secret SSH_KEY Private SSH key for deployment server
Secret TS_OAUTH_SECRET Tailscale OAuth secret
Variable DOCKERHUB_USERNAME Docker Hub username
Variable SONAR_PROJECT_KEY SonarCloud project key
Variable SONAR_ORGANIZATION_KEY SonarCloud organization key
Variable SSH_HOST Deployment server hostname (Tailscale IP)
Variable SSH_USERNAME SSH username on deployment server
Variable SSH_PORT SSH port on deployment server
Variable TS_OAUTH_CLIENT_ID Tailscale OAuth client ID
Variable DEPLOY_PATH Path on server where app is deployed

14. AI-Assisted Development

.github/copilot-instructions.md

This file provides project-specific context to GitHub Copilot (and other AI assistants that read it). It documents:

  • Git commit conventions — execute commits directly, no Co-authored-by trailers
  • Build & run commandsdotnet build, dotnet run, dotnet test, EF Core migrations
  • Architecture — Clean Architecture dependency flow, layer responsibilities
  • Key conventions — CQRS with MediatR, Result pattern, Domain layer rules
  • Testing conventions — xUnit, Moq, FluentAssertions, WebApplicationFactory<Program>
  • Tech stack — .NET 10, PostgreSQL, MediatR 14, FluentValidation 12, Serilog

This file is automatically picked up by Copilot in VS Code and GitHub.com, providing contextual suggestions that align with the project's architecture and conventions.


15. Running the Project

Local Development (without Docker)

# Start PostgreSQL (via Docker or locally)
docker compose up db -d

# Run the API
dotnet run --project src/{Projects}.Api

# API available at http://localhost:5212
# Health check: http://localhost:5212/health
# OpenAPI spec: http://localhost:5212/openapi/v1.json (dev only)

Docker Compose (full stack)

# Build and start everything
docker compose up --build -d

# API available at http://localhost:5212
# Health check: http://localhost:5212/health

Running Tests

# All tests
dotnet test

# Specific test project
dotnet test tests/{Projects}.Domain.Tests
dotnet test tests/{Projects}.Application.Tests
dotnet test tests/{Projects}.IntegrationTests

# Single test by name
dotnet test --filter "FullyQualifiedName~MyTestMethod"

EF Core Migrations

# Add a new migration
dotnet ef migrations add <MigrationName> \
  --project src/{Projects}.Infrastructure \
  --startup-project src/{Projects}.Api

# Apply migrations
dotnet ef database update \
  --project src/{Projects}.Infrastructure \
  --startup-project src/{Projects}.Api

Appendix A: Outbox Pattern (Optional)

This section is an optional enhancement. The boilerplate uses EF Core's DomainEventInterceptor to dispatch domain events in-process during SaveChanges. The outbox pattern is the next evolution — use it when you need guaranteed delivery of domain events to external systems (message brokers, other microservices) with at-least-once semantics.

The Problem

The current DomainEventInterceptor dispatches events via MediatR inside the SaveChanges pipeline. This works well for in-process handlers (updating read models, sending notifications, etc.), but has two limitations:

  1. No guarantee of delivery to external systems — if the application crashes after SaveChanges but before an external message is sent, the event is lost.
  2. Coupling to the transaction boundary — if an event handler calls an external API or publishes to a message broker, you're mixing I/O with the database transaction.

The Outbox Pattern Solution

Instead of dispatching events immediately, write them as rows in an OutboxMessages table within the same database transaction as the business data. A background job (Quartz.NET, which is already included in this boilerplate) polls the table and publishes events to external consumers.

Flow:

  1. Handler modifies entity + raises domain event
  2. SaveChanges interceptor serializes domain events into OutboxMessages table (same transaction)
  3. Transaction commits — business data and outbox messages are atomically consistent
  4. Quartz.NET background job polls OutboxMessages, publishes to message broker, marks as processed

Implementation Sketch

1. Outbox entity (Domain or Infrastructure layer):

public sealed class OutboxMessage
{
    public Guid Id { get; init; }
    public string Type { get; init; } = string.Empty;     // Event CLR type name
    public string Content { get; init; } = string.Empty;   // Serialized event payload (JSON)
    public DateTime OccurredOnUtc { get; init; }
    public DateTime? ProcessedOnUtc { get; set; }
    public string? Error { get; set; }
}

2. Modified interceptor (writes to outbox instead of dispatching):

// In SaveChangesInterceptor, replace MediatR Publish with:
var outboxMessages = domainEvents.Select(e => new OutboxMessage
{
    Id = Guid.NewGuid(),
    Type = e.GetType().Name,
    Content = JsonConvert.SerializeObject(e, new JsonSerializerSettings
    {
        TypeNameHandling = TypeNameHandling.All
    }),
    OccurredOnUtc = DateTime.UtcNow
});

dbContext.Set<OutboxMessage>().AddRange(outboxMessages);
// Events are persisted in the same SaveChanges transaction

3. Quartz.NET background job:

[DisallowConcurrentExecution]
public sealed class ProcessOutboxMessagesJob(
    ApplicationDbContext dbContext,
    IPublisher publisher) : IJob
{
    public async Task Execute(IJobExecutionContext context)
    {
        var messages = await dbContext.Set<OutboxMessage>()
            .Where(m => m.ProcessedOnUtc == null)
            .OrderBy(m => m.OccurredOnUtc)
            .Take(20)
            .ToListAsync(context.CancellationToken);

        foreach (var message in messages)
        {
            try
            {
                var domainEvent = JsonConvert.DeserializeObject<IDomainEvent>(
                    message.Content,
                    new JsonSerializerSettings { TypeNameHandling = TypeNameHandling.All });

                if (domainEvent is not null)
                    await publisher.Publish(domainEvent, context.CancellationToken);

                message.ProcessedOnUtc = DateTime.UtcNow;
            }
            catch (Exception ex)
            {
                message.Error = ex.ToString();
            }
        }

        await dbContext.SaveChangesAsync(context.CancellationToken);
    }
}

When to Adopt This

  • You're publishing events to a message broker (RabbitMQ, Azure Service Bus, Kafka)
  • You need at-least-once delivery guarantees
  • You're in a microservices architecture where services communicate via events

For purely in-process event handling (the common case in a monolith), the existing DomainEventInterceptor approach is simpler and sufficient.

scaffold.ps1 Sin formato
1param(
2 [string]$ProjectName,
3 [ValidateSet("slnx", "sln")]
4 [string]$Format = "slnx",
5 [switch]$SkipPackages = $false
6)
7
8# --- 1. Setup & Validation ---
9if ([string]::IsNullOrWhiteSpace($ProjectName)) {
10 $ProjectName = Read-Host "Please enter a project name"
11}
12
13if ([string]::IsNullOrWhiteSpace($ProjectName)) {
14 Write-Host "❌ Name required." -ForegroundColor Red; exit 1
15}
16
17if (Test-Path $ProjectName) {
18 Write-Host "❌ Directory '$ProjectName' already exists. Aborting to prevent overwrite." -ForegroundColor Red; exit 1
19}
20
21$SlnFile = "$ProjectName.$Format"
22Write-Host "🚀 Scaffolding Clean Architecture (DDD) for: $ProjectName" -ForegroundColor Cyan
23
24# Create Root Directory
25New-Item -ItemType Directory -Path $ProjectName | Out-Null
26Set-Location $ProjectName
27
28# --- 2. Smart SDK Detection ---
29$LatestSdk = dotnet --list-sdks | Select-Object -Last 1 | ForEach-Object { $_.Split(' ')[0] }
30
31if ($LatestSdk) {
32 Write-Host "ℹ️ Detected SDK: $LatestSdk. Pinning global.json..." -ForegroundColor Gray
33 dotnet new globaljson --sdk-version $LatestSdk --roll-forward latestFeature
34}
35
36dotnet new gitignore
37
38# --- 3. Create Solution & Fix NuGet ---
39if ($Format -eq "slnx") { dotnet new sln -n $ProjectName --format slnx }
40else { dotnet new sln -n $ProjectName }
41
42Write-Host "📦 Configuring NuGet sources..." -ForegroundColor Cyan
43dotnet new nugetconfig --force
44$CurrentSources = dotnet nuget list source --configfile "nuget.config"
45if ($CurrentSources -notmatch "nuget.org") {
46 dotnet nuget add source "https://api.nuget.org/v3/index.json" -n "nuget.org" --configfile "nuget.config"
47}
48
49# --- 4. Create Projects ---
50Write-Host "🔨 Creating projects..." -ForegroundColor Cyan
51
52# Source Projects
53dotnet new classlib -n "$ProjectName.Domain" -o "src/$ProjectName.Domain"
54dotnet new classlib -n "$ProjectName.Application" -o "src/$ProjectName.Application"
55dotnet new classlib -n "$ProjectName.Infrastructure" -o "src/$ProjectName.Infrastructure"
56dotnet new webapi -n "$ProjectName.Api" -o "src/$ProjectName.Api" --use-controllers
57
58# Test Projects
59dotnet new xunit -n "$ProjectName.Domain.Tests" -o "tests/$ProjectName.Domain.Tests"
60dotnet new xunit -n "$ProjectName.Application.Tests" -o "tests/$ProjectName.Application.Tests"
61dotnet new xunit -n "$ProjectName.IntegrationTests" -o "tests/$ProjectName.IntegrationTests"
62
63# --- 4.1 CLEANUP BOILERPLATE ---
64Write-Host "🧹 Removing default template files..." -ForegroundColor Cyan
65
66$FilesToRemove = @(
67 "src/$ProjectName.Domain/Class1.cs",
68 "src/$ProjectName.Application/Class1.cs",
69 "src/$ProjectName.Infrastructure/Class1.cs",
70 "tests/$ProjectName.Domain.Tests/UnitTest1.cs",
71 "tests/$ProjectName.Application.Tests/UnitTest1.cs",
72 "tests/$ProjectName.IntegrationTests/UnitTest1.cs",
73 "src/$ProjectName.Api/WeatherForecast.cs",
74 "src/$ProjectName.Api/Controllers/WeatherForecastController.cs",
75 "src/$ProjectName.Api/$ProjectName.Api.http"
76)
77
78foreach ($File in $FilesToRemove) {
79 if (Test-Path $File) {
80 Remove-Item $File -Force
81 }
82}
83
84# --- 5. Add to Solution ---
85Write-Host "📂 Organizing solution structure..." -ForegroundColor Cyan
86
87# Add Src
88dotnet sln $SlnFile add "src/$ProjectName.Domain/$ProjectName.Domain.csproj" -s "src"
89dotnet sln $SlnFile add "src/$ProjectName.Application/$ProjectName.Application.csproj" -s "src"
90dotnet sln $SlnFile add "src/$ProjectName.Infrastructure/$ProjectName.Infrastructure.csproj" -s "src"
91dotnet sln $SlnFile add "src/$ProjectName.Api/$ProjectName.Api.csproj" -s "src"
92
93# Add Tests
94dotnet sln $SlnFile add "tests/$ProjectName.Domain.Tests/$ProjectName.Domain.Tests.csproj" -s "tests"
95dotnet sln $SlnFile add "tests/$ProjectName.Application.Tests/$ProjectName.Application.Tests.csproj" -s "tests"
96dotnet sln $SlnFile add "tests/$ProjectName.IntegrationTests/$ProjectName.IntegrationTests.csproj" -s "tests"
97
98# --- 6. Add References ---
99Write-Host "🔗 Wiring up dependencies..." -ForegroundColor Cyan
100
101# Application -> Domain
102dotnet add "src/$ProjectName.Application/$ProjectName.Application.csproj" reference "src/$ProjectName.Domain/$ProjectName.Domain.csproj"
103
104# Infrastructure -> Application & Domain
105dotnet add "src/$ProjectName.Infrastructure/$ProjectName.Infrastructure.csproj" reference "src/$ProjectName.Application/$ProjectName.Application.csproj"
106dotnet add "src/$ProjectName.Infrastructure/$ProjectName.Infrastructure.csproj" reference "src/$ProjectName.Domain/$ProjectName.Domain.csproj"
107
108# API -> Application & Infrastructure
109dotnet add "src/$ProjectName.Api/$ProjectName.Api.csproj" reference "src/$ProjectName.Application/$ProjectName.Application.csproj"
110dotnet add "src/$ProjectName.Api/$ProjectName.Api.csproj" reference "src/$ProjectName.Infrastructure/$ProjectName.Infrastructure.csproj"
111
112# Tests
113dotnet add "tests/$ProjectName.Domain.Tests/$ProjectName.Domain.Tests.csproj" reference "src/$ProjectName.Domain/$ProjectName.Domain.csproj"
114
115dotnet add "tests/$ProjectName.Application.Tests/$ProjectName.Application.Tests.csproj" reference "src/$ProjectName.Application/$ProjectName.Application.csproj"
116# NOTE: Application tests usually need Domain too for entities
117dotnet add "tests/$ProjectName.Application.Tests/$ProjectName.Application.Tests.csproj" reference "src/$ProjectName.Domain/$ProjectName.Domain.csproj"
118
119dotnet add "tests/$ProjectName.IntegrationTests/$ProjectName.IntegrationTests.csproj" reference "src/$ProjectName.Api/$ProjectName.Api.csproj"
120dotnet add "tests/$ProjectName.IntegrationTests/$ProjectName.IntegrationTests.csproj" reference "src/$ProjectName.Infrastructure/$ProjectName.Infrastructure.csproj"
121dotnet add "tests/$ProjectName.IntegrationTests/$ProjectName.IntegrationTests.csproj" reference "src/$ProjectName.Application/$ProjectName.Application.csproj"
122dotnet add "tests/$ProjectName.IntegrationTests/$ProjectName.IntegrationTests.csproj" reference "src/$ProjectName.Domain/$ProjectName.Domain.csproj"
123
124# --- 7. Install Nuget Packages (Optional) ---
125if (-not $SkipPackages) {
126 Write-Host "📦 Installing standard Clean Architecture packages..." -ForegroundColor Cyan
127
128 function Add-Package {
129 param ($Project, $Package)
130 Write-Host " + Adding $Package..." -ForegroundColor Gray
131 dotnet add $Project package $Package
132 }
133
134 # Application Layer
135 Add-Package "src/$ProjectName.Application/$ProjectName.Application.csproj" "MediatR"
136 Add-Package "src/$ProjectName.Application/$ProjectName.Application.csproj" "FluentValidation"
137 Add-Package "src/$ProjectName.Application/$ProjectName.Application.csproj" "FluentValidation.DependencyInjectionExtensions"
138 Add-Package "src/$ProjectName.Application/$ProjectName.Application.csproj" "Microsoft.Extensions.Logging.Abstractions"
139
140 # Infrastructure Layer
141 Add-Package "src/$ProjectName.Infrastructure/$ProjectName.Infrastructure.csproj" "Microsoft.EntityFrameworkCore"
142 Add-Package "src/$ProjectName.Infrastructure/$ProjectName.Infrastructure.csproj" "Microsoft.EntityFrameworkCore.SqlServer"
143 Add-Package "src/$ProjectName.Infrastructure/$ProjectName.Infrastructure.csproj" "Microsoft.EntityFrameworkCore.Design"
144
145 # API Layer
146 Add-Package "src/$ProjectName.Api/$ProjectName.Api.csproj" "Microsoft.EntityFrameworkCore.Tools"
147
148 # Test Projects
149 $TestProjects = @(
150 "tests/$ProjectName.Domain.Tests/$ProjectName.Domain.Tests.csproj",
151 "tests/$ProjectName.Application.Tests/$ProjectName.Application.Tests.csproj",
152 "tests/$ProjectName.IntegrationTests/$ProjectName.IntegrationTests.csproj"
153 )
154 foreach ($proj in $TestProjects) {
155 Add-Package $proj "FluentAssertions"
156 Add-Package $proj "Moq"
157 }
158
159 Add-Package "tests/$ProjectName.IntegrationTests/$ProjectName.IntegrationTests.csproj" "Microsoft.AspNetCore.Mvc.Testing"
160}
161
162# --- 8. Final Verification ---
163Write-Host "🏗️ Verifying build..." -ForegroundColor Cyan
164dotnet build
165if ($LASTEXITCODE -eq 0) {
166 Write-Host "$ProjectName scaffolded successfully!" -ForegroundColor Green
167 Write-Host " 👉 cd $ProjectName" -ForegroundColor Gray
168}
scaffold.sh Sin formato
1#!/bin/bash
2
3# Usage: ./scaffold.sh MyProjectName [slnx|sln] [true|false for packages]
4# Example: ./scaffold.sh MyCleanApp slnx
5
6# --- 1. Setup & Validation ---
7if [ -z "$1" ]; then
8 echo "❌ Please provide a project name."
9 echo "Usage: ./scaffold.sh MyProjectName [slnx|sln]"
10 exit 1
11fi
12
13PROJECT_NAME=$1
14FORMAT=${2:-slnx}
15INSTALL_PACKAGES=${3:-true} # Default to true
16SLN_FILE="$PROJECT_NAME.$FORMAT"
17
18# Check if directory exists to avoid overwriting
19if [ -d "$PROJECT_NAME" ]; then
20 echo "❌ Directory '$PROJECT_NAME' already exists. Aborting."
21 exit 1
22fi
23
24echo "🚀 Scaffolding $FORMAT solution for: $PROJECT_NAME (Clean Architecture + DDD)"
25
26# Create Root Directory and enter it
27mkdir "$PROJECT_NAME"
28cd "$PROJECT_NAME" || exit
29
30# --- 2. Smart SDK Detection ---
31# Get the latest installed SDK version (e.g. 10.0.102)
32LATEST_SDK=$(dotnet --list-sdks | tail -n 1 | awk '{print $1}')
33
34if [ -n "$LATEST_SDK" ]; then
35 echo "ℹ️ Detected SDK: $LATEST_SDK. Pinning global.json..."
36 dotnet new globaljson --sdk-version "$LATEST_SDK" --roll-forward latestFeature
37else
38 echo "⚠️ No SDK detected. Skipping global.json."
39fi
40
41dotnet new gitignore
42
43# --- 3. Create Solution & Fix NuGet ---
44if [ "$FORMAT" == "slnx" ]; then
45 dotnet new sln -n "$PROJECT_NAME" --format slnx
46else
47 dotnet new sln -n "$PROJECT_NAME"
48fi
49
50# [FIX] Create a local nuget.config to ensure we can find packages
51echo "📦 Configuring NuGet sources..."
52dotnet new nugetconfig --force
53
54# Check if nuget.org is already there
55if ! dotnet nuget list source --configfile "nuget.config" | grep -q "nuget.org"; then
56 dotnet nuget add source "https://api.nuget.org/v3/index.json" -n "nuget.org" --configfile "nuget.config"
57fi
58
59# --- 4. Create Projects ---
60echo "🔨 Creating projects..."
61dotnet new classlib -n "$PROJECT_NAME.Domain" -o "src/$PROJECT_NAME.Domain"
62dotnet new classlib -n "$PROJECT_NAME.Application" -o "src/$PROJECT_NAME.Application"
63dotnet new classlib -n "$PROJECT_NAME.Infrastructure" -o "src/$PROJECT_NAME.Infrastructure"
64dotnet new webapi -n "$PROJECT_NAME.Api" -o "src/$PROJECT_NAME.Api" --use-controllers
65
66mkdir -p tests
67dotnet new xunit -n "$PROJECT_NAME.Domain.Tests" -o "tests/$PROJECT_NAME.Domain.Tests"
68dotnet new xunit -n "$PROJECT_NAME.Application.Tests" -o "tests/$PROJECT_NAME.Application.Tests"
69dotnet new xunit -n "$PROJECT_NAME.IntegrationTests" -o "tests/$PROJECT_NAME.IntegrationTests"
70
71# --- 4.1 CLEANUP BOILERPLATE ---
72echo "🧹 Removing default template files..."
73
74# Define array of files to remove relative to solution root
75FILES_TO_REMOVE=(
76 "src/$PROJECT_NAME.Domain/Class1.cs"
77 "src/$PROJECT_NAME.Application/Class1.cs"
78 "src/$PROJECT_NAME.Infrastructure/Class1.cs"
79 "tests/$PROJECT_NAME.Domain.Tests/UnitTest1.cs"
80 "tests/$PROJECT_NAME.Application.Tests/UnitTest1.cs"
81 "tests/$PROJECT_NAME.IntegrationTests/UnitTest1.cs"
82 "src/$PROJECT_NAME.Api/WeatherForecast.cs"
83 "src/$PROJECT_NAME.Api/Controllers/WeatherForecastController.cs"
84 "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.http"
85)
86
87for file in "${FILES_TO_REMOVE[@]}"; do
88 if [ -f "$file" ]; then
89 rm "$file"
90 echo " - Deleted $file"
91 fi
92done
93
94# --- 5. Add to Solution (With Visual Folders) ---
95echo "📂 Organizing solution structure..."
96# Source
97dotnet sln "$SLN_FILE" add "src/$PROJECT_NAME.Domain/$PROJECT_NAME.Domain.csproj" -s "src"
98dotnet sln "$SLN_FILE" add "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj" -s "src"
99dotnet sln "$SLN_FILE" add "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj" -s "src"
100dotnet sln "$SLN_FILE" add "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.csproj" -s "src"
101
102# Tests
103dotnet sln "$SLN_FILE" add "tests/$PROJECT_NAME.Domain.Tests/$PROJECT_NAME.Domain.Tests.csproj" -s "tests"
104dotnet sln "$SLN_FILE" add "tests/$PROJECT_NAME.Application.Tests/$PROJECT_NAME.Application.Tests.csproj" -s "tests"
105dotnet sln "$SLN_FILE" add "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj" -s "tests"
106
107# --- 6. Add References ---
108echo "🔗 Wiring up dependencies..."
109
110# Application -> Domain
111dotnet add "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj" reference "src/$PROJECT_NAME.Domain/$PROJECT_NAME.Domain.csproj"
112
113# Infrastructure -> Application AND Domain
114dotnet add "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj" reference "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj"
115dotnet add "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj" reference "src/$PROJECT_NAME.Domain/$PROJECT_NAME.Domain.csproj"
116
117# API -> Application AND Infrastructure
118dotnet add "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.csproj" reference "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj"
119dotnet add "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.csproj" reference "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj"
120
121# Tests
122dotnet add "tests/$PROJECT_NAME.Domain.Tests/$PROJECT_NAME.Domain.Tests.csproj" reference "src/$PROJECT_NAME.Domain/$PROJECT_NAME.Domain.csproj"
123
124dotnet add "tests/$PROJECT_NAME.Application.Tests/$PROJECT_NAME.Application.Tests.csproj" reference "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj"
125dotnet add "tests/$PROJECT_NAME.Application.Tests/$PROJECT_NAME.Application.Tests.csproj" reference "src/$PROJECT_NAME.Domain/$PROJECT_NAME.Domain.csproj"
126
127dotnet add "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj" reference "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.csproj"
128dotnet add "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj" reference "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj"
129dotnet add "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj" reference "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj"
130dotnet add "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj" reference "src/$PROJECT_NAME.Domain/$PROJECT_NAME.Domain.csproj"
131
132# --- 7. Install Nuget Packages (Optional) ---
133if [ "$INSTALL_PACKAGES" = true ]; then
134 echo "📦 Installing standard Clean Architecture packages..."
135
136 # Helper function to add packages safely
137 add_package() {
138 local proj=$1
139 local pkg=$2
140 echo " + Adding $pkg..."
141 dotnet add "$proj" package "$pkg" > /dev/null 2>&1
142 if [ $? -ne 0 ]; then
143 echo " ⚠️ Failed to add $pkg. Check connection."
144 fi
145 }
146
147 # Application Layer
148 add_package "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj" "MediatR"
149 add_package "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj" "FluentValidation"
150 add_package "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj" "FluentValidation.DependencyInjectionExtensions"
151 add_package "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj" "Microsoft.Extensions.Logging.Abstractions"
152
153 # Infrastructure Layer
154 add_package "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj" "Microsoft.EntityFrameworkCore"
155 add_package "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj" "Microsoft.EntityFrameworkCore.SqlServer"
156 add_package "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj" "Microsoft.EntityFrameworkCore.Design"
157
158 # API Layer
159 add_package "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.csproj" "Microsoft.EntityFrameworkCore.Tools"
160
161 # Tests
162 TEST_PROJECTS=(
163 "tests/$PROJECT_NAME.Domain.Tests/$PROJECT_NAME.Domain.Tests.csproj"
164 "tests/$PROJECT_NAME.Application.Tests/$PROJECT_NAME.Application.Tests.csproj"
165 "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj"
166 )
167
168 for proj in "${TEST_PROJECTS[@]}"; do
169 add_package "$proj" "FluentAssertions"
170 add_package "$proj" "Moq"
171 done
172
173 add_package "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj" "Microsoft.AspNetCore.Mvc.Testing"
174fi
175
176# --- 8. Final Verification ---
177echo "🏗️ Verifying build..."
178dotnet build
179if [ $? -eq 0 ]; then
180 echo "$PROJECT_NAME scaffolded successfully!"
181 echo "👉 cd $PROJECT_NAME"
182else
183 echo "⚠️ Scaffolding finished, but build failed. Run 'dotnet restore' manually."
184fi
scaffold_project.sh Sin formato
1#!/bin/bash
2
3# --- 0. Determine Project Name ---
4# Grab the name of the current directory
5PROJECT_NAME=$(basename "$PWD")
6
7echo "📂 Using current directory name as project name: $PROJECT_NAME"
8
9# --- 1. Configuration ---
10FORMAT="slnx" # Options: "sln" or "slnx"
11INSTALL_PACKAGES=true # Set to true to install NuGet packages
12
13echo "🚀 Initializing $PROJECT_NAME (Format: $FORMAT)..."
14
15# --- 1.5 Database Selection ---
16echo ""
17echo "🗄️ Select your Database Provider for $PROJECT_NAME:"
18echo " 1) PostgreSQL (Npgsql) [Default]"
19echo " 2) SQL Server"
20echo " 3) SQLite"
21# < /dev/tty allows interactive prompt when piping via curl
22read -p "Enter choice [1-3] (Default: 1): " DB_CHOICE < /dev/tty
23
24case $DB_CHOICE in
25 2)
26 DB_PACKAGE="Microsoft.EntityFrameworkCore.SqlServer"
27 echo " Selected: SQL Server"
28 ;;
29 3)
30 DB_PACKAGE="Microsoft.EntityFrameworkCore.Sqlite"
31 echo " Selected: SQLite"
32 ;;
33 *)
34 DB_PACKAGE="Npgsql.EntityFrameworkCore.PostgreSQL"
35 echo " Selected: PostgreSQL"
36 ;;
37esac
38echo ""
39
40# --- 2. Smart SDK Detection ---
41# Get the latest installed SDK version
42LATEST_SDK=$(dotnet --list-sdks | tail -n 1 | awk '{print $1}')
43
44if [ -n "$LATEST_SDK" ]; then
45 echo "ℹ️ Detected SDK: $LATEST_SDK. Pinning global.json..."
46 dotnet new globaljson --sdk-version "$LATEST_SDK" --roll-forward latestFeature
47else
48 echo "⚠️ No SDK detected. Skipping global.json."
49fi
50
51dotnet new gitignore
52
53# --- 3. Create Solution & Fix NuGet ---
54if [ "$FORMAT" = "slnx" ]; then
55 echo "📄 Creating .slnx solution..."
56 dotnet new sln -n "$PROJECT_NAME" --format slnx
57 SLN_FILE="$PROJECT_NAME.slnx"
58else
59 echo "📄 Creating standard .sln solution..."
60 dotnet new sln -n "$PROJECT_NAME"
61 SLN_FILE="$PROJECT_NAME.sln"
62fi
63
64# Create a local nuget.config to ensure we can find packages
65echo "📦 Configuring NuGet sources..."
66dotnet new nugetconfig --force
67
68# Check if nuget.org is already there, if not, add it
69if ! dotnet nuget list source --configfile "nuget.config" | grep -q "nuget.org"; then
70 dotnet nuget add source "https://api.nuget.org/v3/index.json" -n "nuget.org" --configfile "nuget.config"
71fi
72
73# --- 4. Create Projects ---
74echo "🔨 Creating projects..."
75dotnet new classlib -n "$PROJECT_NAME.Domain" -o "src/$PROJECT_NAME.Domain"
76dotnet new classlib -n "$PROJECT_NAME.Application" -o "src/$PROJECT_NAME.Application"
77dotnet new classlib -n "$PROJECT_NAME.Infrastructure" -o "src/$PROJECT_NAME.Infrastructure"
78dotnet new webapi -n "$PROJECT_NAME.Api" -o "src/$PROJECT_NAME.Api" --use-controllers
79
80mkdir -p tests
81dotnet new xunit -n "$PROJECT_NAME.Domain.Tests" -o "tests/$PROJECT_NAME.Domain.Tests"
82dotnet new xunit -n "$PROJECT_NAME.Application.Tests" -o "tests/$PROJECT_NAME.Application.Tests"
83dotnet new xunit -n "$PROJECT_NAME.IntegrationTests" -o "tests/$PROJECT_NAME.IntegrationTests"
84
85# --- 4.1 CLEANUP BOILERPLATE ---
86echo "🧹 Removing default template files..."
87
88FILES_TO_REMOVE=(
89 "src/$PROJECT_NAME.Domain/Class1.cs"
90 "src/$PROJECT_NAME.Application/Class1.cs"
91 "src/$PROJECT_NAME.Infrastructure/Class1.cs"
92 "tests/$PROJECT_NAME.Domain.Tests/UnitTest1.cs"
93 "tests/$PROJECT_NAME.Application.Tests/UnitTest1.cs"
94 "tests/$PROJECT_NAME.IntegrationTests/UnitTest1.cs"
95 "src/$PROJECT_NAME.Api/WeatherForecast.cs"
96 "src/$PROJECT_NAME.Api/Controllers/WeatherForecastController.cs"
97 "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.http"
98)
99
100for file in "${FILES_TO_REMOVE[@]}"; do
101 if [ -f "$file" ]; then
102 rm "$file"
103 echo " - Deleted $file"
104 fi
105done
106
107# --- 5. Add to Solution (With Visual Folders) ---
108echo "📂 Organizing solution structure in $SLN_FILE..."
109
110dotnet sln "$SLN_FILE" add "src/$PROJECT_NAME.Domain/$PROJECT_NAME.Domain.csproj" -s "src"
111dotnet sln "$SLN_FILE" add "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj" -s "src"
112dotnet sln "$SLN_FILE" add "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj" -s "src"
113dotnet sln "$SLN_FILE" add "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.csproj" -s "src"
114
115dotnet sln "$SLN_FILE" add "tests/$PROJECT_NAME.Domain.Tests/$PROJECT_NAME.Domain.Tests.csproj" -s "tests"
116dotnet sln "$SLN_FILE" add "tests/$PROJECT_NAME.Application.Tests/$PROJECT_NAME.Application.Tests.csproj" -s "tests"
117dotnet sln "$SLN_FILE" add "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj" -s "tests"
118
119# --- 6. Add References ---
120echo "🔗 Wiring up dependencies..."
121
122# Clean Architecture Flow
123dotnet add "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj" reference "src/$PROJECT_NAME.Domain/$PROJECT_NAME.Domain.csproj"
124
125dotnet add "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj" reference "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj"
126dotnet add "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj" reference "src/$PROJECT_NAME.Domain/$PROJECT_NAME.Domain.csproj"
127
128dotnet add "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.csproj" reference "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj"
129dotnet add "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.csproj" reference "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj"
130
131# Test References
132dotnet add "tests/$PROJECT_NAME.Domain.Tests/$PROJECT_NAME.Domain.Tests.csproj" reference "src/$PROJECT_NAME.Domain/$PROJECT_NAME.Domain.csproj"
133
134dotnet add "tests/$PROJECT_NAME.Application.Tests/$PROJECT_NAME.Application.Tests.csproj" reference "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj"
135dotnet add "tests/$PROJECT_NAME.Application.Tests/$PROJECT_NAME.Application.Tests.csproj" reference "src/$PROJECT_NAME.Domain/$PROJECT_NAME.Domain.csproj"
136
137dotnet add "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj" reference "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.csproj"
138dotnet add "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj" reference "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj"
139dotnet add "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj" reference "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj"
140dotnet add "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj" reference "src/$PROJECT_NAME.Domain/$PROJECT_NAME.Domain.csproj"
141
142# --- 7. Install Nuget Packages (Optional) ---
143if [ "$INSTALL_PACKAGES" = true ]; then
144 echo "📦 Installing standard Clean Architecture packages..."
145
146 add_package() {
147 local proj=$1
148 local pkg=$2
149 echo " + Adding $pkg..."
150 dotnet add "$proj" package "$pkg" > /dev/null 2>&1
151 if [ $? -ne 0 ]; then
152 echo " ⚠️ Failed to add $pkg. Check connection."
153 fi
154 }
155
156 # Application Layer
157 add_package "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj" "MediatR"
158 add_package "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj" "FluentValidation"
159 add_package "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj" "FluentValidation.DependencyInjectionExtensions"
160 add_package "src/$PROJECT_NAME.Application/$PROJECT_NAME.Application.csproj" "Microsoft.Extensions.Logging.Abstractions"
161
162 # Infrastructure Layer
163 add_package "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj" "Microsoft.EntityFrameworkCore"
164 add_package "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj" "$DB_PACKAGE"
165 add_package "src/$PROJECT_NAME.Infrastructure/$PROJECT_NAME.Infrastructure.csproj" "Microsoft.EntityFrameworkCore.Design"
166
167 # API Layer
168 add_package "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.csproj" "Microsoft.EntityFrameworkCore.Tools"
169 add_package "src/$PROJECT_NAME.Api/$PROJECT_NAME.Api.csproj" "Serilog.AspNetCore"
170
171 # Test Projects
172 TEST_PROJECTS=(
173 "tests/$PROJECT_NAME.Domain.Tests/$PROJECT_NAME.Domain.Tests.csproj"
174 "tests/$PROJECT_NAME.Application.Tests/$PROJECT_NAME.Application.Tests.csproj"
175 "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj"
176 )
177
178 for proj in "${TEST_PROJECTS[@]}"; do
179 add_package "$proj" "FluentAssertions"
180 add_package "$proj" "Moq"
181 done
182
183 add_package "tests/$PROJECT_NAME.IntegrationTests/$PROJECT_NAME.IntegrationTests.csproj" "Microsoft.AspNetCore.Mvc.Testing"
184fi
185
186# --- 8. Final Verification ---
187echo "🏗️ Verifying build..."
188dotnet build
189if [ $? -eq 0 ]; then
190 echo "$PROJECT_NAME scaffolded successfully!"
191else
192 echo "⚠️ Scaffolding finished, but build failed. Run 'dotnet restore' manually."
193fi