Production-ready API template implementing Clean Architecture, DDD, CQRS with Change Data Capture
Built with modern .NET 9 technologies and enterprise-grade data synchronization
- Hexagonal Architecture: Clean separation with Ports & Adapters pattern
- CQRS Implementation: Separate Command (PostgreSQL) and Query (MongoDB) stores
- Change Data Capture: Real-time synchronization using Debezium + Kafka
- Domain-Driven Design: Rich domain models with business rules enforcement
- JWT Authentication: Secure API with password hashing and validation
- Dual Database Strategy: PostgreSQL for writes, MongoDB for optimized reads
- Complete Stack: PostgreSQL, MongoDB, Kafka, Debezium fully containerized
- API Documentation: Interactive Swagger/OpenAPI with detailed schemas
- Event Streaming: Kafka-based CDC for instant data synchronization
- Clean Code Principles: DRY, KISS, YAGNI compliant with centralized utilities
- Testing: 60+ test files with unit and integration coverage
| Category | Technology |
|---|---|
| Framework | .NET 9 |
| Command DB | PostgreSQL |
| Query DB | MongoDB |
| Message Stream | Apache Kafka |
| CDC Platform | Debezium |
| ORM | Entity Framework Core |
| ODM | MongoDB Driver |
| Mediator | MediatR |
| Validation | FluentValidation |
| Mapping | AutoMapper |
| Authentication | JWT Bearer |
| Logging | Serilog |
| Testing | xUnit + Testcontainers |
Real-time sync: PostgreSQL changes stream directly to MongoDB via Kafka
Zero downtime: Schema changes handled automatically
Reliable: Built-in failure recovery with offset tracking
How it works: Write to PostgreSQL → Debezium captures WAL changes → Kafka streams → MongoDB gets updated
graph LR
A[API Command] --> B[PostgreSQL]
B --> C[Debezium CDC]
C --> D[Kafka Topic]
D --> E[CDC Consumer]
E --> F[MongoDB]
F --> G[Optimized Queries]
Key CDC Events:
user.created→ Syncs new user to read modeluser.updated→ Maintains profile consistencyuser.deleted→ Handles logical deletion synchronization
CDC vs Domain Events: No application code changes needed, captures direct SQL modifications CDC vs API Polling: Real-time latency without rate limiting headaches CDC vs ETL/DevOps Jobs: Sub-second sync instead of scheduled batch processes
Used by Netflix, Uber, LinkedIn, and many others for real-time data synchronization.
The setup.ps1 script handles all initialization tasks:
# What the script does:
# 1. Runs EF Core migrations against PostgreSQL
# 2. Configures Debezium CDC connector for the users table
# 3. Sets up Kafka topic routing for real-time synchronization
# 4. Validates all services are running correctly
./setup.ps1 # One command to rule them all# Create new migration for PostgreSQL (Command store)
dotnet ef migrations add MigrationName --project HexagonalSkeleton.MigrationDb
# Update PostgreSQL database
dotnet ef database update --project HexagonalSkeleton.MigrationDb
# MongoDB collections are automatically created by the CDC consumers# Monitor Kafka topics and CDC events
# Access Confluent Control Center or use Kafka CLI tools
docker exec -it hexagonal-kafka kafka-topics --list --bootstrap-server localhost:9092
# Check Debezium connector status
curl -H "Accept:application/json" localhost:8083/connectors/postgres-users-connector/status# 1. Clone the repository
git clone https://github.com/asanabrialopez/.net-api-hexagonal-skeleton.git
cd .net-api-hexagonal-skeleton
# 2. Start the complete infrastructure stack
docker-compose up -d --wait
# 3. Run the setup script (migrations + CDC configuration)
./setup.ps1
# 4. Start the API
dotnet run --project HexagonalSkeleton.API- API Documentation: http://localhost:5000/swagger
- PostgreSQL: localhost:5432 (Commands/Writes)
- MongoDB: localhost:27017 (Queries/Reads)
- Kafka: localhost:9092 (Event Streaming)
- Debezium Connect: localhost:8083 (CDC Management)
# Register a new user (writes to PostgreSQL)
curl -X POST http://localhost:5000/api/registration \
-H "Content-Type: application/json" \
-d '{"email": "test@example.com", "password": "Test123!", "firstName": "John", "lastName": "Doe"}'
# Query users (reads from MongoDB - synced via CDC)
curl http://localhost:5000/api/usersgraph TB
subgraph "API Layer"
Controllers[Controllers]
DTOs[Request/Response DTOs]
Auth[JWT Authentication]
end
subgraph "Application Layer"
Commands[Commands]
Queries[Queries]
Handlers[MediatR Handlers]
CdcEvents[CDC Events]
end
subgraph "Domain Layer"
Entities[Domain Entities]
DomainServices[Domain Services]
Ports[Ports/Interfaces]
ValueObjects[Value Objects]
end
subgraph "Infrastructure Layer"
WriteRepo[Command Repository]
ReadRepo[Query Repository]
CdcProcessor[CDC Event Processor]
Kafka[Kafka + Debezium]
end
subgraph "Data Stores"
PostgreSQL[(PostgreSQL<br/>Write Operations)]
MongoDB[(MongoDB<br/>Read Operations)]
end
Controllers --> Handlers
Handlers --> Commands
Handlers --> Queries
Commands --> WriteRepo
Queries --> ReadRepo
WriteRepo --> PostgreSQL
ReadRepo --> MongoDB
PostgreSQL --> Kafka
Kafka --> CdcProcessor
CdcProcessor --> MongoDB
- Hexagonal Architecture: Ports & Adapters with clean dependency inversion
- CQRS Pattern: Separate optimized stores for commands and queries
- Change Data Capture: Real-time synchronization using Debezium + Kafka
- Eventual Consistency: Automated synchronization between data stores
- Repository Pattern: Clean data access abstraction layer
- Specification Pattern: Reusable and composable business rules
- Domain Events: Decoupled business logic with integration events
- Exception Handling: Global error management with custom exceptions
├── HexagonalSkeleton.API/ # API Layer
│ ├── Controllers/ # REST API endpoints
│ ├── Models/ # API request/response models
│ └── Config/ # DI container configuration
├── HexagonalSkeleton.Application/ # Application Layer
│ ├── Features/ # CQRS commands & queries
│ ├── Services/ # Application services
│ └── Events/ # Domain event handlers
├── HexagonalSkeleton.Domain/ # Domain Layer
│ ├── Entities/ # Domain entities (User.cs)
│ ├── Services/ # Domain services
│ ├── Specifications/ # Business rules
│ ├── Common/ # Shared utilities (AgeCalculator)
│ └── Ports/ # Interface contracts
├── HexagonalSkeleton.Infrastructure/ # Infrastructure Layer
│ ├── Persistence/ # Database context & repositories
│ ├── Auth/ # JWT implementation
│ └── Services/ # External service adapters
└── HexagonalSkeleton.Test/ # Testing
├── Unit/ # Unit tests (60+ test files)
├── Integration/ # Integration tests
└── TestInfrastructure/ # Testing utilities
60+ tests covering the full stack. Integration tests use Testcontainers for real PostgreSQL, MongoDB, and Kafka instances.
# Run all tests
dotnet test
# Run with coverage
dotnet test --collect:"XPlat Code Coverage"
# Run specific test category
dotnet test --filter "Category=Integration"- Unit Tests: Domain logic, business rules, and value objects (including AgeCalculator)
- Integration Tests: End-to-end API workflows with Testcontainers (PostgreSQL + MongoDB + Kafka)
- CDC Integration Tests: Complete Change Data Capture flow validation with real containers
- CQRS Tests: Command and query handler validation with containerized databases
- Repository Tests: Data access layer with real database containers
- Authentication Tests: JWT token generation and validation
- Testcontainers: Real PostgreSQL, MongoDB, and Kafka containers for integration tests
- TestWebApplicationFactory: Custom factory replacing production dependencies with containerized services
- Isolated Test Environments: Each test class gets its own Docker container instances
- CDC Testing: Full end-to-end validation of Change Data Capture flows
- AutoFixture: Automated test data generation for comprehensive scenarios
- Container Orchestration: Automatic setup/teardown of complete infrastructure stack
- Centralized Test Utilities: DRY-compliant test data creation with
TestHelper.cs - Domain Utility Testing: Comprehensive coverage of
AgeCalculatorwith edge cases - Postman Collection: 66 automated API tests covering all endpoints and business rules
- JWT Authentication with configurable expiration and secure token generation
- Password Hashing with salt generation using industry-standard algorithms
- Input Validation with FluentValidation for comprehensive request validation
- Global Exception Handling without sensitive data exposure in responses
- CORS configuration for secure cross-origin requests
- Authorization Attributes for role-based endpoint protection
The codebase follows standard Clean Architecture patterns:
- Domain logic stays in the domain layer (business rules, validations)
- Shared utilities like
AgeCalculatorprevent duplication across layers - Simple, descriptive naming - methods do what their names say
- Minimal dependencies - no unnecessary abstractions or complexity
| Endpoint | Method | Description | Response | Business Rules |
|---|---|---|---|---|
/api/auth/login |
POST | User authentication with JWT | AuthenticationToken + User | Password validation, user exists |
/api/registration |
POST | User registration + authentication | AuthenticationToken + User | Age 13-120 years, email unique |
| Endpoint | Method | Description | Database Used |
|---|---|---|---|
/api/users |
GET | Paginated users with filtering | MongoDB |
/api/users/{id} |
GET | Get user by ID | MongoDB |
/api/users |
PUT | Update user (admin) | PostgreSQL |
/api/users/{id} |
DELETE | Hard delete user | PostgreSQL |
/api/users/{id}/deactivate |
POST | Soft delete (deactivate) | PostgreSQL |
| Endpoint | Method | Description | Database Used |
|---|---|---|---|
/api/profile |
GET | Get own profile | MongoDB |
/api/profile/personal-info |
PATCH | Update personal information | PostgreSQL |
How it works:
- Write Operations → PostgreSQL (Commands)
- CDC Capture → Debezium streams changes to Kafka
- Event Processing → CDC consumers sync to MongoDB (Queries)
- Read Operations → MongoDB (Optimized for queries)
Full API documentation with request/response schemas at /swagger.
This template demonstrates production-ready enterprise software development with a modern twist on data synchronization:
This architecture demonstrates advanced concepts valued in enterprise software development:
- Scalability: CQRS enables independent scaling of read/write operations
- Performance: Dual databases optimized for specific access patterns
- Reliability: CDC provides guaranteed data consistency without application-level event handling
- Real-time: Debezium streams database changes instantly, no polling or delays
- Maintainability: Clean Architecture with clear separation of concerns
- Testability: Comprehensive test coverage with dependency injection and Testcontainers
- Business Logic: Domain-driven design with rich business rules enforcement
Perfect for demonstrating expertise in:
- Modern .NET Development (.NET 9 with latest C# 13 features)
- Distributed Systems (CQRS + Change Data Capture patterns)
- Event Streaming Architecture (Kafka + Debezium for real-time sync)
- Database Design (PostgreSQL + MongoDB optimization strategies)
- Enterprise Patterns (Hexagonal Architecture, DDD, SOLID principles)
- Advanced Testing (Unit, integration, CDC testing with 60+ test files)
- DevOps & Containers (Docker Compose, automated setup scripts)
CDC vs Domain Events: No application code changes needed, captures direct SQL modifications
CDC vs API Polling: Real-time latency without rate limiting headaches
CDC vs ETL/DevOps Jobs: Sub-second sync instead of scheduled batch processes
Used by Netflix, Uber, LinkedIn, and many others for real-time data synchronization.
sequenceDiagram
participant API as API Controller
participant CMD as Command Handler
participant PG as PostgreSQL
participant DBZ as Debezium
participant KAFKA as Kafka
participant CONS as CDC Consumer
participant MONGO as MongoDB
API->>CMD: RegisterUserCommand
CMD->>PG: Save User Entity
PG->>DBZ: WAL Change Detected
DBZ->>KAFKA: Publish CDC Event
KAFKA->>CONS: Stream Event
CONS->>MONGO: Sync to Read Model
Note over API,MONGO: Real-time Consistency Achieved
- Command Execution: Write operations go to PostgreSQL with full ACID compliance
- Change Capture: Debezium monitors PostgreSQL Write-Ahead Log (WAL) for changes
- Event Streaming: Changes are streamed to Kafka topics in real-time
- Event Processing: Dedicated CDC consumers process events and update MongoDB
- Query Optimization: Read operations use optimized MongoDB collections with indexes
- Domain First: Create entities, value objects, business rules
- Command/Query: Add MediatR handlers for CQRS operations
- Events: Define integration events for cross-bounded context communication
- API Layer: Create controllers and DTOs
- Sync Logic: Update consumers for read model consistency
- Tests: Write comprehensive unit and integration tests
# Environment-specific settings
cp appsettings.json appsettings.Production.json
# Modify connection strings, logging levels, etc.
# Swap PostgreSQL for SQL Server (Command store)
# In CqrsDatabaseExtension.cs:
services.AddDbContextPool<CommandDbContext>(options =>
options.UseSqlServer(connectionString));
# Configure database connections and CDC settings
# In appsettings.json or environment variables# docker-compose.prod.yml
services:
api:
build: .
environment:
- ASPNETCORE_ENVIRONMENT=Production
- ConnectionStrings__HexagonalSkeleton=${PG_CONNECTION}
- ConnectionStrings__HexagonalSkeletonRead=${MONGO_CONNECTION}
# ... database services with volumes and health checks- Fork/Clone this repository
- Rename namespaces to match your project (e.g.,
YourCompany.YourDomain) - Customize domain entities and business rules for your use case
- Extend with additional bounded contexts and features
- Deploy with confidence using the provided Docker configuration
- Multi-tenant: Add tenant isolation to both command and query stores
- Event Sourcing: Implement full event sourcing with Kafka event store
- API Versioning: Add versioned endpoints with backward compatibility
- GraphQL: Replace REST controllers with GraphQL endpoints
- Real-time: Add SignalR for real-time notifications
- Hexagonal Architecture implementation with ports & adapters in .NET
- CQRS Pattern with separate optimized data stores (PostgreSQL + MongoDB)
- Change Data Capture with reliable streaming using Kafka + Debezium
- Domain-Driven Design with rich domain models and business logic encapsulation
- Enterprise Testing strategies including unit, integration, and end-to-end tests
- Modern .NET development practices with dependency injection and clean code
- Microservices Patterns ready for distributed system development
- Database Optimization for both transactional and analytical workloads
- Clean Code Principles practical implementation of DRY, KISS, YAGNI
- Business Rules Engineering centralized utilities for domain calculations
⭐ Star this repo if it demonstrates valuable patterns for your projects!
This template showcases enterprise-grade .NET development with modern architectural patterns.
Perfect for production-ready applications.
