A full-stack AI chat application built with .NET 10 Web API backend and Angular 21 frontend. The application supports multiple AI service providers including Ollama, OpenAI, Azure AI Foundry, and Anthropic, with document management and vector search capabilities.
- Features
- Architecture
- Tech Stack
- Prerequisites
- Quick Start
- Configuration
- Database Setup
- API Endpoints
- Examples
- Development
- Testing
- Troubleshooting
- Contributing
- License
- Authors
- Acknowledgments
- Multi-AI Provider Support: Integrate with Ollama, OpenAI, Azure AI Foundry, and Anthropic
- Real-time Chat: Server-sent events for streaming responses
- Document Management: Upload and search documents with vector embeddings
- Session Management: Persistent chat sessions with history
- Modern UI: Responsive Angular frontend with Bootstrap 5
- Vector Search: AI-powered document search using SQL Server Vector Search
ai-chat/
βββ RR.AI-Chat/ # .NET 10 Web API Backend
β βββ RR.AI-Chat.Api/ # API Controllers & Program.cs
β βββ RR.AI-Chat.Service/ # Business Logic Services
β βββ RR.AI-Chat.Repository/ # Data Access Layer
β βββ RR.AI-Chat.Entity/ # Entity Framework Models
β βββ RR.AI-Chat.Dto/ # Data Transfer Objects
βββ ai-chat-ui/ # Angular 21 Frontend
βββ src/app/services/ # HTTP Services
βββ src/app/dtos/ # TypeScript DTOs
βββ src/environments/ # Environment Configuration
- .NET 10.0 - Web API Framework
- Entity Framework Core 10.0 - ORM with SQL Server
- SQL Server Vector Search - Vector embeddings storage
- Microsoft.Extensions.AI - AI service abstractions
- Swagger/OpenAPI - API Documentation
- Angular 21 - Frontend Framework
- TypeScript 5.9 - Programming Language
- Bootstrap 5.3 - CSS Framework
- RxJS - Reactive Programming
- Highlight.js - Code Syntax Highlighting
- Markdown-it - Markdown Rendering
- Ollama - Local AI models
- OpenAI - GPT models
- Azure AI Foundry - Azure OpenAI Service
- Anthropic - Claude models
- .NET 10.0 SDK - Download here
- Node.js 18+ - Download here
- SQL Server - Express, Developer, or Full edition
- Angular CLI 21+ - Install via
npm install -g @angular/cli
- Ollama - Install here for local AI models
- OpenAI API Key - For GPT models
- Azure AI Foundry - Endpoint URL and API Key
- Anthropic API Key - For Claude models
New to the project? Check out our Quick Start Guide for the fastest way to get running!
git clone https://github.com/RorroRojas3/ai-chat.git
cd ai-chat# Create database (replace connection string as needed)
# Default: Server=localhost;Database=aichat;Integrated Security=true;Encrypt=true;TrustServerCertificate=true;Create user secrets for the API project:
cd RR.AI-Chat/RR.AI-Chat.Api
dotnet user-secrets initAdd your AI service configurations:
# For OpenAI
dotnet user-secrets set "OpenAI:ApiKey" "your-openai-api-key"
# For Azure AI Foundry
dotnet user-secrets set "AzureAIFoundry:Url" "https://your-endpoint.openai.azure.com/"
dotnet user-secrets set "AzureAIFoundry:ApiKey" "your-azure-api-key"
dotnet user-secrets set "AzureAIFoundry:EmbeddingModel" "text-embedding-ada-002"
# For Anthropic
dotnet user-secrets set "Anthropic:ApiKey" "your-anthropic-api-key"
# For custom Ollama URL (optional, defaults to http://localhost:11434/)
dotnet user-secrets set "OllamaUrl" "http://localhost:11434/"cd RR.AI-Chat
dotnet restore
dotnet build
dotnet run --project RR.AI-Chat.ApiThe API will start at https://localhost:7045 (HTTPS) and http://localhost:5045 (HTTP).
cd ai-chat-ui
npm install
npm startThe frontend will start at http://localhost:4200.
Use the Angular CLI MCP server to supercharge AI-assisted Angular workflows.
Create .vscode/mcp.json in the repo root with one of the following configurations:
- Frontend: http://localhost:4200
- API Documentation: https://localhost:7045/swagger
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"AllowedHosts": "*",
"CorsOrigins": ["http://localhost:4200"],
"OllamaUrl": "http://localhost:11434/",
"ConnectionStrings": {
"DefaultConnection": "Server=localhost;Database=aichat;Integrated Security=true;Encrypt=true;TrustServerCertificate=true;"
}
}export const environment = {
production: false,
apiUrl: "https://localhost:7045/api/",
};| Variable | Description | Required | Default |
|---|---|---|---|
OpenAI:ApiKey |
OpenAI API key for GPT models | No* | - |
AzureAIFoundry:Url |
Azure OpenAI endpoint URL | No* | - |
AzureAIFoundry:ApiKey |
Azure OpenAI API key | No* | - |
AzureAIFoundry:EmbeddingModel |
Embedding model name | No | text-embedding-ada-002 |
Anthropic:ApiKey |
Anthropic API key for Claude models | No* | - |
OllamaUrl |
Ollama server URL | No | http://localhost:11434/ |
ConnectionStrings:DefaultConnection |
SQL Server connection string | Yes | See above |
*At least one AI service must be configured.
The application uses Entity Framework migrations. To set up the database:
-
Update Connection String: Modify the connection string in
appsettings.jsonor set via user secrets:dotnet user-secrets set "ConnectionStrings:DefaultConnection" "your-connection-string"
-
Run Migrations (when available):
cd RR.AI-Chat dotnet ef database update --project RR.AI-Chat.Api
POST /api/chats/sessions/{sessionId}/stream- Stream chat responsesPOST /api/chats/sessions/{sessionId}/completion- Get chat completion
GET /api/sessions- Get all sessionsPOST /api/sessions- Create new sessionGET /api/sessions/{id}- Get session by IDDELETE /api/sessions/{id}- Delete session
POST /api/documents- Upload documentGET /api/documents- Get all documentsPOST /api/documents/search- Search documents
GET /api/models- Get available AI models
Backend (C# API Call):
// Create a new chat session
var session = new SessionDto
{
Name = "My AI Conversation",
CreatedAt = DateTime.UtcNow
};
// POST to /api/sessions
var response = await httpClient.PostAsJsonAsync("api/sessions", session);
var createdSession = await response.Content.ReadFromJsonAsync<SessionDto>();Frontend (TypeScript/Angular):
// Using the SessionService
this.sessionService.createSession("My AI Conversation").subscribe((session) => {
console.log("Session created:", session.id);
this.currentSessionId = session.id;
});Backend (C# Controller):
[HttpPost("sessions/{sessionId}/stream")]
public async IAsyncEnumerable<string> StreamChatCompletion(
Guid sessionId,
[FromBody] ChatCompletionDto request,
[EnumeratorCancellation] CancellationToken cancellationToken)
{
await foreach (var chunk in _chatService.StreamCompletionAsync(
sessionId,
request,
cancellationToken))
{
yield return chunk;
}
}Frontend (TypeScript/Angular with SSE):
// Stream chat response
sendMessage(sessionId: string, message: string, model: string) {
const request = {
prompt: message,
model: model,
systemPrompt: 'You are a helpful assistant.'
};
this.chatService.streamCompletion(sessionId, request).subscribe({
next: (chunk) => {
// Append chunk to message display
this.currentMessage += chunk;
},
complete: () => {
console.log('Streaming completed');
}
});
}Upload a Document:
// C# Example
var formData = new MultipartFormDataContent();
formData.Add(new StreamContent(fileStream), "file", fileName);
var response = await httpClient.PostAsync("api/documents", formData);
var document = await response.Content.ReadFromJsonAsync<DocumentDto>();Search Documents:
// C# Example - Vector search with AI embeddings
var searchRequest = new DocumentSearchDto
{
Query = "What are the system requirements?",
TopK = 5
};
var response = await httpClient.PostAsJsonAsync("api/documents/search", searchRequest);
var results = await response.Content.ReadFromJsonAsync<List<DocumentDto>>();OpenAI (GPT-4):
const request = {
prompt: "Explain quantum computing",
model: "gpt-4",
systemPrompt: "You are a physics expert.",
};
this.chatService.getCompletion(sessionId, request).subscribe((response) => {
console.log(response.content);
});Ollama (Local Model):
const request = {
prompt: "Write a haiku about coding",
model: "llama3.2:latest",
systemPrompt: "You are a creative poet.",
};
this.chatService.getCompletion(sessionId, request).subscribe((response) => {
console.log(response.content);
});Anthropic (Claude):
const request = {
prompt: "Help me debug this code",
model: "claude-3-5-sonnet-20241022",
systemPrompt: "You are an expert programmer.",
};
this.chatService.getCompletion(sessionId, request).subscribe((response) => {
console.log(response.content);
});List All Sessions:
// Get all chat sessions
this.sessionService.getSessions().subscribe((sessions) => {
sessions.forEach((session) => {
console.log(`${session.name} - Created: ${session.createdAt}`);
});
});Delete a Session:
// Delete a specific session
this.sessionService.deleteSession(sessionId).subscribe(() => {
console.log("Session deleted successfully");
});Setting up OpenAI:
cd RR.AI-Chat/RR.AI-Chat.Api
dotnet user-secrets set "OpenAI:ApiKey" "sk-proj-xxxxxxxxxxxxx"Setting up Azure AI Foundry:
dotnet user-secrets set "AzureAIFoundry:Url" "https://my-resource.openai.azure.com/"
dotnet user-secrets set "AzureAIFoundry:ApiKey" "your-azure-key"
dotnet user-secrets set "AzureAIFoundry:EmbeddingModel" "text-embedding-ada-002"Setting up Anthropic:
dotnet user-secrets set "Anthropic:ApiKey" "sk-ant-xxxxxxxxxxxxx"Contributing? See our Development Guide for detailed development tips and best practices.
Backend Tests:
cd RR.AI-Chat
dotnet testFrontend Tests:
cd ai-chat-ui
npm testBackend:
cd RR.AI-Chat
dotnet publish -c Release -o ./publishFrontend:
cd ai-chat-ui
npm run build- Backend: The project uses .NET testing frameworks. Test projects can be added following the pattern
[ProjectName].Tests - Frontend: Angular uses Jasmine and Karma for unit testing
cd RR.AI-Chat
dotnet test --verbosity normalcd ai-chat-ui
npm testFor continuous test watching during development:
npm test -- --watchTo generate code coverage reports:
Backend (using dotnet-coverage):
dotnet tool install -g dotnet-coverage
cd RR.AI-Chat
dotnet-coverage collect -f cobertura -o coverage.cobertura.xml dotnet testFrontend:
cd ai-chat-ui
npm test -- --code-coverageCoverage reports will be generated in the coverage/ directory.
Error: The current .NET SDK does not support targeting .NET 10.0
Solution: Install .NET 10.0 SDK from Microsoft's download page.
Error: Cannot connect to SQL Server
Solutions:
- Ensure SQL Server is running
- Verify connection string in
appsettings.json - Check if Windows Authentication is enabled (for Integrated Security)
- For Docker SQL Server, ensure proper port mapping
Error: CORS policy blocking requests from frontend
Solutions:
- Verify
CorsOriginsinappsettings.jsonincludes your frontend URL - Ensure the API is running on the expected port
- Check if HTTPS redirects are causing issues
Error: API key authentication failed
Solutions:
- Verify API keys are correctly set in user secrets
- Check if the AI service endpoint URLs are correct
- Ensure at least one AI service is properly configured
Error: Vector search operations failing
Solutions:
- Ensure SQL Server supports Vector Search (SQL Server 2022+)
- Verify EFCore.SqlServer.VectorSearch package is installed
- Check if embedding model is properly configured
Error: Node.js version compatibility
Solutions:
- Use Node.js 18+ (recommended: LTS version)
- Clear npm cache:
npm cache clean --force - Delete
node_modulesand runnpm installagain
Error: Error: Unknown arguments: read-only, mcp
Cause: An older Angular CLI (e.g., v19) is being resolved by npx.
Solutions:
-
Configure VS Code MCP to use the workspace-local CLI binary (Windows example):
-
File:
.vscode/mcp.json -
Snippet:
{ "servers": { "angular-cli": { "type": "stdio", "command": "ai-chat-ui/node_modules/.bin/ng.cmd", "args": ["mcp", "--read-only"] } } }
-
-
Or pin CLI v21 when using
npx:npx -y @angular/cli@21 mcp --read-only
Docs: https://angular.dev/ai/mcp
Error: Port already in use
Solutions:
- API: Modify
launchSettings.jsonto use different ports - Frontend: Use
ng serve --port 4201to specify different port
- API Logs: Check console output when running
dotnet run - Frontend Logs: Open browser developer tools (F12)
- Database: Use SQL Server Management Studio or Azure Data Studio
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
- Backend (.NET): Follow standard C# conventions and SOLID principles
- Frontend (Angular): Follow Angular style guide and use TypeScript strict mode
- Ensure all tests pass before submitting PR
- Add tests for new features
- Update documentation as needed
This project is licensed under the MIT License - see the LICENSE file for details.
Copyright (c) 2025 Rodrigo Ignacio Rojas Garcia
Rodrigo Ignacio Rojas Garcia - Creator and Maintainer
- GitHub: @RorroRojas3
- OpenAI - GPT models and embeddings
- Anthropic - Claude AI models
- Microsoft Azure AI - Azure OpenAI Service
- Ollama - Local AI model runtime
Backend (.NET)
- ASP.NET Core - Web framework
- Entity Framework Core - ORM and database access
- Microsoft.Extensions.AI - AI service abstractions
- OllamaSharp - Ollama .NET client
- Anthropic.SDK - Anthropic .NET SDK
- Swashbuckle - OpenAPI/Swagger documentation
- Hangfire - Background job processing
Frontend (Angular)
- Angular - Frontend framework
- Bootstrap - UI component library
- Bootstrap Icons - Icon library
- highlight.js - Syntax highlighting
- markdown-it - Markdown parser and renderer
- RxJS - Reactive programming library
- MSAL Angular - Microsoft Authentication Library
Database & Search
- SQL Server - Database engine
- EFCore.SqlServer.VectorSearch - Vector search capabilities
Development Tools
- Visual Studio Code - Code editor
- .NET SDK - Development framework
- Node.js - JavaScript runtime
- Angular CLI - Angular development tools
This project combines modern AI capabilities with traditional web development practices to create a flexible, multi-provider chat interface suitable for various AI use cases.
If you encounter any issues or have questions:
- Check the Troubleshooting section
- Review the API documentation when the API is running
- Create an issue in the GitHub repository
{ "servers": { "angular-cli": { "command": "npx", "args": ["-y", "@angular/cli", "mcp"] } } }