Skip to content

Commit c025eab

Browse files
committed
Add comprehensive documentation for DeepSeek Wrapper
- Created multiple new documentation files including API Reference, Deployment Guide, FAQ, Features, Getting Started, and Web UI Guide. - Added README.md for an overview of the documentation structure. - Included images directory with guidelines for required images to enhance documentation clarity. - Provided detailed instructions for local and cloud deployment options, environment variable configurations, and security recommendations. - Documented core features, advanced features, and usage instructions to assist developers and users in effectively utilizing the DeepSeek Wrapper.
1 parent f5c679f commit c025eab

File tree

8 files changed

+727
-0
lines changed

8 files changed

+727
-0
lines changed

docs/README.md

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# DeepSeek Wrapper Documentation
2+
3+
This directory contains comprehensive documentation for the DeepSeek Wrapper project.
4+
5+
## Contents
6+
7+
- [Getting Started](getting-started.md) - Quick setup guide
8+
- [Web UI Guide](web-ui-guide.md) - Guide to using the web interface
9+
- [Features](features.md) - Overview of all features
10+
- [API Reference](api-reference.md) - API documentation for developers
11+
- [Deployment](deployment.md) - Deployment options and configurations
12+
- [Contributing](CONTRIBUTING.md) - How to contribute to this project
13+
- [FAQ](faq.md) - Frequently asked questions
14+
15+
For a detailed overview of the DeepSeek AI capabilities, see [deepseek-docs.md](deepseek-docs.md).

docs/api-reference.md

Lines changed: 192 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,192 @@
1+
# API Reference
2+
3+
This document provides details about the DeepSeek Wrapper API for developers who want to integrate or extend the functionality.
4+
5+
## REST API Endpoints
6+
7+
The DeepSeek Wrapper provides a FastAPI-based REST API with the following endpoints:
8+
9+
### Chat Endpoints
10+
11+
#### `POST /api/chat`
12+
13+
Create a new chat message and get a response from the DeepSeek AI.
14+
15+
**Request Body:**
16+
```json
17+
{
18+
"message": "Your message here",
19+
"conversation_id": "optional-conversation-id",
20+
"system_prompt": "Optional system prompt to guide the AI's behavior"
21+
}
22+
```
23+
24+
**Response:**
25+
```json
26+
{
27+
"id": "response-id",
28+
"content": "AI response content",
29+
"conversation_id": "conversation-id"
30+
}
31+
```
32+
33+
#### `GET /api/chat/stream`
34+
35+
Stream a chat response using Server-Sent Events (SSE).
36+
37+
**Query Parameters:**
38+
- `message` (string): The user's message
39+
- `conversation_id` (string, optional): Conversation identifier
40+
- `system_prompt` (string, optional): System prompt
41+
42+
**Response:**
43+
Server-sent events with the following data format:
44+
```json
45+
{
46+
"id": "chunk-id",
47+
"content": "Partial content chunk",
48+
"is_complete": false
49+
}
50+
```
51+
52+
Final chunk will have `is_complete` set to `true`.
53+
54+
### Conversation Endpoints
55+
56+
#### `GET /api/conversations`
57+
58+
Get a list of all conversations.
59+
60+
**Response:**
61+
```json
62+
[
63+
{
64+
"id": "conversation-id",
65+
"title": "Conversation title",
66+
"created_at": "2023-06-15T14:30:00Z",
67+
"updated_at": "2023-06-15T14:35:00Z"
68+
}
69+
]
70+
```
71+
72+
#### `GET /api/conversations/{conversation_id}`
73+
74+
Get details and messages for a specific conversation.
75+
76+
**Response:**
77+
```json
78+
{
79+
"id": "conversation-id",
80+
"title": "Conversation title",
81+
"created_at": "2023-06-15T14:30:00Z",
82+
"updated_at": "2023-06-15T14:35:00Z",
83+
"messages": [
84+
{
85+
"id": "message-id",
86+
"role": "user",
87+
"content": "User message",
88+
"created_at": "2023-06-15T14:30:00Z"
89+
},
90+
{
91+
"id": "response-id",
92+
"role": "assistant",
93+
"content": "AI response",
94+
"created_at": "2023-06-15T14:31:00Z"
95+
}
96+
]
97+
}
98+
```
99+
100+
#### `DELETE /api/conversations/{conversation_id}`
101+
102+
Delete a specific conversation.
103+
104+
**Response:** Status 204 No Content
105+
106+
### Document Endpoints
107+
108+
#### `POST /api/documents/upload`
109+
110+
Upload a document for processing.
111+
112+
**Request:** Multipart form data with a `file` field containing the document.
113+
114+
**Response:**
115+
```json
116+
{
117+
"id": "document-id",
118+
"filename": "document.pdf",
119+
"content": "Extracted text from the document",
120+
"content_type": "application/pdf",
121+
"size": 1024
122+
}
123+
```
124+
125+
## Python Client API
126+
127+
The DeepSeek Wrapper provides a Python client for programmatic access to the DeepSeek API.
128+
129+
### Basic Usage
130+
131+
```python
132+
from deepseek_wrapper.client import DeepSeekClient
133+
134+
# Initialize the client
135+
client = DeepSeekClient(api_key="your-api-key")
136+
137+
# Simple chat completion
138+
response = client.chat_completion(
139+
messages=[
140+
{"role": "user", "content": "Hello, how are you?"}
141+
]
142+
)
143+
print(response.content)
144+
145+
# Streaming response
146+
for chunk in client.chat_completion_stream(
147+
messages=[
148+
{"role": "user", "content": "Write a short story about a robot."}
149+
]
150+
):
151+
print(chunk.content, end="", flush=True)
152+
```
153+
154+
### Client Methods
155+
156+
#### `chat_completion(messages, system_prompt=None, **kwargs)`
157+
158+
Get a complete response for the given messages.
159+
160+
**Parameters:**
161+
- `messages` (List[Dict]): List of message objects with `role` and `content`
162+
- `system_prompt` (str, optional): System prompt to guide the AI
163+
- `**kwargs`: Additional parameters to pass to the API
164+
165+
**Returns:** Response object with attributes like `id`, `content`, etc.
166+
167+
#### `chat_completion_stream(messages, system_prompt=None, **kwargs)`
168+
169+
Stream the response for the given messages.
170+
171+
**Parameters:**
172+
- `messages` (List[Dict]): List of message objects with `role` and `content`
173+
- `system_prompt` (str, optional): System prompt to guide the AI
174+
- `**kwargs`: Additional parameters to pass to the API
175+
176+
**Returns:** Generator yielding response chunks
177+
178+
#### `clean_up()`
179+
180+
Clean up resources used by the client, including closing the HTTP session.
181+
182+
## Error Handling
183+
184+
The API returns standard HTTP status codes:
185+
186+
- `200 OK`: Successful operation
187+
- `400 Bad Request`: Invalid input
188+
- `401 Unauthorized`: Missing or invalid API key
189+
- `404 Not Found`: Resource not found
190+
- `500 Internal Server Error`: Server error
191+
192+
Error responses include a JSON body with `error` and `message` fields.

docs/deployment.md

Lines changed: 161 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,161 @@
1+
# Deployment Guide
2+
3+
This guide covers different options for deploying the DeepSeek Wrapper application in various environments.
4+
5+
## Local Deployment
6+
7+
### Running for Development
8+
9+
For local development, you can run the application with:
10+
11+
```bash
12+
python -m src.deepseek_wrapper.main
13+
```
14+
15+
This will start the application with auto-reload enabled for development purposes.
16+
17+
### Running for Production
18+
19+
For a more production-ready local deployment:
20+
21+
```bash
22+
uvicorn src.deepseek_wrapper.main:app --host 0.0.0.0 --port 8000 --workers 4
23+
```
24+
25+
This uses the Uvicorn ASGI server with multiple worker processes for better performance.
26+
27+
## Docker Deployment
28+
29+
### Building the Docker Image
30+
31+
1. Ensure Docker is installed on your system
32+
2. Build the image:
33+
```bash
34+
docker build -t deepseek-wrapper .
35+
```
36+
37+
### Running with Docker
38+
39+
```bash
40+
docker run -d -p 8000:8000 --env-file .env --name deepseek-wrapper deepseek-wrapper
41+
```
42+
43+
This will:
44+
- Run the container in detached mode (`-d`)
45+
- Map port 8000 from the container to your host
46+
- Use your local `.env` file for environment variables
47+
- Name the container "deepseek-wrapper"
48+
49+
### Docker Compose
50+
51+
For easier management, you can use Docker Compose:
52+
53+
```yaml
54+
# docker-compose.yml
55+
version: '3'
56+
services:
57+
deepseek-wrapper:
58+
build: .
59+
ports:
60+
- "8000:8000"
61+
environment:
62+
- DEEPSEEK_API_KEY=${DEEPSEEK_API_KEY}
63+
volumes:
64+
- ./uploads:/app/uploads
65+
restart: unless-stopped
66+
```
67+
68+
Run with:
69+
```bash
70+
docker-compose up -d
71+
```
72+
73+
## Cloud Deployment
74+
75+
### Deploying to Heroku
76+
77+
1. Install the Heroku CLI and login
78+
2. Initialize a Git repository if not already done:
79+
```bash
80+
git init
81+
git add .
82+
git commit -m "Initial commit"
83+
```
84+
3. Create a Heroku app and deploy:
85+
```bash
86+
heroku create deepseek-wrapper
87+
heroku config:set DEEPSEEK_API_KEY=your_api_key_here
88+
git push heroku main
89+
```
90+
91+
### Deploying to AWS Elastic Beanstalk
92+
93+
1. Install the EB CLI and initialize your project:
94+
```bash
95+
pip install awsebcli
96+
eb init
97+
```
98+
2. Create and deploy to an environment:
99+
```bash
100+
eb create deepseek-wrapper-env
101+
```
102+
3. Set environment variables:
103+
```bash
104+
eb setenv DEEPSEEK_API_KEY=your_api_key_here
105+
```
106+
107+
### Deploying to Azure App Service
108+
109+
1. Install the Azure CLI and login:
110+
```bash
111+
az login
112+
```
113+
2. Create an App Service and deploy:
114+
```bash
115+
az webapp up --name deepseek-wrapper --resource-group your-resource-group --runtime "PYTHON:3.9"
116+
```
117+
3. Set environment variables:
118+
```bash
119+
az webapp config appsettings set --name deepseek-wrapper --resource-group your-resource-group --settings DEEPSEEK_API_KEY=your_api_key_here
120+
```
121+
122+
## Environment Variables
123+
124+
The following environment variables can be configured for deployment:
125+
126+
| Variable | Description | Default |
127+
|----------|-------------|---------|
128+
| `DEEPSEEK_API_KEY` | Your DeepSeek API key (required) | None |
129+
| `HOST` | Host to bind the server to | 0.0.0.0 |
130+
| `PORT` | Port to run the server on | 8000 |
131+
| `LOG_LEVEL` | Logging level (DEBUG, INFO, WARNING, ERROR) | INFO |
132+
| `UPLOAD_DIR` | Directory for file uploads | ./uploads |
133+
| `MAX_UPLOAD_SIZE` | Maximum file upload size in MB | 10 |
134+
135+
## Scaling Considerations
136+
137+
For high-traffic deployments, consider:
138+
139+
1. Using a reverse proxy like Nginx in front of the application
140+
2. Implementing Redis for session storage
141+
3. Deploying with multiple workers/instances
142+
4. Setting up a load balancer for horizontal scaling
143+
144+
## Security Recommendations
145+
146+
1. Always use HTTPS in production
147+
2. Set up proper authentication if exposing to public internet
148+
3. Keep your API key secure and use environment variables
149+
4. Regularly update dependencies
150+
5. Consider using a WAF (Web Application Firewall)
151+
152+
## Monitoring
153+
154+
For production deployments, set up monitoring:
155+
156+
1. Application logs
157+
2. Server metrics (CPU, memory, disk usage)
158+
3. Request/response times
159+
4. Error rates
160+
161+
Popular monitoring tools include Prometheus, Grafana, or cloud provider monitoring services.

0 commit comments

Comments
 (0)