initial commit

This commit is contained in:
2025-10-19 22:09:35 +03:00
commit 6d593b4554
114 changed files with 23622 additions and 0 deletions

128
CODE_OF_CONDUCT.md Normal file
View File

@ -0,0 +1,128 @@
# Contributor Covenant Code of Conduct
## Our Pledge
We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, religion, or sexual identity
and orientation.
We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.
## Our Standards
Examples of behavior that contributes to a positive environment for our
community include:
* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
and learning from the experience
* Focusing on what is best not just for us as individuals, but for the
overall community
Examples of unacceptable behavior include:
* The use of sexualized language or imagery, and sexual attention or
advances of any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email
address, without their explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Enforcement Responsibilities
Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.
Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, wiki edits, issues, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.
## Scope
This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
igor.magalhaes.r@gmail.com.
All complaints will be reviewed and investigated promptly and fairly.
All community leaders are obligated to respect the privacy and security of the
reporter of any incident.
## Enforcement Guidelines
Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:
### 1. Correction
**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.
**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.
### 2. Warning
**Community Impact**: A violation through a single incident or series
of actions.
**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or
permanent ban.
### 3. Temporary Ban
**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.
**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.
### 4. Permanent Ban
**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior, harassment of an
individual, or aggression toward or disparagement of classes of individuals.
**Consequence**: A permanent ban from any sort of public interaction within
the community.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.0, available at
https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
Community Impact Guidelines were inspired by [Mozilla's code of conduct
enforcement ladder](https://github.com/mozilla/diversity).
[homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see the FAQ at
https://www.contributor-covenant.org/faq. Translations are available at
https://www.contributor-covenant.org/translations.

99
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,99 @@
# Contributing to FastAPI-boilerplate
Thank you for your interest in contributing to FastAPI-boilerplate! This guide is meant to make it easy for you to get started.
Contributions are appreciated, even if just reporting bugs, documenting stuff or answering questions. To contribute with a feature:
## Setting Up Your Development Environment
### Cloning the Repository
Start by forking and cloning the FastAPI-boilerplate repository:
1. **Fork the Repository**: Begin by forking the project repository. You can do this by visiting https://github.com/igormagalhaesr/FastAPI-boilerplate and clicking the "Fork" button.
1. **Create a Feature Branch**: Once you've forked the repo, create a branch for your feature by running `git checkout -b feature/fooBar`.
1. **Testing Changes**: Ensure that your changes do not break existing functionality by running tests. In the root folder, execute `uv run pytest` to run the tests.
### Using uv for Dependency Management
FastAPI-boilerplate uses uv for managing dependencies. If you don't have uv installed, follow the instructions on the [official uv website](https://docs.astral.sh/uv/).
Once uv is installed, navigate to the cloned repository and install the dependencies:
```sh
cd FastAPI-boilerplate
uv sync
```
### Activating the Virtual Environment
uv creates a virtual environment for your project. Activate it using:
```sh
source .venv/bin/activate
```
Alternatively, you can run commands directly with `uv run` without activating the environment:
```sh
uv run python your_script.py
```
## Making Contributions
### Coding Standards
- Follow PEP 8 guidelines.
- Write meaningful tests for new features or bug fixes.
### Testing with Pytest
FastAPI-boilerplate uses pytest for testing. Run tests using:
```sh
uv run pytest
```
### Linting
Use mypy for type checking:
```sh
mypy src
```
Use ruff for style:
```sh
ruff check --fix
ruff format
```
Ensure your code passes linting before submitting.
### Using pre-commit for Better Code Quality
It helps in identifying simple issues before submission to code review. By running automated checks, pre-commit can ensure code quality and consistency.
1. **Install Pre-commit**:
- **Installation**: Install pre-commit in your development environment. Use the command `uv add --dev pre-commit` or `pip install pre-commit`.
- **Setting Up Hooks**: After installing pre-commit, set up the hooks with `pre-commit install`. This command will install hooks into your .git/ directory which will automatically check your commits for issues.
1. **Committing Your Changes**:
After making your changes, use `git commit -am 'Add some fooBar'` to commit them. Pre-commit will run automatically on your files when you commit, ensuring that they meet the required standards.
Note: If pre-commit identifies issues, it may block your commit. Fix these issues and commit again. This ensures that all contributions are of high quality.
1. **Pushing Changes and Creating Pull Request**:
Push your changes to the branch using `git push origin feature/fooBar`.
Visit your fork on GitHub and create a new Pull Request to the main repository.
### Additional Notes
**Stay Updated**: Keep your fork updated with the main repository to avoid merge conflicts. Regularly fetch and merge changes from the upstream repository.
**Adhere to Project Conventions**: Follow the coding style, conventions, and commit message guidelines of the project.
**Open Communication**: Feel free to ask questions or discuss your ideas by opening an issue or in discussions.
## Submitting Your Contributions
### Creating a Pull Request
After making your changes:
- Push your changes to your fork.
- Open a pull request with a clear description of your changes.
- Update the README.md if necessary.
### Code Reviews
- Address any feedback from code reviews.
- Once approved, your contributions will be merged into the main branch.
## Code of Conduct
Please adhere to our [Code of Conduct](CODE_OF_CONDUCT.md) to maintain a welcoming and inclusive environment.
Thank you for contributing to FastAPI-boilerplate🚀

44
Dockerfile Normal file
View File

@ -0,0 +1,44 @@
# --------- Builder Stage ---------
FROM ghcr.io/astral-sh/uv:python3.11-bookworm-slim AS builder
# Set environment variables for uv
ENV UV_COMPILE_BYTECODE=1
ENV UV_LINK_MODE=copy
WORKDIR /app
# Install dependencies first (for better layer caching)
RUN --mount=type=cache,target=/root/.cache/uv \
--mount=type=bind,source=uv.lock,target=uv.lock \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
uv sync --locked --no-install-project
# Copy the project source code
COPY . /app
# Install the project in non-editable mode
RUN --mount=type=cache,target=/root/.cache/uv \
uv sync --locked --no-editable
# --------- Final Stage ---------
FROM python:3.11-slim-bookworm
# Create a non-root user for security
RUN groupadd --gid 1000 app \
&& useradd --uid 1000 --gid app --shell /bin/bash --create-home app
# Copy the virtual environment from the builder stage
COPY --from=builder --chown=app:app /app/.venv /app/.venv
# Ensure the virtual environment is in the PATH
ENV PATH="/app/.venv/bin:$PATH"
# Switch to the non-root user
USER app
# Set the working directory
WORKDIR /code
# -------- replace with comment to run with gunicorn --------
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]
# CMD ["gunicorn", "app.main:app", "-w", "4", "-k", "uvicorn.workers.UvicornWorker", "-b", "0.0.0.0:8000"]

21
LICENSE.md Normal file
View File

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2023 Igor Magalhães
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

2225
README.md Normal file

File diff suppressed because it is too large Load Diff

2
build-docker.sh Executable file
View File

@ -0,0 +1,2 @@
docker build -t git.logidex.ru/fakz9/tbank-api-logidex:latest .
docker push git.logidex.ru/fakz9/tbank-api-logidex:latest

32
default.conf Normal file
View File

@ -0,0 +1,32 @@
# ---------------- Running With One Server ----------------
server {
listen 80;
location / {
proxy_pass http://web:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
# # ---------------- To Run with Multiple Servers, Uncomment below ----------------
# upstream fastapi_app {
# server fastapi1:8000; # Replace with actual server names or IP addresses
# server fastapi2:8000;
# # Add more servers as needed
# }
# server {
# listen 80;
# location / {
# proxy_pass http://fastapi_app;
# proxy_set_header Host $host;
# proxy_set_header X-Real-IP $remote_addr;
# proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
# proxy_set_header X-Forwarded-Proto $scheme;
# }
# }

57
docker-compose.prod.yml Normal file
View File

@ -0,0 +1,57 @@
services:
web:
image: git.logidex.ru/fakz9/tbank-api-logidex:latest
build:
context: .
dockerfile: Dockerfile
command: gunicorn app.main:app -w 4 -k uvicorn.workers.UvicornWorker -b 0.0.0.0:8000
env_file:
- ./src/.env
depends_on:
- db
- redis
volumes:
- ./src/app:/code/app
- ./src/.env:/code/.env
networks:
- appnet
- proxy
labels:
- "traefik.enable=true"
worker:
build:
context: .
dockerfile: Dockerfile
command: arq app.core.worker.settings.WorkerSettings
env_file:
- ./src/.env
depends_on:
- db
- redis
volumes:
- ./src/app:/code/app
- ./src/.env:/code/.env
networks:
- appnet
db:
image: postgres:17
env_file:
- ./src/.env
volumes:
- postgres-data:/var/lib/postgresql/data
networks:
- appnet
redis:
image: redis:alpine
volumes:
- redis-data:/data
networks:
- appnet
volumes:
postgres-data:
redis-data:
networks:
appnet:
external: false
proxy:
external: true

8
docker-compose.test.yml Normal file
View File

@ -0,0 +1,8 @@
services:
web:
user: root # Run as root for tests to allow global package installation
environment:
- PYTHONPATH=/usr/local/lib/python3.11/site-packages
command: bash -c "pip install faker pytest-asyncio pytest-mock && pytest tests/ -v"
volumes:
- ./tests:/code/tests

81
docker-compose.yml Normal file
View File

@ -0,0 +1,81 @@
services:
web:
image: git.logidex.ru/fakz9/tbank-api-logidex:latest
build:
context: .
dockerfile: Dockerfile
# -------- replace with comment to run with gunicorn --------
command: uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
# command: gunicorn app.main:app -w 4 -k uvicorn.workers.UvicornWorker -b 0.0.0.0:8000
env_file:
- ./src/.env
# -------- replace with comment if you are using nginx --------
ports:
- "8000:8000"
# expose:
# - "8000"
depends_on:
- db
- redis
volumes:
- ./src/app:/code/app
- ./src/.env:/code/.env
worker:
build:
context: .
dockerfile: Dockerfile
command: arq app.core.worker.settings.WorkerSettings
env_file:
- ./src/.env
depends_on:
- db
- redis
volumes:
- ./src/app:/code/app
- ./src/.env:/code/.env
db:
image: postgres:13
env_file:
- ./src/.env
volumes:
- postgres-data:/var/lib/postgresql/data
# -------- replace with comment to run migrations with docker --------
expose:
- "5432"
ports:
- "5432:5432"
redis:
image: redis:alpine
volumes:
- redis-data:/data
expose:
- "6379"
pgadmin:
container_name: pgadmin4
image: dpage/pgadmin4:latest
restart: always
ports:
- "5050:80"
volumes:
- pgadmin-data:/var/lib/pgadmin
env_file:
- ./src/.env
depends_on:
- db
nginx:
image: nginx:latest
ports:
- "80:80"
volumes:
- ./default.conf:/etc/nginx/conf.d/default.conf
depends_on:
- web
volumes:
postgres-data:
redis-data:
pgadmin-data:

Binary file not shown.

After

Width:  |  Height:  |  Size: 390 KiB

96
docs/community.md Normal file
View File

@ -0,0 +1,96 @@
# Community
Join our vibrant Discord community to connect with other developers, get help, share your projects, and stay updated with the latest developments!
## 🚀 Discord Server
**[Join our Discord community](https://discord.gg/jhhbkxBmhj)**
Welcome to the **Benav Labs** community! Our Discord server is the central hub where developers using our FastAPI boilerplate and other products can connect, collaborate, and grow together.
## 🏠 What to Expect
When you join our Discord server, you'll find an organized space designed for productive discussions and community building. Here's what you can expect:
### Community Guidelines
Our community is built on respect and collaboration. We maintain:
- A welcoming environment for developers of all skill levels
- Clear guidelines to keep discussions productive and on-topic
- Direct feedback channels to the Benav Labs team
- A safe space for sharing projects and asking questions
---
### 📚 Channel Overview
- **🤝 Networking**
Connect with fellow developers, share your background, and explore collaboration opportunities. Perfect for introductions and finding project partners.
- **📚 Products**
Learn about the Benav Labs ecosystem, including FastroAI and other tools. Get updates on new features and product roadmaps.
- **📸 Showcase**
Share what you've built using our tools! From quick prototypes to full production applications, the community loves seeing your creations. Projects may be featured in our blog or community highlights.
- **🗒️ Blog**
Stay updated with our latest technical blog posts, tutorials, and insights about FastAPI, AI development, and best practices.
- **📣 Announcements**
Official updates from the Benav Labs team about new releases, important changes, and community events.
- **💬 General**
Open discussions about development, troubleshooting, and general chat. Great for quick questions and casual conversations.
- **🎤 Community Voice**
Join live voice conversations, community calls, and interactive discussions with other members and the team.
---
## 🌟 Community Benefits
By joining our Discord community, you get:
- **Direct support** from the Benav Labs team and experienced community members
- **Early insights** into new features and product developments
- **Networking opportunities** with developers building similar projects
- **Project showcase** opportunities for visibility and feedback
- **Real-time help** with FastAPI boilerplate and development questions
## 🎯 Getting Started
1. **Join the server** using the invite link above
2. **Read the welcome message** and community guidelines
3. **Introduce yourself** in the networking channel
4. **Explore the channels** to see ongoing discussions
5. **Ask questions** - the community is here to help!
## 💬 Feedback & Support
We actively encourage feedback and suggestions! The community provides multiple ways to share your thoughts:
- Direct messages to team members for product feedback
- Public discussions for feature requests and improvements
- Bug reports and technical issues
- General suggestions for the community and products
---
## 🔗 Quick Links
- **Discord Server:** [discord.gg/jhhbkxBmhj](https://discord.gg/jhhbkxBmhj)
- **FastroAI:** [benav.io/fastroai](https://benav.io/fastroai)
- **Blog:** [fastro.ai/blog](https://fastro.ai/blog)
- **Benav Labs:** [benav.io](https://benav.io)
---
We're excited to have you as part of our community! 🚀

View File

@ -0,0 +1,163 @@
# Configuration
This guide covers the essential configuration steps to get your FastAPI application running quickly.
## Quick Setup
The fastest way to get started is to copy the example environment file and modify just a few values:
```bash
cp src/.env.example src/.env
```
## Essential Configuration
Open `src/.env` and set these required values:
### Application Settings
```env
# App Settings
APP_NAME="Your app name here"
APP_DESCRIPTION="Your app description here"
APP_VERSION="0.1"
CONTACT_NAME="Your name"
CONTACT_EMAIL="Your email"
LICENSE_NAME="The license you picked"
```
### Database Connection
```env
# Database
POSTGRES_USER="your_postgres_user"
POSTGRES_PASSWORD="your_password"
POSTGRES_SERVER="localhost" # Use "db" for Docker Compose
POSTGRES_PORT=5432 # Use 5432 for Docker Compose
POSTGRES_DB="your_database_name"
```
### PGAdmin (Optional)
For database administration:
```env
# PGAdmin
PGADMIN_DEFAULT_EMAIL="your_email_address"
PGADMIN_DEFAULT_PASSWORD="your_password"
PGADMIN_LISTEN_PORT=80
```
**To connect to database in PGAdmin:**
1. Login with `PGADMIN_DEFAULT_EMAIL` and `PGADMIN_DEFAULT_PASSWORD`
2. Click "Add Server"
3. Use these connection settings:
- **Hostname/address**: `db` (if using containers) or `localhost`
- **Port**: Value from `POSTGRES_PORT`
- **Database**: `postgres` (leave as default)
- **Username**: Value from `POSTGRES_USER`
- **Password**: Value from `POSTGRES_PASSWORD`
### Security
Generate a secret key and set it:
```bash
# Generate a secure secret key
openssl rand -hex 32
```
```env
# Cryptography
SECRET_KEY="your-generated-secret-key-here" # Result of openssl rand -hex 32
ALGORITHM="HS256" # Default: HS256
ACCESS_TOKEN_EXPIRE_MINUTES=30 # Default: 30
REFRESH_TOKEN_EXPIRE_DAYS=7 # Default: 7
```
### First Admin User
```env
# Admin User
ADMIN_NAME="your_name"
ADMIN_EMAIL="your_email"
ADMIN_USERNAME="your_username"
ADMIN_PASSWORD="your_password"
```
### Redis Configuration
```env
# Redis Cache
REDIS_CACHE_HOST="localhost" # Use "redis" for Docker Compose
REDIS_CACHE_PORT=6379
# Client-side Cache
CLIENT_CACHE_MAX_AGE=30 # Default: 30 seconds
# Redis Job Queue
REDIS_QUEUE_HOST="localhost" # Use "redis" for Docker Compose
REDIS_QUEUE_PORT=6379
# Redis Rate Limiting
REDIS_RATE_LIMIT_HOST="localhost" # Use "redis" for Docker Compose
REDIS_RATE_LIMIT_PORT=6379
```
!!! warning "Redis in Production"
You may use the same Redis instance for caching and queues while developing, but use separate containers in production.
### Rate Limiting Defaults
```env
# Default Rate Limits
DEFAULT_RATE_LIMIT_LIMIT=10 # Default: 10 requests
DEFAULT_RATE_LIMIT_PERIOD=3600 # Default: 3600 seconds (1 hour)
```
### First Tier
```env
# Default Tier
TIER_NAME="free"
```
## Environment Types
Set your environment type:
```env
ENVIRONMENT="local" # local, staging, or production
```
- **local**: API docs available at `/docs`, `/redoc`, and `/openapi.json`
- **staging**: API docs available to superusers only
- **production**: API docs completely disabled
## Docker Compose Settings
If using Docker Compose, use these values instead:
```env
# Docker Compose values
POSTGRES_SERVER="db"
REDIS_CACHE_HOST="redis"
REDIS_QUEUE_HOST="redis"
REDIS_RATE_LIMIT_HOST="redis"
```
## Optional Services
The boilerplate includes Redis for caching, job queues, and rate limiting. If running locally without Docker, either:
1. **Install Redis** and keep the default settings
2. **Disable Redis services** (see [User Guide - Configuration](../user-guide/configuration/index.md) for details)
## That's It!
With these basic settings configured, you can start the application:
- **Docker Compose**: `docker compose up`
- **Manual**: `uv run uvicorn src.app.main:app --reload`
For detailed configuration options, advanced settings, and production deployment, see the [User Guide - Configuration](../user-guide/configuration/index.md).

View File

@ -0,0 +1,594 @@
# First Run Guide
Congratulations on setting up the FastAPI Boilerplate! This guide will walk you through testing your installation, understanding the basics, and making your first customizations.
## Verification Checklist
Before diving deeper, let's verify everything is working correctly.
### 1. Check All Services
Ensure all services are running:
```bash
# For Docker Compose users
docker compose ps
# Expected output:
# NAME COMMAND SERVICE STATUS
# fastapi-boilerplate-web-1 "uvicorn app.main:app…" web running
# fastapi-boilerplate-db-1 "docker-entrypoint.s…" db running
# fastapi-boilerplate-redis-1 "docker-entrypoint.s…" redis running
# fastapi-boilerplate-worker-1 "arq src.app.core.wo…" worker running
```
### 2. Test API Endpoints
Visit these URLs to confirm your API is working:
**API Documentation:**
- **Swagger UI**: [http://localhost:8000/docs](http://localhost:8000/docs)
- **ReDoc**: [http://localhost:8000/redoc](http://localhost:8000/redoc)
**Health Check:**
```bash
curl http://localhost:8000/api/v1/health
```
Expected response:
```json
{
"status": "healthy",
"timestamp": "2024-01-01T12:00:00Z"
}
```
### 3. Database Connection
Check if the database tables were created:
```bash
# For Docker Compose
docker compose exec db psql -U postgres -d myapp -c "\dt"
# You should see tables like:
# public | users | table | postgres
# public | posts | table | postgres
# public | tiers | table | postgres
# public | rate_limits | table | postgres
```
### 4. Redis Connection
Test Redis connectivity:
```bash
# For Docker Compose
docker compose exec redis redis-cli ping
# Expected response: PONG
```
## Initial Setup
Before testing features, you need to create the first superuser and tier.
### Creating the First Superuser
!!! warning "Prerequisites"
Make sure the database and tables are created before running create_superuser. The database should be running and the API should have started at least once.
#### Using Docker Compose
If using Docker Compose, uncomment this section in your `docker-compose.yml`:
```yaml
#-------- uncomment to create first superuser --------
create_superuser:
build:
context: .
dockerfile: Dockerfile
env_file:
- ./src/.env
depends_on:
- db
command: python -m src.scripts.create_first_superuser
volumes:
- ./src:/code/src
```
Then run:
```bash
# Start services and run create_superuser automatically
docker compose up -d
# Or run it manually
docker compose run --rm create_superuser
# Stop the create_superuser service when done
docker compose stop create_superuser
```
#### From Scratch
If running manually, use:
```bash
# Make sure you're in the root folder
uv run python -m src.scripts.create_first_superuser
```
### Creating the First Tier
!!! warning "Prerequisites"
Make sure the database and tables are created before running create_tier.
#### Using Docker Compose
Uncomment the `create_tier` service in `docker-compose.yml` and run:
```bash
docker compose run --rm create_tier
```
#### From Scratch
```bash
# Make sure you're in the root folder
uv run python -m src.scripts.create_first_tier
```
## Testing Core Features
Let's test the main features of your API.
### Authentication Flow
#### 1. Login with Admin User
Use the admin credentials you set in your `.env` file:
```bash
curl -X POST "http://localhost:8000/api/v1/login" \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "username=admin&password=your_admin_password"
```
You should receive a response like:
```json
{
"access_token": "eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9...",
"token_type": "bearer",
"refresh_token": "eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9..."
}
```
#### 2. Create a New User
```bash
curl -X POST "http://localhost:8000/api/v1/users" \
-H "Content-Type: application/json" \
-d '{
"name": "John Doe",
"username": "johndoe",
"email": "john@example.com",
"password": "securepassword123"
}'
```
#### 3. Test Protected Endpoint
Use the access token from step 1:
```bash
curl -X GET "http://localhost:8000/api/v1/users/me" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN_HERE"
```
### CRUD Operations
#### 1. Create a Post
```bash
curl -X POST "http://localhost:8000/api/v1/posts" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN_HERE" \
-d '{
"title": "My First Post",
"content": "This is the content of my first post!"
}'
```
#### 2. Get All Posts
```bash
curl -X GET "http://localhost:8000/api/v1/posts" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN_HERE"
```
#### 3. Get Posts with Pagination
```bash
curl -X GET "http://localhost:8000/api/v1/posts?page=1&items_per_page=5" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN_HERE"
```
### Background Tasks
Test the job queue system:
#### 1. Submit a Background Task
```bash
curl -X POST "http://localhost:8000/api/v1/tasks/task?message=hello" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN_HERE"
```
Response:
```json
{
"id": "550e8400-e29b-41d4-a716-446655440000"
}
```
#### 2. Check Task Status
```bash
curl -X GET "http://localhost:8000/api/v1/tasks/task/550e8400-e29b-41d4-a716-446655440000" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN_HERE"
```
### Caching
Test the caching system:
#### 1. Make a Cached Request
```bash
# First request (cache miss)
curl -X GET "http://localhost:8000/api/v1/users/johndoe" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN_HERE" \
-w "Time: %{time_total}s\n"
# Second request (cache hit - should be faster)
curl -X GET "http://localhost:8000/api/v1/users/johndoe" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN_HERE" \
-w "Time: %{time_total}s\n"
```
## Your First Customization
Let's create a simple custom endpoint to see how easy it is to extend the boilerplate.
### 1. Create a Simple Model
Create `src/app/models/item.py`:
```python
from sqlalchemy import String
from sqlalchemy.orm import Mapped, mapped_column
from app.core.db.database import Base
class Item(Base):
__tablename__ = "items"
id: Mapped[int] = mapped_column("id", autoincrement=True, nullable=False, unique=True, primary_key=True, init=False)
name: Mapped[str] = mapped_column(String(100))
description: Mapped[str] = mapped_column(String(500), default="")
```
### 2. Create Pydantic Schemas
Create `src/app/schemas/item.py`:
```python
from pydantic import BaseModel, Field
class ItemBase(BaseModel):
name: str = Field(..., min_length=1, max_length=100)
description: str = Field("", max_length=500)
class ItemCreate(ItemBase):
pass
class ItemCreateInternal(ItemCreate):
pass
class ItemRead(ItemBase):
id: int
class ItemUpdate(BaseModel):
name: str | None = None
description: str | None = None
class ItemUpdateInternal(ItemUpdate):
pass
class ItemDelete(BaseModel):
is_deleted: bool = True
```
### 3. Create CRUD Operations
Create `src/app/crud/crud_items.py`:
```python
from fastcrud import FastCRUD
from app.models.item import Item
from app.schemas.item import ItemCreateInternal, ItemUpdate, ItemUpdateInternal, ItemDelete
CRUDItem = FastCRUD[Item, ItemCreateInternal, ItemUpdate, ItemUpdateInternal, ItemDelete]
crud_items = CRUDItem(Item)
```
### 4. Create API Endpoints
Create `src/app/api/v1/items.py`:
```python
from typing import Annotated
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.dependencies import get_current_user
from app.core.db.database import async_get_db
from app.crud.crud_items import crud_items
from app.schemas.item import ItemCreate, ItemRead, ItemUpdate
from app.schemas.user import UserRead
router = APIRouter(tags=["items"])
@router.post("/", response_model=ItemRead, status_code=201)
async def create_item(
item: ItemCreate,
db: Annotated[AsyncSession, Depends(async_get_db)],
current_user: Annotated[UserRead, Depends(get_current_user)]
):
"""Create a new item."""
db_item = await crud_items.create(db=db, object=item)
return db_item
@router.get("/{item_id}", response_model=ItemRead)
async def get_item(
item_id: int,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
"""Get an item by ID."""
db_item = await crud_items.get(db=db, id=item_id)
if not db_item:
raise HTTPException(status_code=404, detail="Item not found")
return db_item
@router.get("/", response_model=list[ItemRead])
async def get_items(
db: Annotated[AsyncSession, Depends(async_get_db)],
skip: int = 0,
limit: int = 100
):
"""Get all items."""
items = await crud_items.get_multi(db=db, offset=skip, limit=limit)
return items["data"]
@router.patch("/{item_id}", response_model=ItemRead)
async def update_item(
item_id: int,
item_update: ItemUpdate,
db: Annotated[AsyncSession, Depends(async_get_db)],
current_user: Annotated[UserRead, Depends(get_current_user)]
):
"""Update an item."""
db_item = await crud_items.get(db=db, id=item_id)
if not db_item:
raise HTTPException(status_code=404, detail="Item not found")
updated_item = await crud_items.update(db=db, object=item_update, id=item_id)
return updated_item
@router.delete("/{item_id}")
async def delete_item(
item_id: int,
db: Annotated[AsyncSession, Depends(async_get_db)],
current_user: Annotated[UserRead, Depends(get_current_user)]
):
"""Delete an item."""
db_item = await crud_items.get(db=db, id=item_id)
if not db_item:
raise HTTPException(status_code=404, detail="Item not found")
await crud_items.delete(db=db, id=item_id)
return {"message": "Item deleted successfully"}
```
### 5. Register the Router
Add your new router to `src/app/api/v1/__init__.py`:
```python
from fastapi import APIRouter
from app.api.v1.login import router as login_router
from app.api.v1.logout import router as logout_router
from app.api.v1.posts import router as posts_router
from app.api.v1.rate_limits import router as rate_limits_router
from app.api.v1.tasks import router as tasks_router
from app.api.v1.tiers import router as tiers_router
from app.api.v1.users import router as users_router
from app.api.v1.items import router as items_router # Add this line
router = APIRouter(prefix="/v1")
router.include_router(login_router, prefix="/login")
router.include_router(logout_router, prefix="/logout")
router.include_router(users_router, prefix="/users")
router.include_router(posts_router, prefix="/posts")
router.include_router(tasks_router, prefix="/tasks")
router.include_router(tiers_router, prefix="/tiers")
router.include_router(rate_limits_router, prefix="/rate_limits")
router.include_router(items_router, prefix="/items") # Add this line
```
### 6. Create and Run Migration
Import your new model in `src/app/models/__init__.py`:
```python
from .user import User
from .post import Post
from .tier import Tier
from .rate_limit import RateLimit
from .item import Item # Add this line
```
Create and run the migration:
```bash
# For Docker Compose
docker compose exec web alembic revision --autogenerate -m "Add items table"
docker compose exec web alembic upgrade head
# For manual installation
cd src
uv run alembic revision --autogenerate -m "Add items table"
uv run alembic upgrade head
```
### 7. Test Your New Endpoint
Restart your application and test the new endpoints:
```bash
# Create an item
curl -X POST "http://localhost:8000/api/v1/items/" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN_HERE" \
-d '{
"name": "My First Item",
"description": "This is a test item"
}'
# Get all items
curl -X GET "http://localhost:8000/api/v1/items/" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN_HERE"
```
## Debugging Common Issues
### Logs and Monitoring
#### Check Application Logs
```bash
# For Docker Compose
docker compose logs web
# For manual installation
tail -f src/app/logs/app.log
```
#### Check Database Logs
```bash
# For Docker Compose
docker compose logs db
```
#### Check Worker Logs
```bash
# For Docker Compose
docker compose logs worker
```
### Performance Testing
#### Test API Response Times
```bash
# Test endpoint performance
curl -w "Time: %{time_total}s\n" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN_HERE" \
http://localhost:8000/api/v1/users/me
```
#### Test Database Performance
```bash
# Check active connections
docker compose exec db psql -U postgres -d myapp -c "SELECT count(*) FROM pg_stat_activity;"
```
## Monitoring Dashboard
### Redis Monitor
```bash
# Monitor Redis operations
docker compose exec redis redis-cli monitor
```
### Database Activity
```bash
# Check database activity
docker compose exec db psql -U postgres -d myapp -c "SELECT * FROM pg_stat_activity;"
```
## Next Steps
Now that you've verified everything works and created your first custom endpoint, you're ready to dive deeper:
### Essential Learning
1. **[Project Structure](../user-guide/project-structure.md)** - Understand how the code is organized
2. **[Database Guide](../user-guide/database/index.md)** - Learn about models, schemas, and CRUD operations
3. **[Authentication](../user-guide/authentication/index.md)** - Deep dive into JWT and user management
### Advanced Features
1. **[Caching](../user-guide/caching/index.md)** - Speed up your API with Redis caching
2. **[Background Tasks](../user-guide/background-tasks/index.md)** - Process long-running tasks asynchronously
3. **[Rate Limiting](../user-guide/rate-limiting/index.md)** - Protect your API from abuse
### Development Workflow
1. **[Development Guide](../user-guide/development.md)** - Best practices for extending the boilerplate
2. **[Testing](../user-guide/testing.md)** - Write tests for your new features
3. **[Production](../user-guide/production.md)** - Deploy your API to production
## Getting Help
If you encounter any issues:
1. **Check the logs** for error messages
2. **Verify your configuration** in the `.env` file
3. **Review the [GitHub Issues](https://github.com/benavlabs/fastapi-boilerplate/issues)** for common solutions
4. **Search [existing issues](https://github.com/benavlabs/fastapi-boilerplate/issues)** on GitHub
5. **Create a [new issue](https://github.com/benavlabs/fastapi-boilerplate/issues/new)** with detailed information
## Congratulations!
You've successfully:
- Verified your FastAPI Boilerplate installation
- Tested core API functionality
- Created your first custom endpoint
- Run database migrations
- Tested authentication and CRUD operations
You're now ready to build amazing APIs with FastAPI!

View File

@ -0,0 +1,181 @@
# Getting Started
Welcome to the FastAPI Boilerplate! This guide will have you up and running with a production-ready API in just a few minutes.
## Quick Start (5 minutes)
The fastest way to get started is using Docker Compose. This will set up everything you need including PostgreSQL, Redis, and the API server.
### Prerequisites
Make sure you have installed:
- [Docker](https://docs.docker.com/get-docker/) (20.10+)
- [Docker Compose](https://docs.docker.com/compose/install/) (1.29+)
### 1. Get the Template
Start by using this template for your new project:
1. Click **"Use this template"** on the [GitHub repository](https://github.com/benavlabs/fastapi-boilerplate)
2. Create a new repository with your project name
3. Clone your new repository:
```bash
git clone https://github.com/yourusername/your-project-name
cd your-project-name
```
### 2. Environment Setup
Create your environment configuration:
```bash
# Create the environment file
touch src/.env
```
Add the following basic configuration to `src/.env`:
```env
# Application
APP_NAME="My FastAPI App"
APP_DESCRIPTION="My awesome API"
APP_VERSION="0.1.0"
# Database
POSTGRES_USER="postgres"
POSTGRES_PASSWORD="changethis"
POSTGRES_SERVER="db"
POSTGRES_PORT=5432
POSTGRES_DB="myapp"
# Security
SECRET_KEY="your-secret-key-here"
ALGORITHM="HS256"
ACCESS_TOKEN_EXPIRE_MINUTES=30
REFRESH_TOKEN_EXPIRE_DAYS=7
# Redis
REDIS_CACHE_HOST="redis"
REDIS_CACHE_PORT=6379
REDIS_QUEUE_HOST="redis"
REDIS_QUEUE_PORT=6379
# Admin User
ADMIN_NAME="Admin"
ADMIN_EMAIL="admin@example.com"
ADMIN_USERNAME="admin"
ADMIN_PASSWORD="changethis"
# Environment
ENVIRONMENT="local"
```
!!! warning "Security Note"
Generate a secure secret key using: `openssl rand -hex 32`
### 3. Start the Application
Launch all services with a single command:
```bash
docker compose up
```
This will start:
- **FastAPI server** on port 8000
- **PostgreSQL database**
- **Redis** for caching and job queues
- **Worker** for background tasks
### 4. Verify Installation
Once the containers are running, you should see output like:
```
fastapi-boilerplate-web-1 | INFO: Application startup complete.
fastapi-boilerplate-db-1 | database system is ready to accept connections
fastapi-boilerplate-worker-1 | redis_version=7.x.x mem_usage=1MB clients_connected=1
```
Visit these URLs to confirm everything is working:
- **API Documentation**: [http://localhost:8000/docs](http://localhost:8000/docs)
- **Alternative Docs**: [http://localhost:8000/redoc](http://localhost:8000/redoc)
- **Health Check**: [http://localhost:8000/api/v1/health](http://localhost:8000/api/v1/health)
## You're Ready!
Congratulations! You now have a fully functional FastAPI application with:
- REST API with automatic documentation
- PostgreSQL database with migrations
- Redis caching and job queues
- JWT authentication system
- Background task processing
- Rate limiting
- Admin user created
## Test Your API
Try these quick tests to see your API in action:
### 1. Health Check
```bash
curl http://localhost:8000/api/v1/health
```
### 2. Create a User
```bash
curl -X POST "http://localhost:8000/api/v1/users" \
-H "Content-Type: application/json" \
-d '{
"name": "John Doe",
"username": "johndoe",
"email": "john@example.com",
"password": "securepassword"
}'
```
### 3. Login
```bash
curl -X POST "http://localhost:8000/api/v1/login" \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "username=johndoe&password=securepassword"
```
## Next Steps
Now that you have the basics running, explore these guides to learn more:
### Essential Reading
- **[Configuration Guide](configuration.md)** - Understand all configuration options
- **[Project Structure](../user-guide/project-structure.md)** - Learn how the code is organized
- **[Authentication](../user-guide/authentication/index.md)** - Set up user management
### Popular Features
- **[Database Operations](../user-guide/database/index.md)** - Working with models and CRUD
- **[Caching](../user-guide/caching/index.md)** - Speed up your API with Redis caching
- **[Background Tasks](../user-guide/background-tasks/index.md)** - Process jobs asynchronously
- **[Rate Limiting](../user-guide/rate-limiting/index.md)** - Protect your API from abuse
### Development & Deployment
- **[Development Guide](../user-guide/development.md)** - Extend and customize the boilerplate
- **[Testing](../user-guide/testing.md)** - Write tests for your API
- **[Production Deployment](../user-guide/production.md)** - Deploy to production
## Alternative Setup Methods
Not using Docker? No problem!
- **[Manual Installation](installation.md)** - Install dependencies manually
## Need Help?
- Join our **[Discord Community](../community.md)** - Get help from other developers
- Report issues on **[GitHub](https://github.com/benavlabs/fastapi-boilerplate/issues)**
---
**Ready to dive deeper?** Continue with the [detailed installation guide](installation.md) or explore the [user guide](../user-guide/index.md).

View File

@ -0,0 +1,366 @@
# Installation Guide
This guide covers different ways to install and set up the FastAPI Boilerplate depending on your needs and environment.
## System Requirements
Before you begin, ensure your system meets these requirements:
- **Python**: 3.11 or higher
- **Operating System**: Linux, macOS, or Windows (with WSL2 recommended)
- **Memory**: Minimum 4GB RAM (8GB recommended)
- **Disk Space**: At least 2GB free space
## Method 1: Docker Compose (Recommended)
Docker Compose is the easiest way to get started. It handles all dependencies and services automatically.
### Prerequisites
Install these tools on your system:
- [Docker](https://docs.docker.com/get-docker/) (version 20.10+)
- [Docker Compose](https://docs.docker.com/compose/install/) (version 1.29+)
### Installation Steps
1. **Get the template**:
```bash
git clone https://github.com/benavlabs/fastapi-boilerplate
cd fastapi-boilerplate
```
2. **Set up environment**:
```bash
cp src/.env.example src/.env
# Edit src/.env with your configuration
```
3. **Start services**:
```bash
docker compose up -d
```
4. **Verify installation**:
```bash
curl http://localhost:8000/docs
```
### What Gets Installed
Docker Compose sets up these services:
- **Web server** (FastAPI + Uvicorn) on port 8000
- **PostgreSQL** database on port 5432 (internal)
- **Redis** server on port 6379 (internal)
- **ARQ Worker** for background tasks
- **NGINX** (optional, for production)
## Method 2: Manual Installation
For more control or development purposes, you can install everything manually.
### Prerequisites
1. **Install Python 3.11+**:
```bash
# On Ubuntu/Debian
sudo apt update
sudo apt install python3.11 python3.11-pip
# On macOS (with Homebrew)
brew install python@3.11
# On Windows
# Download from python.org
```
2. **Install uv** (Python package manager):
```bash
pip install uv
```
3. **Install PostgreSQL**:
```bash
# On Ubuntu/Debian
sudo apt install postgresql postgresql-contrib
# On macOS
brew install postgresql
# On Windows
# Download from postgresql.org
```
4. **Install Redis**:
```bash
# On Ubuntu/Debian
sudo apt install redis-server
# On macOS
brew install redis
# On Windows
# Download from redis.io
```
### Installation Steps
1. **Clone the repository**:
```bash
git clone https://github.com/benavlabs/fastapi-boilerplate
cd fastapi-boilerplate
```
2. **Install Python dependencies**:
```bash
uv sync
```
3. **Set up environment variables**:
```bash
cp src/.env.example src/.env
# Edit src/.env with your local database/Redis settings
```
4. **Set up PostgreSQL**:
```bash
# Create database and user
sudo -u postgres psql
CREATE DATABASE myapp;
CREATE USER myuser WITH PASSWORD 'mypassword';
GRANT ALL PRIVILEGES ON DATABASE myapp TO myuser;
\q
```
5. **Run database migrations**:
```bash
cd src
uv run alembic upgrade head
```
6. **Create admin user**:
```bash
uv run python -m src.scripts.create_first_superuser
```
7. **Start the application**:
```bash
uv run uvicorn src.app.main:app --reload --host 0.0.0.0 --port 8000
```
8. **Start the worker** (in another terminal):
```bash
uv run arq src.app.core.worker.settings.WorkerSettings
```
## Method 3: Development Setup
For contributors and advanced users who want to modify the boilerplate.
### Additional Prerequisites
- **Git** for version control
- **Pre-commit** for code quality
### Installation Steps
1. **Fork and clone**:
```bash
# Fork the repository on GitHub first
git clone https://github.com/yourusername/fastapi-boilerplate
cd fastapi-boilerplate
```
2. **Install development dependencies**:
```bash
uv sync --group dev
```
3. **Set up pre-commit hooks**:
```bash
uv run pre-commit install
```
4. **Set up development environment**:
```bash
cp src/.env.example src/.env
# Configure for development
```
5. **Run tests to verify setup**:
```bash
uv run pytest
```
## Docker Services Breakdown
Understanding what each Docker service does:
### Web Service
```yaml
web:
build: .
ports:
- "8000:8000"
depends_on:
- db
- redis
```
- Runs the FastAPI application
- Handles HTTP requests
- Auto-reloads on code changes (development)
### Database Service
```yaml
db:
image: postgres:13
environment:
POSTGRES_DB: myapp
POSTGRES_USER: postgres
POSTGRES_PASSWORD: changethis
```
- PostgreSQL database server
- Persistent data storage
- Automatic initialization
### Redis Service
```yaml
redis:
image: redis:alpine
command: redis-server --appendonly yes
```
- In-memory data store
- Used for caching and job queues
- Persistent storage with AOF
### Worker Service
```yaml
worker:
build: .
command: arq src.app.core.worker.settings.WorkerSettings
depends_on:
- redis
```
- Background task processor
- Handles async jobs
- Scales independently
## Configuration
### Environment Variables
The application uses environment variables for configuration. Key variables:
```env
# Database
POSTGRES_USER=postgres
POSTGRES_PASSWORD=changethis
POSTGRES_SERVER=localhost # or "db" for Docker
POSTGRES_PORT=5432
POSTGRES_DB=myapp
# Redis
REDIS_CACHE_HOST=localhost # or "redis" for Docker
REDIS_CACHE_PORT=6379
# Security
SECRET_KEY=your-secret-key-here
ALGORITHM=HS256
ACCESS_TOKEN_EXPIRE_MINUTES=30
```
### Database Connection
For manual installation, update your database settings:
```env
# Local PostgreSQL
POSTGRES_SERVER=localhost
POSTGRES_PORT=5432
# Docker PostgreSQL
POSTGRES_SERVER=db
POSTGRES_PORT=5432
```
## Verification
After installation, verify everything works:
1. **API Documentation**: http://localhost:8000/docs
2. **Health Check**: http://localhost:8000/api/v1/health
3. **Database Connection**: Check logs for successful connection
4. **Redis Connection**: Test caching functionality
5. **Background Tasks**: Submit a test job
## Troubleshooting
### Common Issues
**Port Already in Use**:
```bash
# Check what's using port 8000
lsof -i :8000
# Kill the process
kill -9 <PID>
```
**Database Connection Error**:
```bash
# Check PostgreSQL status
sudo systemctl status postgresql
# Restart PostgreSQL
sudo systemctl restart postgresql
```
**Redis Connection Error**:
```bash
# Check Redis status
redis-cli ping
# Start Redis
redis-server
```
**Permission Errors**:
```bash
# Fix Docker permissions
sudo usermod -aG docker $USER
# Log out and back in
```
### Docker Issues
**Clean Reset**:
```bash
# Stop all containers
docker compose down
# Remove volumes (⚠️ deletes data)
docker compose down -v
# Rebuild images
docker compose build --no-cache
# Start fresh
docker compose up
```
## Next Steps
After successful installation:
1. **[Configuration Guide](configuration.md)** - Set up your environment
2. **[First Run](first-run.md)** - Test your installation
3. **[Project Structure](../user-guide/project-structure.md)** - Understand the codebase
## Need Help?
If you encounter issues:
- Check the [GitHub Issues](https://github.com/benavlabs/fastapi-boilerplate/issues) for common problems
- Search [existing issues](https://github.com/benavlabs/fastapi-boilerplate/issues)
- Create a [new issue](https://github.com/benavlabs/fastapi-boilerplate/issues/new) with details

126
docs/index.md Normal file
View File

@ -0,0 +1,126 @@
# Benav Labs FastAPI Boilerplate
<p align="center">
<img src="assets/FastAPI-boilerplate.png" alt="Purple Rocket with FastAPI Logo as its window." width="35%" height="auto">
</p>
<p align="center">
<i>A production-ready FastAPI boilerplate to speed up your development.</i>
</p>
!!! warning "Documentation Status"
This is our first version of the documentation. While functional, we acknowledge it's rough around the edges - there's a huge amount to document and we needed to start somewhere! We built this foundation (with a lot of AI assistance) so we can improve upon it.
Better documentation, examples, and guides are actively being developed. Contributions and feedback are greatly appreciated!
<p align="center">
<a href="https://fastapi.tiangolo.com">
<img src="https://img.shields.io/badge/FastAPI-005571?style=for-the-badge&logo=fastapi" alt="FastAPI">
</a>
<a href="https://docs.pydantic.dev/2.4/">
<img src="https://img.shields.io/badge/Pydantic-E92063?logo=pydantic&logoColor=fff&style=for-the-badge" alt="Pydantic">
</a>
<a href="https://www.postgresql.org">
<img src="https://img.shields.io/badge/PostgreSQL-316192?style=for-the-badge&logo=postgresql&logoColor=white" alt="PostgreSQL">
</a>
<a href="https://redis.io">
<img src="https://img.shields.io/badge/Redis-DC382D?logo=redis&logoColor=fff&style=for-the-badge" alt="Redis">
</a>
<a href="https://docs.docker.com/compose/">
<img src="https://img.shields.io/badge/Docker-2496ED?logo=docker&logoColor=fff&style=for-the-badge" alt="Docker">
</a>
</p>
## What is FastAPI Boilerplate?
FastAPI Boilerplate is a comprehensive, production-ready template that provides everything you need to build scalable, async APIs using modern Python technologies. It combines the power of FastAPI with industry best practices to give you a solid foundation for your next project.
## Core Technologies
This boilerplate leverages cutting-edge Python technologies:
- **[FastAPI](https://fastapi.tiangolo.com)** - Modern, fast web framework for building APIs with Python 3.7+
- **[Pydantic V2](https://docs.pydantic.dev/2.4/)** - Data validation library rewritten in Rust (5x-50x faster)
- **[SQLAlchemy 2.0](https://docs.sqlalchemy.org/en/20/)** - Python SQL toolkit and Object Relational Mapper
- **[PostgreSQL](https://www.postgresql.org)** - Advanced open source relational database
- **[Redis](https://redis.io)** - In-memory data store for caching and message brokering
- **[ARQ](https://arq-docs.helpmanual.io)** - Job queues and RPC with asyncio and Redis
- **[Docker](https://docs.docker.com/compose/)** - Containerization for easy deployment
- **[NGINX](https://nginx.org/en/)** - High-performance web server for reverse proxy and load balancing
## Key Features
### Performance & Scalability
- Fully async architecture
- Pydantic V2 for ultra-fast data validation
- SQLAlchemy 2.0 with efficient query patterns
- Built-in caching with Redis
- Horizontal scaling with NGINX load balancing
### Security & Authentication
- JWT-based authentication with refresh tokens
- Cookie-based secure token storage
- Role-based access control with user tiers
- Rate limiting to prevent abuse
- Production-ready security configurations
### Developer Experience
- Comprehensive CRUD operations with [FastCRUD](https://github.com/igorbenav/fastcrud)
- Automatic API documentation
- Database migrations with Alembic
- Background task processing
- Extensive test coverage
- Docker Compose for easy development
### Production Ready
- Environment-based configuration
- Structured logging
- Health checks and monitoring
- NGINX reverse proxy setup
- Gunicorn with Uvicorn workers
- Database connection pooling
## Quick Start
Get up and running in less than 5 minutes:
```bash
# Clone the repository
git clone https://github.com/benavlabs/fastapi-boilerplate
cd fastapi-boilerplate
# Start with Docker Compose
docker compose up
```
That's it! Your API will be available at `http://localhost:8000/docs`
**[Continue with the Getting Started Guide →](getting-started/index.md)**
## Documentation Structure
### For New Users
- **[Getting Started](getting-started/index.md)** - Quick setup and first steps
- **[User Guide](user-guide/index.md)** - Comprehensive feature documentation
### For Developers
- **[Development](user-guide/development.md)** - Extending and customizing the boilerplate
- **[Testing](user-guide/testing.md)** - Testing strategies and best practices
- **[Production](user-guide/production.md)** - Production deployment guides
## Perfect For
- **REST APIs** - Build robust, scalable REST APIs
- **Microservices** - Create microservice architectures
- **SaaS Applications** - Multi-tenant applications with user tiers
- **Data APIs** - APIs for data processing and analytics
## Community & Support
- **[Discord Community](community.md)** - Join our Discord server to connect with other developers
- **[GitHub Issues](https://github.com/benavlabs/fastapi-boilerplate/issues)** - Bug reports and feature requests
<hr>
<a href="https://benav.io">
<img src="https://github.com/benavlabs/fastcrud/raw/main/docs/assets/benav_labs_banner.png" alt="Powered by Benav Labs - benav.io"/>
</a>

View File

@ -0,0 +1,20 @@
/* Make only the header/favicon logo white, keep other instances purple */
.md-header__button.md-logo img,
.md-nav__button.md-logo img {
filter: brightness(0) invert(1);
}
/* Ensure header logo is white in both light and dark modes */
[data-md-color-scheme="default"] .md-header__button.md-logo img,
[data-md-color-scheme="default"] .md-nav__button.md-logo img {
filter: brightness(0) invert(1);
}
[data-md-color-scheme="slate"] .md-header__button.md-logo img,
[data-md-color-scheme="slate"] .md-nav__button.md-logo img {
filter: brightness(0) invert(1);
}
:root {
--md-primary-fg-color: #cd4bfb;
}

View File

@ -0,0 +1,480 @@
# Adding Models
Learn how to extend the admin interface with your new models by following the patterns established in the FastAPI boilerplate. The boilerplate already includes User, Tier, and Post models - we'll show you how to add your own models using these working examples.
> **CRUDAdmin Features**: This guide shows boilerplate-specific patterns. For advanced model configuration options and features, see the [CRUDAdmin documentation](https://benavlabs.github.io/crudadmin/).
## Understanding the Existing Setup
The boilerplate comes with three models already registered in the admin interface. Understanding how they're implemented will help you add your own models successfully.
### Current Model Registration
The admin interface is configured in `src/app/admin/views.py`:
```python
def register_admin_views(admin: CRUDAdmin) -> None:
"""Register all models and their schemas with the admin interface."""
# User model with password handling
password_transformer = PasswordTransformer(
password_field="password",
hashed_field="hashed_password",
hash_function=get_password_hash,
required_fields=["name", "username", "email"],
)
admin.add_view(
model=User,
create_schema=UserCreate,
update_schema=UserUpdate,
allowed_actions={"view", "create", "update"},
password_transformer=password_transformer,
)
admin.add_view(
model=Tier,
create_schema=TierCreate,
update_schema=TierUpdate,
allowed_actions={"view", "create", "update", "delete"}
)
admin.add_view(
model=Post,
create_schema=PostCreateAdmin, # Special admin-only schema
update_schema=PostUpdate,
allowed_actions={"view", "create", "update", "delete"}
)
```
Each model registration follows the same pattern: specify the SQLAlchemy model, appropriate Pydantic schemas for create/update operations, and define which actions are allowed.
## Step-by-Step Model Addition
Let's walk through adding a new model to your admin interface using a product catalog example.
### Step 1: Create Your Model
First, create your SQLAlchemy model following the boilerplate's patterns:
```python
# src/app/models/product.py
from decimal import Decimal
from sqlalchemy.orm import Mapped, mapped_column
from sqlalchemy import String, Numeric, ForeignKey, Text, Boolean
from sqlalchemy.types import DateTime
from datetime import datetime
from ..core.db.database import Base
class Product(Base):
__tablename__ = "products"
id: Mapped[int] = mapped_column(primary_key=True)
name: Mapped[str] = mapped_column(String(100), nullable=False)
description: Mapped[str | None] = mapped_column(Text, nullable=True)
price: Mapped[Decimal] = mapped_column(Numeric(10, 2), nullable=False)
is_active: Mapped[bool] = mapped_column(Boolean, default=True)
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
# Foreign key relationship (similar to Post.created_by_user_id)
category_id: Mapped[int] = mapped_column(ForeignKey("categories.id"))
```
### Step 2: Create Pydantic Schemas
Create schemas for the admin interface following the boilerplate's pattern:
```python
# src/app/schemas/product.py
from decimal import Decimal
from pydantic import BaseModel, Field
from typing import Annotated
class ProductCreate(BaseModel):
name: Annotated[str, Field(min_length=2, max_length=100)]
description: Annotated[str | None, Field(max_length=1000, default=None)]
price: Annotated[Decimal, Field(gt=0, le=999999.99)]
is_active: Annotated[bool, Field(default=True)]
category_id: Annotated[int, Field(gt=0)]
class ProductUpdate(BaseModel):
name: Annotated[str | None, Field(min_length=2, max_length=100, default=None)]
description: Annotated[str | None, Field(max_length=1000, default=None)]
price: Annotated[Decimal | None, Field(gt=0, le=999999.99, default=None)]
is_active: Annotated[bool | None, Field(default=None)]
category_id: Annotated[int | None, Field(gt=0, default=None)]
```
### Step 3: Register with Admin Interface
Add your model to `src/app/admin/views.py`:
```python
# Add import at the top
from ..models.product import Product
from ..schemas.product import ProductCreate, ProductUpdate
def register_admin_views(admin: CRUDAdmin) -> None:
"""Register all models and their schemas with the admin interface."""
# ... existing model registrations ...
# Add your new model
admin.add_view(
model=Product,
create_schema=ProductCreate,
update_schema=ProductUpdate,
allowed_actions={"view", "create", "update", "delete"}
)
```
### Step 4: Create and Run Migration
Generate the database migration for your new model:
```bash
# Generate migration
uv run alembic revision --autogenerate -m "Add product model"
# Apply migration
uv run alembic upgrade head
```
### Step 5: Test Your New Model
Start your application and test the new model in the admin interface:
```bash
# Start the application
uv run fastapi dev
# Visit http://localhost:8000/admin
# Login with your admin credentials
# You should see "Products" in the admin navigation
```
## Learning from Existing Models
Each model in the boilerplate demonstrates different admin interface patterns you can follow.
### User Model - Password Handling
The User model shows how to handle sensitive fields like passwords:
```python
# Password transformer for secure password handling
password_transformer = PasswordTransformer(
password_field="password", # Field in the schema
hashed_field="hashed_password", # Field in the database model
hash_function=get_password_hash, # Your app's hash function
required_fields=["name", "username", "email"], # Fields required for user creation
)
admin.add_view(
model=User,
create_schema=UserCreate,
update_schema=UserUpdate,
allowed_actions={"view", "create", "update"}, # No delete for users
password_transformer=password_transformer,
)
```
**When to use this pattern:**
- Models with password fields
- Any field that needs transformation before storage
- Fields requiring special security handling
### Tier Model - Simple CRUD
The Tier model demonstrates straightforward CRUD operations:
```python
admin.add_view(
model=Tier,
create_schema=TierCreate,
update_schema=TierUpdate,
allowed_actions={"view", "create", "update", "delete"} # Full CRUD
)
```
**When to use this pattern:**
- Reference data (categories, types, statuses)
- Configuration models
- Simple data without complex relationships
### Post Model - Admin-Specific Schemas
The Post model shows how to create admin-specific schemas when the regular API schemas don't work for admin purposes:
```python
# Special admin schema (different from regular PostCreate)
class PostCreateAdmin(BaseModel):
title: Annotated[str, Field(min_length=2, max_length=30)]
text: Annotated[str, Field(min_length=1, max_length=63206)]
created_by_user_id: int # Required in admin, but not in API
media_url: Annotated[str | None, Field(pattern=r"^(https?|ftp)://[^\s/$.?#].[^\s]*$", default=None)]
admin.add_view(
model=Post,
create_schema=PostCreateAdmin, # Admin-specific schema
update_schema=PostUpdate, # Regular update schema works fine
allowed_actions={"view", "create", "update", "delete"}
)
```
**When to use this pattern:**
- Models where admins need to set fields that users can't
- Models requiring additional validation for admin operations
- Cases where API schemas are too restrictive or too permissive for admin use
## Advanced Model Configuration
### Customizing Field Display
You can control how fields appear in the admin interface by modifying your schemas:
```python
class ProductCreateAdmin(BaseModel):
name: Annotated[str, Field(
min_length=2,
max_length=100,
description="Product name as shown to customers"
)]
description: Annotated[str | None, Field(
max_length=1000,
description="Detailed product description (supports HTML)"
)]
price: Annotated[Decimal, Field(
gt=0,
le=999999.99,
description="Price in USD (up to 2 decimal places)"
)]
category_id: Annotated[int, Field(
gt=0,
description="Product category (creates dropdown automatically)"
)]
```
### Restricting Actions
Control what operations are available for each model:
```python
# Read-only model (reports, logs, etc.)
admin.add_view(
model=AuditLog,
create_schema=None, # No creation allowed
update_schema=None, # No updates allowed
allowed_actions={"view"} # Only viewing
)
# No deletion allowed (users, critical data)
admin.add_view(
model=User,
create_schema=UserCreate,
update_schema=UserUpdate,
allowed_actions={"view", "create", "update"} # No delete
)
```
### Handling Complex Fields
Some models may have fields that don't work well in the admin interface. Use select schemas to exclude problematic fields:
```python
from pydantic import BaseModel
# Create a simplified view schema
class ProductAdminView(BaseModel):
id: int
name: str
price: Decimal
is_active: bool
# Exclude complex fields like large text or binary data
admin.add_view(
model=Product,
create_schema=ProductCreate,
update_schema=ProductUpdate,
select_schema=ProductAdminView, # Controls what's shown in lists
allowed_actions={"view", "create", "update", "delete"}
)
```
## Common Model Patterns
### Reference Data Models
For categories, types, and other reference data:
```python
# Simple reference model
class Category(Base):
__tablename__ = "categories"
id: Mapped[int] = mapped_column(primary_key=True)
name: Mapped[str] = mapped_column(String(50), unique=True)
description: Mapped[str | None] = mapped_column(Text)
# Simple schemas
class CategoryCreate(BaseModel):
name: str = Field(..., min_length=2, max_length=50)
description: str | None = None
# Registration
admin.add_view(
model=Category,
create_schema=CategoryCreate,
update_schema=CategoryCreate, # Same schema for create and update
allowed_actions={"view", "create", "update", "delete"}
)
```
### User-Generated Content
For content models with user associations:
```python
class BlogPost(Base):
__tablename__ = "blog_posts"
id: Mapped[int] = mapped_column(primary_key=True)
title: Mapped[str] = mapped_column(String(200))
content: Mapped[str] = mapped_column(Text)
author_id: Mapped[int] = mapped_column(ForeignKey("users.id"))
published_at: Mapped[datetime | None] = mapped_column(DateTime)
# Admin schema with required author
class BlogPostCreateAdmin(BaseModel):
title: str = Field(..., min_length=5, max_length=200)
content: str = Field(..., min_length=10)
author_id: int = Field(..., gt=0) # Admin must specify author
published_at: datetime | None = None
admin.add_view(
model=BlogPost,
create_schema=BlogPostCreateAdmin,
update_schema=BlogPostUpdate,
allowed_actions={"view", "create", "update", "delete"}
)
```
### Configuration Models
For application settings and configuration:
```python
class SystemSetting(Base):
__tablename__ = "system_settings"
id: Mapped[int] = mapped_column(primary_key=True)
key: Mapped[str] = mapped_column(String(100), unique=True)
value: Mapped[str] = mapped_column(Text)
description: Mapped[str | None] = mapped_column(Text)
# Restricted actions - settings shouldn't be deleted
admin.add_view(
model=SystemSetting,
create_schema=SystemSettingCreate,
update_schema=SystemSettingUpdate,
allowed_actions={"view", "create", "update"} # No delete
)
```
## Testing Your Models
After adding models to the admin interface, test them thoroughly:
### Manual Testing
1. **Access**: Navigate to `/admin` and log in
2. **Create**: Try creating new records with valid and invalid data
3. **Edit**: Test updating existing records
4. **Validation**: Verify that your schema validation works correctly
5. **Relationships**: Test foreign key relationships (dropdowns should populate)
### Development Testing
```python
# Test your admin configuration
# src/scripts/test_admin.py
from app.admin.initialize import create_admin_interface
def test_admin_setup():
admin = create_admin_interface()
if admin:
print("Admin interface created successfully")
print(f"Models registered: {len(admin._views)}")
for model_name in admin._views:
print(f" - {model_name}")
else:
print("Admin interface disabled")
if __name__ == "__main__":
test_admin_setup()
```
```bash
# Run the test
uv run python src/scripts/test_admin.py
```
## Updating Model Registration
When you need to modify how existing models appear in the admin interface:
### Adding Actions
```python
# Enable deletion for a model that previously didn't allow it
admin.add_view(
model=Product,
create_schema=ProductCreate,
update_schema=ProductUpdate,
allowed_actions={"view", "create", "update", "delete"} # Added delete
)
```
### Changing Schemas
```python
# Switch to admin-specific schemas
admin.add_view(
model=User,
create_schema=UserCreateAdmin, # New admin schema
update_schema=UserUpdateAdmin, # New admin schema
allowed_actions={"view", "create", "update"},
password_transformer=password_transformer,
)
```
### Performance Optimization
For models with many records, consider using select schemas to limit data:
```python
# Only show essential fields in lists
class UserListView(BaseModel):
id: int
username: str
email: str
is_active: bool
admin.add_view(
model=User,
create_schema=UserCreate,
update_schema=UserUpdate,
select_schema=UserListView, # Faster list loading
allowed_actions={"view", "create", "update"},
password_transformer=password_transformer,
)
```
## What's Next
With your models successfully added to the admin interface, you're ready to:
1. **[User Management](user-management.md)** - Learn how to manage admin users and implement security best practices
Your models are now fully integrated into the admin interface and ready for production use. The admin panel will automatically handle form generation, validation, and database operations based on your model and schema definitions.

View File

@ -0,0 +1,378 @@
# Configuration
Learn how to configure the admin panel (powered by [CRUDAdmin](https://github.com/benavlabs/crudadmin)) using the FastAPI boilerplate's built-in environment variable system. The admin panel is fully integrated with your application's configuration and requires no additional setup files or complex initialization.
> **About CRUDAdmin**: For complete configuration options and advanced features, see the [CRUDAdmin documentation](https://benavlabs.github.io/crudadmin/).
## Environment-Based Configuration
The FastAPI boilerplate handles all admin panel configuration through environment variables defined in your `.env` file. This approach provides consistent configuration across development, staging, and production environments.
```bash
# Basic admin panel configuration in .env
CRUD_ADMIN_ENABLED=true
ADMIN_USERNAME="admin"
ADMIN_PASSWORD="SecurePassword123!"
CRUD_ADMIN_MOUNT_PATH="/admin"
```
The configuration system automatically:
- Validates all environment variables at startup
- Provides sensible defaults for optional settings
- Adapts security settings based on your environment (local/staging/production)
- Integrates with your application's existing security and database systems
## Core Configuration Settings
### Enable/Disable Admin Panel
Control whether the admin panel is available:
```bash
# Enable admin panel (default: true)
CRUD_ADMIN_ENABLED=true
# Disable admin panel completely
CRUD_ADMIN_ENABLED=false
```
When disabled, the admin interface is not mounted and consumes no resources.
### Admin Access Credentials
Configure the initial admin user that's created automatically:
```bash
# Required: Admin user credentials
ADMIN_USERNAME="your-admin-username" # Admin login username
ADMIN_PASSWORD="YourSecurePassword123!" # Admin login password
# Optional: Additional admin user details (uses existing settings)
ADMIN_NAME="Administrator" # Display name (from FirstUserSettings)
ADMIN_EMAIL="admin@yourcompany.com" # Admin email (from FirstUserSettings)
```
**How this works:**
- The admin user is created automatically when the application starts
- Only created if no admin users exist (safe for restarts)
- Uses your application's existing password hashing system
- Credentials are validated according to CRUDAdmin requirements
### Interface Configuration
Customize where and how the admin panel appears:
```bash
# Admin panel URL path (default: "/admin")
CRUD_ADMIN_MOUNT_PATH="/admin" # Access at http://localhost:8000/admin
CRUD_ADMIN_MOUNT_PATH="/management" # Access at http://localhost:8000/management
CRUD_ADMIN_MOUNT_PATH="/internal" # Access at http://localhost:8000/internal
```
The admin panel is mounted as a sub-application at your specified path.
## Session Management Configuration
Control how admin users stay logged in and how sessions are managed.
### Basic Session Settings
```bash
# Session limits and timeouts
CRUD_ADMIN_MAX_SESSIONS=10 # Max concurrent sessions per user
CRUD_ADMIN_SESSION_TIMEOUT=1440 # Session timeout in minutes (24 hours)
# Cookie security
SESSION_SECURE_COOKIES=true # Require HTTPS for cookies (production)
```
**Session behavior:**
- Each admin login creates a new session
- Sessions expire after the timeout period of inactivity
- When max sessions are exceeded, oldest sessions are removed
- Session cookies are HTTP-only and secure (when HTTPS is enabled)
### Memory Sessions (Development)
For local development, sessions are stored in memory by default:
```bash
# Development configuration
ENVIRONMENT="local" # Enables memory sessions
CRUD_ADMIN_REDIS_ENABLED=false # Explicitly disable Redis (default)
```
**Memory session characteristics:**
- Fast performance with no external dependencies
- Sessions lost when application restarts
- Suitable for single-developer environments
- Not suitable for load-balanced deployments
### Redis Sessions (Production)
For production deployments, enable Redis session storage:
```bash
# Enable Redis sessions
CRUD_ADMIN_REDIS_ENABLED=true
# Redis connection settings
CRUD_ADMIN_REDIS_HOST="localhost" # Redis server hostname
CRUD_ADMIN_REDIS_PORT=6379 # Redis server port
CRUD_ADMIN_REDIS_DB=0 # Redis database number
CRUD_ADMIN_REDIS_PASSWORD="secure-pass" # Redis authentication
CRUD_ADMIN_REDIS_SSL=false # Enable SSL/TLS connection
```
**Redis session benefits:**
- Sessions persist across application restarts
- Supports multiple application instances (load balancing)
- Configurable expiration and cleanup
- Production-ready scalability
**Redis URL construction:**
The boilerplate automatically constructs the Redis URL from your environment variables:
```python
# Automatic URL generation in src/app/admin/initialize.py
redis_url = f"redis{'s' if settings.CRUD_ADMIN_REDIS_SSL else ''}://"
if settings.CRUD_ADMIN_REDIS_PASSWORD:
redis_url += f":{settings.CRUD_ADMIN_REDIS_PASSWORD}@"
redis_url += f"{settings.CRUD_ADMIN_REDIS_HOST}:{settings.CRUD_ADMIN_REDIS_PORT}/{settings.CRUD_ADMIN_REDIS_DB}"
```
## Security Configuration
The admin panel automatically adapts its security settings based on your deployment environment.
### Environment-Based Security
```bash
# Environment setting affects security behavior
ENVIRONMENT="local" # Development mode
ENVIRONMENT="staging" # Staging mode
ENVIRONMENT="production" # Production mode with enhanced security
```
**Security changes by environment:**
| Setting | Local | Staging | Production |
|---------|-------|---------|------------|
| **HTTPS Enforcement** | Disabled | Optional | Enabled |
| **Secure Cookies** | Optional | Recommended | Required |
| **Session Tracking** | Optional | Recommended | Enabled |
| **Event Logging** | Optional | Recommended | Enabled |
### Audit and Tracking
Enable comprehensive logging for compliance and security monitoring:
```bash
# Event and session tracking
CRUD_ADMIN_TRACK_EVENTS=true # Log all admin actions
CRUD_ADMIN_TRACK_SESSIONS=true # Track session lifecycle
# Available in admin interface
# - View all admin actions with timestamps
# - Monitor active sessions
# - Track user activity patterns
```
### Access Restrictions
The boilerplate supports IP and network-based access restrictions (configured in code):
```python
# In src/app/admin/initialize.py - customize as needed
admin = CRUDAdmin(
# ... other settings ...
allowed_ips=settings.CRUD_ADMIN_ALLOWED_IPS_LIST, # Specific IP addresses
allowed_networks=settings.CRUD_ADMIN_ALLOWED_NETWORKS_LIST, # CIDR network ranges
)
```
To implement IP restrictions, extend the `CRUDAdminSettings` class in `src/app/core/config.py`.
## Integration with Application Settings
The admin panel leverages your existing application configuration for seamless integration.
### Shared Security Settings
```bash
# Uses your application's main secret key
SECRET_KEY="your-application-secret-key" # Shared with admin panel
# Inherits database settings
POSTGRES_USER="dbuser" # Admin uses same database
POSTGRES_PASSWORD="dbpass"
POSTGRES_SERVER="localhost"
POSTGRES_DB="yourapp"
```
### Automatic Configuration Loading
The admin panel automatically inherits settings from your application:
```python
# In src/app/admin/initialize.py
admin = CRUDAdmin(
session=async_get_db, # Your app's database session
SECRET_KEY=settings.SECRET_KEY.get_secret_value(), # Your app's secret key
enforce_https=settings.ENVIRONMENT == EnvironmentOption.PRODUCTION,
# ... other settings from your app configuration
)
```
## Deployment Examples
### Development Environment
Perfect for local development with minimal setup:
```bash
# .env.development
ENVIRONMENT="local"
CRUD_ADMIN_ENABLED=true
ADMIN_USERNAME="dev-admin"
ADMIN_PASSWORD="dev123"
CRUD_ADMIN_MOUNT_PATH="/admin"
# Memory sessions - no external dependencies
CRUD_ADMIN_REDIS_ENABLED=false
# Optional tracking for testing
CRUD_ADMIN_TRACK_EVENTS=false
CRUD_ADMIN_TRACK_SESSIONS=false
```
### Staging Environment
Staging environment with Redis but relaxed security:
```bash
# .env.staging
ENVIRONMENT="staging"
CRUD_ADMIN_ENABLED=true
ADMIN_USERNAME="staging-admin"
ADMIN_PASSWORD="StagingPassword123!"
# Redis sessions for testing production behavior
CRUD_ADMIN_REDIS_ENABLED=true
CRUD_ADMIN_REDIS_HOST="staging-redis.example.com"
CRUD_ADMIN_REDIS_PASSWORD="staging-redis-pass"
# Enable tracking for testing
CRUD_ADMIN_TRACK_EVENTS=true
CRUD_ADMIN_TRACK_SESSIONS=true
SESSION_SECURE_COOKIES=true
```
### Production Environment
Production-ready configuration with full security:
```bash
# .env.production
ENVIRONMENT="production"
CRUD_ADMIN_ENABLED=true
ADMIN_USERNAME="prod-admin"
ADMIN_PASSWORD="VerySecureProductionPassword123!"
# Redis sessions for scalability
CRUD_ADMIN_REDIS_ENABLED=true
CRUD_ADMIN_REDIS_HOST="redis.internal.company.com"
CRUD_ADMIN_REDIS_PORT=6379
CRUD_ADMIN_REDIS_PASSWORD="ultra-secure-redis-password"
CRUD_ADMIN_REDIS_SSL=true
# Full security and tracking
SESSION_SECURE_COOKIES=true
CRUD_ADMIN_TRACK_EVENTS=true
CRUD_ADMIN_TRACK_SESSIONS=true
CRUD_ADMIN_MAX_SESSIONS=5
CRUD_ADMIN_SESSION_TIMEOUT=480 # 8 hours for security
```
### Docker Deployment
Configure for containerized deployments:
```yaml
# docker-compose.yml
version: '3.8'
services:
web:
build: .
environment:
- ENVIRONMENT=production
- ADMIN_USERNAME=${ADMIN_USERNAME}
- ADMIN_PASSWORD=${ADMIN_PASSWORD}
# Redis connection
- CRUD_ADMIN_REDIS_ENABLED=true
- CRUD_ADMIN_REDIS_HOST=redis
- CRUD_ADMIN_REDIS_PORT=6379
- CRUD_ADMIN_REDIS_PASSWORD=${REDIS_PASSWORD}
depends_on:
- redis
- postgres
redis:
image: redis:7-alpine
command: redis-server --requirepass ${REDIS_PASSWORD}
volumes:
- redis_data:/data
```
```bash
# .env file for Docker
ADMIN_USERNAME="docker-admin"
ADMIN_PASSWORD="DockerSecurePassword123!"
REDIS_PASSWORD="docker-redis-password"
```
## Configuration Validation
The boilerplate automatically validates your configuration at startup and provides helpful error messages.
### Common Configuration Issues
**Missing Required Variables:**
```bash
# Error: Admin credentials not provided
# Solution: Add to .env
ADMIN_USERNAME="your-admin"
ADMIN_PASSWORD="your-password"
```
**Invalid Redis Configuration:**
```bash
# Error: Redis connection failed
# Check Redis server and credentials
CRUD_ADMIN_REDIS_HOST="correct-redis-host"
CRUD_ADMIN_REDIS_PASSWORD="correct-password"
```
**Security Warnings:**
```bash
# Warning: Weak admin password
# Use stronger password with mixed case, numbers, symbols
ADMIN_PASSWORD="StrongerPassword123!"
```
## What's Next
With your admin panel configured, you're ready to:
1. **[Adding Models](adding-models.md)** - Register your application models with the admin interface
2. **[User Management](user-management.md)** - Manage admin users and implement security best practices
The configuration system provides flexibility for any deployment scenario while maintaining consistency across environments.

View File

@ -0,0 +1,295 @@
# Admin Panel
The FastAPI boilerplate comes with a pre-configured web-based admin interface powered by [CRUDAdmin](https://github.com/benavlabs/crudadmin) that provides instant database management capabilities. Learn how to access, configure, and customize the admin panel for your development and production needs.
> **Powered by CRUDAdmin**: This admin panel is built with [CRUDAdmin](https://github.com/benavlabs/crudadmin), a modern admin interface generator for FastAPI applications.
>
> - **📚 CRUDAdmin Documentation**: [benavlabs.github.io/crudadmin](https://benavlabs.github.io/crudadmin/)
> - **💻 CRUDAdmin GitHub**: [github.com/benavlabs/crudadmin](https://github.com/benavlabs/crudadmin)
## What You'll Learn
- **[Configuration](configuration.md)** - Environment variables and deployment settings
- **[Adding Models](adding-models.md)** - Register your new models with the admin interface
- **[User Management](user-management.md)** - Manage admin users and security
## Admin Panel Overview
Your FastAPI boilerplate includes a fully configured admin interface that's ready to use out of the box. The admin panel automatically provides web-based management for your database models without requiring any additional setup.
**What's Already Configured:**
- Complete admin interface mounted at `/admin`
- User, Tier, and Post models already registered
- Automatic form generation and validation
- Session management with configurable backends
- Security features and access controls
**Accessing the Admin Panel:**
1. Start your application: `uv run fastapi dev`
2. Navigate to: `http://localhost:8000/admin`
3. Login with default credentials (configured via environment variables)
## Pre-Registered Models
The boilerplate comes with three models already set up in the admin interface:
### User Management
```python
# Already registered in your admin
admin.add_view(
model=User,
create_schema=UserCreate,
update_schema=UserUpdate,
allowed_actions={"view", "create", "update"},
password_transformer=password_transformer, # Automatic password hashing
)
```
**Features:**
- Create and manage application users
- Automatic password hashing with bcrypt
- User profile management (name, username, email)
- Tier assignment for subscription management
### Tier Management
```python
# Subscription tiers for your application
admin.add_view(
model=Tier,
create_schema=TierCreate,
update_schema=TierUpdate,
allowed_actions={"view", "create", "update", "delete"}
)
```
**Features:**
- Manage subscription tiers and pricing
- Configure rate limits per tier
- Full CRUD operations available
### Content Management
```python
# Post/content management
admin.add_view(
model=Post,
create_schema=PostCreateAdmin, # Special admin schema
update_schema=PostUpdate,
allowed_actions={"view", "create", "update", "delete"}
)
```
**Features:**
- Manage user-generated content
- Handle media URLs and content validation
- Associate posts with users
## Quick Start
### 1. Set Up Admin Credentials
Configure your admin login in your `.env` file:
```bash
# Admin Panel Access
ADMIN_USERNAME="your-admin-username"
ADMIN_PASSWORD="YourSecurePassword123!"
# Basic Configuration
CRUD_ADMIN_ENABLED=true
CRUD_ADMIN_MOUNT_PATH="/admin"
```
### 2. Start the Application
```bash
# Development
uv run fastapi dev
# The admin panel will be available at:
# http://localhost:8000/admin
```
### 3. Login and Explore
1. **Access**: Navigate to `/admin` in your browser
2. **Login**: Use the credentials from your environment variables
3. **Explore**: Browse the pre-configured models (Users, Tiers, Posts)
## Environment Configuration
The admin panel is configured entirely through environment variables, making it easy to adapt for different deployment environments.
### Basic Settings
```bash
# Enable/disable admin panel
CRUD_ADMIN_ENABLED=true # Set to false to disable completely
# Admin interface path
CRUD_ADMIN_MOUNT_PATH="/admin" # Change the URL path
# Admin user credentials (created automatically)
ADMIN_USERNAME="admin" # Your admin username
ADMIN_PASSWORD="SecurePassword123!" # Your admin password
```
### Session Management
```bash
# Session configuration
CRUD_ADMIN_MAX_SESSIONS=10 # Max concurrent sessions per user
CRUD_ADMIN_SESSION_TIMEOUT=1440 # Session timeout (24 hours)
SESSION_SECURE_COOKIES=true # HTTPS-only cookies
```
### Production Security
```bash
# Security settings for production
ENVIRONMENT="production" # Enables HTTPS enforcement
CRUD_ADMIN_TRACK_EVENTS=true # Log admin actions
CRUD_ADMIN_TRACK_SESSIONS=true # Track session activity
```
### Redis Session Storage
For production deployments with multiple server instances:
```bash
# Enable Redis sessions
CRUD_ADMIN_REDIS_ENABLED=true
CRUD_ADMIN_REDIS_HOST="localhost"
CRUD_ADMIN_REDIS_PORT=6379
CRUD_ADMIN_REDIS_DB=0
CRUD_ADMIN_REDIS_PASSWORD="your-redis-password"
CRUD_ADMIN_REDIS_SSL=false
```
## How It Works
The admin panel integrates seamlessly with your FastAPI application through several key components:
### Automatic Initialization
```python
# In src/app/main.py - already configured
admin = create_admin_interface()
@asynccontextmanager
async def lifespan_with_admin(app: FastAPI):
async with default_lifespan(app):
if admin:
await admin.initialize() # Sets up admin database
yield
# Admin is mounted automatically at your configured path
if admin:
app.mount(settings.CRUD_ADMIN_MOUNT_PATH, admin.app)
```
### Configuration Integration
```python
# In src/app/admin/initialize.py - uses your existing settings
admin = CRUDAdmin(
session=async_get_db, # Your database session
SECRET_KEY=settings.SECRET_KEY, # Your app's secret key
mount_path=settings.CRUD_ADMIN_MOUNT_PATH, # Configurable path
secure_cookies=settings.SESSION_SECURE_COOKIES,
enforce_https=settings.ENVIRONMENT == EnvironmentOption.PRODUCTION,
# ... all configured via environment variables
)
```
### Model Registration
```python
# In src/app/admin/views.py - pre-configured models
def register_admin_views(admin: CRUDAdmin):
# Password handling for User model
password_transformer = PasswordTransformer(
password_field="password",
hashed_field="hashed_password",
hash_function=get_password_hash, # Uses your app's password hashing
)
# Register your models with appropriate schemas
admin.add_view(model=User, create_schema=UserCreate, ...)
admin.add_view(model=Tier, create_schema=TierCreate, ...)
admin.add_view(model=Post, create_schema=PostCreateAdmin, ...)
```
## Development vs Production
### Development Setup
For local development, minimal configuration is needed:
```bash
# .env for development
CRUD_ADMIN_ENABLED=true
ADMIN_USERNAME="admin"
ADMIN_PASSWORD="admin123"
ENVIRONMENT="local"
# Uses memory sessions (fast, no external dependencies)
CRUD_ADMIN_REDIS_ENABLED=false
```
### Production Setup
For production deployments, enable additional security features:
```bash
# .env for production
CRUD_ADMIN_ENABLED=true
ADMIN_USERNAME="production-admin"
ADMIN_PASSWORD="VerySecureProductionPassword123!"
ENVIRONMENT="production"
# Redis sessions for scalability
CRUD_ADMIN_REDIS_ENABLED=true
CRUD_ADMIN_REDIS_HOST="your-redis-host"
CRUD_ADMIN_REDIS_PASSWORD="secure-redis-password"
CRUD_ADMIN_REDIS_SSL=true
# Enhanced security
SESSION_SECURE_COOKIES=true
CRUD_ADMIN_TRACK_EVENTS=true
CRUD_ADMIN_TRACK_SESSIONS=true
```
## Getting Started Guide
### 1. **[Configuration](configuration.md)** - Environment Setup
Learn about all available environment variables and how to configure the admin panel for different deployment scenarios. Understand session backends and security settings.
Perfect for setting up development environments and preparing for production deployment.
### 2. **[Adding Models](adding-models.md)** - Extend the Admin Interface
Discover how to register your new models with the admin interface. Learn from the existing User, Tier, and Post implementations to add your own models.
Essential when you create new database models and want them managed through the admin interface.
### 3. **[User Management](user-management.md)** - Admin Security
Understand how admin authentication works, how to create additional admin users, and implement security best practices for production environments.
Critical for production deployments where multiple team members need admin access.
## What's Next
Ready to start using your admin panel? Follow this path:
1. **[Configuration](configuration.md)** - Set up your environment variables and understand deployment options
2. **[Adding Models](adding-models.md)** - Add your new models to the admin interface
3. **[User Management](user-management.md)** - Implement secure admin authentication
The admin panel is ready to use immediately with sensible defaults, and each guide shows you how to customize it for your specific needs.

View File

@ -0,0 +1,213 @@
# User Management
Learn how to manage admin users in your FastAPI boilerplate's admin panel. The boilerplate automatically creates admin users from environment variables and provides a separate authentication system (powered by [CRUDAdmin](https://github.com/benavlabs/crudadmin)) from your application users.
> **CRUDAdmin Authentication**: For advanced authentication features and session management, see the [CRUDAdmin documentation](https://benavlabs.github.io/crudadmin/).
## Initial Admin Setup
### Configure Admin Credentials
Set your admin credentials in your `.env` file:
```bash
# Required admin credentials
ADMIN_USERNAME="admin"
ADMIN_PASSWORD="SecurePassword123!"
# Optional details
ADMIN_NAME="Administrator"
ADMIN_EMAIL="admin@yourcompany.com"
```
### Access the Admin Panel
Start your application and access the admin panel:
```bash
# Start application
uv run fastapi dev
# Visit: http://localhost:8000/admin
# Login with your ADMIN_USERNAME and ADMIN_PASSWORD
```
The boilerplate automatically creates the initial admin user from your environment variables when the application starts.
## Managing Admin Users
### Creating Additional Admin Users
Once logged in, you can create more admin users through the admin interface:
1. Navigate to the admin users section in the admin panel
2. Click "Create" or "Add New"
3. Fill in the required fields:
- Username (must be unique)
- Password (will be hashed automatically)
- Email (optional)
### Admin User Requirements
- **Username**: 3-50 characters, letters/numbers/underscores/hyphens
- **Password**: Minimum 8 characters with mixed case, numbers, and symbols
- **Email**: Valid email format (optional)
### Updating and Removing Users
- **Update**: Find the user in the admin panel and click "Edit"
- **Remove**: Click "Delete" (ensure you have alternative admin access first)
## Security Configuration
### Environment-Specific Settings
Configure different security levels for each environment:
```bash
# Development
ADMIN_USERNAME="dev-admin"
ADMIN_PASSWORD="DevPass123!"
ENVIRONMENT="local"
# Production
ADMIN_USERNAME="prod-admin"
ADMIN_PASSWORD="VerySecurePassword123!"
ENVIRONMENT="production"
CRUD_ADMIN_TRACK_EVENTS=true
CRUD_ADMIN_TRACK_SESSIONS=true
SESSION_SECURE_COOKIES=true
```
### Session Management
Control admin sessions with these settings:
```bash
# Session limits and timeouts
CRUD_ADMIN_MAX_SESSIONS=10 # Max concurrent sessions per user
CRUD_ADMIN_SESSION_TIMEOUT=1440 # Timeout in minutes (24 hours)
SESSION_SECURE_COOKIES=true # HTTPS-only cookies
```
### Enable Tracking
Monitor admin activity by enabling event tracking:
```bash
# Track admin actions and sessions
CRUD_ADMIN_TRACK_EVENTS=true # Log all admin actions
CRUD_ADMIN_TRACK_SESSIONS=true # Track session lifecycle
```
## Production Deployment
### Secure Credential Management
For production, use Docker secrets or Kubernetes secrets instead of plain text:
```yaml
# docker-compose.yml
services:
web:
secrets:
- admin_username
- admin_password
environment:
- ADMIN_USERNAME_FILE=/run/secrets/admin_username
- ADMIN_PASSWORD_FILE=/run/secrets/admin_password
secrets:
admin_username:
file: ./secrets/admin_username.txt
admin_password:
file: ./secrets/admin_password.txt
```
### Production Security Settings
```bash
# Production .env
ENVIRONMENT="production"
ADMIN_USERNAME="prod-admin"
ADMIN_PASSWORD="UltraSecurePassword123!"
# Enhanced security
CRUD_ADMIN_REDIS_ENABLED=true
CRUD_ADMIN_REDIS_HOST="redis.internal.company.com"
CRUD_ADMIN_REDIS_PASSWORD="secure-redis-password"
CRUD_ADMIN_REDIS_SSL=true
# Monitoring
CRUD_ADMIN_TRACK_EVENTS=true
CRUD_ADMIN_TRACK_SESSIONS=true
SESSION_SECURE_COOKIES=true
CRUD_ADMIN_MAX_SESSIONS=5
CRUD_ADMIN_SESSION_TIMEOUT=480 # 8 hours
```
## Application User Management
### Admin vs Application Users
Your boilerplate maintains two separate user systems:
- **Admin Users**: Access the admin panel (stored by CRUDAdmin)
- **Application Users**: Use your application (stored in your User model)
### Managing Application Users
Through the admin panel, you can manage your application's users:
1. Navigate to "Users" section (your application users)
2. View, create, update user profiles
3. Manage user tiers and subscriptions
4. View user-generated content (posts)
The User model is already registered with password hashing and proper permissions.
## Emergency Recovery
### Lost Admin Password
If you lose admin access, update your environment variables:
```bash
# Update .env file
ADMIN_USERNAME="emergency-admin"
ADMIN_PASSWORD="EmergencyPassword123!"
# Restart application
uv run fastapi dev
```
### Database Recovery (Advanced)
For direct database password reset:
```python
# Generate bcrypt hash
import bcrypt
password = "NewPassword123!"
hashed = bcrypt.hashpw(password.encode('utf-8'), bcrypt.gensalt())
print(hashed.decode('utf-8'))
```
```sql
-- Update in database
UPDATE admin_users
SET password_hash = '<bcrypt-hash>'
WHERE username = 'admin';
```
## What's Next
Your admin user management is now configured with:
- Automatic admin user creation from environment variables
- Secure authentication separate from application users
- Environment-specific security settings
- Production-ready credential management
- Emergency recovery procedures
You can now securely manage both admin users and your application users through the admin panel.

View File

@ -0,0 +1,327 @@
# API Endpoints
This guide shows you how to create API endpoints using the boilerplate's established patterns. You'll learn the common patterns you need for building CRUD APIs.
## Quick Start
Here's how to create a typical endpoint using the boilerplate's patterns:
```python
from fastapi import APIRouter, Depends, HTTPException
from typing import Annotated
from app.core.db.database import async_get_db
from app.crud.crud_users import crud_users
from app.schemas.user import UserRead, UserCreate
from app.api.dependencies import get_current_user
router = APIRouter(prefix="/users", tags=["users"])
@router.get("/{user_id}", response_model=UserRead)
async def get_user(
user_id: int,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
"""Get a user by ID."""
user = await crud_users.get(db=db, id=user_id, schema_to_select=UserRead)
if not user:
raise HTTPException(status_code=404, detail="User not found")
return user
```
That's it! The boilerplate handles the rest.
## Common Endpoint Patterns
### 1. Get Single Item
```python
@router.get("/{user_id}", response_model=UserRead)
async def get_user(
user_id: int,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
user = await crud_users.get(db=db, id=user_id, schema_to_select=UserRead)
if not user:
raise HTTPException(status_code=404, detail="User not found")
return user
```
### 2. Get Multiple Items (with Pagination)
```python
from fastcrud.paginated import PaginatedListResponse, paginated_response
@router.get("/", response_model=PaginatedListResponse[UserRead])
async def get_users(
page: int = 1,
items_per_page: int = 10,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
users = await crud_users.get_multi(
db=db,
offset=(page - 1) * items_per_page,
limit=items_per_page,
schema_to_select=UserRead,
return_as_model=True,
return_total_count=True
)
return paginated_response(
crud_data=users,
page=page,
items_per_page=items_per_page
)
```
### 3. Create Item
```python
@router.post("/", response_model=UserRead, status_code=201)
async def create_user(
user_data: UserCreate,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
# Check if user already exists
if await crud_users.exists(db=db, email=user_data.email):
raise HTTPException(status_code=409, detail="Email already exists")
# Create user
new_user = await crud_users.create(db=db, object=user_data)
return new_user
```
### 4. Update Item
```python
@router.patch("/{user_id}", response_model=UserRead)
async def update_user(
user_id: int,
user_data: UserUpdate,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
# Check if user exists
if not await crud_users.exists(db=db, id=user_id):
raise HTTPException(status_code=404, detail="User not found")
# Update user
updated_user = await crud_users.update(db=db, object=user_data, id=user_id)
return updated_user
```
### 5. Delete Item (Soft Delete)
```python
@router.delete("/{user_id}")
async def delete_user(
user_id: int,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
if not await crud_users.exists(db=db, id=user_id):
raise HTTPException(status_code=404, detail="User not found")
await crud_users.delete(db=db, id=user_id)
return {"message": "User deleted"}
```
## Adding Authentication
To require login, add the `get_current_user` dependency:
```python
@router.get("/me", response_model=UserRead)
async def get_my_profile(
current_user: Annotated[dict, Depends(get_current_user)]
):
"""Get current user's profile."""
return current_user
@router.post("/", response_model=UserRead)
async def create_user(
user_data: UserCreate,
current_user: Annotated[dict, Depends(get_current_user)], # Requires login
db: Annotated[AsyncSession, Depends(async_get_db)]
):
# Only logged-in users can create users
new_user = await crud_users.create(db=db, object=user_data)
return new_user
```
## Adding Admin-Only Endpoints
For admin-only endpoints, use `get_current_superuser`:
```python
from app.api.dependencies import get_current_superuser
@router.delete("/{user_id}/permanent", dependencies=[Depends(get_current_superuser)])
async def permanently_delete_user(
user_id: int,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
"""Admin-only: Permanently delete user from database."""
await crud_users.db_delete(db=db, id=user_id)
return {"message": "User permanently deleted"}
```
## Query Parameters
### Simple Parameters
```python
@router.get("/search")
async def search_users(
name: str | None = None, # Optional string
age: int | None = None, # Optional integer
is_active: bool = True, # Boolean with default
db: Annotated[AsyncSession, Depends(async_get_db)]
):
filters = {"is_active": is_active}
if name:
filters["name"] = name
if age:
filters["age"] = age
users = await crud_users.get_multi(db=db, **filters)
return users["data"]
```
### Parameters with Validation
```python
from fastapi import Query
@router.get("/")
async def get_users(
page: Annotated[int, Query(ge=1)] = 1, # Must be >= 1
limit: Annotated[int, Query(ge=1, le=100)] = 10, # Between 1-100
search: Annotated[str | None, Query(max_length=50)] = None, # Max 50 chars
db: Annotated[AsyncSession, Depends(async_get_db)]
):
# Use the validated parameters
users = await crud_users.get_multi(
db=db,
offset=(page - 1) * limit,
limit=limit
)
return users["data"]
```
## Error Handling
The boilerplate includes custom exceptions you can use:
```python
from app.core.exceptions.http_exceptions import (
NotFoundException,
DuplicateValueException,
ForbiddenException
)
@router.get("/{user_id}")
async def get_user(user_id: int, db: AsyncSession):
user = await crud_users.get(db=db, id=user_id)
if not user:
raise NotFoundException("User not found") # Returns 404
return user
@router.post("/")
async def create_user(user_data: UserCreate, db: AsyncSession):
if await crud_users.exists(db=db, email=user_data.email):
raise DuplicateValueException("Email already exists") # Returns 409
return await crud_users.create(db=db, object=user_data)
```
## File Uploads
```python
from fastapi import UploadFile, File
@router.post("/{user_id}/avatar")
async def upload_avatar(
user_id: int,
file: UploadFile = File(...),
current_user: Annotated[dict, Depends(get_current_user)],
db: Annotated[AsyncSession, Depends(async_get_db)]
):
# Check file type
if not file.content_type.startswith('image/'):
raise HTTPException(status_code=400, detail="File must be an image")
# Save file and update user
# ... file handling logic ...
return {"message": "Avatar uploaded successfully"}
```
## Creating New Endpoints
### Step 1: Create the Router File
Create `src/app/api/v1/posts.py`:
```python
from fastapi import APIRouter, Depends, HTTPException
from typing import Annotated
from app.core.db.database import async_get_db
from app.crud.crud_posts import crud_posts # You'll create this
from app.schemas.post import PostRead, PostCreate, PostUpdate # You'll create these
from app.api.dependencies import get_current_user
router = APIRouter(prefix="/posts", tags=["posts"])
@router.get("/", response_model=list[PostRead])
async def get_posts(db: Annotated[AsyncSession, Depends(async_get_db)]):
posts = await crud_posts.get_multi(db=db, schema_to_select=PostRead)
return posts["data"]
@router.post("/", response_model=PostRead, status_code=201)
async def create_post(
post_data: PostCreate,
current_user: Annotated[dict, Depends(get_current_user)],
db: Annotated[AsyncSession, Depends(async_get_db)]
):
# Add current user as post author
post_dict = post_data.model_dump()
post_dict["author_id"] = current_user["id"]
new_post = await crud_posts.create(db=db, object=post_dict)
return new_post
```
### Step 2: Register the Router
In `src/app/api/v1/__init__.py`, add:
```python
from .posts import router as posts_router
api_router.include_router(posts_router)
```
### Step 3: Test Your Endpoints
Your new endpoints will be available at:
- `GET /api/v1/posts/` - Get all posts
- `POST /api/v1/posts/` - Create new post (requires login)
## Best Practices
1. **Always use the database dependency**: `Depends(async_get_db)`
2. **Use existing CRUD methods**: `crud_users.get()`, `crud_users.create()`, etc.
3. **Check if items exist before operations**: Use `crud_users.exists()`
4. **Use proper HTTP status codes**: `status_code=201` for creation
5. **Add authentication when needed**: `Depends(get_current_user)`
6. **Use response models**: `response_model=UserRead`
7. **Handle errors with custom exceptions**: `NotFoundException`, `DuplicateValueException`
## What's Next
Now that you understand basic endpoints:
- **[Pagination](pagination.md)** - Add pagination to your endpoints<br>
- **[Exceptions](exceptions.md)** - Custom error handling and HTTP exceptions<br>
- **[CRUD Operations](../database/crud.md)** - Understand the CRUD layer<br>
The boilerplate provides everything you need - just follow these patterns!

View File

@ -0,0 +1,465 @@
# API Exception Handling
Learn how to handle errors properly in your API endpoints using the boilerplate's built-in exceptions and patterns.
## Quick Start
The boilerplate provides ready-to-use exceptions that return proper HTTP status codes:
```python
from app.core.exceptions.http_exceptions import NotFoundException
@router.get("/{user_id}")
async def get_user(user_id: int, db: AsyncSession):
user = await crud_users.get(db=db, id=user_id)
if not user:
raise NotFoundException("User not found") # Returns 404
return user
```
That's it! The exception automatically becomes a proper JSON error response.
## Built-in Exceptions
The boilerplate includes common HTTP exceptions you'll need:
### NotFoundException (404)
```python
from app.core.exceptions.http_exceptions import NotFoundException
@router.get("/{user_id}")
async def get_user(user_id: int):
user = await crud_users.get(db=db, id=user_id)
if not user:
raise NotFoundException("User not found")
return user
# Returns:
# Status: 404
# {"detail": "User not found"}
```
### DuplicateValueException (409)
```python
from app.core.exceptions.http_exceptions import DuplicateValueException
@router.post("/")
async def create_user(user_data: UserCreate):
if await crud_users.exists(db=db, email=user_data.email):
raise DuplicateValueException("Email already exists")
return await crud_users.create(db=db, object=user_data)
# Returns:
# Status: 409
# {"detail": "Email already exists"}
```
### ForbiddenException (403)
```python
from app.core.exceptions.http_exceptions import ForbiddenException
@router.delete("/{user_id}")
async def delete_user(
user_id: int,
current_user: Annotated[dict, Depends(get_current_user)]
):
if current_user["id"] != user_id and not current_user["is_superuser"]:
raise ForbiddenException("You can only delete your own account")
await crud_users.delete(db=db, id=user_id)
return {"message": "User deleted"}
# Returns:
# Status: 403
# {"detail": "You can only delete your own account"}
```
### UnauthorizedException (401)
```python
from app.core.exceptions.http_exceptions import UnauthorizedException
# This is typically used in the auth system, but you can use it too:
@router.get("/admin-only")
async def admin_endpoint():
# Some validation logic
if not user_is_admin:
raise UnauthorizedException("Admin access required")
return {"data": "secret admin data"}
# Returns:
# Status: 401
# {"detail": "Admin access required"}
```
## Common Patterns
### Check Before Create
```python
@router.post("/", response_model=UserRead)
async def create_user(user_data: UserCreate, db: AsyncSession):
# Check email
if await crud_users.exists(db=db, email=user_data.email):
raise DuplicateValueException("Email already exists")
# Check username
if await crud_users.exists(db=db, username=user_data.username):
raise DuplicateValueException("Username already taken")
# Create user
return await crud_users.create(db=db, object=user_data)
# For public registration endpoints, consider rate limiting
# to prevent email enumeration attacks
```
### Check Before Update
```python
@router.patch("/{user_id}", response_model=UserRead)
async def update_user(
user_id: int,
user_data: UserUpdate,
db: AsyncSession
):
# Check if user exists
if not await crud_users.exists(db=db, id=user_id):
raise NotFoundException("User not found")
# Check for email conflicts (if email is being updated)
if user_data.email:
existing = await crud_users.get(db=db, email=user_data.email)
if existing and existing.id != user_id:
raise DuplicateValueException("Email already taken")
# Update user
return await crud_users.update(db=db, object=user_data, id=user_id)
```
### Check Ownership
```python
@router.get("/{post_id}")
async def get_post(
post_id: int,
current_user: Annotated[dict, Depends(get_current_user)],
db: AsyncSession
):
post = await crud_posts.get(db=db, id=post_id)
if not post:
raise NotFoundException("Post not found")
# Check if user owns the post or is admin
if post.author_id != current_user["id"] and not current_user["is_superuser"]:
raise ForbiddenException("You can only view your own posts")
return post
```
## Validation Errors
FastAPI automatically handles Pydantic validation errors, but you can catch and customize them:
```python
from fastapi import HTTPException
from pydantic import ValidationError
@router.post("/")
async def create_user(user_data: UserCreate):
try:
# If user_data fails validation, Pydantic raises ValidationError
# FastAPI automatically converts this to a 422 response
return await crud_users.create(db=db, object=user_data)
except ValidationError as e:
# You can catch and customize if needed
raise HTTPException(
status_code=400,
detail=f"Invalid data: {e.errors()}"
)
```
## Standard HTTP Exceptions
For other status codes, use FastAPI's HTTPException:
```python
from fastapi import HTTPException
# Bad Request (400)
@router.post("/")
async def create_something(data: dict):
if not data.get("required_field"):
raise HTTPException(
status_code=400,
detail="required_field is missing"
)
# Too Many Requests (429)
@router.post("/")
async def rate_limited_endpoint():
if rate_limit_exceeded():
raise HTTPException(
status_code=429,
detail="Rate limit exceeded. Try again later."
)
# Internal Server Error (500)
@router.get("/")
async def risky_endpoint():
try:
# Some operation that might fail
result = risky_operation()
return result
except Exception as e:
# Log the error
logger.error(f"Unexpected error: {e}")
raise HTTPException(
status_code=500,
detail="An unexpected error occurred"
)
```
## Creating Custom Exceptions
If you need custom exceptions, follow the boilerplate's pattern:
```python
# In app/core/exceptions/http_exceptions.py (add to existing file)
from fastapi import HTTPException
class PaymentRequiredException(HTTPException):
"""402 Payment Required"""
def __init__(self, detail: str = "Payment required"):
super().__init__(status_code=402, detail=detail)
class TooManyRequestsException(HTTPException):
"""429 Too Many Requests"""
def __init__(self, detail: str = "Too many requests"):
super().__init__(status_code=429, detail=detail)
# Use them in your endpoints
from app.core.exceptions.http_exceptions import PaymentRequiredException
@router.get("/premium-feature")
async def premium_feature(current_user: dict):
if current_user["tier"] == "free":
raise PaymentRequiredException("Upgrade to access this feature")
return {"data": "premium content"}
```
## Error Response Format
All exceptions return consistent JSON responses:
```json
{
"detail": "Error message here"
}
```
For validation errors (422), you get more detail:
```json
{
"detail": [
{
"type": "missing",
"loc": ["body", "email"],
"msg": "Field required",
"input": null
}
]
}
```
## Global Exception Handling
The boilerplate includes global exception handlers. You can add your own in `main.py`:
```python
from fastapi import FastAPI, Request
from fastapi.responses import JSONResponse
app = FastAPI()
@app.exception_handler(ValueError)
async def value_error_handler(request: Request, exc: ValueError):
"""Handle ValueError exceptions globally"""
return JSONResponse(
status_code=400,
content={"detail": f"Invalid value: {str(exc)}"}
)
@app.exception_handler(Exception)
async def general_exception_handler(request: Request, exc: Exception):
"""Catch-all exception handler"""
# Log the error
logger.error(f"Unhandled exception: {exc}")
return JSONResponse(
status_code=500,
content={"detail": "An unexpected error occurred"}
)
```
## Security Considerations
### Authentication Endpoints - Use Generic Messages
For security, authentication endpoints should use generic error messages to prevent information disclosure:
```python
# SECURITY: Don't reveal if username exists
@router.post("/login")
async def login(credentials: LoginCredentials):
user = await crud_users.get(db=db, username=credentials.username)
# Don't do this - reveals if username exists
# if not user:
# raise NotFoundException("User not found")
# if not verify_password(credentials.password, user.hashed_password):
# raise UnauthorizedException("Invalid password")
# Do this - generic message for all auth failures
if not user or not verify_password(credentials.password, user.hashed_password):
raise UnauthorizedException("Invalid username or password")
return create_access_token(user.id)
# SECURITY: Don't reveal if email is registered during password reset
@router.post("/forgot-password")
async def forgot_password(email: str):
user = await crud_users.get(db=db, email=email)
# Don't do this - reveals if email exists
# if not user:
# raise NotFoundException("Email not found")
# Do this - always return success message
if user:
await send_password_reset_email(user.email)
# Always return the same message
return {"message": "If the email exists, a reset link has been sent"}
```
### Resource Access - Be Specific When Safe
For non-auth operations, specific messages help developers:
```python
# Safe to be specific for resource operations
@router.get("/{post_id}")
async def get_post(
post_id: int,
current_user: Annotated[dict, Depends(get_current_user)]
):
post = await crud_posts.get(db=db, id=post_id)
if not post:
raise NotFoundException("Post not found") # Safe to be specific
if post.author_id != current_user["id"]:
# Don't reveal post exists if user can't access it
raise NotFoundException("Post not found") # Generic, not "Access denied"
return post
```
## Best Practices
### 1. Use Specific Exceptions (When Safe)
```python
# Good for non-sensitive operations
if not user:
raise NotFoundException("User not found")
# Good for validation errors
raise DuplicateValueException("Username already taken")
```
### 2. Use Generic Messages for Security
```python
# Good for authentication
raise UnauthorizedException("Invalid username or password")
# Good for authorization (don't reveal resource exists)
raise NotFoundException("Resource not found") # Instead of "Access denied"
```
### 3. Check Permissions Early
```python
@router.delete("/{user_id}")
async def delete_user(
user_id: int,
current_user: Annotated[dict, Depends(get_current_user)]
):
# Check permission first
if current_user["id"] != user_id:
raise ForbiddenException("Cannot delete other users")
# Then check if user exists
if not await crud_users.exists(db=db, id=user_id):
raise NotFoundException("User not found")
await crud_users.delete(db=db, id=user_id)
```
### 4. Log Important Errors
```python
import logging
logger = logging.getLogger(__name__)
@router.post("/")
async def create_user(user_data: UserCreate):
try:
return await crud_users.create(db=db, object=user_data)
except Exception as e:
logger.error(f"Failed to create user: {e}")
raise HTTPException(status_code=500, detail="User creation failed")
```
## Testing Exceptions
Test that your endpoints raise the right exceptions:
```python
import pytest
from httpx import AsyncClient
@pytest.mark.asyncio
async def test_user_not_found(client: AsyncClient):
response = await client.get("/api/v1/users/99999")
assert response.status_code == 404
assert "User not found" in response.json()["detail"]
@pytest.mark.asyncio
async def test_duplicate_email(client: AsyncClient):
# Create a user
await client.post("/api/v1/users/", json={
"name": "Test User",
"username": "test1",
"email": "test@example.com",
"password": "Password123!"
})
# Try to create another with same email
response = await client.post("/api/v1/users/", json={
"name": "Test User 2",
"username": "test2",
"email": "test@example.com", # Same email
"password": "Password123!"
})
assert response.status_code == 409
assert "Email already exists" in response.json()["detail"]
```
## What's Next
Now that you understand error handling:
- **[Versioning](versioning.md)** - Learn how to version your APIs<br>
- **[Database CRUD](../database/crud.md)** - Understand the database operations<br>
- **[Authentication](../authentication/index.md)** - Add user authentication to your APIs
Proper error handling makes your API much more user-friendly and easier to debug!

View File

@ -0,0 +1,125 @@
# API Development
Learn how to build REST APIs with the FastAPI Boilerplate. This section covers everything you need to create robust, production-ready APIs.
## What You'll Learn
- **[Endpoints](endpoints.md)** - Create CRUD endpoints with authentication and validation
- **[Pagination](pagination.md)** - Add pagination to handle large datasets
- **[Exception Handling](exceptions.md)** - Handle errors properly with built-in exceptions
- **[API Versioning](versioning.md)** - Version your APIs and maintain backward compatibility
- **Database Integration** - Use the boilerplate's CRUD layer and schemas
## Quick Overview
The boilerplate provides everything you need for API development:
```python
from fastapi import APIRouter, Depends
from app.crud.crud_users import crud_users
from app.schemas.user import UserRead, UserCreate
from app.core.db.database import async_get_db
router = APIRouter(prefix="/users", tags=["users"])
@router.get("/", response_model=list[UserRead])
async def get_users(db: Annotated[AsyncSession, Depends(async_get_db)]):
users = await crud_users.get_multi(db=db, schema_to_select=UserRead)
return users["data"]
@router.post("/", response_model=UserRead, status_code=201)
async def create_user(
user_data: UserCreate,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
return await crud_users.create(db=db, object=user_data)
```
## Key Features
### 🔐 **Built-in Authentication**
Add authentication to any endpoint:
```python
from app.api.dependencies import get_current_user
@router.get("/me", response_model=UserRead)
async def get_profile(current_user: Annotated[dict, Depends(get_current_user)]):
return current_user
```
### 📊 **Easy Pagination**
Paginate any endpoint with one line:
```python
from fastcrud.paginated import PaginatedListResponse
@router.get("/", response_model=PaginatedListResponse[UserRead])
async def get_users(page: int = 1, items_per_page: int = 10):
# Add pagination to any endpoint
```
### ✅ **Automatic Validation**
Request and response validation is handled automatically:
```python
@router.post("/", response_model=UserRead)
async def create_user(user_data: UserCreate): # ← Validates input
return await crud_users.create(object=user_data) # ← Validates output
```
### 🛡️ **Error Handling**
Use built-in exceptions for consistent error responses:
```python
from app.core.exceptions.http_exceptions import NotFoundException
@router.get("/{user_id}")
async def get_user(user_id: int):
user = await crud_users.get(id=user_id)
if not user:
raise NotFoundException("User not found") # Returns proper 404
return user
```
## Architecture
The boilerplate follows a layered architecture:
```
API Endpoint
Pydantic Schema (validation)
CRUD Layer (database operations)
SQLAlchemy Model (database)
```
This separation makes your code:
- **Testable** - Mock any layer easily
- **Maintainable** - Clear separation of concerns
- **Scalable** - Add features without breaking existing code
## Directory Structure
```text
src/app/api/
├── dependencies.py # Shared dependencies (auth, rate limiting)
└── v1/ # API version 1
├── users.py # User endpoints
├── posts.py # Post endpoints
├── login.py # Authentication
└── ... # Other endpoints
```
## What's Next
Start with the basics:
1. **[Endpoints](endpoints.md)** - Learn the common patterns for creating API endpoints
2. **[Pagination](pagination.md)** - Add pagination to handle large datasets
3. **[Exception Handling](exceptions.md)** - Handle errors properly with built-in exceptions
4. **[API Versioning](versioning.md)** - Version your APIs and maintain backward compatibility
Then dive deeper into the foundation:
5. **[Database Schemas](../database/schemas.md)** - Create schemas for your data
6. **[CRUD Operations](../database/crud.md)** - Understand the database layer
Each guide builds on the previous one with practical examples you can use immediately.

View File

@ -0,0 +1,316 @@
# API Pagination
This guide shows you how to add pagination to your API endpoints using the boilerplate's built-in utilities. Pagination helps you handle large datasets efficiently.
## Quick Start
Here's how to add basic pagination to any endpoint:
```python
from fastcrud.paginated import PaginatedListResponse
@router.get("/", response_model=PaginatedListResponse[UserRead])
async def get_users(
page: int = 1,
items_per_page: int = 10,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
users = await crud_users.get_multi(
db=db,
offset=(page - 1) * items_per_page,
limit=items_per_page,
schema_to_select=UserRead,
return_as_model=True,
return_total_count=True
)
return paginated_response(
crud_data=users,
page=page,
items_per_page=items_per_page
)
```
That's it! Your endpoint now returns paginated results with metadata.
## What You Get
The response includes everything frontends need:
```json
{
"data": [
{
"id": 1,
"name": "John Doe",
"username": "johndoe",
"email": "john@example.com"
}
// ... more users
],
"total_count": 150,
"has_more": true,
"page": 1,
"items_per_page": 10,
"total_pages": 15
}
```
## Adding Filters
You can easily add filtering to paginated endpoints:
```python
@router.get("/", response_model=PaginatedListResponse[UserRead])
async def get_users(
page: int = 1,
items_per_page: int = 10,
# Add filter parameters
search: str | None = None,
is_active: bool | None = None,
tier_id: int | None = None,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
# Build filters
filters = {}
if search:
filters["name__icontains"] = search # Search by name
if is_active is not None:
filters["is_active"] = is_active
if tier_id:
filters["tier_id"] = tier_id
users = await crud_users.get_multi(
db=db,
offset=(page - 1) * items_per_page,
limit=items_per_page,
schema_to_select=UserRead,
return_as_model=True,
return_total_count=True,
**filters
)
return paginated_response(
crud_data=users,
page=page,
items_per_page=items_per_page
)
```
Now you can call:
- `/users/?search=john` - Find users with "john" in their name
- `/users/?is_active=true` - Only active users
- `/users/?tier_id=1&page=2` - Users in tier 1, page 2
## Adding Sorting
Add sorting options to your paginated endpoints:
```python
@router.get("/", response_model=PaginatedListResponse[UserRead])
async def get_users(
page: int = 1,
items_per_page: int = 10,
# Add sorting parameters
sort_by: str = "created_at",
sort_order: str = "desc",
db: Annotated[AsyncSession, Depends(async_get_db)]
):
users = await crud_users.get_multi(
db=db,
offset=(page - 1) * items_per_page,
limit=items_per_page,
schema_to_select=UserRead,
return_as_model=True,
return_total_count=True,
sort_columns=sort_by,
sort_orders=sort_order
)
return paginated_response(
crud_data=users,
page=page,
items_per_page=items_per_page
)
```
Usage:
- `/users/?sort_by=name&sort_order=asc` - Sort by name A-Z
- `/users/?sort_by=created_at&sort_order=desc` - Newest first
## Validation
Add validation to prevent issues:
```python
from fastapi import Query
@router.get("/", response_model=PaginatedListResponse[UserRead])
async def get_users(
page: Annotated[int, Query(ge=1)] = 1, # Must be >= 1
items_per_page: Annotated[int, Query(ge=1, le=100)] = 10, # Between 1-100
db: Annotated[AsyncSession, Depends(async_get_db)]
):
# Your pagination logic here
```
## Complete Example
Here's a full-featured paginated endpoint:
```python
@router.get("/", response_model=PaginatedListResponse[UserRead])
async def get_users(
# Pagination
page: Annotated[int, Query(ge=1)] = 1,
items_per_page: Annotated[int, Query(ge=1, le=100)] = 10,
# Filtering
search: Annotated[str | None, Query(max_length=100)] = None,
is_active: bool | None = None,
tier_id: int | None = None,
# Sorting
sort_by: str = "created_at",
sort_order: str = "desc",
db: Annotated[AsyncSession, Depends(async_get_db)]
):
"""Get paginated users with filtering and sorting."""
# Build filters
filters = {"is_deleted": False} # Always exclude deleted users
if is_active is not None:
filters["is_active"] = is_active
if tier_id:
filters["tier_id"] = tier_id
# Handle search
search_criteria = []
if search:
from sqlalchemy import or_, func
search_criteria = [
or_(
func.lower(User.name).contains(search.lower()),
func.lower(User.username).contains(search.lower()),
func.lower(User.email).contains(search.lower())
)
]
users = await crud_users.get_multi(
db=db,
offset=(page - 1) * items_per_page,
limit=items_per_page,
schema_to_select=UserRead,
return_as_model=True,
return_total_count=True,
sort_columns=sort_by,
sort_orders=sort_order,
**filters,
**{"filter_criteria": search_criteria} if search_criteria else {}
)
return paginated_response(
crud_data=users,
page=page,
items_per_page=items_per_page
)
```
This endpoint supports:
- `/users/` - First 10 users
- `/users/?page=2&items_per_page=20` - Page 2, 20 items
- `/users/?search=john&is_active=true` - Active users named john
- `/users/?sort_by=name&sort_order=asc` - Sorted by name
## Simple List (No Pagination)
Sometimes you just want a simple list without pagination:
```python
@router.get("/all", response_model=list[UserRead])
async def get_all_users(
limit: int = 100, # Prevent too many results
db: Annotated[AsyncSession, Depends(async_get_db)]
):
users = await crud_users.get_multi(
db=db,
limit=limit,
schema_to_select=UserRead,
return_as_model=True
)
return users["data"]
```
## Performance Tips
1. **Always set a maximum page size**:
```python
items_per_page: Annotated[int, Query(ge=1, le=100)] = 10 # Max 100 items
```
2. **Use `schema_to_select` to only fetch needed fields**:
```python
users = await crud_users.get_multi(
schema_to_select=UserRead, # Only fetch UserRead fields
return_as_model=True
)
```
3. **Add database indexes** for columns you sort by:
```sql
-- In your migration
CREATE INDEX idx_users_created_at ON users(created_at);
CREATE INDEX idx_users_name ON users(name);
```
## Common Patterns
### Admin List with All Users
```python
@router.get("/admin", dependencies=[Depends(get_current_superuser)])
async def get_all_users_admin(
include_deleted: bool = False,
page: int = 1,
items_per_page: int = 50,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
filters = {}
if not include_deleted:
filters["is_deleted"] = False
users = await crud_users.get_multi(db=db, **filters)
return paginated_response(users, page, items_per_page)
```
### User's Own Items
```python
@router.get("/my-posts", response_model=PaginatedListResponse[PostRead])
async def get_my_posts(
page: int = 1,
items_per_page: int = 10,
current_user: Annotated[dict, Depends(get_current_user)],
db: Annotated[AsyncSession, Depends(async_get_db)]
):
posts = await crud_posts.get_multi(
db=db,
author_id=current_user["id"], # Only user's own posts
offset=(page - 1) * items_per_page,
limit=items_per_page
)
return paginated_response(posts, page, items_per_page)
```
## What's Next
Now that you understand pagination:
- **[Database CRUD](../database/crud.md)** - Learn more about the CRUD operations
- **[Database Schemas](../database/schemas.md)** - Create schemas for your data
- **[Authentication](../authentication/index.md)** - Add user authentication to your endpoints
The boilerplate makes pagination simple - just use these patterns!

View File

@ -0,0 +1,418 @@
# API Versioning
Learn how to version your APIs properly using the boilerplate's built-in versioning structure and best practices for maintaining backward compatibility.
## Quick Start
The boilerplate is already set up for versioning with a `v1` structure:
```text
src/app/api/
├── dependencies.py # Shared across all versions
└── v1/ # Version 1 of your API
├── __init__.py # Router registration
├── users.py # User endpoints
├── posts.py # Post endpoints
└── ... # Other endpoints
```
Your endpoints are automatically available at `/api/v1/...`:
- `GET /api/v1/users/` - Get users
- `POST /api/v1/users/` - Create user
- `GET /api/v1/posts/` - Get posts
## Current Structure
### Version 1 (v1)
The current API version is in `src/app/api/v1/`:
```python
# src/app/api/v1/__init__.py
from fastapi import APIRouter
from .users import router as users_router
from .posts import router as posts_router
from .login import router as login_router
# Main v1 router
api_router = APIRouter()
# Include all v1 endpoints
api_router.include_router(users_router)
api_router.include_router(posts_router)
api_router.include_router(login_router)
```
### Main App Registration
In `src/app/main.py`, v1 is registered:
```python
from fastapi import FastAPI
from app.api.v1 import api_router as api_v1_router
app = FastAPI()
# Register v1 API
app.include_router(api_v1_router, prefix="/api/v1")
```
## Adding Version 2
When you need to make breaking changes, create a new version:
### Step 1: Create v2 Directory
```text
src/app/api/
├── dependencies.py
├── v1/ # Keep v1 unchanged
│ ├── __init__.py
│ ├── users.py
│ └── ...
└── v2/ # New version
├── __init__.py
├── users.py # Updated user endpoints
└── ...
```
### Step 2: Create v2 Router
```python
# src/app/api/v2/__init__.py
from fastapi import APIRouter
from .users import router as users_router
# Import other v2 routers
# Main v2 router
api_router = APIRouter()
# Include v2 endpoints
api_router.include_router(users_router)
```
### Step 3: Register v2 in Main App
```python
# src/app/main.py
from fastapi import FastAPI
from app.api.v1 import api_router as api_v1_router
from app.api.v2 import api_router as api_v2_router
app = FastAPI()
# Register both versions
app.include_router(api_v1_router, prefix="/api/v1")
app.include_router(api_v2_router, prefix="/api/v2")
```
## Version 2 Example
Here's how you might evolve the user endpoints in v2:
### v1 User Endpoint
```python
# src/app/api/v1/users.py
from app.schemas.user import UserRead, UserCreate
@router.get("/", response_model=list[UserRead])
async def get_users():
users = await crud_users.get_multi(db=db, schema_to_select=UserRead)
return users["data"]
@router.post("/", response_model=UserRead)
async def create_user(user_data: UserCreate):
return await crud_users.create(db=db, object=user_data)
```
### v2 User Endpoint (with breaking changes)
```python
# src/app/api/v2/users.py
from app.schemas.user import UserReadV2, UserCreateV2 # New schemas
from fastcrud.paginated import PaginatedListResponse
# Breaking change: Always return paginated response
@router.get("/", response_model=PaginatedListResponse[UserReadV2])
async def get_users(page: int = 1, items_per_page: int = 10):
users = await crud_users.get_multi(
db=db,
offset=(page - 1) * items_per_page,
limit=items_per_page,
schema_to_select=UserReadV2
)
return paginated_response(users, page, items_per_page)
# Breaking change: Require authentication
@router.post("/", response_model=UserReadV2)
async def create_user(
user_data: UserCreateV2,
current_user: Annotated[dict, Depends(get_current_user)] # Now required
):
return await crud_users.create(db=db, object=user_data)
```
## Schema Versioning
Create separate schemas for different versions:
### Version 1 Schema
```python
# src/app/schemas/user.py (existing)
class UserRead(BaseModel):
id: int
name: str
username: str
email: str
profile_image_url: str
tier_id: int | None
class UserCreate(BaseModel):
name: str
username: str
email: str
password: str
```
### Version 2 Schema (with changes)
```python
# src/app/schemas/user_v2.py (new file)
from datetime import datetime
class UserReadV2(BaseModel):
id: int
name: str
username: str
email: str
avatar_url: str # Changed from profile_image_url
subscription_tier: str # Changed from tier_id to string
created_at: datetime # New field
is_verified: bool # New field
class UserCreateV2(BaseModel):
name: str
username: str
email: str
password: str
accept_terms: bool # New required field
```
## Gradual Migration Strategy
### 1. Keep Both Versions Running
```python
# Both versions work simultaneously
# v1: GET /api/v1/users/ -> list[UserRead]
# v2: GET /api/v2/users/ -> PaginatedListResponse[UserReadV2]
```
### 2. Add Deprecation Warnings
```python
# src/app/api/v1/users.py
import warnings
from fastapi import HTTPException
@router.get("/", response_model=list[UserRead])
async def get_users(response: Response):
# Add deprecation header
response.headers["X-API-Deprecation"] = "v1 is deprecated. Use v2."
response.headers["X-API-Sunset"] = "2024-12-31" # When v1 will be removed
users = await crud_users.get_multi(db=db, schema_to_select=UserRead)
return users["data"]
```
### 3. Monitor Usage
Track which versions are being used:
```python
# src/app/api/middleware.py
from fastapi import Request
import logging
logger = logging.getLogger(__name__)
async def version_tracking_middleware(request: Request, call_next):
if request.url.path.startswith("/api/v1/"):
logger.info(f"v1 usage: {request.method} {request.url.path}")
elif request.url.path.startswith("/api/v2/"):
logger.info(f"v2 usage: {request.method} {request.url.path}")
response = await call_next(request)
return response
```
## Shared Code Between Versions
Keep common logic in shared modules:
### Shared Dependencies
```python
# src/app/api/dependencies.py - shared across all versions
async def get_current_user(...):
# Authentication logic used by all versions
pass
async def get_db():
# Database connection used by all versions
pass
```
### Shared CRUD Operations
```python
# The CRUD layer can be shared between versions
# Only the schemas and endpoints change
# v1 endpoint
@router.get("/", response_model=list[UserRead])
async def get_users_v1():
users = await crud_users.get_multi(schema_to_select=UserRead)
return users["data"]
# v2 endpoint
@router.get("/", response_model=PaginatedListResponse[UserReadV2])
async def get_users_v2():
users = await crud_users.get_multi(schema_to_select=UserReadV2)
return paginated_response(users, page, items_per_page)
```
## Version Discovery
Let clients discover available versions:
```python
# src/app/api/versions.py
from fastapi import APIRouter
router = APIRouter()
@router.get("/versions")
async def get_api_versions():
return {
"available_versions": ["v1", "v2"],
"current_version": "v2",
"deprecated_versions": [],
"sunset_dates": {
"v1": "2024-12-31"
}
}
```
Register it in main.py:
```python
# src/app/main.py
from app.api.versions import router as versions_router
app.include_router(versions_router, prefix="/api")
# Now available at GET /api/versions
```
## Testing Multiple Versions
Test both versions to ensure compatibility:
```python
# tests/test_api_versioning.py
import pytest
from httpx import AsyncClient
@pytest.mark.asyncio
async def test_v1_users(client: AsyncClient):
"""Test v1 returns simple list"""
response = await client.get("/api/v1/users/")
assert response.status_code == 200
data = response.json()
assert isinstance(data, list) # v1 returns list
@pytest.mark.asyncio
async def test_v2_users(client: AsyncClient):
"""Test v2 returns paginated response"""
response = await client.get("/api/v2/users/")
assert response.status_code == 200
data = response.json()
assert "data" in data # v2 returns paginated response
assert "total_count" in data
assert "page" in data
```
## OpenAPI Documentation
Each version gets its own docs:
```python
# src/app/main.py
from fastapi import FastAPI
# Create separate apps for documentation
v1_app = FastAPI(title="My API v1", version="1.0.0")
v2_app = FastAPI(title="My API v2", version="2.0.0")
# Register routes
v1_app.include_router(api_v1_router)
v2_app.include_router(api_v2_router)
# Mount as sub-applications
main_app = FastAPI()
main_app.mount("/api/v1", v1_app)
main_app.mount("/api/v2", v2_app)
```
Now you have separate documentation:
- `/api/v1/docs` - v1 documentation
- `/api/v2/docs` - v2 documentation
## Best Practices
### 1. Semantic Versioning
- **v1.0** → **v1.1**: New features (backward compatible)
- **v1.1** → **v2.0**: Breaking changes (new version)
### 2. Clear Migration Path
```python
# Document what changed in v2
"""
API v2 Changes:
- GET /users/ now returns paginated response instead of array
- POST /users/ now requires authentication
- UserRead.profile_image_url renamed to avatar_url
- UserRead.tier_id changed to subscription_tier (string)
- Added UserRead.created_at and is_verified fields
- UserCreate now requires accept_terms field
"""
```
### 3. Gradual Deprecation
1. Release v2 alongside v1
2. Add deprecation warnings to v1
3. Set sunset date for v1
4. Monitor v1 usage
5. Remove v1 after sunset date
### 4. Consistent Patterns
Keep the same patterns across versions:
- Same URL structure: `/api/v{number}/resource`
- Same HTTP methods and status codes
- Same authentication approach
- Same error response format
## What's Next
Now that you understand API versioning:
- **[Database Migrations](../database/migrations.md)** - Handle database schema changes
- **[Testing](../testing.md)** - Test multiple API versions
- **[Production](../production.md)** - Deploy versioned APIs
Proper versioning lets you evolve your API without breaking existing clients!

View File

@ -0,0 +1,198 @@
# Authentication & Security
Learn how to implement secure authentication in your FastAPI application. The boilerplate provides a complete JWT-based authentication system with user management, permissions, and security best practices.
## What You'll Learn
- **[JWT Tokens](jwt-tokens.md)** - Understand access and refresh token management
- **[User Management](user-management.md)** - Handle registration, login, and user profiles
- **[Permissions](permissions.md)** - Implement role-based access control and authorization
## Authentication Overview
The system uses JWT tokens with refresh token rotation for secure, stateless authentication:
```python
# Basic login flow
@router.post("/login", response_model=Token)
async def login_for_access_token(response: Response, form_data: OAuth2PasswordRequestForm):
user = await authenticate_user(form_data.username, form_data.password, db)
access_token = await create_access_token(data={"sub": user["username"]})
refresh_token = await create_refresh_token(data={"sub": user["username"]})
# Set secure HTTP-only cookie for refresh token
response.set_cookie("refresh_token", refresh_token, httponly=True, secure=True)
return {"access_token": access_token, "token_type": "bearer"}
```
## Key Features
### JWT Token System
- **Access tokens**: Short-lived (30 minutes), for API requests
- **Refresh tokens**: Long-lived (7 days), stored in secure cookies
- **Token blacklisting**: Secure logout implementation
- **Automatic expiration**: Built-in token lifecycle management
### User Management
- **Flexible authentication**: Username or email login
- **Secure passwords**: bcrypt hashing with salt
- **Profile management**: Complete user CRUD operations
- **Soft delete**: User deactivation without data loss
### Permission System
- **Superuser privileges**: Administrative access control
- **Resource ownership**: User-specific data access
- **User tiers**: Subscription-based feature access
- **Rate limiting**: Per-user and per-tier API limits
## Authentication Patterns
### Endpoint Protection
```python
# Required authentication
@router.get("/protected")
async def protected_endpoint(current_user: dict = Depends(get_current_user)):
return {"message": f"Hello {current_user['username']}"}
# Optional authentication
@router.get("/public")
async def public_endpoint(user: dict | None = Depends(get_optional_user)):
if user:
return {"premium_content": True}
return {"premium_content": False}
# Superuser only
@router.get("/admin", dependencies=[Depends(get_current_superuser)])
async def admin_endpoint():
return {"admin_data": "sensitive"}
```
### Resource Ownership
```python
@router.patch("/posts/{post_id}")
async def update_post(post_id: int, current_user: dict = Depends(get_current_user)):
post = await crud_posts.get(db=db, id=post_id)
# Check ownership or admin privileges
if post["created_by_user_id"] != current_user["id"] and not current_user["is_superuser"]:
raise ForbiddenException("Cannot update other users' posts")
return await crud_posts.update(db=db, id=post_id, object=updates)
```
## Security Features
### Token Security
- Short-lived access tokens limit exposure
- HTTP-only refresh token cookies prevent XSS
- Token blacklisting enables secure logout
- Configurable token expiration times
### Password Security
- bcrypt hashing with automatic salt generation
- Configurable password complexity requirements
- No plain text passwords stored anywhere
- Rate limiting on authentication endpoints
### API Protection
- CORS policies for cross-origin request control
- Rate limiting prevents brute force attacks
- Input validation prevents injection attacks
- Consistent error messages prevent information disclosure
## Configuration
### JWT Settings
```env
SECRET_KEY="your-super-secret-key-here"
ALGORITHM="HS256"
ACCESS_TOKEN_EXPIRE_MINUTES=30
REFRESH_TOKEN_EXPIRE_DAYS=7
```
### Security Settings
```env
# Cookie security
COOKIE_SECURE=true
COOKIE_SAMESITE="lax"
# Password requirements
PASSWORD_MIN_LENGTH=8
ENABLE_PASSWORD_COMPLEXITY=true
```
## Getting Started
Follow this progressive learning path:
### 1. **[JWT Tokens](jwt-tokens.md)** - Foundation
Understand how JWT tokens work, including access and refresh token management, verification, and blacklisting.
### 2. **[User Management](user-management.md)** - Core Features
Implement user registration, login, profile management, and administrative operations.
### 3. **[Permissions](permissions.md)** - Access Control
Set up role-based access control, resource ownership checking, and tier-based permissions.
## Implementation Examples
### Quick Authentication Setup
```python
# Protect an endpoint
@router.get("/my-data")
async def get_my_data(current_user: dict = Depends(get_current_user)):
return await get_user_specific_data(current_user["id"])
# Check user permissions
def check_tier_access(user: dict, required_tier: str):
if not user.get("tier") or user["tier"]["name"] != required_tier:
raise ForbiddenException(f"Requires {required_tier} tier")
# Custom authentication dependency
async def get_premium_user(current_user: dict = Depends(get_current_user)):
check_tier_access(current_user, "Pro")
return current_user
```
### Frontend Integration
```javascript
// Basic authentication flow
class AuthManager {
async login(username, password) {
const response = await fetch('/api/v1/login', {
method: 'POST',
headers: {'Content-Type': 'application/x-www-form-urlencoded'},
body: new URLSearchParams({username, password})
});
const tokens = await response.json();
localStorage.setItem('access_token', tokens.access_token);
return tokens;
}
async makeAuthenticatedRequest(url, options = {}) {
const token = localStorage.getItem('access_token');
return fetch(url, {
...options,
headers: {
...options.headers,
'Authorization': `Bearer ${token}`
}
});
}
}
```
## What's Next
Start building your authentication system:
1. **[JWT Tokens](jwt-tokens.md)** - Learn token creation, verification, and lifecycle management
2. **[User Management](user-management.md)** - Implement registration, login, and profile operations
3. **[Permissions](permissions.md)** - Add authorization patterns and access control
The authentication system provides a secure foundation for your API. Each guide includes practical examples and implementation details for production-ready authentication.

View File

@ -0,0 +1,669 @@
# JWT Tokens
JSON Web Tokens (JWT) form the backbone of modern web authentication. This comprehensive guide explains how the boilerplate implements a secure, stateless authentication system using access and refresh tokens.
## Understanding JWT Authentication
JWT tokens are self-contained, digitally signed packages of information that can be safely transmitted between parties. Unlike traditional session-based authentication that requires server-side storage, JWT tokens are stateless - all the information needed to verify a user's identity is contained within the token itself.
### Why Use JWT?
**Stateless Design**: No need to store session data on the server, making it perfect for distributed systems and microservices.
**Scalability**: Since tokens contain all necessary information, they work seamlessly across multiple servers without shared session storage.
**Security**: Digital signatures ensure tokens can't be tampered with, and expiration times limit exposure if compromised.
**Cross-Domain Support**: Unlike cookies, JWT tokens work across different domains and can be used in mobile applications.
## Token Types
The authentication system uses a **dual-token approach** for maximum security and user experience:
### Access Tokens
Access tokens are short-lived credentials that prove a user's identity for API requests. Think of them as temporary keys that grant access to protected resources.
- **Purpose**: Authenticate API requests and authorize actions
- **Lifetime**: 30 minutes (configurable) - short enough to limit damage if compromised
- **Storage**: Authorization header (`Bearer <token>`) - sent with each API request
- **Usage**: Include in every call to protected endpoints
**Why Short-Lived?** If an access token is stolen (e.g., through XSS), the damage window is limited to 30 minutes before it expires naturally.
### Refresh Tokens
Refresh tokens are longer-lived credentials used solely to generate new access tokens. They provide a balance between security and user convenience.
- **Purpose**: Generate new access tokens without requiring re-login
- **Lifetime**: 7 days (configurable) - long enough for good UX, short enough for security
- **Storage**: Secure HTTP-only cookie - inaccessible to JavaScript, preventing XSS attacks
- **Usage**: Automatically used by the browser when access tokens need refreshing
**Why HTTP-Only Cookies?** This prevents malicious JavaScript from accessing refresh tokens, providing protection against XSS attacks while allowing automatic renewal.
## Token Creation
Understanding how tokens are created helps you customize the authentication system for your specific needs.
### Creating Access Tokens
Access tokens are generated during login and token refresh operations. The process involves encoding user information with an expiration time and signing it with your secret key.
```python
from datetime import timedelta
from app.core.security import create_access_token, ACCESS_TOKEN_EXPIRE_MINUTES
# Basic access token with default expiration
access_token = await create_access_token(data={"sub": username})
# Custom expiration for special cases (e.g., admin sessions)
custom_expires = timedelta(minutes=60)
access_token = await create_access_token(
data={"sub": username},
expires_delta=custom_expires
)
```
**When to Customize Expiration:**
- **High-security environments**: Shorter expiration (15 minutes)
- **Development/testing**: Longer expiration for convenience
- **Admin operations**: Variable expiration based on sensitivity
### Creating Refresh Tokens
Refresh tokens follow the same creation pattern but with longer expiration times. They're typically created only during login.
```python
from app.core.security import create_refresh_token, REFRESH_TOKEN_EXPIRE_DAYS
# Standard refresh token
refresh_token = await create_refresh_token(data={"sub": username})
# Extended refresh token for "remember me" functionality
extended_expires = timedelta(days=30)
refresh_token = await create_refresh_token(
data={"sub": username},
expires_delta=extended_expires
)
```
### Token Structure
JWT tokens consist of three parts separated by dots: `header.payload.signature`. The payload contains the actual user information and metadata.
```python
# Access token payload structure
{
"sub": "username", # Subject (user identifier)
"exp": 1234567890, # Expiration timestamp (Unix)
"token_type": "access", # Distinguishes from refresh tokens
"iat": 1234567890 # Issued at (automatic)
}
# Refresh token payload structure
{
"sub": "username", # Same user identifier
"exp": 1234567890, # Longer expiration time
"token_type": "refresh", # Prevents confusion/misuse
"iat": 1234567890 # Issue timestamp
}
```
**Key Fields Explained:**
- **`sub` (Subject)**: Identifies the user - can be username, email, or user ID
- **`exp` (Expiration)**: Unix timestamp when token becomes invalid
- **`token_type`**: Custom field preventing tokens from being used incorrectly
- **`iat` (Issued At)**: Useful for token rotation and audit trails
## Token Verification
Token verification is a multi-step process that ensures both the token's authenticity and the user's current authorization status.
### Verifying Access Tokens
Every protected endpoint must verify the access token before processing the request. This involves checking the signature, expiration, and blacklist status.
```python
from app.core.security import verify_token, TokenType
# Verify access token in endpoint
token_data = await verify_token(token, TokenType.ACCESS, db)
if token_data:
username = token_data.username_or_email
# Token is valid, proceed with request processing
else:
# Token is invalid, expired, or blacklisted
raise UnauthorizedException("Invalid or expired token")
```
### Verifying Refresh Tokens
Refresh token verification follows the same process but with different validation rules and outcomes.
```python
# Verify refresh token for renewal
token_data = await verify_token(token, TokenType.REFRESH, db)
if token_data:
# Generate new access token
new_access_token = await create_access_token(
data={"sub": token_data.username_or_email}
)
return {"access_token": new_access_token, "token_type": "bearer"}
else:
# Refresh token invalid - user must log in again
raise UnauthorizedException("Invalid refresh token")
```
### Token Verification Process
The verification process includes several security checks to prevent various attack vectors:
```python
async def verify_token(token: str, expected_token_type: TokenType, db: AsyncSession) -> TokenData | None:
# 1. Check blacklist first (prevents use of logged-out tokens)
is_blacklisted = await crud_token_blacklist.exists(db, token=token)
if is_blacklisted:
return None
try:
# 2. Verify signature and decode payload
payload = jwt.decode(token, SECRET_KEY.get_secret_value(), algorithms=[ALGORITHM])
# 3. Extract and validate claims
username_or_email: str | None = payload.get("sub")
token_type: str | None = payload.get("token_type")
# 4. Ensure token type matches expectation
if username_or_email is None or token_type != expected_token_type:
return None
# 5. Return validated data
return TokenData(username_or_email=username_or_email)
except JWTError:
# Token is malformed, expired, or signature invalid
return None
```
**Security Checks Explained:**
1. **Blacklist Check**: Prevents use of tokens from logged-out users
2. **Signature Verification**: Ensures token hasn't been tampered with
3. **Expiration Check**: Automatically handled by JWT library
4. **Type Validation**: Prevents refresh tokens from being used as access tokens
5. **Subject Validation**: Ensures token contains valid user identifier
## Client-Side Authentication Flow
Understanding the complete authentication flow helps frontend developers integrate properly with the API.
### Recommended Client Flow
**1. Login Process**
```javascript
// Send credentials to login endpoint
const response = await fetch('/api/v1/login', {
method: 'POST',
headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
body: 'username=user&password=pass',
credentials: 'include' // Important: includes cookies
});
const { access_token, token_type } = await response.json();
// Store access token in memory (not localStorage)
sessionStorage.setItem('access_token', access_token);
```
**2. Making Authenticated Requests**
```javascript
// Include access token in Authorization header
const response = await fetch('/api/v1/protected-endpoint', {
headers: {
'Authorization': `Bearer ${sessionStorage.getItem('access_token')}`
},
credentials: 'include'
});
```
**3. Handling Token Expiration**
```javascript
// Automatic token refresh on 401 errors
async function apiCall(url, options = {}) {
let response = await fetch(url, {
...options,
headers: {
...options.headers,
'Authorization': `Bearer ${sessionStorage.getItem('access_token')}`
},
credentials: 'include'
});
// If token expired, try to refresh
if (response.status === 401) {
const refreshResponse = await fetch('/api/v1/refresh', {
method: 'POST',
credentials: 'include' // Sends refresh token cookie
});
if (refreshResponse.ok) {
const { access_token } = await refreshResponse.json();
sessionStorage.setItem('access_token', access_token);
// Retry original request
response = await fetch(url, {
...options,
headers: {
...options.headers,
'Authorization': `Bearer ${access_token}`
},
credentials: 'include'
});
} else {
// Refresh failed - redirect to login
window.location.href = '/login';
}
}
return response;
}
```
**4. Logout Process**
```javascript
// Clear tokens and call logout endpoint
await fetch('/api/v1/logout', {
method: 'POST',
credentials: 'include'
});
sessionStorage.removeItem('access_token');
// Refresh token cookie is cleared by server
```
### Cookie Configuration
The refresh token cookie is configured for maximum security:
```python
response.set_cookie(
key="refresh_token",
value=refresh_token,
httponly=True, # Prevents JavaScript access (XSS protection)
secure=True, # HTTPS only in production
samesite="Lax", # CSRF protection with good usability
max_age=REFRESH_TOKEN_EXPIRE_DAYS * 24 * 60 * 60
)
```
**SameSite Options:**
- **`Lax`** (Recommended): Cookies sent on top-level navigation but not cross-site requests
- **`Strict`**: Maximum security but may break some user flows
- **`None`**: Required for cross-origin requests (must use with Secure)
## Token Blacklisting
Token blacklisting solves a fundamental problem with JWT tokens: once issued, they remain valid until expiration, even if the user logs out. Blacklisting provides immediate token revocation.
### Why Blacklisting Matters
Without blacklisting, logged-out users could continue accessing your API until their tokens naturally expire. This creates security risks, especially on shared computers or if tokens are compromised.
### Blacklisting Implementation
The system uses a database table to track invalidated tokens:
```python
# models/token_blacklist.py
class TokenBlacklist(Base):
__tablename__ = "token_blacklist"
id: Mapped[int] = mapped_column(primary_key=True)
token: Mapped[str] = mapped_column(unique=True, index=True) # Full token string
expires_at: Mapped[datetime] = mapped_column() # When to clean up
created_at: Mapped[datetime] = mapped_column(default=datetime.utcnow)
```
**Design Considerations:**
- **Unique constraint**: Prevents duplicate entries
- **Index on token**: Fast lookup during verification
- **Expires_at field**: Enables automatic cleanup of old entries
### Blacklisting Tokens
The system provides functions for both single token and dual token blacklisting:
```python
from app.core.security import blacklist_token, blacklist_tokens
# Single token blacklisting (for specific scenarios)
await blacklist_token(token, db)
# Dual token blacklisting (standard logout)
await blacklist_tokens(access_token, refresh_token, db)
```
### Blacklisting Process
The blacklisting process extracts the expiration time from the token to set an appropriate cleanup schedule:
```python
async def blacklist_token(token: str, db: AsyncSession) -> None:
# 1. Decode token to extract expiration (no verification needed)
payload = jwt.decode(token, SECRET_KEY.get_secret_value(), algorithms=[ALGORITHM])
exp_timestamp = payload.get("exp")
if exp_timestamp is not None:
# 2. Convert Unix timestamp to datetime
expires_at = datetime.fromtimestamp(exp_timestamp)
# 3. Store in blacklist with expiration
await crud_token_blacklist.create(
db,
object=TokenBlacklistCreate(token=token, expires_at=expires_at)
)
```
**Cleanup Strategy**: Blacklisted tokens can be automatically removed from the database after their natural expiration time, preventing unlimited database growth.
## Login Flow Implementation
### Complete Login Endpoint
```python
@router.post("/login", response_model=Token)
async def login_for_access_token(
response: Response,
form_data: Annotated[OAuth2PasswordRequestForm, Depends()],
db: Annotated[AsyncSession, Depends(async_get_db)],
) -> dict[str, str]:
# 1. Authenticate user
user = await authenticate_user(
username_or_email=form_data.username,
password=form_data.password,
db=db
)
if not user:
raise HTTPException(
status_code=401,
detail="Incorrect username or password"
)
# 2. Create access token
access_token = await create_access_token(data={"sub": user["username"]})
# 3. Create refresh token
refresh_token = await create_refresh_token(data={"sub": user["username"]})
# 4. Set refresh token as HTTP-only cookie
response.set_cookie(
key="refresh_token",
value=refresh_token,
httponly=True,
secure=True,
samesite="strict",
max_age=REFRESH_TOKEN_EXPIRE_DAYS * 24 * 60 * 60
)
return {"access_token": access_token, "token_type": "bearer"}
```
### Token Refresh Endpoint
```python
@router.post("/refresh", response_model=Token)
async def refresh_access_token(
response: Response,
db: Annotated[AsyncSession, Depends(async_get_db)],
refresh_token: str = Cookie(None)
) -> dict[str, str]:
if not refresh_token:
raise HTTPException(status_code=401, detail="Refresh token missing")
# 1. Verify refresh token
token_data = await verify_token(refresh_token, TokenType.REFRESH, db)
if not token_data:
raise HTTPException(status_code=401, detail="Invalid refresh token")
# 2. Create new access token
new_access_token = await create_access_token(
data={"sub": token_data.username_or_email}
)
# 3. Optionally create new refresh token (token rotation)
new_refresh_token = await create_refresh_token(
data={"sub": token_data.username_or_email}
)
# 4. Blacklist old refresh token
await blacklist_token(refresh_token, db)
# 5. Set new refresh token cookie
response.set_cookie(
key="refresh_token",
value=new_refresh_token,
httponly=True,
secure=True,
samesite="strict",
max_age=REFRESH_TOKEN_EXPIRE_DAYS * 24 * 60 * 60
)
return {"access_token": new_access_token, "token_type": "bearer"}
```
### Logout Implementation
```python
@router.post("/logout")
async def logout(
response: Response,
db: Annotated[AsyncSession, Depends(async_get_db)],
current_user: dict = Depends(get_current_user),
token: str = Depends(oauth2_scheme),
refresh_token: str = Cookie(None)
) -> dict[str, str]:
# 1. Blacklist access token
await blacklist_token(token, db)
# 2. Blacklist refresh token if present
if refresh_token:
await blacklist_token(refresh_token, db)
# 3. Clear refresh token cookie
response.delete_cookie(
key="refresh_token",
httponly=True,
secure=True,
samesite="strict"
)
return {"message": "Successfully logged out"}
```
## Authentication Dependencies
### get_current_user
```python
async def get_current_user(
db: AsyncSession = Depends(async_get_db),
token: str = Depends(oauth2_scheme)
) -> dict:
# 1. Verify token
token_data = await verify_token(token, TokenType.ACCESS, db)
if not token_data:
raise HTTPException(status_code=401, detail="Invalid token")
# 2. Get user from database
user = await crud_users.get(
db=db,
username=token_data.username_or_email,
schema_to_select=UserRead
)
if user is None:
raise HTTPException(status_code=401, detail="User not found")
return user
```
### get_optional_user
```python
async def get_optional_user(
db: AsyncSession = Depends(async_get_db),
token: str = Depends(optional_oauth2_scheme)
) -> dict | None:
if not token:
return None
try:
return await get_current_user(db=db, token=token)
except HTTPException:
return None
```
### get_current_superuser
```python
async def get_current_superuser(
current_user: dict = Depends(get_current_user)
) -> dict:
if not current_user.get("is_superuser", False):
raise HTTPException(
status_code=403,
detail="Not enough permissions"
)
return current_user
```
## Configuration
### Environment Variables
```bash
# JWT Configuration
SECRET_KEY=your-secret-key-here
ALGORITHM=HS256
ACCESS_TOKEN_EXPIRE_MINUTES=30
REFRESH_TOKEN_EXPIRE_DAYS=7
# Security Headers
SECURE_COOKIES=true
CORS_ORIGINS=["http://localhost:3000", "https://yourapp.com"]
```
### Security Configuration
```python
# app/core/config.py
class Settings(BaseSettings):
SECRET_KEY: SecretStr
ALGORITHM: str = "HS256"
ACCESS_TOKEN_EXPIRE_MINUTES: int = 30
REFRESH_TOKEN_EXPIRE_DAYS: int = 7
# Cookie settings
SECURE_COOKIES: bool = True
COOKIE_DOMAIN: str | None = None
COOKIE_SAMESITE: str = "strict"
```
## Security Best Practices
### Token Security
- **Use strong secrets**: Generate cryptographically secure SECRET_KEY
- **Rotate secrets**: Regularly change SECRET_KEY in production
- **Environment separation**: Different secrets for dev/staging/production
- **Secure transmission**: Always use HTTPS in production
### Cookie Security
- **HttpOnly flag**: Prevents JavaScript access to refresh tokens
- **Secure flag**: Ensures cookies only sent over HTTPS
- **SameSite attribute**: Prevents CSRF attacks
- **Domain restrictions**: Set cookie domain appropriately
### Implementation Security
- **Input validation**: Validate all token inputs
- **Rate limiting**: Implement login attempt limits
- **Audit logging**: Log authentication events
- **Token rotation**: Regularly refresh tokens
## Common Patterns
### API Key Authentication
For service-to-service communication:
```python
async def get_api_key_user(
api_key: str = Header(None),
db: AsyncSession = Depends(async_get_db)
) -> dict:
if not api_key:
raise HTTPException(status_code=401, detail="API key required")
# Verify API key
user = await crud_users.get(db=db, api_key=api_key)
if not user:
raise HTTPException(status_code=401, detail="Invalid API key")
return user
```
### Multiple Authentication Methods
```python
async def get_authenticated_user(
db: AsyncSession = Depends(async_get_db),
token: str = Depends(optional_oauth2_scheme),
api_key: str = Header(None)
) -> dict:
# Try JWT token first
if token:
try:
return await get_current_user(db=db, token=token)
except HTTPException:
pass
# Fall back to API key
if api_key:
return await get_api_key_user(api_key=api_key, db=db)
raise HTTPException(status_code=401, detail="Authentication required")
```
## Troubleshooting
### Common Issues
**Token Expired**: Implement automatic refresh using refresh tokens
**Invalid Signature**: Check SECRET_KEY consistency across environments
**Blacklisted Token**: User logged out - redirect to login
**Missing Token**: Ensure Authorization header is properly set
### Debugging Tips
```python
# Enable debug logging
import logging
logging.getLogger("app.core.security").setLevel(logging.DEBUG)
# Test token validation
async def debug_token(token: str, db: AsyncSession):
try:
payload = jwt.decode(token, SECRET_KEY.get_secret_value(), algorithms=[ALGORITHM])
print(f"Token payload: {payload}")
is_blacklisted = await crud_token_blacklist.exists(db, token=token)
print(f"Is blacklisted: {is_blacklisted}")
except JWTError as e:
print(f"JWT Error: {e}")
```
This comprehensive JWT implementation provides secure, scalable authentication for your FastAPI application.

View File

@ -0,0 +1,634 @@
# Permissions and Authorization
Authorization determines what authenticated users can do within your application. While authentication answers "who are you?", authorization answers "what can you do?". This section covers the permission system, access control patterns, and how to implement secure authorization in your endpoints.
## Understanding Authorization
Authorization is a multi-layered security concept that protects resources and operations based on user identity, roles, and contextual information. The boilerplate implements several authorization patterns to handle different security requirements.
### Authorization vs Authentication
**Authentication**: Verifies user identity - confirms the user is who they claim to be
**Authorization**: Determines user permissions - decides what the authenticated user can access
These work together: you must authenticate first (prove identity) before you can authorize (check permissions).
### Authorization Patterns
The system implements several common authorization patterns:
1. **Role-Based Access Control (RBAC)**: Users have roles (superuser, regular user) that determine permissions
2. **Resource Ownership**: Users can only access resources they own
3. **Tiered Access**: Different user tiers have different capabilities and limits
4. **Contextual Authorization**: Permissions based on request context (rate limits, time-based access)
## Core Authorization Patterns
### Superuser Permissions
Superusers have elevated privileges for administrative operations. This pattern is essential for system management but must be carefully controlled.
```python
from app.api.dependencies import get_current_superuser
# Superuser-only endpoint
@router.get("/admin/users/", dependencies=[Depends(get_current_superuser)])
async def get_all_users(
db: AsyncSession = Depends(async_get_db)
) -> list[UserRead]:
# Only superusers can access this endpoint
users = await crud_users.get_multi(
db=db,
schema_to_select=UserRead,
return_as_model=True
)
return users.data
```
**When to Use Superuser Authorization:**
- **User management operations**: Creating, deleting, or modifying other users
- **System configuration**: Changing application settings or configuration
- **Data export/import**: Bulk operations on sensitive data
- **Administrative reporting**: Access to system-wide analytics and logs
**Security Considerations:**
- **Minimal Assignment**: Only assign superuser status when absolutely necessary
- **Regular Audits**: Periodically review who has superuser access
- **Activity Logging**: Log all superuser actions for security monitoring
- **Time-Limited Access**: Consider temporary superuser elevation for specific tasks
### Resource Ownership
Resource ownership ensures users can only access and modify their own data. This is the most common authorization pattern in user-facing applications.
```python
@router.get("/posts/me/")
async def get_my_posts(
current_user: dict = Depends(get_current_user),
db: AsyncSession = Depends(async_get_db)
) -> list[PostRead]:
# Get posts owned by current user
posts = await crud_posts.get_multi(
db=db,
created_by_user_id=current_user["id"],
schema_to_select=PostRead,
return_as_model=True
)
return posts.data
@router.delete("/posts/{post_id}")
async def delete_post(
post_id: int,
current_user: dict = Depends(get_current_user),
db: AsyncSession = Depends(async_get_db)
) -> dict[str, str]:
# 1. Get the post
post = await crud_posts.get(db=db, id=post_id)
if not post:
raise NotFoundException("Post not found")
# 2. Check ownership
if post["created_by_user_id"] != current_user["id"]:
raise ForbiddenException("You can only delete your own posts")
# 3. Delete the post
await crud_posts.delete(db=db, id=post_id)
return {"message": "Post deleted"}
```
**Ownership Validation Pattern:**
1. **Retrieve Resource**: Get the resource from the database
2. **Check Ownership**: Compare resource owner with current user
3. **Authorize or Deny**: Allow action if user owns resource, deny otherwise
### User Tiers and Rate Limiting
User tiers provide differentiated access based on subscription levels or user status. This enables business models with different feature sets for different user types.
```python
@router.post("/posts/", response_model=PostRead)
async def create_post(
post: PostCreate,
current_user: dict = Depends(get_current_user),
db: AsyncSession = Depends(async_get_db)
) -> PostRead:
# Check rate limits based on user tier
await check_rate_limit(
resource="posts",
user_id=current_user["id"],
tier_id=current_user.get("tier_id"),
db=db
)
# Create post with user association
post_internal = PostCreateInternal(
**post.model_dump(),
created_by_user_id=current_user["id"]
)
created_post = await crud_posts.create(db=db, object=post_internal)
return created_post
```
**Rate Limiting Implementation:**
```python
async def check_rate_limit(
resource: str,
user_id: int,
tier_id: int | None,
db: AsyncSession
) -> None:
# 1. Get user's tier information
if tier_id:
tier = await crud_tiers.get(db=db, id=tier_id)
limit = tier["rate_limit_posts"] if tier else 10 # Default limit
else:
limit = 5 # Free tier limit
# 2. Count recent posts (last 24 hours)
recent_posts = await crud_posts.count(
db=db,
created_by_user_id=user_id,
created_at__gte=datetime.utcnow() - timedelta(hours=24)
)
# 3. Check if limit exceeded
if recent_posts >= limit:
raise RateLimitException(f"Daily {resource} limit exceeded ({limit})")
```
**Tier-Based Authorization Benefits:**
- **Business Model Support**: Different features for different subscription levels
- **Resource Protection**: Prevents abuse by limiting free tier usage
- **Progressive Enhancement**: Encourages upgrades by showing tier benefits
- **Fair Usage**: Ensures equitable resource distribution among users
### Custom Permission Helpers
Custom permission functions provide reusable authorization logic for complex scenarios.
```python
# Permission helper functions
async def can_edit_post(user: dict, post_id: int, db: AsyncSession) -> bool:
"""Check if user can edit a specific post."""
post = await crud_posts.get(db=db, id=post_id)
if not post:
return False
# Superusers can edit any post
if user.get("is_superuser", False):
return True
# Users can edit their own posts
if post["created_by_user_id"] == user["id"]:
return True
return False
async def can_access_admin_panel(user: dict) -> bool:
"""Check if user can access admin panel."""
return user.get("is_superuser", False)
async def has_tier_feature(user: dict, feature: str, db: AsyncSession) -> bool:
"""Check if user's tier includes a specific feature."""
tier_id = user.get("tier_id")
if not tier_id:
return False # Free tier - no premium features
tier = await crud_tiers.get(db=db, id=tier_id)
if not tier:
return False
# Check tier features (example)
return tier.get(f"allows_{feature}", False)
# Usage in endpoints
@router.put("/posts/{post_id}")
async def update_post(
post_id: int,
post_updates: PostUpdate,
current_user: dict = Depends(get_current_user),
db: AsyncSession = Depends(async_get_db)
) -> PostRead:
# Use permission helper
if not await can_edit_post(current_user, post_id, db):
raise ForbiddenException("Cannot edit this post")
updated_post = await crud_posts.update(
db=db,
object=post_updates,
id=post_id
)
return updated_post
```
**Permission Helper Benefits:**
- **Reusability**: Same logic used across multiple endpoints
- **Consistency**: Ensures uniform permission checking
- **Maintainability**: Changes to permissions only need updates in one place
- **Testability**: Permission logic can be unit tested separately
## Authorization Dependencies
### Basic Authorization Dependencies
```python
# Required authentication
async def get_current_user(
token: str = Depends(oauth2_scheme),
db: AsyncSession = Depends(async_get_db)
) -> dict:
"""Get currently authenticated user."""
token_data = await verify_token(token, TokenType.ACCESS, db)
if not token_data:
raise HTTPException(status_code=401, detail="Invalid token")
user = await crud_users.get(db=db, username=token_data.username_or_email)
if not user:
raise HTTPException(status_code=401, detail="User not found")
return user
# Optional authentication
async def get_optional_user(
token: str = Depends(optional_oauth2_scheme),
db: AsyncSession = Depends(async_get_db)
) -> dict | None:
"""Get currently authenticated user, or None if not authenticated."""
if not token:
return None
try:
return await get_current_user(token=token, db=db)
except HTTPException:
return None
# Superuser requirement
async def get_current_superuser(
current_user: dict = Depends(get_current_user)
) -> dict:
"""Get current user and ensure they are a superuser."""
if not current_user.get("is_superuser", False):
raise HTTPException(status_code=403, detail="Not enough permissions")
return current_user
```
### Advanced Authorization Dependencies
```python
# Tier-based access control
def require_tier(minimum_tier: str):
"""Factory function for tier-based dependencies."""
async def check_user_tier(
current_user: dict = Depends(get_current_user),
db: AsyncSession = Depends(async_get_db)
) -> dict:
tier_id = current_user.get("tier_id")
if not tier_id:
raise HTTPException(status_code=403, detail="No subscription tier")
tier = await crud_tiers.get(db=db, id=tier_id)
if not tier or tier["name"] != minimum_tier:
raise HTTPException(
status_code=403,
detail=f"Requires {minimum_tier} tier"
)
return current_user
return check_user_tier
# Resource ownership dependency
def require_resource_ownership(resource_type: str):
"""Factory function for resource ownership dependencies."""
async def check_ownership(
resource_id: int,
current_user: dict = Depends(get_current_user),
db: AsyncSession = Depends(async_get_db)
) -> dict:
if resource_type == "post":
resource = await crud_posts.get(db=db, id=resource_id)
owner_field = "created_by_user_id"
else:
raise ValueError(f"Unknown resource type: {resource_type}")
if not resource:
raise HTTPException(status_code=404, detail="Resource not found")
# Superusers can access any resource
if current_user.get("is_superuser", False):
return current_user
# Check ownership
if resource[owner_field] != current_user["id"]:
raise HTTPException(
status_code=403,
detail="You don't own this resource"
)
return current_user
return check_ownership
# Usage examples
@router.get("/premium-feature", dependencies=[Depends(require_tier("Premium"))])
async def premium_feature():
return {"message": "Premium feature accessed"}
@router.put("/posts/{post_id}")
async def update_post(
post_id: int,
post_update: PostUpdate,
current_user: dict = Depends(require_resource_ownership("post")),
db: AsyncSession = Depends(async_get_db)
) -> PostRead:
# User ownership already verified by dependency
updated_post = await crud_posts.update(db=db, object=post_update, id=post_id)
return updated_post
```
## Security Best Practices
### Principle of Least Privilege
Always grant the minimum permissions necessary for users to complete their tasks.
**Implementation:**
- **Default Deny**: Start with no permissions and explicitly grant what's needed
- **Regular Review**: Periodically audit user permissions and remove unnecessary access
- **Role Segregation**: Separate administrative and user-facing permissions
- **Temporary Elevation**: Use temporary permissions for one-time administrative tasks
### Defense in Depth
Implement multiple layers of authorization checks throughout your application.
**Authorization Layers:**
1. **API Gateway**: Route-level permission checks
2. **Endpoint Dependencies**: FastAPI dependency injection for common patterns
3. **Business Logic**: Method-level permission validation
4. **Database**: Row-level security where applicable
### Input Validation and Sanitization
Always validate and sanitize user input, even from authorized users.
```python
@router.post("/admin/users/{user_id}/tier")
async def update_user_tier(
user_id: int,
tier_update: UserTierUpdate,
current_user: dict = Depends(get_current_superuser),
db: AsyncSession = Depends(async_get_db)
) -> dict[str, str]:
# 1. Validate tier exists
tier = await crud_tiers.get(db=db, id=tier_update.tier_id)
if not tier:
raise NotFoundException("Tier not found")
# 2. Validate user exists
user = await crud_users.get(db=db, id=user_id)
if not user:
raise NotFoundException("User not found")
# 3. Prevent self-demotion (optional business rule)
if user_id == current_user["id"] and tier["name"] == "free":
raise ForbiddenException("Cannot demote yourself to free tier")
# 4. Update user tier
await crud_users.update(
db=db,
object={"tier_id": tier_update.tier_id},
id=user_id
)
return {"message": f"User tier updated to {tier['name']}"}
```
### Audit Logging
Log all significant authorization decisions for security monitoring and compliance.
```python
import logging
security_logger = logging.getLogger("security")
async def log_authorization_event(
user_id: int,
action: str,
resource: str,
result: str,
details: dict = None
):
"""Log authorization events for security auditing."""
security_logger.info(
f"Authorization {result}: User {user_id} attempted {action} on {resource}",
extra={
"user_id": user_id,
"action": action,
"resource": resource,
"result": result,
"details": details or {}
}
)
# Usage in permission checks
async def delete_user_account(user_id: int, current_user: dict, db: AsyncSession):
if current_user["id"] != user_id and not current_user.get("is_superuser"):
await log_authorization_event(
user_id=current_user["id"],
action="delete_account",
resource=f"user:{user_id}",
result="denied",
details={"reason": "insufficient_permissions"}
)
raise ForbiddenException("Cannot delete other users' accounts")
await log_authorization_event(
user_id=current_user["id"],
action="delete_account",
resource=f"user:{user_id}",
result="granted"
)
# Proceed with deletion
await crud_users.delete(db=db, id=user_id)
```
## Common Authorization Patterns
### Multi-Tenant Authorization
For applications serving multiple organizations or tenants:
```python
@router.get("/organizations/{org_id}/users/")
async def get_organization_users(
org_id: int,
current_user: dict = Depends(get_current_user),
db: AsyncSession = Depends(async_get_db)
) -> list[UserRead]:
# Check if user belongs to organization
membership = await crud_org_members.get(
db=db,
organization_id=org_id,
user_id=current_user["id"]
)
if not membership:
raise ForbiddenException("Not a member of this organization")
# Check if user has admin role in organization
if membership.role not in ["admin", "owner"]:
raise ForbiddenException("Insufficient organization permissions")
# Get organization users
users = await crud_users.get_multi(
db=db,
organization_id=org_id,
schema_to_select=UserRead,
return_as_model=True
)
return users.data
```
### Time-Based Permissions
For permissions that change based on time or schedule:
```python
from datetime import datetime, time
async def check_business_hours_access(user: dict) -> bool:
"""Check if user can access during business hours only."""
now = datetime.now()
business_start = time(9, 0) # 9 AM
business_end = time(17, 0) # 5 PM
# Superusers can always access
if user.get("is_superuser", False):
return True
# Regular users only during business hours
current_time = now.time()
return business_start <= current_time <= business_end
# Usage in dependency
async def require_business_hours(
current_user: dict = Depends(get_current_user)
) -> dict:
"""Require access during business hours for non-admin users."""
if not await check_business_hours_access(current_user):
raise ForbiddenException("Access only allowed during business hours")
return current_user
@router.post("/business-operation", dependencies=[Depends(require_business_hours)])
async def business_operation():
return {"message": "Business operation completed"}
```
### Role-Based Access Control (RBAC)
For more complex permission systems:
```python
# Role definitions
class Role(str, Enum):
USER = "user"
MODERATOR = "moderator"
ADMIN = "admin"
SUPERUSER = "superuser"
# Permission checking
def has_role(user: dict, required_role: Role) -> bool:
"""Check if user has required role or higher."""
role_hierarchy = {
Role.USER: 0,
Role.MODERATOR: 1,
Role.ADMIN: 2,
Role.SUPERUSER: 3
}
user_role = Role(user.get("role", "user"))
return role_hierarchy[user_role] >= role_hierarchy[required_role]
# Role-based dependency
def require_role(minimum_role: Role):
"""Factory for role-based dependencies."""
async def check_role(current_user: dict = Depends(get_current_user)) -> dict:
if not has_role(current_user, minimum_role):
raise HTTPException(
status_code=403,
detail=f"Requires {minimum_role.value} role or higher"
)
return current_user
return check_role
# Usage
@router.delete("/posts/{post_id}", dependencies=[Depends(require_role(Role.MODERATOR))])
async def moderate_delete_post(post_id: int, db: AsyncSession = Depends(async_get_db)):
await crud_posts.delete(db=db, id=post_id)
return {"message": "Post deleted by moderator"}
```
### Feature Flags and Permissions
For gradual feature rollouts:
```python
async def has_feature_access(user: dict, feature: str, db: AsyncSession) -> bool:
"""Check if user has access to a specific feature."""
# Check feature flags
feature_flag = await crud_feature_flags.get(db=db, name=feature)
if not feature_flag or not feature_flag.enabled:
return False
# Check user tier permissions
if feature_flag.requires_tier:
tier_id = user.get("tier_id")
if not tier_id:
return False
tier = await crud_tiers.get(db=db, id=tier_id)
if not tier or tier["level"] < feature_flag["minimum_tier_level"]:
return False
# Check beta user status
if feature_flag.beta_only:
return user.get("is_beta_user", False)
return True
# Feature flag dependency
def require_feature(feature_name: str):
"""Factory for feature flag dependencies."""
async def check_feature_access(
current_user: dict = Depends(get_current_user),
db: AsyncSession = Depends(async_get_db)
) -> dict:
if not await has_feature_access(current_user, feature_name, db):
raise HTTPException(
status_code=403,
detail=f"Access to {feature_name} feature not available"
)
return current_user
return check_feature_access
@router.get("/beta-feature", dependencies=[Depends(require_feature("beta_analytics"))])
async def get_beta_analytics():
return {"analytics": "beta_data"}
```
This comprehensive permissions system provides flexible, secure authorization patterns that can be adapted to your specific application requirements while maintaining security best practices.

View File

@ -0,0 +1,879 @@
# User Management
User management forms the core of any authentication system, handling everything from user registration and login to profile updates and account deletion. This section covers the complete user lifecycle with secure authentication flows and administrative operations.
## Understanding User Lifecycle
The user lifecycle in the boilerplate follows a secure, well-defined process that protects user data while providing a smooth experience. Understanding this flow helps you customize the system for your specific needs.
**Registration → Authentication → Profile Management → Administrative Operations**
Each stage has specific security considerations and business logic that ensure data integrity and user safety.
## User Registration
User registration is the entry point to your application. The process must be secure, user-friendly, and prevent common issues like duplicate accounts or weak passwords.
### Registration Process
The registration endpoint performs several validation steps before creating a user account. This multi-step validation prevents common registration issues and ensures data quality.
```python
# User registration endpoint
@router.post("/user", response_model=UserRead, status_code=201)
async def write_user(
user: UserCreate,
db: AsyncSession
) -> UserRead:
# 1. Check if email exists
email_row = await crud_users.exists(db=db, email=user.email)
if email_row:
raise DuplicateValueException("Email is already registered")
# 2. Check if username exists
username_row = await crud_users.exists(db=db, username=user.username)
if username_row:
raise DuplicateValueException("Username not available")
# 3. Hash password
user_internal_dict = user.model_dump()
user_internal_dict["hashed_password"] = get_password_hash(
password=user_internal_dict["password"]
)
del user_internal_dict["password"]
# 4. Create user
user_internal = UserCreateInternal(**user_internal_dict)
created_user = await crud_users.create(db=db, object=user_internal)
return created_user
```
**Security Steps Explained:**
1. **Email Uniqueness**: Prevents multiple accounts with the same email, which could cause confusion and security issues
2. **Username Uniqueness**: Ensures usernames are unique identifiers within your system
3. **Password Hashing**: Converts plain text passwords into secure hashes before database storage
4. **Data Separation**: Plain text passwords are immediately removed from memory after hashing
### Registration Schema
The registration schema defines what data is required and how it's validated. This ensures consistent data quality and prevents malformed user accounts.
```python
# User registration input
class UserCreate(UserBase):
model_config = ConfigDict(extra="forbid")
password: Annotated[
str,
Field(
pattern=r"^.{8,}|[0-9]+|[A-Z]+|[a-z]+|[^a-zA-Z0-9]+$",
examples=["Str1ngst!"]
)
]
# Internal schema for database storage
class UserCreateInternal(UserBase):
hashed_password: str
```
**Schema Design Principles:**
- **`extra="forbid"`**: Rejects unexpected fields, preventing injection of unauthorized data
- **Password Patterns**: Enforces minimum security requirements for passwords
- **Separation of Concerns**: External schema accepts passwords, internal schema stores hashes
## User Authentication
Authentication verifies user identity using credentials. The process must be secure against common attacks while remaining user-friendly.
### Authentication Process
```python
async def authenticate_user(username_or_email: str, password: str, db: AsyncSession) -> dict | False:
# 1. Get user by email or username
if "@" in username_or_email:
db_user = await crud_users.get(db=db, email=username_or_email, is_deleted=False)
else:
db_user = await crud_users.get(db=db, username=username_or_email, is_deleted=False)
if not db_user:
return False
# 2. Verify password
if not await verify_password(password, db_user["hashed_password"]):
return False
return db_user
```
**Security Considerations:**
- **Flexible Login**: Accepts both username and email for better user experience
- **Soft Delete Check**: `is_deleted=False` prevents deleted users from logging in
- **Consistent Timing**: Both user lookup and password verification take similar time
### Password Security
Password security is critical for protecting user accounts. The system uses industry-standard bcrypt hashing with automatic salt generation.
```python
import bcrypt
async def verify_password(plain_password: str, hashed_password: str) -> bool:
"""Verify a plain password against its hash."""
correct_password: bool = bcrypt.checkpw(
plain_password.encode(),
hashed_password.encode()
)
return correct_password
def get_password_hash(password: str) -> str:
"""Generate password hash with salt."""
hashed_password: str = bcrypt.hashpw(
password.encode(),
bcrypt.gensalt()
).decode()
return hashed_password
```
**Why bcrypt?**
- **Adaptive Hashing**: Computationally expensive, making brute force attacks impractical
- **Automatic Salt**: Each password gets a unique salt, preventing rainbow table attacks
- **Future-Proof**: Can increase computational cost as hardware improves
### Login Validation
Client-side validation provides immediate feedback but should never be the only validation layer.
```python
# Password validation pattern
PASSWORD_PATTERN = r"^.{8,}|[0-9]+|[A-Z]+|[a-z]+|[^a-zA-Z0-9]+$"
# Frontend validation (example)
function validatePassword(password) {
const minLength = password.length >= 8;
const hasNumber = /[0-9]/.test(password);
const hasUpper = /[A-Z]/.test(password);
const hasLower = /[a-z]/.test(password);
const hasSpecial = /[^a-zA-Z0-9]/.test(password);
return minLength && hasNumber && hasUpper && hasLower && hasSpecial;
}
```
**Validation Strategy:**
- **Server-Side**: Always validate on the server - client validation can be bypassed
- **Client-Side**: Provides immediate feedback for better user experience
- **Progressive**: Validate as user types to catch issues early
## Profile Management
Profile management allows users to update their information while maintaining security and data integrity.
### Get Current User Profile
Retrieving the current user's profile is a fundamental operation that should be fast and secure.
```python
@router.get("/user/me/", response_model=UserRead)
async def read_users_me(current_user: dict = Depends(get_current_user)) -> dict:
return current_user
# Frontend usage
async function getCurrentUser() {
const token = localStorage.getItem('access_token');
const response = await fetch('/api/v1/user/me/', {
headers: {
'Authorization': `Bearer ${token}`
}
});
if (response.ok) {
return await response.json();
}
throw new Error('Failed to get user profile');
}
```
**Design Decisions:**
- **`/me` Endpoint**: Common pattern that's intuitive for users and developers
- **Current User Dependency**: Automatically handles authentication and user lookup
- **Minimal Data**: Returns only safe, user-relevant information
### Update User Profile
Profile updates require careful validation to prevent unauthorized changes and maintain data integrity.
```python
@router.patch("/user/{username}")
async def patch_user(
values: UserUpdate,
username: str,
current_user: dict = Depends(get_current_user),
db: AsyncSession = Depends(async_get_db),
) -> dict[str, str]:
# 1. Get user from database
db_user = await crud_users.get(db=db, username=username, schema_to_select=UserRead)
if db_user is None:
raise NotFoundException("User not found")
# 2. Check ownership (users can only update their own profile)
if db_user["username"] != current_user["username"]:
raise ForbiddenException("Cannot update other users")
# 3. Validate unique constraints
if values.username and values.username != db_user["username"]:
existing_username = await crud_users.exists(db=db, username=values.username)
if existing_username:
raise DuplicateValueException("Username not available")
if values.email and values.email != db_user["email"]:
existing_email = await crud_users.exists(db=db, email=values.email)
if existing_email:
raise DuplicateValueException("Email is already registered")
# 4. Update user
await crud_users.update(db=db, object=values, username=username)
return {"message": "User updated"}
```
**Security Measures:**
1. **Ownership Verification**: Users can only update their own profiles
2. **Uniqueness Checks**: Prevents conflicts when changing username/email
3. **Partial Updates**: Only provided fields are updated
4. **Input Validation**: Pydantic schemas validate all input data
## User Deletion
User deletion requires careful consideration of data retention, user rights, and system integrity.
### Self-Deletion
Users should be able to delete their own accounts, but the process should be secure and potentially reversible.
```python
@router.delete("/user/{username}")
async def erase_user(
username: str,
current_user: dict = Depends(get_current_user),
db: AsyncSession = Depends(async_get_db),
token: str = Depends(oauth2_scheme),
) -> dict[str, str]:
# 1. Get user from database
db_user = await crud_users.get(db=db, username=username, schema_to_select=UserRead)
if not db_user:
raise NotFoundException("User not found")
# 2. Check ownership
if username != current_user["username"]:
raise ForbiddenException()
# 3. Soft delete user
await crud_users.delete(db=db, username=username)
# 4. Blacklist current token
await blacklist_token(token=token, db=db)
return {"message": "User deleted"}
```
**Soft Delete Benefits:**
- **Data Recovery**: Users can be restored if needed
- **Audit Trail**: Maintain records for compliance
- **Relationship Integrity**: Related data (posts, comments) remain accessible
- **Gradual Cleanup**: Allow time for data migration or backup
### Admin Deletion (Hard Delete)
Administrators may need to permanently remove users in specific circumstances.
```python
@router.delete("/db_user/{username}", dependencies=[Depends(get_current_superuser)])
async def erase_db_user(
username: str,
db: AsyncSession = Depends(async_get_db),
token: str = Depends(oauth2_scheme),
) -> dict[str, str]:
# 1. Check if user exists
db_user = await crud_users.exists(db=db, username=username)
if not db_user:
raise NotFoundException("User not found")
# 2. Hard delete from database
await crud_users.db_delete(db=db, username=username)
# 3. Blacklist current token
await blacklist_token(token=token, db=db)
return {"message": "User deleted from the database"}
```
**When to Use Hard Delete:**
- **Legal Requirements**: GDPR "right to be forgotten" requests
- **Data Breach Response**: Complete removal of compromised accounts
- **Spam/Abuse**: Permanent removal of malicious accounts
## Administrative Operations
### List All Users
```python
@router.get("/users", response_model=PaginatedListResponse[UserRead])
async def read_users(
db: AsyncSession = Depends(async_get_db),
page: int = 1,
items_per_page: int = 10
) -> dict:
users_data = await crud_users.get_multi(
db=db,
offset=compute_offset(page, items_per_page),
limit=items_per_page,
is_deleted=False,
)
response: dict[str, Any] = paginated_response(
crud_data=users_data,
page=page,
items_per_page=items_per_page
)
return response
```
### Get User by Username
```python
@router.get("/user/{username}", response_model=UserRead)
async def read_user(
username: str,
db: AsyncSession = Depends(async_get_db)
) -> UserRead:
db_user = await crud_users.get(
db=db,
username=username,
is_deleted=False,
schema_to_select=UserRead
)
if db_user is None:
raise NotFoundException("User not found")
return db_user
```
### User with Tier Information
```python
@router.get("/user/{username}/tier")
async def read_user_tier(
username: str,
db: AsyncSession = Depends(async_get_db)
) -> dict | None:
# 1. Get user
db_user = await crud_users.get(db=db, username=username, schema_to_select=UserRead)
if db_user is None:
raise NotFoundException("User not found")
# 2. Return None if no tier assigned
if db_user["tier_id"] is None:
return None
# 3. Get tier information
db_tier = await crud_tiers.get(db=db, id=db_user["tier_id"], schema_to_select=TierRead)
if not db_tier:
raise NotFoundException("Tier not found")
# 4. Combine user and tier data
user_dict = dict(db_user) # Convert to dict if needed
tier_dict = dict(db_tier) # Convert to dict if needed
for key, value in tier_dict.items():
user_dict[f"tier_{key}"] = value
return user_dict
```
## User Tiers and Permissions
### Assign User Tier
```python
@router.patch("/user/{username}/tier", dependencies=[Depends(get_current_superuser)])
async def patch_user_tier(
username: str,
values: UserTierUpdate,
db: AsyncSession = Depends(async_get_db)
) -> dict[str, str]:
# 1. Verify user exists
db_user = await crud_users.get(db=db, username=username, schema_to_select=UserRead)
if db_user is None:
raise NotFoundException("User not found")
# 2. Verify tier exists
tier_exists = await crud_tiers.exists(db=db, id=values.tier_id)
if not tier_exists:
raise NotFoundException("Tier not found")
# 3. Update user tier
await crud_users.update(db=db, object=values, username=username)
return {"message": "User tier updated"}
# Tier update schema
class UserTierUpdate(BaseModel):
tier_id: int
```
### User Rate Limits
```python
@router.get("/user/{username}/rate_limits", dependencies=[Depends(get_current_superuser)])
async def read_user_rate_limits(
username: str,
db: AsyncSession = Depends(async_get_db)
) -> dict[str, Any]:
# 1. Get user
db_user = await crud_users.get(db=db, username=username, schema_to_select=UserRead)
if db_user is None:
raise NotFoundException("User not found")
user_dict = dict(db_user) # Convert to dict if needed
# 2. No tier assigned
if db_user["tier_id"] is None:
user_dict["tier_rate_limits"] = []
return user_dict
# 3. Get tier and rate limits
db_tier = await crud_tiers.get(db=db, id=db_user["tier_id"], schema_to_select=TierRead)
if db_tier is None:
raise NotFoundException("Tier not found")
db_rate_limits = await crud_rate_limits.get_multi(db=db, tier_id=db_tier["id"])
user_dict["tier_rate_limits"] = db_rate_limits["data"]
return user_dict
```
## User Model Structure
### Database Model
```python
class User(Base):
__tablename__ = "user"
id: Mapped[int] = mapped_column(primary_key=True)
name: Mapped[str] = mapped_column(String(30))
username: Mapped[str] = mapped_column(String(20), unique=True, index=True)
email: Mapped[str] = mapped_column(String(50), unique=True, index=True)
hashed_password: Mapped[str]
profile_image_url: Mapped[str] = mapped_column(default="https://www.profileimageurl.com")
is_superuser: Mapped[bool] = mapped_column(default=False)
tier_id: Mapped[int | None] = mapped_column(ForeignKey("tier.id"), default=None)
# Timestamps
created_at: Mapped[datetime] = mapped_column(default=datetime.utcnow)
updated_at: Mapped[datetime | None] = mapped_column(default=None)
# Soft delete
is_deleted: Mapped[bool] = mapped_column(default=False)
deleted_at: Mapped[datetime | None] = mapped_column(default=None)
# Relationships
tier: Mapped["Tier"] = relationship(back_populates="users")
posts: Mapped[list["Post"]] = relationship(back_populates="created_by_user")
```
### User Schemas
```python
# Base schema with common fields
class UserBase(BaseModel):
name: Annotated[str, Field(min_length=2, max_length=30)]
username: Annotated[str, Field(min_length=2, max_length=20, pattern=r"^[a-z0-9]+$")]
email: Annotated[EmailStr, Field(examples=["user@example.com"])]
# Reading user data (API responses)
class UserRead(BaseModel):
id: int
name: str
username: str
email: str
profile_image_url: str
tier_id: int | None
# Full user data (internal use)
class User(TimestampSchema, UserBase, UUIDSchema, PersistentDeletion):
profile_image_url: str = "https://www.profileimageurl.com"
hashed_password: str
is_superuser: bool = False
tier_id: int | None = None
```
## Common User Operations
### Check User Existence
```python
# By email
email_exists = await crud_users.exists(db=db, email="user@example.com")
# By username
username_exists = await crud_users.exists(db=db, username="johndoe")
# By ID
user_exists = await crud_users.exists(db=db, id=123)
```
### Search Users
```python
# Get active users only
active_users = await crud_users.get_multi(
db=db,
is_deleted=False,
limit=10
)
# Get users by tier
tier_users = await crud_users.get_multi(
db=db,
tier_id=1,
is_deleted=False
)
# Get superusers
superusers = await crud_users.get_multi(
db=db,
is_superuser=True,
is_deleted=False
)
```
### User Statistics
```python
async def get_user_stats(db: AsyncSession) -> dict:
# Total users
total_users = await crud_users.count(db=db, is_deleted=False)
# Active users (logged in recently)
# This would require tracking last_login_at
# Users by tier
tier_stats = {}
tiers = await crud_tiers.get_multi(db=db)
for tier in tiers["data"]:
count = await crud_users.count(db=db, tier_id=tier["id"], is_deleted=False)
tier_stats[tier["name"]] = count
return {
"total_users": total_users,
"tier_distribution": tier_stats
}
```
## Frontend Integration
### Complete User Management Component
```javascript
class UserManager {
constructor(baseUrl = '/api/v1') {
this.baseUrl = baseUrl;
this.token = localStorage.getItem('access_token');
}
async register(userData) {
const response = await fetch(`${this.baseUrl}/user`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify(userData)
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.detail);
}
return await response.json();
}
async login(username, password) {
const response = await fetch(`${this.baseUrl}/login`, {
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
},
body: new URLSearchParams({
username: username,
password: password
})
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.detail);
}
const tokens = await response.json();
localStorage.setItem('access_token', tokens.access_token);
this.token = tokens.access_token;
return tokens;
}
async getProfile() {
const response = await fetch(`${this.baseUrl}/user/me/`, {
headers: {
'Authorization': `Bearer ${this.token}`
}
});
if (!response.ok) {
throw new Error('Failed to get profile');
}
return await response.json();
}
async updateProfile(username, updates) {
const response = await fetch(`${this.baseUrl}/user/${username}`, {
method: 'PATCH',
headers: {
'Authorization': `Bearer ${this.token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify(updates)
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.detail);
}
return await response.json();
}
async deleteAccount(username) {
const response = await fetch(`${this.baseUrl}/user/${username}`, {
method: 'DELETE',
headers: {
'Authorization': `Bearer ${this.token}`
}
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.detail);
}
// Clear local storage
localStorage.removeItem('access_token');
this.token = null;
return await response.json();
}
async logout() {
const response = await fetch(`${this.baseUrl}/logout`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${this.token}`
}
});
// Clear local storage regardless of response
localStorage.removeItem('access_token');
this.token = null;
if (response.ok) {
return await response.json();
}
}
}
// Usage
const userManager = new UserManager();
// Register new user
try {
const user = await userManager.register({
name: "John Doe",
username: "johndoe",
email: "john@example.com",
password: "SecurePass123!"
});
console.log('User registered:', user);
} catch (error) {
console.error('Registration failed:', error.message);
}
// Login
try {
const tokens = await userManager.login('johndoe', 'SecurePass123!');
console.log('Login successful');
// Get profile
const profile = await userManager.getProfile();
console.log('User profile:', profile);
} catch (error) {
console.error('Login failed:', error.message);
}
```
## Security Considerations
### Input Validation
```python
# Server-side validation
class UserCreate(UserBase):
password: Annotated[
str,
Field(
min_length=8,
pattern=r"^(?=.*[a-z])(?=.*[A-Z])(?=.*\d)(?=.*[@$!%*?&])[A-Za-z\d@$!%*?&]",
description="Password must contain uppercase, lowercase, number, and special character"
)
]
```
### Rate Limiting
```python
# Protect registration endpoint
@router.post("/user", dependencies=[Depends(rate_limiter_dependency)])
async def write_user(user: UserCreate, db: AsyncSession):
# Registration logic
pass
# Protect login endpoint
@router.post("/login", dependencies=[Depends(rate_limiter_dependency)])
async def login_for_access_token():
# Login logic
pass
```
### Data Sanitization
```python
def sanitize_user_input(user_data: dict) -> dict:
"""Sanitize user input to prevent XSS and injection."""
import html
sanitized = {}
for key, value in user_data.items():
if isinstance(value, str):
# HTML escape
sanitized[key] = html.escape(value.strip())
else:
sanitized[key] = value
return sanitized
```
## Next Steps
Now that you understand user management:
1. **[Permissions](permissions.md)** - Learn about role-based access control and authorization
2. **[Production Guide](../production.md)** - Implement production-grade security measures
3. **[JWT Tokens](jwt-tokens.md)** - Review token management if needed
User management provides the core functionality for authentication systems. Master these patterns before implementing advanced permission systems.
## Common Authentication Tasks
### Protect New Endpoints
```python
# Add authentication dependency to your router
@router.get("/my-endpoint")
async def my_endpoint(current_user: dict = Depends(get_current_user)):
# Endpoint now requires authentication
return {"user_specific_data": f"Hello {current_user['username']}"}
# Optional authentication for public endpoints
@router.get("/public-endpoint")
async def public_endpoint(user: dict | None = Depends(get_optional_user)):
if user:
return {"message": f"Hello {user['username']}", "premium_features": True}
return {"message": "Hello anonymous user", "premium_features": False}
```
### Complete Authentication Flow
```python
# 1. User registration
user_data = UserCreate(
name="John Doe",
username="johndoe",
email="john@example.com",
password="SecurePassword123!"
)
user = await crud_users.create(db=db, object=user_data)
# 2. User login
form_data = {"username": "johndoe", "password": "SecurePassword123!"}
user = await authenticate_user(form_data["username"], form_data["password"], db)
# 3. Token generation (handled in login endpoint)
access_token = await create_access_token(data={"sub": user["username"]})
refresh_token = await create_refresh_token(data={"sub": user["username"]})
# 4. API access with token
headers = {"Authorization": f"Bearer {access_token}"}
response = requests.get("/api/v1/users/me", headers=headers)
# 5. Token refresh when access token expires
response = requests.post("/api/v1/refresh") # Uses refresh token cookie
new_access_token = response.json()["access_token"]
# 6. Secure logout (blacklists both tokens)
await logout_user(access_token=access_token, refresh_token=refresh_token, db=db)
```
### Check User Permissions
```python
def check_user_permission(user: dict, required_tier: str = None):
"""Check if user has required permissions."""
if not user.get("is_active", True):
raise UnauthorizedException("User account is disabled")
if required_tier and user.get("tier", {}).get("name") != required_tier:
raise ForbiddenException(f"Requires {required_tier} tier")
# Usage in endpoint
@router.get("/premium-feature")
async def premium_feature(current_user: dict = Depends(get_current_user)):
check_user_permission(current_user, "Pro")
return {"premium_data": "exclusive_content"}
```
### Custom Authentication Logic
```python
async def get_user_with_posts(current_user: dict = Depends(get_current_user)):
"""Custom dependency that adds user's posts."""
posts = await crud_posts.get_multi(db=db, created_by_user_id=current_user["id"])
current_user["posts"] = posts
return current_user
# Usage
@router.get("/dashboard")
async def get_dashboard(user_with_posts: dict = Depends(get_user_with_posts)):
return {
"user": user_with_posts,
"post_count": len(user_with_posts["posts"])
}
```

View File

@ -0,0 +1,92 @@
# Background Tasks
The boilerplate includes a robust background task system built on ARQ (Async Redis Queue) for handling long-running operations asynchronously. This enables your API to remain responsive while processing intensive tasks in the background.
## Overview
Background tasks are essential for operations that:
- **Take longer than 2 seconds** to complete
- **Don't block user interactions** in your frontend
- **Can be processed asynchronously** without immediate user feedback
- **Require intensive computation** or external API calls
## Quick Example
```python
# Define a background task
async def send_welcome_email(ctx: Worker, user_id: int, email: str) -> str:
# Send email logic here
await send_email_service(email, "Welcome!")
return f"Welcome email sent to {email}"
# Enqueue the task from an API endpoint
@router.post("/users/", response_model=UserRead)
async def create_user(user_data: UserCreate):
# Create user in database
user = await crud_users.create(db=db, object=user_data)
# Queue welcome email in background
await queue.pool.enqueue_job("send_welcome_email", user["id"], user["email"])
return user
```
## Architecture
### ARQ Worker System
- **Redis-Based**: Uses Redis as the message broker for job queues
- **Async Processing**: Fully asynchronous task execution
- **Worker Pool**: Multiple workers can process tasks concurrently
- **Job Persistence**: Tasks survive application restarts
### Task Lifecycle
1. **Enqueue**: Tasks are added to Redis queue from API endpoints
2. **Processing**: ARQ workers pick up and execute tasks
3. **Results**: Task results are stored and can be retrieved
4. **Monitoring**: Track task status and execution history
## Key Features
**Scalable Processing**
- Multiple worker instances for high throughput
- Automatic load balancing across workers
- Configurable concurrency per worker
**Reliable Execution**
- Task retry mechanisms for failed jobs
- Dead letter queues for problematic tasks
- Graceful shutdown and task cleanup
**Database Integration**
- Shared database sessions with main application
- CRUD operations available in background tasks
- Transaction management and error handling
## Common Use Cases
- **Email Processing**: Welcome emails, notifications, newsletters
- **File Operations**: Image processing, PDF generation, file uploads
- **External APIs**: Third-party integrations, webhooks, data sync
- **Data Processing**: Report generation, analytics, batch operations
- **ML/AI Tasks**: Model inference, data analysis, predictions
## Getting Started
The boilerplate provides everything needed to start using background tasks immediately. Simply define your task functions, register them in the worker settings, and enqueue them from your API endpoints.
## Configuration
Basic Redis queue configuration:
```bash
# Redis Queue Settings
REDIS_QUEUE_HOST=localhost
REDIS_QUEUE_PORT=6379
```
The system automatically handles Redis connection pooling and worker lifecycle management.
## Next Steps
Check the [ARQ documentation](https://arq-docs.helpmanual.io/) for advanced usage patterns and refer to the boilerplate's example implementation in `src/app/core/worker/` and `src/app/api/v1/tasks.py`.

View File

@ -0,0 +1,191 @@
# Cache Strategies
Effective cache strategies balance performance gains with data consistency. This section covers invalidation patterns, cache warming, and optimization techniques for building robust caching systems.
## Cache Invalidation Strategies
Cache invalidation is one of the hardest problems in computer science. The boilerplate provides several strategies to handle different scenarios while maintaining data consistency.
### Understanding Cache Invalidation
**Cache invalidation** ensures that cached data doesn't become stale when the underlying data changes. Poor invalidation leads to users seeing outdated information, while over-aggressive invalidation negates caching benefits.
### Basic Invalidation Patterns
#### Time-Based Expiration (TTL)
The simplest strategy relies on cache expiration times:
```python
# Set different TTL based on data characteristics
@cache(key_prefix="user_profile", expiration=3600) # 1 hour for profiles
@cache(key_prefix="post_content", expiration=1800) # 30 min for posts
@cache(key_prefix="live_stats", expiration=60) # 1 min for live data
```
**Pros:**
- Simple to implement and understand
- Guarantees cache freshness within TTL period
- Works well for data with predictable change patterns
**Cons:**
- May serve stale data until TTL expires
- Difficult to optimize TTL for all scenarios
- Cache miss storms when many keys expire simultaneously
#### Write-Through Invalidation
Automatically invalidate cache when data is modified:
```python
@router.put("/posts/{post_id}")
@cache(
key_prefix="post_cache",
resource_id_name="post_id",
to_invalidate_extra={
"user_posts": "{user_id}", # User's post list
"category_posts": "{category_id}", # Category post list
"recent_posts": "global" # Global recent posts
}
)
async def update_post(
request: Request,
post_id: int,
post_data: PostUpdate,
user_id: int,
category_id: int
):
# Update triggers automatic cache invalidation
updated_post = await crud_posts.update(db=db, id=post_id, object=post_data)
return updated_post
```
**Pros:**
- Immediate consistency when data changes
- No stale data served to users
- Precise control over what gets invalidated
**Cons:**
- More complex implementation
- Can impact write performance
- Risk of over-invalidation
### Advanced Invalidation Patterns
#### Pattern-Based Invalidation
Use Redis pattern matching for bulk invalidation:
```python
@router.put("/users/{user_id}/profile")
@cache(
key_prefix="user_profile",
resource_id_name="user_id",
pattern_to_invalidate_extra=[
"user_{user_id}_*", # All user-related caches
"*_user_{user_id}_*", # Caches containing this user
"leaderboard_*", # Leaderboards might change
"search_users_*" # User search results
]
)
async def update_user_profile(request: Request, user_id: int, profile_data: ProfileUpdate):
await crud_users.update(db=db, id=user_id, object=profile_data)
return {"message": "Profile updated"}
```
**Pattern Examples:**
```python
# User-specific patterns
"user_{user_id}_posts_*" # All paginated post lists for user
"user_{user_id}_*_cache" # All cached data for user
"*_following_{user_id}" # All caches tracking this user's followers
# Content patterns
"posts_category_{category_id}_*" # All posts in category
"comments_post_{post_id}_*" # All comments for post
"search_*_{query}" # All search results for query
# Time-based patterns
"daily_stats_*" # All daily statistics
"hourly_*" # All hourly data
"temp_*" # Temporary cache entries
```
## Cache Warming Strategies
Cache warming proactively loads data into cache to avoid cache misses during peak usage.
### Application Startup Warming
```python
# core/startup.py
async def warm_critical_caches():
"""Warm up critical caches during application startup."""
logger.info("Starting cache warming...")
# Warm up reference data
await warm_reference_data()
# Warm up popular content
await warm_popular_content()
# Warm up user session data for active users
await warm_active_user_data()
logger.info("Cache warming completed")
async def warm_reference_data():
"""Warm up reference data that rarely changes."""
# Countries, currencies, timezones, etc.
reference_data = await crud_reference.get_all_countries()
for country in reference_data:
cache_key = f"country:{country['code']}"
await cache.client.set(cache_key, json.dumps(country), ex=86400) # 24 hours
# Categories
categories = await crud_categories.get_all()
await cache.client.set("all_categories", json.dumps(categories), ex=3600)
async def warm_popular_content():
"""Warm up frequently accessed content."""
# Most viewed posts
popular_posts = await crud_posts.get_popular(limit=100)
for post in popular_posts:
cache_key = f"post_cache:{post['id']}"
await cache.client.set(cache_key, json.dumps(post), ex=1800)
# Trending topics
trending = await crud_posts.get_trending_topics(limit=50)
await cache.client.set("trending_topics", json.dumps(trending), ex=600)
async def warm_active_user_data():
"""Warm up data for recently active users."""
# Get users active in last 24 hours
active_users = await crud_users.get_recently_active(hours=24)
for user in active_users:
# Warm user profile
profile_key = f"user_profile:{user['id']}"
await cache.client.set(profile_key, json.dumps(user), ex=3600)
# Warm user's recent posts
user_posts = await crud_posts.get_user_posts(user['id'], limit=10)
posts_key = f"user_{user['id']}_posts:page_1"
await cache.client.set(posts_key, json.dumps(user_posts), ex=1800)
# Add to startup events
@app.on_event("startup")
async def startup_event():
await create_redis_cache_pool()
await warm_critical_caches()
```
These cache strategies provide a comprehensive approach to building performant, consistent caching systems that scale with your application's needs while maintaining data integrity.

View File

@ -0,0 +1,515 @@
# Client Cache
Client-side caching leverages HTTP cache headers to instruct browsers and CDNs to cache responses locally. This reduces server load and improves user experience by serving cached content directly from the client.
## Understanding Client Caching
Client caching works by setting HTTP headers that tell browsers, proxies, and CDNs how long they should cache responses. When implemented correctly, subsequent requests for the same resource are served instantly from the local cache.
### Benefits of Client Caching
**Reduced Latency**: Instant response from local cache eliminates network round trips
**Lower Server Load**: Fewer requests reach your server infrastructure
**Bandwidth Savings**: Cached responses don't consume network bandwidth
**Better User Experience**: Faster page loads and improved responsiveness
**Cost Reduction**: Lower server resource usage and bandwidth costs
## Cache-Control Headers
The `Cache-Control` header is the primary mechanism for controlling client-side caching behavior.
### Header Components
```http
Cache-Control: public, max-age=3600, s-maxage=7200, must-revalidate
```
**Directive Breakdown:**
- **`public`**: Response can be cached by any cache (browsers, CDNs, proxies)
- **`private`**: Response can only be cached by browsers, not shared caches
- **`max-age=3600`**: Cache for 3600 seconds (1 hour) in browsers
- **`s-maxage=7200`**: Cache for 7200 seconds (2 hours) in shared caches (CDNs)
- **`must-revalidate`**: Must check with server when cache expires
- **`no-cache`**: Must revalidate with server before using cached response
- **`no-store`**: Must not store any part of the response
### Common Cache Patterns
```python
# Static assets (images, CSS, JS)
"Cache-Control: public, max-age=31536000, immutable" # 1 year
# API data that changes rarely
"Cache-Control: public, max-age=3600" # 1 hour
# User-specific data
"Cache-Control: private, max-age=1800" # 30 minutes, browser only
# Real-time data
"Cache-Control: no-cache, must-revalidate" # Always validate
# Sensitive data
"Cache-Control: no-store, no-cache, must-revalidate" # Never cache
```
## Middleware Implementation
The boilerplate includes middleware that automatically adds cache headers to responses.
### ClientCacheMiddleware
```python
# middleware/client_cache_middleware.py
from fastapi import FastAPI, Request, Response
from starlette.middleware.base import BaseHTTPMiddleware, RequestResponseEndpoint
class ClientCacheMiddleware(BaseHTTPMiddleware):
"""Middleware to set Cache-Control headers for client-side caching."""
def __init__(self, app: FastAPI, max_age: int = 60) -> None:
super().__init__(app)
self.max_age = max_age
async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -> Response:
response: Response = await call_next(request)
response.headers["Cache-Control"] = f"public, max-age={self.max_age}"
return response
```
### Adding Middleware to Application
```python
# main.py
from fastapi import FastAPI
from app.middleware.client_cache_middleware import ClientCacheMiddleware
app = FastAPI()
# Add client caching middleware
app.add_middleware(
ClientCacheMiddleware,
max_age=300 # 5 minutes default cache
)
```
### Custom Middleware Configuration
```python
class AdvancedClientCacheMiddleware(BaseHTTPMiddleware):
"""Advanced client cache middleware with path-specific configurations."""
def __init__(
self,
app: FastAPI,
default_max_age: int = 300,
path_configs: dict[str, dict] = None
):
super().__init__(app)
self.default_max_age = default_max_age
self.path_configs = path_configs or {}
async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -> Response:
response = await call_next(request)
# Get path-specific configuration
cache_config = self._get_cache_config(request.url.path)
# Set cache headers based on configuration
if cache_config.get("no_cache", False):
response.headers["Cache-Control"] = "no-cache, no-store, must-revalidate"
response.headers["Pragma"] = "no-cache"
response.headers["Expires"] = "0"
else:
max_age = cache_config.get("max_age", self.default_max_age)
visibility = "private" if cache_config.get("private", False) else "public"
cache_control = f"{visibility}, max-age={max_age}"
if cache_config.get("must_revalidate", False):
cache_control += ", must-revalidate"
if cache_config.get("immutable", False):
cache_control += ", immutable"
response.headers["Cache-Control"] = cache_control
return response
def _get_cache_config(self, path: str) -> dict:
"""Get cache configuration for a specific path."""
for pattern, config in self.path_configs.items():
if path.startswith(pattern):
return config
return {}
# Usage with path-specific configurations
app.add_middleware(
AdvancedClientCacheMiddleware,
default_max_age=300,
path_configs={
"/api/v1/static/": {"max_age": 31536000, "immutable": True}, # 1 year for static assets
"/api/v1/auth/": {"no_cache": True}, # No cache for auth endpoints
"/api/v1/users/me": {"private": True, "max_age": 900}, # 15 min private cache for user data
"/api/v1/public/": {"max_age": 1800}, # 30 min for public data
}
)
```
## Manual Cache Control
Set cache headers manually in specific endpoints for fine-grained control.
### Response Header Manipulation
```python
from fastapi import APIRouter, Response
router = APIRouter()
@router.get("/api/v1/static-data")
async def get_static_data(response: Response):
"""Endpoint with long-term caching for static data."""
# Set cache headers for static data
response.headers["Cache-Control"] = "public, max-age=86400, immutable" # 24 hours
response.headers["Last-Modified"] = "Wed, 21 Oct 2023 07:28:00 GMT"
response.headers["ETag"] = '"abc123"'
return {"data": "static content that rarely changes"}
@router.get("/api/v1/user-data")
async def get_user_data(response: Response, current_user: dict = Depends(get_current_user)):
"""Endpoint with private caching for user-specific data."""
# Private cache for user-specific data
response.headers["Cache-Control"] = "private, max-age=1800" # 30 minutes
response.headers["Vary"] = "Authorization" # Cache varies by auth header
return {"user_id": current_user["id"], "preferences": "user data"}
@router.get("/api/v1/real-time-data")
async def get_real_time_data(response: Response):
"""Endpoint that should not be cached."""
# Prevent caching for real-time data
response.headers["Cache-Control"] = "no-cache, no-store, must-revalidate"
response.headers["Pragma"] = "no-cache"
response.headers["Expires"] = "0"
return {"timestamp": datetime.utcnow(), "live_data": "current status"}
```
### Conditional Caching
Implement conditional caching based on request parameters:
```python
@router.get("/api/v1/posts")
async def get_posts(
response: Response,
page: int = 1,
per_page: int = 10,
category: str | None = None,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
"""Conditional caching based on parameters."""
# Different cache strategies based on parameters
if category:
# Category-specific data changes less frequently
response.headers["Cache-Control"] = "public, max-age=1800" # 30 minutes
elif page == 1:
# First page cached more aggressively
response.headers["Cache-Control"] = "public, max-age=600" # 10 minutes
else:
# Other pages cached for shorter duration
response.headers["Cache-Control"] = "public, max-age=300" # 5 minutes
# Add ETag for efficient revalidation
content_hash = hashlib.md5(f"{page}{per_page}{category}".encode()).hexdigest()
response.headers["ETag"] = f'"{content_hash}"'
posts = await crud_posts.get_multi(
db=db,
offset=(page - 1) * per_page,
limit=per_page,
category=category
)
return {"posts": posts, "page": page, "per_page": per_page}
```
## ETag Implementation
ETags enable efficient cache validation by allowing clients to check if content has changed.
### ETag Generation
```python
import hashlib
from typing import Any
def generate_etag(data: Any) -> str:
"""Generate ETag from data content."""
content = json.dumps(data, sort_keys=True, default=str)
return hashlib.md5(content.encode()).hexdigest()
@router.get("/api/v1/users/{user_id}")
async def get_user(
request: Request,
response: Response,
user_id: int,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
"""Endpoint with ETag support for efficient caching."""
user = await crud_users.get(db=db, id=user_id)
if not user:
raise HTTPException(status_code=404, detail="User not found")
# Generate ETag from user data
etag = generate_etag(user)
# Check if client has current version
if_none_match = request.headers.get("If-None-Match")
if if_none_match == f'"{etag}"':
# Content hasn't changed, return 304 Not Modified
response.status_code = 304
return Response(status_code=304)
# Set ETag and cache headers
response.headers["ETag"] = f'"{etag}"'
response.headers["Cache-Control"] = "private, max-age=1800, must-revalidate"
return user
```
### Last-Modified Headers
Use Last-Modified headers for time-based cache validation:
```python
@router.get("/api/v1/posts/{post_id}")
async def get_post(
request: Request,
response: Response,
post_id: int,
db: Annotated[AsyncSession, Depends(async_get_db)]
):
"""Endpoint with Last-Modified header support."""
post = await crud_posts.get(db=db, id=post_id)
if not post:
raise HTTPException(status_code=404, detail="Post not found")
# Use post's updated_at timestamp
last_modified = post["updated_at"]
# Check If-Modified-Since header
if_modified_since = request.headers.get("If-Modified-Since")
if if_modified_since:
client_time = datetime.strptime(if_modified_since, "%a, %d %b %Y %H:%M:%S GMT")
if last_modified <= client_time:
response.status_code = 304
return Response(status_code=304)
# Set Last-Modified header
response.headers["Last-Modified"] = last_modified.strftime("%a, %d %b %Y %H:%M:%S GMT")
response.headers["Cache-Control"] = "public, max-age=3600, must-revalidate"
return post
```
## Cache Strategy by Content Type
Different types of content require different caching strategies.
### Static Assets
```python
@router.get("/static/{file_path:path}")
async def serve_static(response: Response, file_path: str):
"""Serve static files with aggressive caching."""
# Static assets can be cached for a long time
response.headers["Cache-Control"] = "public, max-age=31536000, immutable" # 1 year
response.headers["Vary"] = "Accept-Encoding" # Vary by compression
# Add file-specific ETag based on file modification time
file_stat = os.stat(f"static/{file_path}")
etag = hashlib.md5(f"{file_path}{file_stat.st_mtime}".encode()).hexdigest()
response.headers["ETag"] = f'"{etag}"'
return FileResponse(f"static/{file_path}")
```
### API Responses
```python
# Reference data (rarely changes)
@router.get("/api/v1/countries")
async def get_countries(response: Response, db: Annotated[AsyncSession, Depends(async_get_db)]):
response.headers["Cache-Control"] = "public, max-age=86400" # 24 hours
return await crud_countries.get_all(db=db)
# User-generated content (moderate changes)
@router.get("/api/v1/posts")
async def get_posts(response: Response, db: Annotated[AsyncSession, Depends(async_get_db)]):
response.headers["Cache-Control"] = "public, max-age=1800" # 30 minutes
return await crud_posts.get_multi(db=db, is_deleted=False)
# Personal data (private caching only)
@router.get("/api/v1/users/me/notifications")
async def get_notifications(
response: Response,
current_user: dict = Depends(get_current_user),
db: Annotated[AsyncSession, Depends(async_get_db)]
):
response.headers["Cache-Control"] = "private, max-age=300" # 5 minutes
response.headers["Vary"] = "Authorization"
return await crud_notifications.get_user_notifications(db=db, user_id=current_user["id"])
# Real-time data (no caching)
@router.get("/api/v1/system/status")
async def get_system_status(response: Response):
response.headers["Cache-Control"] = "no-cache, no-store, must-revalidate"
return {"status": "online", "timestamp": datetime.utcnow()}
```
## Vary Header Usage
The `Vary` header tells caches which request headers affect the response, enabling proper cache key generation.
### Common Vary Patterns
```python
# Cache varies by authorization (user-specific content)
response.headers["Vary"] = "Authorization"
# Cache varies by accepted language
response.headers["Vary"] = "Accept-Language"
# Cache varies by compression support
response.headers["Vary"] = "Accept-Encoding"
# Multiple varying headers
response.headers["Vary"] = "Authorization, Accept-Language, Accept-Encoding"
# Example implementation
@router.get("/api/v1/dashboard")
async def get_dashboard(
request: Request,
response: Response,
current_user: dict = Depends(get_current_user)
):
"""Dashboard content that varies by user and language."""
# Content varies by user (Authorization) and language preference
response.headers["Vary"] = "Authorization, Accept-Language"
response.headers["Cache-Control"] = "private, max-age=900" # 15 minutes
language = request.headers.get("Accept-Language", "en")
dashboard_data = await generate_dashboard(
user_id=current_user["id"],
language=language
)
return dashboard_data
```
## CDN Integration
Configure cache headers for optimal CDN performance.
### CDN-Specific Headers
```python
@router.get("/api/v1/public-content")
async def get_public_content(response: Response):
"""Content optimized for CDN caching."""
# Different cache times for browser vs CDN
response.headers["Cache-Control"] = "public, max-age=300, s-maxage=3600" # 5 min browser, 1 hour CDN
# CDN-specific headers (CloudFlare example)
response.headers["CF-Cache-Tag"] = "public-content,api-v1" # Cache tags for purging
response.headers["CF-Edge-Cache"] = "max-age=86400" # Edge cache for 24 hours
return await get_public_content_data()
```
### Cache Purging
Implement cache purging for content updates:
```python
@router.put("/api/v1/posts/{post_id}")
async def update_post(
response: Response,
post_id: int,
post_data: PostUpdate,
current_user: dict = Depends(get_current_user),
db: Annotated[AsyncSession, Depends(async_get_db)]
):
"""Update post and invalidate related caches."""
# Update the post
updated_post = await crud_posts.update(db=db, id=post_id, object=post_data)
if not updated_post:
raise HTTPException(status_code=404, detail="Post not found")
# Set headers to indicate cache invalidation is needed
response.headers["Cache-Control"] = "no-cache"
response.headers["X-Cache-Purge"] = f"post-{post_id},user-{current_user['id']}-posts"
# In production, trigger CDN purge here
# await purge_cdn_cache([f"post-{post_id}", f"user-{current_user['id']}-posts"])
return updated_post
```
## Best Practices
### Cache Duration Guidelines
```python
# Choose appropriate cache durations based on content characteristics:
# Static assets (CSS, JS, images with versioning)
max_age = 31536000 # 1 year
# API reference data (countries, categories)
max_age = 86400 # 24 hours
# User-generated content (posts, comments)
max_age = 1800 # 30 minutes
# User-specific data (profiles, preferences)
max_age = 900 # 15 minutes
# Search results
max_age = 600 # 10 minutes
# Real-time data (live scores, chat)
max_age = 0 # No caching
```
### Security Considerations
```python
# Never cache sensitive data
@router.get("/api/v1/admin/secrets")
async def get_secrets(response: Response):
response.headers["Cache-Control"] = "no-store, no-cache, must-revalidate, private"
response.headers["Pragma"] = "no-cache"
response.headers["Expires"] = "0"
return {"secret": "sensitive_data"}
# Use private caching for user-specific content
@router.get("/api/v1/users/me/private-data")
async def get_private_data(response: Response):
response.headers["Cache-Control"] = "private, max-age=300, must-revalidate"
response.headers["Vary"] = "Authorization"
return {"private": "user_data"}
```
Client-side caching, when properly implemented, provides significant performance improvements while maintaining security and data freshness through intelligent cache control strategies.

View File

@ -0,0 +1,77 @@
# Caching
The boilerplate includes a comprehensive caching system built on Redis that improves performance through server-side caching and client-side cache control. This section covers the complete caching implementation.
## Overview
The caching system provides multiple layers of optimization:
- **Server-Side Caching**: Redis-based caching with automatic invalidation
- **Client-Side Caching**: HTTP cache headers for browser optimization
- **Cache Invalidation**: Smart invalidation strategies for data consistency
## Quick Example
```python
from app.core.utils.cache import cache
@router.get("/posts/{post_id}")
@cache(key_prefix="post_cache", expiration=3600)
async def get_post(request: Request, post_id: int):
# Cached for 1 hour, automatic invalidation on updates
return await crud_posts.get(db=db, id=post_id)
```
## Architecture
### Server-Side Caching
- **Redis Integration**: Connection pooling and async operations
- **Decorator-Based**: Simple `@cache` decorator for endpoints
- **Smart Invalidation**: Automatic cache clearing on data changes
- **Pattern Matching**: Bulk invalidation using Redis patterns
### Client-Side Caching
- **HTTP Headers**: Cache-Control headers for browser caching
- **Middleware**: Automatic header injection
- **Configurable TTL**: Customizable cache duration
## Key Features
**Automatic Cache Management**
- Caches GET requests automatically
- Invalidates cache on PUT/POST/DELETE operations
- Supports complex invalidation patterns
**Flexible Configuration**
- Per-endpoint expiration times
- Custom cache key generation
- Environment-specific Redis settings
**Performance Optimization**
- Connection pooling for Redis
- Efficient key pattern matching
- Minimal overhead for cache operations
## Getting Started
1. **[Redis Cache](redis-cache.md)** - Server-side caching with Redis
2. **[Client Cache](client-cache.md)** - Browser caching with HTTP headers
3. **[Cache Strategies](cache-strategies.md)** - Invalidation patterns and best practices
Each section provides detailed implementation examples and configuration options for building a robust caching layer.
## Configuration
Basic Redis configuration in your environment:
```bash
# Redis Cache Settings
REDIS_CACHE_HOST=localhost
REDIS_CACHE_PORT=6379
```
The caching system automatically handles connection pooling and provides efficient cache operations for your FastAPI endpoints.
## Next Steps
Start with **[Redis Cache](redis-cache.md)** to understand the core server-side caching implementation, then explore client-side caching and advanced invalidation strategies.

View File

@ -0,0 +1,359 @@
# Redis Cache
Redis-based server-side caching provides fast, in-memory storage for API responses. The boilerplate includes a sophisticated caching decorator that automatically handles cache storage, retrieval, and invalidation.
## Understanding Redis Caching
Redis serves as a high-performance cache layer between your API and database. When properly implemented, it can reduce response times from hundreds of milliseconds to single-digit milliseconds by serving data directly from memory.
### Why Redis?
**Performance**: In-memory storage provides sub-millisecond data access
**Scalability**: Handles thousands of concurrent connections efficiently
**Persistence**: Optional data persistence for cache warm-up after restarts
**Atomic Operations**: Thread-safe operations for concurrent applications
**Pattern Matching**: Advanced key pattern operations for bulk cache invalidation
## Cache Decorator
The `@cache` decorator provides a simple interface for adding caching to any FastAPI endpoint.
### Basic Usage
```python
from fastapi import APIRouter, Request, Depends
from sqlalchemy.orm import Session
from app.core.utils.cache import cache
from app.core.db.database import get_db
router = APIRouter()
@router.get("/posts/{post_id}")
@cache(key_prefix="post_cache", expiration=3600)
async def get_post(request: Request, post_id: int, db: Session = Depends(get_db)):
# This function's result will be cached for 1 hour
post = await crud_posts.get(db=db, id=post_id)
return post
```
**How It Works:**
1. **Cache Check**: On GET requests, checks Redis for existing cached data
2. **Cache Miss**: If no cache exists, executes the function and stores the result
3. **Cache Hit**: Returns cached data directly, bypassing function execution
4. **Invalidation**: Automatically removes cache on non-GET requests (POST, PUT, DELETE)
### Decorator Parameters
```python
@cache(
key_prefix: str, # Cache key prefix
resource_id_name: str = None, # Explicit resource ID parameter
expiration: int = 3600, # Cache TTL in seconds
resource_id_type: type | tuple[type, ...] = int, # Expected ID type
to_invalidate_extra: dict[str, str] = None, # Additional keys to invalidate
pattern_to_invalidate_extra: list[str] = None # Pattern-based invalidation
)
```
#### Key Prefix
The key prefix creates unique cache identifiers:
```python
# Simple prefix
@cache(key_prefix="user_data")
# Generates keys like: "user_data:123"
# Dynamic prefix with placeholders
@cache(key_prefix="{username}_posts")
# Generates keys like: "johndoe_posts:456"
# Complex prefix with multiple parameters
@cache(key_prefix="user_{user_id}_posts_page_{page}")
# Generates keys like: "user_123_posts_page_2:789"
```
#### Resource ID Handling
```python
# Automatic ID inference (looks for 'id' parameter)
@cache(key_prefix="post_cache")
async def get_post(request: Request, post_id: int):
# Uses post_id automatically
# Explicit ID parameter
@cache(key_prefix="user_cache", resource_id_name="username")
async def get_user(request: Request, username: str):
# Uses username instead of looking for 'id'
# Multiple ID types
@cache(key_prefix="search", resource_id_type=(int, str))
async def search(request: Request, query: str, page: int):
# Accepts either string or int as resource ID
```
### Advanced Caching Patterns
#### Paginated Data Caching
```python
@router.get("/users/{username}/posts")
@cache(
key_prefix="{username}_posts:page_{page}:items_per_page_{items_per_page}",
resource_id_name="username",
expiration=300 # 5 minutes for paginated data
)
async def get_user_posts(
request: Request,
username: str,
page: int = 1,
items_per_page: int = 10
):
offset = compute_offset(page, items_per_page)
posts = await crud_posts.get_multi(
db=db,
offset=offset,
limit=items_per_page,
created_by_user_id=user_id
)
return paginated_response(posts, page, items_per_page)
```
#### Hierarchical Data Caching
```python
@router.get("/organizations/{org_id}/departments/{dept_id}/employees")
@cache(
key_prefix="org_{org_id}_dept_{dept_id}_employees",
resource_id_name="dept_id",
expiration=1800 # 30 minutes
)
async def get_department_employees(
request: Request,
org_id: int,
dept_id: int
):
employees = await crud_employees.get_multi(
db=db,
department_id=dept_id,
organization_id=org_id
)
return employees
```
## Cache Invalidation
Cache invalidation ensures data consistency when the underlying data changes.
### Automatic Invalidation
The cache decorator automatically invalidates cache entries on non-GET requests:
```python
@router.put("/posts/{post_id}")
@cache(key_prefix="post_cache", resource_id_name="post_id")
async def update_post(request: Request, post_id: int, data: PostUpdate):
# Automatically invalidates "post_cache:123" when called with PUT/POST/DELETE
await crud_posts.update(db=db, id=post_id, object=data)
return {"message": "Post updated"}
```
### Extra Key Invalidation
Invalidate related cache entries when data changes:
```python
@router.post("/posts")
@cache(
key_prefix="new_post",
resource_id_name="user_id",
to_invalidate_extra={
"user_posts": "{user_id}", # Invalidate user's post list
"latest_posts": "global", # Invalidate global latest posts
"user_stats": "{user_id}" # Invalidate user statistics
}
)
async def create_post(request: Request, post: PostCreate, user_id: int):
# Creating a post invalidates related cached data
new_post = await crud_posts.create(db=db, object=post)
return new_post
```
### Pattern-Based Invalidation
Use Redis pattern matching for bulk invalidation:
```python
@router.put("/users/{user_id}/profile")
@cache(
key_prefix="user_profile",
resource_id_name="user_id",
pattern_to_invalidate_extra=[
"user_{user_id}_*", # All user-related caches
"*_user_{user_id}_*", # Caches that include this user
"search_results_*" # All search result caches
]
)
async def update_user_profile(request: Request, user_id: int, data: UserUpdate):
# Invalidates all matching cache patterns
await crud_users.update(db=db, id=user_id, object=data)
return {"message": "Profile updated"}
```
**Pattern Examples:**
- `user_*` - All keys starting with "user_"
- `*_posts` - All keys ending with "_posts"
- `user_*_posts_*` - Complex patterns with wildcards
- `temp_*` - Temporary cache entries
## Configuration
### Redis Settings
Configure Redis connection in your environment settings:
```python
# core/config.py
class RedisCacheSettings(BaseSettings):
REDIS_CACHE_HOST: str = config("REDIS_CACHE_HOST", default="localhost")
REDIS_CACHE_PORT: int = config("REDIS_CACHE_PORT", default=6379)
REDIS_CACHE_PASSWORD: str = config("REDIS_CACHE_PASSWORD", default="")
REDIS_CACHE_DB: int = config("REDIS_CACHE_DB", default=0)
REDIS_CACHE_URL: str = f"redis://:{REDIS_CACHE_PASSWORD}@{REDIS_CACHE_HOST}:{REDIS_CACHE_PORT}/{REDIS_CACHE_DB}"
```
### Environment Variables
```bash
# Basic Configuration
REDIS_CACHE_HOST=localhost
REDIS_CACHE_PORT=6379
# Production Configuration
REDIS_CACHE_HOST=redis.production.com
REDIS_CACHE_PORT=6379
REDIS_CACHE_PASSWORD=your-secure-password
REDIS_CACHE_DB=0
# Docker Compose
REDIS_CACHE_HOST=redis
REDIS_CACHE_PORT=6379
```
### Connection Pool Setup
The boilerplate automatically configures Redis connection pooling:
```python
# core/setup.py
async def create_redis_cache_pool() -> None:
"""Initialize Redis connection pool for caching."""
cache.pool = redis.ConnectionPool.from_url(
settings.REDIS_CACHE_URL,
max_connections=20, # Maximum connections in pool
retry_on_timeout=True, # Retry on connection timeout
socket_timeout=5.0, # Socket timeout in seconds
health_check_interval=30 # Health check frequency
)
cache.client = redis.Redis.from_pool(cache.pool)
```
### Cache Client Usage
Direct Redis client access for custom caching logic:
```python
from app.core.utils.cache import client
async def custom_cache_operation():
if client is None:
raise MissingClientError("Redis client not initialized")
# Set custom cache entry
await client.set("custom_key", "custom_value", ex=3600)
# Get cached value
cached_value = await client.get("custom_key")
# Delete cache entry
await client.delete("custom_key")
# Bulk operations
pipe = client.pipeline()
pipe.set("key1", "value1")
pipe.set("key2", "value2")
pipe.expire("key1", 3600)
await pipe.execute()
```
## Performance Optimization
### Connection Pooling
Connection pooling prevents the overhead of creating new Redis connections for each request:
```python
# Benefits of connection pooling:
# - Reuses existing connections
# - Handles connection failures gracefully
# - Provides connection health checks
# - Supports concurrent operations
# Pool configuration
redis.ConnectionPool.from_url(
settings.REDIS_CACHE_URL,
max_connections=20, # Adjust based on expected load
retry_on_timeout=True, # Handle network issues
socket_keepalive=True, # Keep connections alive
socket_keepalive_options={}
)
```
### Cache Key Generation
The cache decorator automatically generates keys using this pattern:
```python
# Decorator generates: "{formatted_key_prefix}:{resource_id}"
@cache(key_prefix="post_cache", resource_id_name="post_id")
# Generates: "post_cache:123"
@cache(key_prefix="{username}_posts:page_{page}")
# Generates: "johndoe_posts:page_1:456" (where 456 is the resource_id)
# The system handles key formatting automatically - you just provide the prefix template
```
**What you control:**
- `key_prefix` template with placeholders like `{username}`, `{page}`
- `resource_id_name` to specify which parameter to use as the ID
- The decorator handles the rest
**Generated key examples from the boilerplate:**
```python
# From posts.py
"{username}_posts:page_{page}:items_per_page_{items_per_page}" "john_posts:page_1:items_per_page_10:789"
"{username}_post_cache" "john_post_cache:123"
```
### Expiration Strategies
Choose appropriate expiration times based on data characteristics:
```python
# Static reference data (rarely changes)
@cache(key_prefix="countries", expiration=86400) # 24 hours
# User-generated content (changes moderately)
@cache(key_prefix="user_posts", expiration=1800) # 30 minutes
# Real-time data (changes frequently)
@cache(key_prefix="live_stats", expiration=60) # 1 minute
# Search results (can be stale)
@cache(key_prefix="search", expiration=3600) # 1 hour
```
This comprehensive Redis caching system provides high-performance data access while maintaining data consistency through intelligent invalidation strategies.

View File

@ -0,0 +1,539 @@
# Docker Setup
Learn how to configure and run the FastAPI Boilerplate using Docker Compose. The project includes a complete containerized setup with PostgreSQL, Redis, background workers, and optional services.
## Docker Compose Architecture
The boilerplate includes these core services:
```yaml
services:
web: # FastAPI application (uvicorn or gunicorn)
worker: # ARQ background task worker
db: # PostgreSQL 13 database
redis: # Redis Alpine for caching/queues
# Optional services (commented out by default):
# pgadmin: # Database administration
# nginx: # Reverse proxy
# create_superuser: # One-time superuser creation
# create_tier: # One-time tier creation
```
## Basic Docker Compose
### Main Configuration
The main `docker-compose.yml` includes:
```yaml
version: '3.8'
services:
web:
build:
context: .
dockerfile: Dockerfile
# Development mode (reload enabled)
command: uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
# Production mode (uncomment for production)
# command: gunicorn app.main:app -w 4 -k uvicorn.workers.UvicornWorker -b 0.0.0.0:8000
env_file:
- ./src/.env
ports:
- "8000:8000"
depends_on:
- db
- redis
volumes:
- ./src/app:/code/app
- ./src/.env:/code/.env
worker:
build:
context: .
dockerfile: Dockerfile
command: arq app.core.worker.settings.WorkerSettings
env_file:
- ./src/.env
depends_on:
- db
- redis
volumes:
- ./src/app:/code/app
- ./src/.env:/code/.env
db:
image: postgres:13
env_file:
- ./src/.env
volumes:
- postgres-data:/var/lib/postgresql/data
expose:
- "5432"
redis:
image: redis:alpine
volumes:
- redis-data:/data
expose:
- "6379"
volumes:
postgres-data:
redis-data:
```
### Environment File Loading
All services automatically load environment variables from `./src/.env`:
```yaml
env_file:
- ./src/.env
```
The Docker services use these environment variables:
- `POSTGRES_USER`, `POSTGRES_PASSWORD`, `POSTGRES_DB` for database
- `REDIS_*_HOST` variables automatically resolve to service names
- All application settings from your `.env` file
## Service Details
### Web Service (FastAPI Application)
The web service runs your FastAPI application:
```yaml
web:
build:
context: .
dockerfile: Dockerfile
# Development: uvicorn with reload
command: uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
# Production: gunicorn with multiple workers (commented out)
# command: gunicorn app.main:app -w 4 -k uvicorn.workers.UvicornWorker -b 0.0.0.0:8000
env_file:
- ./src/.env
ports:
- "8000:8000" # Direct access in development
volumes:
- ./src/app:/code/app # Live code reloading
- ./src/.env:/code/.env
```
**Key Features:**
- **Development mode**: Uses uvicorn with `--reload` for automatic code reloading
- **Production mode**: Switch to gunicorn with multiple workers (commented out)
- **Live reloading**: Source code mounted as volume for development
- **Port exposure**: Direct access on port 8000 (can be disabled for nginx)
### Worker Service (Background Tasks)
Handles background job processing with ARQ:
```yaml
worker:
build:
context: .
dockerfile: Dockerfile
command: arq app.core.worker.settings.WorkerSettings
env_file:
- ./src/.env
depends_on:
- db
- redis
volumes:
- ./src/app:/code/app
- ./src/.env:/code/.env
```
**Features:**
- Runs ARQ worker for background job processing
- Shares the same codebase and environment as web service
- Automatically connects to Redis for job queues
- Live code reloading in development
### Database Service (PostgreSQL 13)
```yaml
db:
image: postgres:13
env_file:
- ./src/.env
volumes:
- postgres-data:/var/lib/postgresql/data
expose:
- "5432" # Internal network only
```
**Configuration:**
- Uses environment variables: `POSTGRES_USER`, `POSTGRES_PASSWORD`, `POSTGRES_DB`
- Data persisted in named volume `postgres-data`
- Only exposed to internal Docker network (no external port)
- To enable external access, uncomment the ports section
### Redis Service
```yaml
redis:
image: redis:alpine
volumes:
- redis-data:/data
expose:
- "6379" # Internal network only
```
**Features:**
- Lightweight Alpine Linux image
- Data persistence with named volume
- Used for caching, job queues, and rate limiting
- Internal network access only
## Optional Services
### Database Administration (pgAdmin)
Uncomment to enable web-based database management:
```yaml
pgadmin:
container_name: pgadmin4
image: dpage/pgadmin4:latest
restart: always
ports:
- "5050:80"
volumes:
- pgadmin-data:/var/lib/pgadmin
env_file:
- ./src/.env
depends_on:
- db
```
**Usage:**
- Access at `http://localhost:5050`
- Requires `PGADMIN_DEFAULT_EMAIL` and `PGADMIN_DEFAULT_PASSWORD` in `.env`
- Connect to database using service name `db` and port `5432`
### Reverse Proxy (Nginx)
Uncomment for production-style reverse proxy:
```yaml
nginx:
image: nginx:latest
ports:
- "80:80"
volumes:
- ./default.conf:/etc/nginx/conf.d/default.conf
depends_on:
- web
```
**Configuration:**
The included `default.conf` provides:
```nginx
server {
listen 80;
location / {
proxy_pass http://web:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
```
**When using nginx:**
1. Uncomment the nginx service
2. Comment out the `ports` section in the web service
3. Uncomment `expose: ["8000"]` in the web service
### Initialization Services
#### Create First Superuser
```yaml
create_superuser:
build:
context: .
dockerfile: Dockerfile
env_file:
- ./src/.env
depends_on:
- db
- web
command: python -m src.scripts.create_first_superuser
volumes:
- ./src:/code/src
```
#### Create First Tier
```yaml
create_tier:
build:
context: .
dockerfile: Dockerfile
env_file:
- ./src/.env
depends_on:
- db
- web
command: python -m src.scripts.create_first_tier
volumes:
- ./src:/code/src
```
**Usage:**
- These are one-time setup services
- Uncomment when you need to initialize data
- Run once, then comment out again
## Dockerfile Details
The project uses a multi-stage Dockerfile with `uv` for fast Python package management:
### Builder Stage
```dockerfile
FROM ghcr.io/astral-sh/uv:python3.11-bookworm-slim AS builder
ENV UV_COMPILE_BYTECODE=1
ENV UV_LINK_MODE=copy
WORKDIR /app
# Install dependencies (cached layer)
RUN --mount=type=cache,target=/root/.cache/uv \
--mount=type=bind,source=uv.lock,target=uv.lock \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
uv sync --locked --no-install-project
# Copy and install project
COPY . /app
RUN --mount=type=cache,target=/root/.cache/uv \
uv sync --locked --no-editable
```
### Final Stage
```dockerfile
FROM python:3.11-slim-bookworm
# Create non-root user for security
RUN groupadd --gid 1000 app \
&& useradd --uid 1000 --gid app --shell /bin/bash --create-home app
# Copy virtual environment from builder
COPY --from=builder --chown=app:app /app/.venv /app/.venv
ENV PATH="/app/.venv/bin:$PATH"
USER app
WORKDIR /code
# Default command (can be overridden)
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]
```
**Security Features:**
- Non-root user execution
- Multi-stage build for smaller final image
- Cached dependency installation
## Common Docker Commands
### Development Workflow
```bash
# Start all services
docker compose up
# Start in background
docker compose up -d
# Rebuild and start (after code changes)
docker compose up --build
# View logs
docker compose logs -f web
docker compose logs -f worker
# Stop services
docker compose down
# Stop and remove volumes (reset data)
docker compose down -v
```
### Service Management
```bash
# Start specific services
docker compose up web db redis
# Scale workers
docker compose up --scale worker=3
# Execute commands in running containers
docker compose exec web bash
docker compose exec db psql -U postgres
docker compose exec redis redis-cli
# View service status
docker compose ps
```
### Production Mode
To switch to production mode:
1. **Enable Gunicorn:**
```yaml
# Comment out uvicorn line
# command: uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
# Uncomment gunicorn line
command: gunicorn app.main:app -w 4 -k uvicorn.workers.UvicornWorker -b 0.0.0.0:8000
```
2. **Enable Nginx** (optional):
```yaml
# Uncomment nginx service
nginx:
image: nginx:latest
ports:
- "80:80"
# In web service, comment out ports and uncomment expose
# ports:
# - "8000:8000"
expose:
- "8000"
```
3. **Remove development volumes:**
```yaml
# Remove or comment out for production
# volumes:
# - ./src/app:/code/app
# - ./src/.env:/code/.env
```
## Environment Configuration
### Service Communication
Services communicate using service names:
```yaml
# In your .env file for Docker
POSTGRES_SERVER=db # Not localhost
REDIS_CACHE_HOST=redis # Not localhost
REDIS_QUEUE_HOST=redis
REDIS_RATE_LIMIT_HOST=redis
```
### Port Management
**Development (default):**
- Web: `localhost:8000` (direct access)
- Database: `localhost:5432` (uncomment ports to enable)
- Redis: `localhost:6379` (uncomment ports to enable)
- pgAdmin: `localhost:5050` (if enabled)
**Production with Nginx:**
- Web: `localhost:80` (through nginx)
- Database: Internal only
- Redis: Internal only
## Troubleshooting
### Common Issues
**Container won't start:**
```bash
# Check logs
docker compose logs web
# Rebuild image
docker compose build --no-cache web
# Check environment file
docker compose exec web env | grep POSTGRES
```
**Database connection issues:**
```bash
# Check if db service is running
docker compose ps db
# Test connection from web container
docker compose exec web ping db
# Check database logs
docker compose logs db
```
**Port conflicts:**
```bash
# Check what's using the port
lsof -i :8000
# Use different ports
ports:
- "8001:8000" # Use port 8001 instead
```
### Development vs Production
**Development features:**
- Live code reloading with volume mounts
- Direct port access
- uvicorn with `--reload`
- Exposed database/redis ports for debugging
**Production optimizations:**
- No volume mounts (code baked into image)
- Nginx reverse proxy
- Gunicorn with multiple workers
- Internal service networking only
- Resource limits and health checks
## Best Practices
### Development
- Use volume mounts for live code reloading
- Enable direct port access for debugging
- Use uvicorn with reload for fast development
- Enable optional services (pgAdmin) as needed
### Production
- Switch to gunicorn with multiple workers
- Use nginx for reverse proxy and load balancing
- Remove volume mounts and bake code into images
- Use internal networking only
- Set resource limits and health checks
### Security
- Containers run as non-root user
- Use internal networking for service communication
- Don't expose database/redis ports externally
- Use Docker secrets for sensitive data in production
### Monitoring
- Use `docker compose logs` to monitor services
- Set up health checks for all services
- Monitor resource usage with `docker stats`
- Use structured logging for better observability
The Docker setup provides everything you need for both development and production. Start with the default configuration and customize as your needs grow!

View File

@ -0,0 +1,692 @@
# Environment-Specific Configuration
Learn how to configure your FastAPI application for different environments (development, staging, production) with appropriate security, performance, and monitoring settings.
## Environment Types
The boilerplate supports three environment types:
- **`local`** - Development environment with full debugging
- **`staging`** - Pre-production testing environment
- **`production`** - Production environment with security hardening
Set the environment type with:
```env
ENVIRONMENT="local" # or "staging" or "production"
```
## Development Environment
### Local Development Settings
Create `src/.env.development`:
```env
# ------------- environment -------------
ENVIRONMENT="local"
DEBUG=true
# ------------- app settings -------------
APP_NAME="MyApp (Development)"
APP_VERSION="0.1.0-dev"
# ------------- database -------------
POSTGRES_USER="dev_user"
POSTGRES_PASSWORD="dev_password"
POSTGRES_SERVER="localhost"
POSTGRES_PORT=5432
POSTGRES_DB="myapp_dev"
# ------------- crypt -------------
SECRET_KEY="dev-secret-key-not-for-production-use"
ALGORITHM="HS256"
ACCESS_TOKEN_EXPIRE_MINUTES=60 # Longer for development
REFRESH_TOKEN_EXPIRE_DAYS=30 # Longer for development
# ------------- redis -------------
REDIS_CACHE_HOST="localhost"
REDIS_CACHE_PORT=6379
REDIS_QUEUE_HOST="localhost"
REDIS_QUEUE_PORT=6379
REDIS_RATE_LIMIT_HOST="localhost"
REDIS_RATE_LIMIT_PORT=6379
# ------------- caching -------------
CLIENT_CACHE_MAX_AGE=0 # Disable caching for development
# ------------- rate limiting -------------
DEFAULT_RATE_LIMIT_LIMIT=1000 # Higher limits for development
DEFAULT_RATE_LIMIT_PERIOD=3600
# ------------- admin -------------
ADMIN_NAME="Dev Admin"
ADMIN_EMAIL="admin@localhost"
ADMIN_USERNAME="admin"
ADMIN_PASSWORD="admin123"
# ------------- tier -------------
TIER_NAME="dev_tier"
# ------------- logging -------------
DATABASE_ECHO=true # Log all SQL queries
```
### Development Features
```python
# Development-specific features
if settings.ENVIRONMENT == "local":
# Enable detailed error pages
app.add_middleware(
CORSMiddleware,
allow_origins=["*"], # Allow all origins in development
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
# Enable API documentation
app.openapi_url = "/openapi.json"
app.docs_url = "/docs"
app.redoc_url = "/redoc"
```
### Docker Development Override
`docker-compose.override.yml`:
```yaml
version: '3.8'
services:
web:
environment:
- ENVIRONMENT=local
- DEBUG=true
- DATABASE_ECHO=true
volumes:
- ./src:/code/src:cached
command: uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
ports:
- "8000:8000"
db:
environment:
- POSTGRES_DB=myapp_dev
ports:
- "5432:5432"
redis:
ports:
- "6379:6379"
# Development tools
adminer:
image: adminer
ports:
- "8080:8080"
depends_on:
- db
```
## Staging Environment
### Staging Settings
Create `src/.env.staging`:
```env
# ------------- environment -------------
ENVIRONMENT="staging"
DEBUG=false
# ------------- app settings -------------
APP_NAME="MyApp (Staging)"
APP_VERSION="0.1.0-staging"
# ------------- database -------------
POSTGRES_USER="staging_user"
POSTGRES_PASSWORD="complex_staging_password_123!"
POSTGRES_SERVER="staging-db.example.com"
POSTGRES_PORT=5432
POSTGRES_DB="myapp_staging"
# ------------- crypt -------------
SECRET_KEY="staging-secret-key-different-from-production"
ALGORITHM="HS256"
ACCESS_TOKEN_EXPIRE_MINUTES=30
REFRESH_TOKEN_EXPIRE_DAYS=7
# ------------- redis -------------
REDIS_CACHE_HOST="staging-redis.example.com"
REDIS_CACHE_PORT=6379
REDIS_QUEUE_HOST="staging-redis.example.com"
REDIS_QUEUE_PORT=6379
REDIS_RATE_LIMIT_HOST="staging-redis.example.com"
REDIS_RATE_LIMIT_PORT=6379
# ------------- caching -------------
CLIENT_CACHE_MAX_AGE=300 # 5 minutes
# ------------- rate limiting -------------
DEFAULT_RATE_LIMIT_LIMIT=100
DEFAULT_RATE_LIMIT_PERIOD=3600
# ------------- admin -------------
ADMIN_NAME="Staging Admin"
ADMIN_EMAIL="admin@staging.example.com"
ADMIN_USERNAME="staging_admin"
ADMIN_PASSWORD="secure_staging_password_456!"
# ------------- tier -------------
TIER_NAME="staging_tier"
# ------------- logging -------------
DATABASE_ECHO=false
```
### Staging Features
```python
# Staging-specific features
if settings.ENVIRONMENT == "staging":
# Restricted CORS
app.add_middleware(
CORSMiddleware,
allow_origins=["https://staging.example.com"],
allow_credentials=True,
allow_methods=["GET", "POST", "PUT", "DELETE"],
allow_headers=["*"],
)
# API docs available to superusers only
@app.get("/docs", include_in_schema=False)
async def custom_swagger_ui(current_user: User = Depends(get_current_superuser)):
return get_swagger_ui_html(openapi_url="/openapi.json")
```
### Docker Staging Configuration
`docker-compose.staging.yml`:
```yaml
version: '3.8'
services:
web:
environment:
- ENVIRONMENT=staging
- DEBUG=false
deploy:
replicas: 2
resources:
limits:
memory: 1G
reservations:
memory: 512M
restart: always
db:
environment:
- POSTGRES_DB=myapp_staging
volumes:
- postgres_staging_data:/var/lib/postgresql/data
restart: always
redis:
restart: always
worker:
deploy:
replicas: 2
restart: always
volumes:
postgres_staging_data:
```
## Production Environment
### Production Settings
Create `src/.env.production`:
```env
# ------------- environment -------------
ENVIRONMENT="production"
DEBUG=false
# ------------- app settings -------------
APP_NAME="MyApp"
APP_VERSION="1.0.0"
CONTACT_NAME="Support Team"
CONTACT_EMAIL="support@example.com"
# ------------- database -------------
POSTGRES_USER="prod_user"
POSTGRES_PASSWORD="ultra_secure_production_password_789!"
POSTGRES_SERVER="prod-db.example.com"
POSTGRES_PORT=5433 # Custom port for security
POSTGRES_DB="myapp_production"
# ------------- crypt -------------
SECRET_KEY="ultra-secure-production-key-generated-with-openssl-rand-hex-32"
ALGORITHM="HS256"
ACCESS_TOKEN_EXPIRE_MINUTES=15 # Shorter for security
REFRESH_TOKEN_EXPIRE_DAYS=3 # Shorter for security
# ------------- redis -------------
REDIS_CACHE_HOST="prod-redis.example.com"
REDIS_CACHE_PORT=6380 # Custom port for security
REDIS_QUEUE_HOST="prod-redis.example.com"
REDIS_QUEUE_PORT=6380
REDIS_RATE_LIMIT_HOST="prod-redis.example.com"
REDIS_RATE_LIMIT_PORT=6380
# ------------- caching -------------
CLIENT_CACHE_MAX_AGE=3600 # 1 hour
# ------------- rate limiting -------------
DEFAULT_RATE_LIMIT_LIMIT=100
DEFAULT_RATE_LIMIT_PERIOD=3600
# ------------- admin -------------
ADMIN_NAME="System Administrator"
ADMIN_EMAIL="admin@example.com"
ADMIN_USERNAME="sysadmin"
ADMIN_PASSWORD="extremely_secure_admin_password_with_symbols_#$%!"
# ------------- tier -------------
TIER_NAME="production_tier"
# ------------- logging -------------
DATABASE_ECHO=false
```
### Production Security Features
```python
# Production-specific features
if settings.ENVIRONMENT == "production":
# Strict CORS
app.add_middleware(
CORSMiddleware,
allow_origins=["https://example.com", "https://www.example.com"],
allow_credentials=True,
allow_methods=["GET", "POST", "PUT", "DELETE"],
allow_headers=["Authorization", "Content-Type"],
)
# Disable API documentation
app.openapi_url = None
app.docs_url = None
app.redoc_url = None
# Add security headers
@app.middleware("http")
async def add_security_headers(request: Request, call_next):
response = await call_next(request)
response.headers["X-Content-Type-Options"] = "nosniff"
response.headers["X-Frame-Options"] = "DENY"
response.headers["X-XSS-Protection"] = "1; mode=block"
response.headers["Strict-Transport-Security"] = "max-age=31536000; includeSubDomains"
return response
```
### Docker Production Configuration
`docker-compose.prod.yml`:
```yaml
version: '3.8'
services:
web:
environment:
- ENVIRONMENT=production
- DEBUG=false
deploy:
replicas: 3
resources:
limits:
memory: 2G
cpus: '1'
reservations:
memory: 1G
cpus: '0.5'
restart: always
ports: [] # No direct exposure
nginx:
image: nginx:alpine
ports:
- "80:80"
- "443:443"
volumes:
- ./nginx/nginx.conf:/etc/nginx/nginx.conf
- ./nginx/ssl:/etc/nginx/ssl
- ./nginx/htpasswd:/etc/nginx/htpasswd
depends_on:
- web
restart: always
db:
environment:
- POSTGRES_DB=myapp_production
volumes:
- postgres_prod_data:/var/lib/postgresql/data
ports: [] # No external access
deploy:
resources:
limits:
memory: 4G
reservations:
memory: 2G
restart: always
redis:
volumes:
- redis_prod_data:/data
ports: [] # No external access
deploy:
resources:
limits:
memory: 1G
reservations:
memory: 512M
restart: always
worker:
deploy:
replicas: 2
resources:
limits:
memory: 1G
reservations:
memory: 512M
restart: always
volumes:
postgres_prod_data:
redis_prod_data:
```
## Environment Detection
### Runtime Environment Checks
```python
# src/app/core/config.py
class Settings(BaseSettings):
@computed_field
@property
def IS_DEVELOPMENT(self) -> bool:
return self.ENVIRONMENT == "local"
@computed_field
@property
def IS_PRODUCTION(self) -> bool:
return self.ENVIRONMENT == "production"
@computed_field
@property
def IS_STAGING(self) -> bool:
return self.ENVIRONMENT == "staging"
# Use in application
if settings.IS_DEVELOPMENT:
# Development-only code
pass
if settings.IS_PRODUCTION:
# Production-only code
pass
```
### Environment-Specific Validation
```python
@model_validator(mode="after")
def validate_environment_config(self) -> "Settings":
if self.ENVIRONMENT == "production":
# Production validation
if self.DEBUG:
raise ValueError("DEBUG must be False in production")
if len(self.SECRET_KEY) < 32:
raise ValueError("SECRET_KEY must be at least 32 characters in production")
if "dev" in self.SECRET_KEY.lower():
raise ValueError("Production SECRET_KEY cannot contain 'dev'")
if self.ENVIRONMENT == "local":
# Development warnings
if not self.DEBUG:
logger.warning("DEBUG is False in development environment")
return self
```
## Configuration Management
### Environment File Templates
Create template files for each environment:
```bash
# Create environment templates
cp src/.env.example src/.env.development
cp src/.env.example src/.env.staging
cp src/.env.example src/.env.production
# Use environment-specific files
ln -sf .env.development src/.env # For development
ln -sf .env.staging src/.env # For staging
ln -sf .env.production src/.env # For production
```
### Configuration Validation
```python
# src/scripts/validate_config.py
import asyncio
from src.app.core.config import settings
from src.app.core.db.database import async_get_db
async def validate_configuration():
"""Validate configuration for current environment."""
print(f"Validating configuration for {settings.ENVIRONMENT} environment...")
# Basic settings validation
assert settings.APP_NAME, "APP_NAME is required"
assert settings.SECRET_KEY, "SECRET_KEY is required"
assert len(settings.SECRET_KEY) >= 32, "SECRET_KEY must be at least 32 characters"
# Environment-specific validation
if settings.ENVIRONMENT == "production":
assert not settings.DEBUG, "DEBUG must be False in production"
assert "dev" not in settings.SECRET_KEY.lower(), "Production SECRET_KEY invalid"
assert settings.POSTGRES_PORT != 5432, "Use custom PostgreSQL port in production"
# Test database connection
try:
db = await anext(async_get_db())
print("✓ Database connection successful")
await db.close()
except Exception as e:
print(f"✗ Database connection failed: {e}")
return False
print("✓ Configuration validation passed")
return True
if __name__ == "__main__":
asyncio.run(validate_configuration())
```
### Environment Switching
```bash
#!/bin/bash
# scripts/switch_env.sh
ENV=$1
if [ -z "$ENV" ]; then
echo "Usage: $0 <development|staging|production>"
exit 1
fi
case $ENV in
development)
ln -sf .env.development src/.env
echo "Switched to development environment"
;;
staging)
ln -sf .env.staging src/.env
echo "Switched to staging environment"
;;
production)
ln -sf .env.production src/.env
echo "Switched to production environment"
echo "WARNING: Make sure to review all settings before deployment!"
;;
*)
echo "Invalid environment: $ENV"
echo "Valid options: development, staging, production"
exit 1
;;
esac
# Validate configuration
python -c "from src.app.core.config import settings; print(f'Current environment: {settings.ENVIRONMENT}')"
```
## Security Best Practices
### Environment-Specific Security
```python
# Different security levels per environment
SECURITY_CONFIGS = {
"local": {
"token_expire_minutes": 60,
"enable_cors_origins": ["*"],
"enable_docs": True,
"log_level": "DEBUG",
},
"staging": {
"token_expire_minutes": 30,
"enable_cors_origins": ["https://staging.example.com"],
"enable_docs": True, # For testing
"log_level": "INFO",
},
"production": {
"token_expire_minutes": 15,
"enable_cors_origins": ["https://example.com"],
"enable_docs": False,
"log_level": "WARNING",
}
}
config = SECURITY_CONFIGS[settings.ENVIRONMENT]
```
### Secrets Management
```bash
# Use secrets management in production
# Instead of plain text environment variables
POSTGRES_PASSWORD_FILE="/run/secrets/postgres_password"
SECRET_KEY_FILE="/run/secrets/jwt_secret"
# Docker secrets
services:
web:
secrets:
- postgres_password
- jwt_secret
environment:
- POSTGRES_PASSWORD_FILE=/run/secrets/postgres_password
- SECRET_KEY_FILE=/run/secrets/jwt_secret
secrets:
postgres_password:
external: true
jwt_secret:
external: true
```
## Monitoring and Logging
### Environment-Specific Logging
```python
LOGGING_CONFIG = {
"local": {
"level": "DEBUG",
"format": "%(asctime)s - %(name)s - %(levelname)s - %(message)s",
"handlers": ["console"],
},
"staging": {
"level": "INFO",
"format": "%(asctime)s - %(name)s - %(levelname)s - %(message)s",
"handlers": ["console", "file"],
},
"production": {
"level": "WARNING",
"format": "%(asctime)s - %(name)s - %(levelname)s - %(funcName)s:%(lineno)d - %(message)s",
"handlers": ["file", "syslog"],
}
}
```
### Health Checks by Environment
```python
@app.get("/health")
async def health_check():
health_info = {
"status": "healthy",
"environment": settings.ENVIRONMENT,
"version": settings.APP_VERSION,
}
# Add detailed info in non-production
if not settings.IS_PRODUCTION:
health_info.update({
"database": await check_database_health(),
"redis": await check_redis_health(),
"worker_queue": await check_worker_health(),
})
return health_info
```
## Best Practices
### Security
- Use different secret keys for each environment
- Disable debug mode in staging and production
- Use custom ports in production
- Implement proper CORS policies
- Remove API documentation in production
### Performance
- Configure appropriate resource limits per environment
- Use caching in staging and production
- Set shorter token expiration in production
- Use connection pooling in production
### Configuration
- Keep environment files in version control (except production)
- Use validation to prevent misconfiguration
- Document all environment-specific settings
- Test configuration changes in staging first
### Monitoring
- Use appropriate log levels per environment
- Monitor different metrics in each environment
- Set up alerts for production only
- Use health checks for all environments
Environment-specific configuration ensures your application runs securely and efficiently in each deployment stage. Start with development settings and progressively harden for production!

View File

@ -0,0 +1,651 @@
# Configuration Guide
This guide covers all configuration options available in the FastAPI Boilerplate, including environment variables, settings classes, and advanced deployment configurations.
## Configuration Overview
The boilerplate uses a layered configuration approach:
- **Environment Variables** (`.env` file) - Primary configuration method
- **Settings Classes** (`src/app/core/config.py`) - Python-based configuration
- **Docker Configuration** (`docker-compose.yml`) - Container orchestration
- **Database Configuration** (`alembic.ini`) - Database migrations
## Environment Variables Reference
All configuration is managed through environment variables defined in the `.env` file located in the `src/` directory.
### Application Settings
Basic application metadata displayed in API documentation:
```env
# ------------- app settings -------------
APP_NAME="Your App Name"
APP_DESCRIPTION="Your app description here"
APP_VERSION="0.1.0"
CONTACT_NAME="Your Name"
CONTACT_EMAIL="your.email@example.com"
LICENSE_NAME="MIT"
```
**Variables Explained:**
- `APP_NAME`: Displayed in API documentation and responses
- `APP_DESCRIPTION`: Shown in OpenAPI documentation
- `APP_VERSION`: API version for documentation and headers
- `CONTACT_NAME`: Contact information for API documentation
- `CONTACT_EMAIL`: Support email for API users
- `LICENSE_NAME`: License type for the API
### Database Configuration
PostgreSQL database connection settings:
```env
# ------------- database -------------
POSTGRES_USER="your_postgres_user"
POSTGRES_PASSWORD="your_secure_password"
POSTGRES_SERVER="localhost"
POSTGRES_PORT=5432
POSTGRES_DB="your_database_name"
```
**Variables Explained:**
- `POSTGRES_USER`: Database user with appropriate permissions
- `POSTGRES_PASSWORD`: Strong password for database access
- `POSTGRES_SERVER`: Hostname or IP of PostgreSQL server
- `POSTGRES_PORT`: PostgreSQL port (default: 5432)
- `POSTGRES_DB`: Name of the database to connect to
**Environment-Specific Values:**
```env
# Local development
POSTGRES_SERVER="localhost"
# Docker Compose
POSTGRES_SERVER="db"
# Production
POSTGRES_SERVER="your-prod-db-host.com"
```
### Security & Authentication
JWT and password security configuration:
```env
# ------------- crypt -------------
SECRET_KEY="your-super-secret-key-here"
ALGORITHM="HS256"
ACCESS_TOKEN_EXPIRE_MINUTES=30
REFRESH_TOKEN_EXPIRE_DAYS=7
```
**Variables Explained:**
- `SECRET_KEY`: Used for JWT token signing (generate with `openssl rand -hex 32`)
- `ALGORITHM`: JWT signing algorithm (HS256 recommended)
- `ACCESS_TOKEN_EXPIRE_MINUTES`: How long access tokens remain valid
- `REFRESH_TOKEN_EXPIRE_DAYS`: How long refresh tokens remain valid
!!! danger "Security Warning"
Never use default values in production. Generate a strong secret key:
```bash
openssl rand -hex 32
```
### Redis Configuration
Redis is used for caching, job queues, and rate limiting:
```env
# ------------- redis cache -------------
REDIS_CACHE_HOST="localhost" # Use "redis" for Docker Compose
REDIS_CACHE_PORT=6379
# ------------- redis queue -------------
REDIS_QUEUE_HOST="localhost" # Use "redis" for Docker Compose
REDIS_QUEUE_PORT=6379
# ------------- redis rate limit -------------
REDIS_RATE_LIMIT_HOST="localhost" # Use "redis" for Docker Compose
REDIS_RATE_LIMIT_PORT=6379
```
**Best Practices:**
- **Development**: Use the same Redis instance for all services
- **Production**: Use separate Redis instances for better isolation
```env
# Production example with separate instances
REDIS_CACHE_HOST="cache.redis.example.com"
REDIS_QUEUE_HOST="queue.redis.example.com"
REDIS_RATE_LIMIT_HOST="ratelimit.redis.example.com"
```
### Caching Settings
Client-side and server-side caching configuration:
```env
# ------------- redis client-side cache -------------
CLIENT_CACHE_MAX_AGE=30 # seconds
```
**Variables Explained:**
- `CLIENT_CACHE_MAX_AGE`: How long browsers should cache responses
### Rate Limiting
Default rate limiting configuration:
```env
# ------------- default rate limit settings -------------
DEFAULT_RATE_LIMIT_LIMIT=10 # requests per period
DEFAULT_RATE_LIMIT_PERIOD=3600 # period in seconds (1 hour)
```
**Variables Explained:**
- `DEFAULT_RATE_LIMIT_LIMIT`: Number of requests allowed per period
- `DEFAULT_RATE_LIMIT_PERIOD`: Time window in seconds
### Admin User
First superuser account configuration:
```env
# ------------- admin -------------
ADMIN_NAME="Admin User"
ADMIN_EMAIL="admin@example.com"
ADMIN_USERNAME="admin"
ADMIN_PASSWORD="secure_admin_password"
```
**Variables Explained:**
- `ADMIN_NAME`: Display name for the admin user
- `ADMIN_EMAIL`: Email address for the admin account
- `ADMIN_USERNAME`: Username for admin login
- `ADMIN_PASSWORD`: Initial password (change after first login)
### User Tiers
Initial tier configuration:
```env
# ------------- first tier -------------
TIER_NAME="free"
```
**Variables Explained:**
- `TIER_NAME`: Name of the default user tier
### Environment Type
Controls API documentation visibility and behavior:
```env
# ------------- environment -------------
ENVIRONMENT="local" # local, staging, or production
```
**Environment Types:**
- **local**: Full API docs available publicly at `/docs`
- **staging**: API docs available to superusers only
- **production**: API docs completely disabled
## Docker Compose Configuration
### Basic Setup
Docker Compose automatically loads the `.env` file:
```yaml
# In docker-compose.yml
services:
web:
env_file:
- ./src/.env
```
### Development Overrides
Create `docker-compose.override.yml` for local customizations:
```yaml
version: '3.8'
services:
web:
ports:
- "8001:8000" # Use different port
environment:
- DEBUG=true
volumes:
- ./custom-logs:/code/logs
```
### Service Configuration
Understanding each Docker service:
```yaml
services:
web: # FastAPI application
db: # PostgreSQL database
redis: # Redis for caching/queues
worker: # ARQ background task worker
nginx: # Reverse proxy (optional)
```
## Python Settings Classes
Advanced configuration is handled in `src/app/core/config.py`:
### Settings Composition
The main `Settings` class inherits from multiple setting groups:
```python
class Settings(
AppSettings,
PostgresSettings,
CryptSettings,
FirstUserSettings,
RedisCacheSettings,
ClientSideCacheSettings,
RedisQueueSettings,
RedisRateLimiterSettings,
DefaultRateLimitSettings,
EnvironmentSettings,
):
pass
```
### Adding Custom Settings
Create your own settings group:
```python
class CustomSettings(BaseSettings):
CUSTOM_API_KEY: str = ""
CUSTOM_TIMEOUT: int = 30
ENABLE_FEATURE_X: bool = False
# Add to main Settings class
class Settings(
AppSettings,
# ... other settings ...
CustomSettings,
):
pass
```
### Opting Out of Services
Remove unused services by excluding their settings:
```python
# Minimal setup without Redis services
class Settings(
AppSettings,
PostgresSettings,
CryptSettings,
FirstUserSettings,
# Removed: RedisCacheSettings
# Removed: RedisQueueSettings
# Removed: RedisRateLimiterSettings
EnvironmentSettings,
):
pass
```
## Database Configuration
### Alembic Configuration
Database migrations are configured in `src/alembic.ini`:
```ini
[alembic]
script_location = migrations
sqlalchemy.url = postgresql://%(POSTGRES_USER)s:%(POSTGRES_PASSWORD)s@%(POSTGRES_SERVER)s:%(POSTGRES_PORT)s/%(POSTGRES_DB)s
```
### Connection Pooling
SQLAlchemy connection pool settings in `src/app/core/db/database.py`:
```python
engine = create_async_engine(
DATABASE_URL,
pool_size=20, # Number of connections to maintain
max_overflow=30, # Additional connections allowed
pool_timeout=30, # Seconds to wait for connection
pool_recycle=1800, # Seconds before connection refresh
)
```
### Database Best Practices
**Connection Pool Sizing:**
- Start with `pool_size=20`, `max_overflow=30`
- Monitor connection usage and adjust based on load
- Use connection pooling monitoring tools
**Migration Strategy:**
- Always backup database before running migrations
- Test migrations on staging environment first
- Use `alembic revision --autogenerate` for model changes
## Security Configuration
### JWT Token Configuration
Customize JWT behavior in `src/app/core/security.py`:
```python
def create_access_token(data: dict, expires_delta: timedelta = None):
to_encode = data.copy()
if expires_delta:
expire = datetime.utcnow() + expires_delta
else:
expire = datetime.utcnow() + timedelta(
minutes=settings.ACCESS_TOKEN_EXPIRE_MINUTES
)
```
### CORS Configuration
Configure Cross-Origin Resource Sharing in `src/app/main.py`:
```python
app.add_middleware(
CORSMiddleware,
allow_origins=["http://localhost:3000"], # Specify allowed origins
allow_credentials=True,
allow_methods=["GET", "POST"], # Specify allowed methods
allow_headers=["*"],
)
```
**Production CORS Settings:**
```python
# Never use wildcard (*) in production
allow_origins=[
"https://yourapp.com",
"https://www.yourapp.com"
],
```
### Security Headers
Add security headers middleware:
```python
from starlette.middleware.base import BaseHTTPMiddleware
class SecurityHeadersMiddleware(BaseHTTPMiddleware):
async def dispatch(self, request, call_next):
response = await call_next(request)
response.headers["X-Frame-Options"] = "DENY"
response.headers["X-Content-Type-Options"] = "nosniff"
response.headers["X-XSS-Protection"] = "1; mode=block"
return response
```
## Logging Configuration
### Basic Logging Setup
Configure logging in `src/app/core/logger.py`:
```python
import logging
from logging.handlers import RotatingFileHandler
# Set log level
LOGGING_LEVEL = logging.INFO
# Configure file rotation
file_handler = RotatingFileHandler(
'logs/app.log',
maxBytes=10485760, # 10MB
backupCount=5 # Keep 5 backup files
)
```
### Structured Logging
Use structured logging for better observability:
```python
import structlog
structlog.configure(
processors=[
structlog.stdlib.filter_by_level,
structlog.stdlib.add_logger_name,
structlog.stdlib.add_log_level,
structlog.processors.JSONRenderer()
],
logger_factory=structlog.stdlib.LoggerFactory(),
)
```
### Log Levels by Environment
```python
# Environment-specific log levels
LOG_LEVELS = {
"local": logging.DEBUG,
"staging": logging.INFO,
"production": logging.WARNING
}
LOGGING_LEVEL = LOG_LEVELS.get(settings.ENVIRONMENT, logging.INFO)
```
## Environment-Specific Configurations
### Development (.env.development)
```env
ENVIRONMENT="local"
POSTGRES_SERVER="localhost"
REDIS_CACHE_HOST="localhost"
SECRET_KEY="dev-secret-key-not-for-production"
ACCESS_TOKEN_EXPIRE_MINUTES=60 # Longer for development
DEBUG=true
```
### Staging (.env.staging)
```env
ENVIRONMENT="staging"
POSTGRES_SERVER="staging-db.example.com"
REDIS_CACHE_HOST="staging-redis.example.com"
SECRET_KEY="staging-secret-key-different-from-prod"
ACCESS_TOKEN_EXPIRE_MINUTES=30
DEBUG=false
```
### Production (.env.production)
```env
ENVIRONMENT="production"
POSTGRES_SERVER="prod-db.example.com"
REDIS_CACHE_HOST="prod-redis.example.com"
SECRET_KEY="ultra-secure-production-key-generated-with-openssl"
ACCESS_TOKEN_EXPIRE_MINUTES=15
DEBUG=false
REDIS_CACHE_PORT=6380 # Custom port for security
POSTGRES_PORT=5433 # Custom port for security
```
## Advanced Configuration
### Custom Middleware
Add custom middleware in `src/app/core/setup.py`:
```python
def create_application(router, settings, **kwargs):
app = FastAPI(...)
# Add custom middleware
app.add_middleware(CustomMiddleware, setting=value)
app.add_middleware(TimingMiddleware)
app.add_middleware(RequestIDMiddleware)
return app
```
### Feature Toggles
Implement feature flags:
```python
class FeatureSettings(BaseSettings):
ENABLE_ADVANCED_CACHING: bool = False
ENABLE_ANALYTICS: bool = True
ENABLE_EXPERIMENTAL_FEATURES: bool = False
ENABLE_API_VERSIONING: bool = True
# Use in endpoints
if settings.ENABLE_ADVANCED_CACHING:
# Advanced caching logic
pass
```
### Health Checks
Configure health check endpoints:
```python
@app.get("/health")
async def health_check():
return {
"status": "healthy",
"database": await check_database_health(),
"redis": await check_redis_health(),
"version": settings.APP_VERSION
}
```
## Configuration Validation
### Environment Validation
Add validation to prevent misconfiguration:
```python
def validate_settings():
if not settings.SECRET_KEY:
raise ValueError("SECRET_KEY must be set")
if settings.ENVIRONMENT == "production":
if settings.SECRET_KEY == "dev-secret-key":
raise ValueError("Production must use secure SECRET_KEY")
if settings.DEBUG:
raise ValueError("DEBUG must be False in production")
```
### Runtime Checks
Add validation to application startup:
```python
@app.on_event("startup")
async def startup_event():
validate_settings()
await check_database_connection()
await check_redis_connection()
logger.info(f"Application started in {settings.ENVIRONMENT} mode")
```
## Configuration Troubleshooting
### Common Issues
**Environment Variables Not Loading:**
```bash
# Check file location and permissions
ls -la src/.env
# Check file format (no spaces around =)
cat src/.env | grep "=" | head -5
# Verify environment loading in Python
python -c "from src.app.core.config import settings; print(settings.APP_NAME)"
```
**Database Connection Failed:**
```bash
# Test connection manually
psql -h localhost -U postgres -d myapp
# Check if PostgreSQL is running
systemctl status postgresql
# or on macOS
brew services list | grep postgresql
```
**Redis Connection Failed:**
```bash
# Test Redis connection
redis-cli -h localhost -p 6379 ping
# Check Redis status
systemctl status redis
# or on macOS
brew services list | grep redis
```
### Configuration Testing
Test your configuration with a simple script:
```python
# test_config.py
import asyncio
from src.app.core.config import settings
from src.app.core.db.database import async_get_db
async def test_config():
print(f"App: {settings.APP_NAME}")
print(f"Environment: {settings.ENVIRONMENT}")
# Test database
try:
db = await anext(async_get_db())
print("✓ Database connection successful")
await db.close()
except Exception as e:
print(f"✗ Database connection failed: {e}")
# Test Redis (if enabled)
try:
from src.app.core.utils.cache import redis_client
await redis_client.ping()
print("✓ Redis connection successful")
except Exception as e:
print(f"✗ Redis connection failed: {e}")
if __name__ == "__main__":
asyncio.run(test_config())
```
Run with:
```bash
uv run python test_config.py
```

View File

@ -0,0 +1,311 @@
# Configuration
Learn how to configure your FastAPI Boilerplate application for different environments and use cases. Everything is configured through environment variables and Python settings classes.
## What You'll Learn
- **[Environment Variables](environment-variables.md)** - Configure through `.env` files
- **[Settings Classes](settings-classes.md)** - Python-based configuration management
- **[Docker Setup](docker-setup.md)** - Container and service configuration
- **[Environment-Specific](environment-specific.md)** - Development, staging, and production configs
## Quick Start
The boilerplate uses environment variables as the primary configuration method:
```bash
# Copy the example file
cp src/.env.example src/.env
# Edit with your values
nano src/.env
```
Essential variables to set:
```env
# Application
APP_NAME="My FastAPI App"
SECRET_KEY="your-super-secret-key-here"
# Database
POSTGRES_USER="your_user"
POSTGRES_PASSWORD="your_password"
POSTGRES_DB="your_database"
# Admin Account
ADMIN_EMAIL="admin@example.com"
ADMIN_PASSWORD="secure_password"
```
## Configuration Architecture
The configuration system has three layers:
```
Environment Variables (.env files)
Settings Classes (Python validation)
Application Configuration (Runtime)
```
### Layer 1: Environment Variables
Primary configuration through `.env` files:
```env
POSTGRES_USER="myuser"
POSTGRES_PASSWORD="mypassword"
REDIS_CACHE_HOST="localhost"
SECRET_KEY="your-secret-key"
```
### Layer 2: Settings Classes
Python classes that validate and structure configuration:
```python
class PostgresSettings(BaseSettings):
POSTGRES_USER: str
POSTGRES_PASSWORD: str = Field(min_length=8)
POSTGRES_SERVER: str = "localhost"
POSTGRES_PORT: int = 5432
POSTGRES_DB: str
```
### Layer 3: Application Use
Configuration injected throughout the application:
```python
from app.core.config import settings
# Use anywhere in your code
DATABASE_URL = f"postgresql+asyncpg://{settings.POSTGRES_USER}:{settings.POSTGRES_PASSWORD}@{settings.POSTGRES_SERVER}:{settings.POSTGRES_PORT}/{settings.POSTGRES_DB}"
```
## Key Configuration Areas
### Security Settings
```env
SECRET_KEY="your-super-secret-key-here"
ALGORITHM="HS256"
ACCESS_TOKEN_EXPIRE_MINUTES=30
REFRESH_TOKEN_EXPIRE_DAYS=7
```
### Database Configuration
```env
POSTGRES_USER="your_user"
POSTGRES_PASSWORD="your_password"
POSTGRES_SERVER="localhost"
POSTGRES_PORT=5432
POSTGRES_DB="your_database"
```
### Redis Services
```env
# Cache
REDIS_CACHE_HOST="localhost"
REDIS_CACHE_PORT=6379
# Background jobs
REDIS_QUEUE_HOST="localhost"
REDIS_QUEUE_PORT=6379
# Rate limiting
REDIS_RATE_LIMIT_HOST="localhost"
REDIS_RATE_LIMIT_PORT=6379
```
### Application Settings
```env
APP_NAME="Your App Name"
APP_VERSION="1.0.0"
ENVIRONMENT="local" # local, staging, production
DEBUG=true
```
### Rate Limiting
```env
DEFAULT_RATE_LIMIT_LIMIT=100
DEFAULT_RATE_LIMIT_PERIOD=3600 # 1 hour in seconds
```
### Admin User
```env
ADMIN_NAME="Admin User"
ADMIN_EMAIL="admin@example.com"
ADMIN_USERNAME="admin"
ADMIN_PASSWORD="secure_password"
```
## Environment-Specific Configurations
### Development
```env
ENVIRONMENT="local"
DEBUG=true
POSTGRES_SERVER="localhost"
REDIS_CACHE_HOST="localhost"
ACCESS_TOKEN_EXPIRE_MINUTES=60 # Longer for development
```
### Staging
```env
ENVIRONMENT="staging"
DEBUG=false
POSTGRES_SERVER="staging-db.example.com"
REDIS_CACHE_HOST="staging-redis.example.com"
ACCESS_TOKEN_EXPIRE_MINUTES=30
```
### Production
```env
ENVIRONMENT="production"
DEBUG=false
POSTGRES_SERVER="prod-db.example.com"
REDIS_CACHE_HOST="prod-redis.example.com"
ACCESS_TOKEN_EXPIRE_MINUTES=15
# Use custom ports for security
POSTGRES_PORT=5433
REDIS_CACHE_PORT=6380
```
## Docker Configuration
### Basic Setup
Docker Compose automatically loads your `.env` file:
```yaml
services:
web:
env_file:
- ./src/.env
environment:
- DATABASE_URL=postgresql+asyncpg://${POSTGRES_USER}:${POSTGRES_PASSWORD}@db:5432/${POSTGRES_DB}
```
### Service Overview
```yaml
services:
web: # FastAPI application
db: # PostgreSQL database
redis: # Redis for caching/queues
worker: # Background task worker
```
## Common Configuration Patterns
### Feature Flags
```python
# In settings class
class FeatureSettings(BaseSettings):
ENABLE_CACHING: bool = True
ENABLE_ANALYTICS: bool = False
ENABLE_BACKGROUND_JOBS: bool = True
# Use in code
if settings.ENABLE_CACHING:
cache_result = await get_from_cache(key)
```
### Environment Detection
```python
@app.get("/docs", include_in_schema=False)
async def custom_swagger_ui():
if settings.ENVIRONMENT == "production":
raise HTTPException(404, "Documentation not available")
return get_swagger_ui_html(openapi_url="/openapi.json")
```
### Health Checks
```python
@app.get("/health")
async def health_check():
return {
"status": "healthy",
"environment": settings.ENVIRONMENT,
"version": settings.APP_VERSION,
"database": await check_database_health(),
"redis": await check_redis_health()
}
```
## Quick Configuration Tasks
### Generate Secret Key
```bash
# Generate a secure secret key
openssl rand -hex 32
```
### Test Configuration
```python
# test_config.py
from app.core.config import settings
print(f"App: {settings.APP_NAME}")
print(f"Environment: {settings.ENVIRONMENT}")
print(f"Database: {settings.POSTGRES_DB}")
```
### Environment File Templates
```bash
# Development
cp src/.env.example src/.env.development
# Staging
cp src/.env.example src/.env.staging
# Production
cp src/.env.example src/.env.production
```
## Best Practices
### Security
- Never commit `.env` files to version control
- Use different secret keys for each environment
- Disable debug mode in production
- Use secure passwords and keys
### Performance
- Configure appropriate connection pool sizes
- Set reasonable token expiration times
- Use Redis for caching in production
- Configure proper rate limits
### Maintenance
- Document all custom environment variables
- Use validation in settings classes
- Test configurations in staging first
- Monitor configuration changes
### Testing
- Use separate test environment variables
- Mock external services in tests
- Validate configuration on startup
- Test with different environment combinations
## Getting Started
Follow this path to configure your application:
### 1. **[Environment Variables](environment-variables.md)** - Start here
Learn about all available environment variables, their purposes, and recommended values for different environments.
### 2. **[Settings Classes](settings-classes.md)** - Validation layer
Understand how Python settings classes validate and structure your configuration with type hints and validation rules.
### 3. **[Docker Setup](docker-setup.md)** - Container configuration
Configure Docker Compose services, networking, and environment-specific overrides.
### 4. **[Environment-Specific](environment-specific.md)** - Deployment configs
Set up configuration for development, staging, and production environments with best practices.
## What's Next
Each guide provides practical examples and copy-paste configurations:
1. **[Environment Variables](environment-variables.md)** - Complete reference and examples
2. **[Settings Classes](settings-classes.md)** - Custom validation and organization
3. **[Docker Setup](docker-setup.md)** - Service configuration and overrides
4. **[Environment-Specific](environment-specific.md)** - Production-ready configurations
The boilerplate provides sensible defaults - just customize what you need!

View File

@ -0,0 +1,537 @@
# Settings Classes
Learn how Python settings classes validate, structure, and organize your application configuration. The boilerplate uses Pydantic's `BaseSettings` for type-safe configuration management.
## Settings Architecture
The main `Settings` class inherits from multiple specialized setting groups:
```python
# src/app/core/config.py
class Settings(
AppSettings,
PostgresSettings,
CryptSettings,
FirstUserSettings,
RedisCacheSettings,
ClientSideCacheSettings,
RedisQueueSettings,
RedisRateLimiterSettings,
DefaultRateLimitSettings,
EnvironmentSettings,
):
pass
# Single instance used throughout the app
settings = Settings()
```
## Built-in Settings Groups
### Application Settings
Basic app metadata and configuration:
```python
class AppSettings(BaseSettings):
APP_NAME: str = "FastAPI"
APP_DESCRIPTION: str = "A FastAPI project"
APP_VERSION: str = "0.1.0"
CONTACT_NAME: str = "Your Name"
CONTACT_EMAIL: str = "your.email@example.com"
LICENSE_NAME: str = "MIT"
```
### Database Settings
PostgreSQL connection configuration:
```python
class PostgresSettings(BaseSettings):
POSTGRES_USER: str
POSTGRES_PASSWORD: str
POSTGRES_SERVER: str = "localhost"
POSTGRES_PORT: int = 5432
POSTGRES_DB: str
@computed_field
@property
def DATABASE_URL(self) -> str:
return (
f"postgresql+asyncpg://{self.POSTGRES_USER}:"
f"{self.POSTGRES_PASSWORD}@{self.POSTGRES_SERVER}:"
f"{self.POSTGRES_PORT}/{self.POSTGRES_DB}"
)
```
### Security Settings
JWT and authentication configuration:
```python
class CryptSettings(BaseSettings):
SECRET_KEY: str
ALGORITHM: str = "HS256"
ACCESS_TOKEN_EXPIRE_MINUTES: int = 30
REFRESH_TOKEN_EXPIRE_DAYS: int = 7
@field_validator("SECRET_KEY")
@classmethod
def validate_secret_key(cls, v: str) -> str:
if len(v) < 32:
raise ValueError("SECRET_KEY must be at least 32 characters")
return v
```
### Redis Settings
Separate Redis instances for different services:
```python
class RedisCacheSettings(BaseSettings):
REDIS_CACHE_HOST: str = "localhost"
REDIS_CACHE_PORT: int = 6379
class RedisQueueSettings(BaseSettings):
REDIS_QUEUE_HOST: str = "localhost"
REDIS_QUEUE_PORT: int = 6379
class RedisRateLimiterSettings(BaseSettings):
REDIS_RATE_LIMIT_HOST: str = "localhost"
REDIS_RATE_LIMIT_PORT: int = 6379
```
### Rate Limiting Settings
Default rate limiting configuration:
```python
class DefaultRateLimitSettings(BaseSettings):
DEFAULT_RATE_LIMIT_LIMIT: int = 10
DEFAULT_RATE_LIMIT_PERIOD: int = 3600 # 1 hour
```
### Admin User Settings
First superuser account creation:
```python
class FirstUserSettings(BaseSettings):
ADMIN_NAME: str = "Admin"
ADMIN_EMAIL: str
ADMIN_USERNAME: str = "admin"
ADMIN_PASSWORD: str
@field_validator("ADMIN_EMAIL")
@classmethod
def validate_admin_email(cls, v: str) -> str:
if "@" not in v:
raise ValueError("ADMIN_EMAIL must be a valid email")
return v
```
## Creating Custom Settings
### Basic Custom Settings
Add your own settings group:
```python
class CustomSettings(BaseSettings):
CUSTOM_API_KEY: str = ""
CUSTOM_TIMEOUT: int = 30
ENABLE_FEATURE_X: bool = False
MAX_UPLOAD_SIZE: int = 10485760 # 10MB
@field_validator("MAX_UPLOAD_SIZE")
@classmethod
def validate_upload_size(cls, v: int) -> int:
if v < 1024: # 1KB minimum
raise ValueError("MAX_UPLOAD_SIZE must be at least 1KB")
if v > 104857600: # 100MB maximum
raise ValueError("MAX_UPLOAD_SIZE cannot exceed 100MB")
return v
# Add to main Settings class
class Settings(
AppSettings,
PostgresSettings,
# ... other settings ...
CustomSettings, # Add your custom settings
):
pass
```
### Advanced Custom Settings
Settings with complex validation and computed fields:
```python
class EmailSettings(BaseSettings):
SMTP_HOST: str = ""
SMTP_PORT: int = 587
SMTP_USERNAME: str = ""
SMTP_PASSWORD: str = ""
SMTP_USE_TLS: bool = True
EMAIL_FROM: str = ""
EMAIL_FROM_NAME: str = ""
@computed_field
@property
def EMAIL_ENABLED(self) -> bool:
return bool(self.SMTP_HOST and self.SMTP_USERNAME)
@model_validator(mode="after")
def validate_email_config(self) -> "EmailSettings":
if self.SMTP_HOST and not self.EMAIL_FROM:
raise ValueError("EMAIL_FROM required when SMTP_HOST is set")
if self.SMTP_USERNAME and not self.SMTP_PASSWORD:
raise ValueError("SMTP_PASSWORD required when SMTP_USERNAME is set")
return self
```
### Feature Flag Settings
Organize feature toggles:
```python
class FeatureSettings(BaseSettings):
# Core features
ENABLE_CACHING: bool = True
ENABLE_RATE_LIMITING: bool = True
ENABLE_BACKGROUND_JOBS: bool = True
# Optional features
ENABLE_ANALYTICS: bool = False
ENABLE_EMAIL_NOTIFICATIONS: bool = False
ENABLE_FILE_UPLOADS: bool = False
# Experimental features
ENABLE_EXPERIMENTAL_API: bool = False
ENABLE_BETA_FEATURES: bool = False
@model_validator(mode="after")
def validate_feature_dependencies(self) -> "FeatureSettings":
if self.ENABLE_EMAIL_NOTIFICATIONS and not self.ENABLE_BACKGROUND_JOBS:
raise ValueError("Email notifications require background jobs")
return self
```
## Settings Validation
### Field Validation
Validate individual fields:
```python
class DatabaseSettings(BaseSettings):
DB_POOL_SIZE: int = 20
DB_MAX_OVERFLOW: int = 30
DB_TIMEOUT: int = 30
@field_validator("DB_POOL_SIZE")
@classmethod
def validate_pool_size(cls, v: int) -> int:
if v < 1:
raise ValueError("Pool size must be at least 1")
if v > 100:
raise ValueError("Pool size should not exceed 100")
return v
@field_validator("DB_TIMEOUT")
@classmethod
def validate_timeout(cls, v: int) -> int:
if v < 5:
raise ValueError("Timeout must be at least 5 seconds")
return v
```
### Model Validation
Validate across multiple fields:
```python
class SecuritySettings(BaseSettings):
ENABLE_HTTPS: bool = False
SSL_CERT_PATH: str = ""
SSL_KEY_PATH: str = ""
FORCE_SSL: bool = False
@model_validator(mode="after")
def validate_ssl_config(self) -> "SecuritySettings":
if self.ENABLE_HTTPS:
if not self.SSL_CERT_PATH:
raise ValueError("SSL_CERT_PATH required when HTTPS enabled")
if not self.SSL_KEY_PATH:
raise ValueError("SSL_KEY_PATH required when HTTPS enabled")
if self.FORCE_SSL and not self.ENABLE_HTTPS:
raise ValueError("Cannot force SSL without enabling HTTPS")
return self
```
### Environment-Specific Validation
Different validation rules per environment:
```python
class EnvironmentSettings(BaseSettings):
ENVIRONMENT: str = "local"
DEBUG: bool = True
@model_validator(mode="after")
def validate_environment_config(self) -> "EnvironmentSettings":
if self.ENVIRONMENT == "production":
if self.DEBUG:
raise ValueError("DEBUG must be False in production")
if self.ENVIRONMENT not in ["local", "staging", "production"]:
raise ValueError("ENVIRONMENT must be local, staging, or production")
return self
```
## Computed Properties
### Dynamic Configuration
Create computed values from other settings:
```python
class StorageSettings(BaseSettings):
STORAGE_TYPE: str = "local" # local, s3, gcs
# Local storage
LOCAL_STORAGE_PATH: str = "./uploads"
# S3 settings
AWS_ACCESS_KEY_ID: str = ""
AWS_SECRET_ACCESS_KEY: str = ""
AWS_BUCKET_NAME: str = ""
AWS_REGION: str = "us-east-1"
@computed_field
@property
def STORAGE_ENABLED(self) -> bool:
if self.STORAGE_TYPE == "local":
return bool(self.LOCAL_STORAGE_PATH)
elif self.STORAGE_TYPE == "s3":
return bool(self.AWS_ACCESS_KEY_ID and self.AWS_SECRET_ACCESS_KEY and self.AWS_BUCKET_NAME)
return False
@computed_field
@property
def STORAGE_CONFIG(self) -> dict:
if self.STORAGE_TYPE == "local":
return {"path": self.LOCAL_STORAGE_PATH}
elif self.STORAGE_TYPE == "s3":
return {
"bucket": self.AWS_BUCKET_NAME,
"region": self.AWS_REGION,
"credentials": {
"access_key": self.AWS_ACCESS_KEY_ID,
"secret_key": self.AWS_SECRET_ACCESS_KEY,
}
}
return {}
```
## Organizing Settings
### Service-Based Organization
Group settings by service or domain:
```python
# Authentication service settings
class AuthSettings(BaseSettings):
JWT_SECRET_KEY: str
JWT_ALGORITHM: str = "HS256"
ACCESS_TOKEN_EXPIRE: int = 30
REFRESH_TOKEN_EXPIRE: int = 7200
PASSWORD_MIN_LENGTH: int = 8
# Notification service settings
class NotificationSettings(BaseSettings):
EMAIL_ENABLED: bool = False
SMS_ENABLED: bool = False
PUSH_ENABLED: bool = False
# Email settings
SMTP_HOST: str = ""
SMTP_PORT: int = 587
# SMS settings (example with Twilio)
TWILIO_ACCOUNT_SID: str = ""
TWILIO_AUTH_TOKEN: str = ""
# Main settings
class Settings(
AppSettings,
AuthSettings,
NotificationSettings,
# ... other settings
):
pass
```
### Conditional Settings Loading
Load different settings based on environment:
```python
class BaseAppSettings(BaseSettings):
APP_NAME: str = "FastAPI App"
DEBUG: bool = False
class DevelopmentSettings(BaseAppSettings):
DEBUG: bool = True
LOG_LEVEL: str = "DEBUG"
DATABASE_ECHO: bool = True
class ProductionSettings(BaseAppSettings):
DEBUG: bool = False
LOG_LEVEL: str = "WARNING"
DATABASE_ECHO: bool = False
def get_settings() -> BaseAppSettings:
environment = os.getenv("ENVIRONMENT", "local")
if environment == "production":
return ProductionSettings()
else:
return DevelopmentSettings()
settings = get_settings()
```
## Removing Unused Services
### Minimal Configuration
Remove services you don't need:
```python
# Minimal setup without Redis services
class MinimalSettings(
AppSettings,
PostgresSettings,
CryptSettings,
FirstUserSettings,
# Removed: RedisCacheSettings
# Removed: RedisQueueSettings
# Removed: RedisRateLimiterSettings
EnvironmentSettings,
):
pass
```
### Service Feature Flags
Use feature flags to conditionally enable services:
```python
class ServiceSettings(BaseSettings):
ENABLE_REDIS: bool = True
ENABLE_CELERY: bool = True
ENABLE_MONITORING: bool = False
class ConditionalSettings(
AppSettings,
PostgresSettings,
CryptSettings,
ServiceSettings,
):
# Add Redis settings only if enabled
def __init__(self, **kwargs):
super().__init__(**kwargs)
if self.ENABLE_REDIS:
# Dynamically add Redis settings
self.__class__ = type(
"ConditionalSettings",
(self.__class__, RedisCacheSettings),
{}
)
```
## Testing Settings
### Test Configuration
Create separate settings for testing:
```python
class TestSettings(BaseSettings):
# Override database for testing
POSTGRES_DB: str = "test_database"
# Disable external services
ENABLE_REDIS: bool = False
ENABLE_EMAIL: bool = False
# Speed up tests
ACCESS_TOKEN_EXPIRE_MINUTES: int = 5
# Test-specific settings
TEST_USER_EMAIL: str = "test@example.com"
TEST_USER_PASSWORD: str = "testpassword123"
# Use in tests
@pytest.fixture
def test_settings():
return TestSettings()
```
### Settings Validation Testing
Test your custom settings:
```python
def test_custom_settings_validation():
# Test valid configuration
settings = CustomSettings(
CUSTOM_API_KEY="test-key",
CUSTOM_TIMEOUT=60,
MAX_UPLOAD_SIZE=5242880 # 5MB
)
assert settings.CUSTOM_TIMEOUT == 60
# Test validation error
with pytest.raises(ValueError, match="MAX_UPLOAD_SIZE cannot exceed 100MB"):
CustomSettings(MAX_UPLOAD_SIZE=209715200) # 200MB
def test_settings_computed_fields():
settings = StorageSettings(
STORAGE_TYPE="s3",
AWS_ACCESS_KEY_ID="test-key",
AWS_SECRET_ACCESS_KEY="test-secret",
AWS_BUCKET_NAME="test-bucket"
)
assert settings.STORAGE_ENABLED is True
assert settings.STORAGE_CONFIG["bucket"] == "test-bucket"
```
## Best Practices
### Organization
- Group related settings in dedicated classes
- Use descriptive names for settings groups
- Keep validation logic close to the settings
- Document complex validation rules
### Security
- Validate sensitive settings like secret keys
- Never set default values for secrets in production
- Use computed fields to derive connection strings
- Separate test and production configurations
### Performance
- Use `@computed_field` for expensive calculations
- Cache settings instances appropriately
- Avoid complex validation in hot paths
- Use model validators for cross-field validation
### Testing
- Create separate test settings classes
- Test all validation rules
- Mock external service settings in tests
- Use dependency injection for settings in tests
The settings system provides type safety, validation, and organization for your application configuration. Start with the built-in settings and extend them as your application grows!

View File

@ -0,0 +1,491 @@
# CRUD Operations
This guide covers all CRUD (Create, Read, Update, Delete) operations available in the FastAPI Boilerplate using FastCRUD, a powerful library that provides consistent and efficient database operations.
## Overview
The boilerplate uses [FastCRUD](https://github.com/igorbenav/fastcrud) for all database operations. FastCRUD provides:
- **Consistent API** across all models
- **Type safety** with generic type parameters
- **Automatic pagination** support
- **Advanced filtering** and joining capabilities
- **Soft delete** support
- **Optimized queries** with selective field loading
## CRUD Class Structure
Each model has a corresponding CRUD class that defines the available operations:
```python
# src/app/crud/crud_users.py
from fastcrud import FastCRUD
from app.models.user import User
from app.schemas.user import (
UserCreateInternal, UserUpdate, UserUpdateInternal,
UserDelete, UserRead
)
CRUDUser = FastCRUD[
User, # Model class
UserCreateInternal, # Create schema
UserUpdate, # Update schema
UserUpdateInternal, # Internal update schema
UserDelete, # Delete schema
UserRead # Read schema
]
crud_users = CRUDUser(User)
```
## Read Operations
### Get Single Record
Retrieve a single record by any field:
```python
# Get user by ID
user = await crud_users.get(db=db, id=user_id)
# Get user by username
user = await crud_users.get(db=db, username="john_doe")
# Get user by email
user = await crud_users.get(db=db, email="john@example.com")
# Get with specific fields only
user = await crud_users.get(
db=db,
schema_to_select=UserRead, # Only select fields defined in UserRead
id=user_id,
)
```
**Real usage from the codebase:**
```python
# From src/app/api/v1/users.py
db_user = await crud_users.get(
db=db,
schema_to_select=UserRead,
username=username,
is_deleted=False,
)
```
### Get Multiple Records
Retrieve multiple records with filtering and pagination:
```python
# Get all users
users = await crud_users.get_multi(db=db)
# Get with pagination
users = await crud_users.get_multi(
db=db,
offset=0, # Skip first 0 records
limit=10, # Return maximum 10 records
)
# Get with filtering
active_users = await crud_users.get_multi(
db=db,
is_deleted=False, # Filter condition
offset=compute_offset(page, items_per_page),
limit=items_per_page
)
```
**Pagination response structure:**
```python
{
"data": [
{"id": 1, "username": "john", "email": "john@example.com"},
{"id": 2, "username": "jane", "email": "jane@example.com"}
],
"total_count": 25,
"has_more": true,
"page": 1,
"items_per_page": 10
}
```
### Check Existence
Check if a record exists without fetching it:
```python
# Check if user exists
user_exists = await crud_users.exists(db=db, email="john@example.com")
# Returns True or False
# Check if username is available
username_taken = await crud_users.exists(db=db, username="john_doe")
```
**Real usage example:**
```python
# From src/app/api/v1/users.py - checking before creating
email_row = await crud_users.exists(db=db, email=user.email)
if email_row:
raise DuplicateValueException("Email is already registered")
```
### Count Records
Get count of records matching criteria:
```python
# Count all users
total_users = await crud_users.count(db=db)
# Count active users
active_count = await crud_users.count(db=db, is_deleted=False)
# Count by specific criteria
admin_count = await crud_users.count(db=db, is_superuser=True)
```
## Create Operations
### Basic Creation
Create new records using Pydantic schemas:
```python
# Create user
user_data = UserCreateInternal(
username="john_doe",
email="john@example.com",
hashed_password="hashed_password_here"
)
created_user = await crud_users.create(db=db, object=user_data)
```
**Real creation example:**
```python
# From src/app/api/v1/users.py
user_internal_dict = user.model_dump()
user_internal_dict["hashed_password"] = get_password_hash(password=user_internal_dict["password"])
del user_internal_dict["password"]
user_internal = UserCreateInternal(**user_internal_dict)
created_user = await crud_users.create(db=db, object=user_internal)
```
### Create with Relationships
When creating records with foreign keys:
```python
# Create post for a user
post_data = PostCreateInternal(
title="My First Post",
content="This is the content of my post",
created_by_user_id=user.id # Foreign key reference
)
created_post = await crud_posts.create(db=db, object=post_data)
```
## Update Operations
### Basic Updates
Update records by any field:
```python
# Update user by ID
update_data = UserUpdate(email="newemail@example.com")
await crud_users.update(db=db, object=update_data, id=user_id)
# Update by username
await crud_users.update(db=db, object=update_data, username="john_doe")
# Update multiple fields
update_data = UserUpdate(
email="newemail@example.com",
profile_image_url="https://newimage.com/photo.jpg"
)
await crud_users.update(db=db, object=update_data, id=user_id)
```
### Conditional Updates
Update with validation:
```python
# From real endpoint - check before updating
if values.username != db_user.username:
existing_username = await crud_users.exists(db=db, username=values.username)
if existing_username:
raise DuplicateValueException("Username not available")
await crud_users.update(db=db, object=values, username=username)
```
### Bulk Updates
Update multiple records at once:
```python
# Update all users with specific criteria
update_data = {"is_active": False}
await crud_users.update(db=db, object=update_data, is_deleted=True)
```
## Delete Operations
### Soft Delete
For models with soft delete fields (like User, Post):
```python
# Soft delete - sets is_deleted=True, deleted_at=now()
await crud_users.delete(db=db, username="john_doe")
# The record stays in the database but is marked as deleted
user = await crud_users.get(db=db, username="john_doe", is_deleted=True)
```
### Hard Delete
Permanently remove records from the database:
```python
# Permanently delete from database
await crud_users.db_delete(db=db, username="john_doe")
# The record is completely removed
```
**Real deletion example:**
```python
# From src/app/api/v1/users.py
# Regular users get soft delete
await crud_users.delete(db=db, username=username)
# Superusers can hard delete
await crud_users.db_delete(db=db, username=username)
```
## Advanced Operations
### Joined Queries
Get data from multiple related tables:
```python
# Get posts with user information
posts_with_users = await crud_posts.get_multi_joined(
db=db,
join_model=User,
join_on=Post.created_by_user_id == User.id,
schema_to_select=PostRead,
join_schema_to_select=UserRead,
join_prefix="user_"
)
```
Result structure:
```python
{
"id": 1,
"title": "My Post",
"content": "Post content",
"user_id": 123,
"user_username": "john_doe",
"user_email": "john@example.com"
}
```
### Custom Filtering
Advanced filtering with SQLAlchemy expressions:
```python
from sqlalchemy import and_, or_
# Complex filters
users = await crud_users.get_multi(
db=db,
filter_criteria=[
and_(
User.is_deleted == False,
User.created_at > datetime(2024, 1, 1)
)
]
)
```
### Optimized Field Selection
Select only needed fields for better performance:
```python
# Only select id and username
users = await crud_users.get_multi(
db=db,
schema_to_select=UserRead, # Use schema to define fields
limit=100
)
# Or specify fields directly
users = await crud_users.get_multi(
db=db,
schema_to_select=["id", "username", "email"],
limit=100
)
```
## Practical Examples
### Complete CRUD Workflow
Here's a complete example showing all CRUD operations:
```python
from sqlalchemy.ext.asyncio import AsyncSession
from app.crud.crud_users import crud_users
from app.schemas.user import UserCreateInternal, UserUpdate, UserRead
async def user_management_example(db: AsyncSession):
# 1. CREATE
user_data = UserCreateInternal(
username="demo_user",
email="demo@example.com",
hashed_password="hashed_password"
)
new_user = await crud_users.create(db=db, object=user_data)
print(f"Created user: {new_user.id}")
# 2. READ
user = await crud_users.get(
db=db,
id=new_user.id,
schema_to_select=UserRead
)
print(f"Retrieved user: {user.username}")
# 3. UPDATE
update_data = UserUpdate(email="updated@example.com")
await crud_users.update(db=db, object=update_data, id=new_user.id)
print("User updated")
# 4. DELETE (soft delete)
await crud_users.delete(db=db, id=new_user.id)
print("User soft deleted")
# 5. VERIFY DELETION
deleted_user = await crud_users.get(db=db, id=new_user.id, is_deleted=True)
print(f"User deleted at: {deleted_user.deleted_at}")
```
### Pagination Helper
Using FastCRUD's pagination utilities:
```python
from fastcrud.paginated import compute_offset, paginated_response
async def get_paginated_users(
db: AsyncSession,
page: int = 1,
items_per_page: int = 10
):
users_data = await crud_users.get_multi(
db=db,
offset=compute_offset(page, items_per_page),
limit=items_per_page,
is_deleted=False,
schema_to_select=UserRead
)
return paginated_response(
crud_data=users_data,
page=page,
items_per_page=items_per_page
)
```
### Error Handling
Proper error handling with CRUD operations:
```python
from app.core.exceptions.http_exceptions import NotFoundException, DuplicateValueException
async def safe_user_creation(db: AsyncSession, user_data: UserCreate):
# Check for duplicates
if await crud_users.exists(db=db, email=user_data.email):
raise DuplicateValueException("Email already registered")
if await crud_users.exists(db=db, username=user_data.username):
raise DuplicateValueException("Username not available")
# Create user
try:
user_internal = UserCreateInternal(**user_data.model_dump())
created_user = await crud_users.create(db=db, object=user_internal)
return created_user
except Exception as e:
# Handle database errors
await db.rollback()
raise e
```
## Performance Tips
### 1. Use Schema Selection
Always specify `schema_to_select` to avoid loading unnecessary data:
```python
# Good - only loads needed fields
user = await crud_users.get(db=db, id=user_id, schema_to_select=UserRead)
# Avoid - loads all fields
user = await crud_users.get(db=db, id=user_id)
```
### 2. Batch Operations
For multiple operations, use transactions:
```python
async def batch_user_updates(db: AsyncSession, updates: List[dict]):
try:
for update in updates:
await crud_users.update(db=db, object=update["data"], id=update["id"])
await db.commit()
except Exception:
await db.rollback()
raise
```
### 3. Use Exists for Checks
Use `exists()` instead of `get()` when you only need to check existence:
```python
# Good - faster, doesn't load data
if await crud_users.exists(db=db, email=email):
raise DuplicateValueException("Email taken")
# Avoid - slower, loads unnecessary data
user = await crud_users.get(db=db, email=email)
if user:
raise DuplicateValueException("Email taken")
```
## Next Steps
- **[Database Migrations](migrations.md)** - Managing database schema changes
- **[API Development](../api/index.md)** - Using CRUD in API endpoints
- **[Caching](../caching/index.md)** - Optimizing CRUD with caching

View File

@ -0,0 +1,235 @@
# Database Layer
Learn how to work with the database layer in the FastAPI Boilerplate. This section covers everything you need to store and retrieve data effectively.
## What You'll Learn
- **[Models](models.md)** - Define database tables with SQLAlchemy models
- **[Schemas](schemas.md)** - Validate and serialize data with Pydantic schemas
- **[CRUD Operations](crud.md)** - Perform database operations with FastCRUD
- **[Migrations](migrations.md)** - Manage database schema changes with Alembic
## Quick Overview
The boilerplate uses a layered architecture that separates concerns:
```python
# API Endpoint
@router.post("/", response_model=UserRead)
async def create_user(user_data: UserCreate, db: AsyncSession):
return await crud_users.create(db=db, object=user_data)
# The layers work together:
# 1. UserCreate schema validates the input
# 2. crud_users handles the database operation
# 3. User model defines the database table
# 4. UserRead schema formats the response
```
## Architecture
The database layer follows a clear separation:
```
API Request
Pydantic Schema (validation & serialization)
CRUD Layer (business logic & database operations)
SQLAlchemy Model (database table definition)
PostgreSQL Database
```
## Key Features
### 🗄️ **SQLAlchemy 2.0 Models**
Modern async SQLAlchemy with type hints:
```python
class User(Base):
__tablename__ = "user"
id: Mapped[int] = mapped_column(primary_key=True)
username: Mapped[str] = mapped_column(String(50), unique=True)
email: Mapped[str] = mapped_column(String(100), unique=True)
created_at: Mapped[datetime] = mapped_column(default=datetime.utcnow)
```
### ✅ **Pydantic Schemas**
Automatic validation and serialization:
```python
class UserCreate(BaseModel):
username: str = Field(min_length=2, max_length=50)
email: EmailStr
password: str = Field(min_length=8)
class UserRead(BaseModel):
id: int
username: str
email: str
created_at: datetime
# Note: no password field in read schema
```
### 🔧 **FastCRUD Operations**
Consistent database operations:
```python
# Create
user = await crud_users.create(db=db, object=user_create)
# Read
user = await crud_users.get(db=db, id=user_id)
users = await crud_users.get_multi(db=db, offset=0, limit=10)
# Update
user = await crud_users.update(db=db, object=user_update, id=user_id)
# Delete (soft delete)
await crud_users.delete(db=db, id=user_id)
```
### 🔄 **Database Migrations**
Track schema changes with Alembic:
```bash
# Generate migration
alembic revision --autogenerate -m "Add user table"
# Apply migrations
alembic upgrade head
# Rollback if needed
alembic downgrade -1
```
## Database Setup
The boilerplate is configured for PostgreSQL with async support:
### Environment Configuration
```bash
# .env file
POSTGRES_USER=your_user
POSTGRES_PASSWORD=your_password
POSTGRES_SERVER=localhost
POSTGRES_PORT=5432
POSTGRES_DB=your_database
```
### Connection Management
```python
# Database session dependency
async def async_get_db() -> AsyncIterator[AsyncSession]:
async with async_session_maker() as session:
yield session
# Use in endpoints
@router.get("/users/")
async def get_users(db: Annotated[AsyncSession, Depends(async_get_db)]):
return await crud_users.get_multi(db=db)
```
## Included Models
The boilerplate includes four example models:
### **User Model** - Authentication & user management
- Username, email, password (hashed)
- Soft delete support
- Tier-based access control
### **Post Model** - Content with user relationships
- Title, content, creation metadata
- Foreign key to user (no SQLAlchemy relationships)
- Soft delete built-in
### **Tier Model** - User subscription levels
- Name-based tiers (free, premium, etc.)
- Links to rate limiting system
### **Rate Limit Model** - API access control
- Path-specific rate limits per tier
- Configurable limits and time periods
## Directory Structure
```text
src/app/
├── models/ # SQLAlchemy models (database tables)
│ ├── __init__.py
│ ├── user.py # User table definition
│ ├── post.py # Post table definition
│ └── ...
├── schemas/ # Pydantic schemas (validation)
│ ├── __init__.py
│ ├── user.py # User validation schemas
│ ├── post.py # Post validation schemas
│ └── ...
├── crud/ # Database operations
│ ├── __init__.py
│ ├── crud_users.py # User CRUD operations
│ ├── crud_posts.py # Post CRUD operations
│ └── ...
└── core/db/ # Database configuration
├── database.py # Connection and session setup
└── models.py # Base classes and mixins
```
## Common Patterns
### Create with Validation
```python
@router.post("/users/", response_model=UserRead)
async def create_user(
user_data: UserCreate, # Validates input automatically
db: Annotated[AsyncSession, Depends(async_get_db)]
):
# Check for duplicates
if await crud_users.exists(db=db, email=user_data.email):
raise DuplicateValueException("Email already exists")
# Create user (password gets hashed automatically)
return await crud_users.create(db=db, object=user_data)
```
### Query with Filters
```python
# Get active users only
users = await crud_users.get_multi(
db=db,
is_active=True,
is_deleted=False,
offset=0,
limit=10
)
# Search users
users = await crud_users.get_multi(
db=db,
username__icontains="john", # Contains "john"
schema_to_select=UserRead
)
```
### Soft Delete Pattern
```python
# Soft delete (sets is_deleted=True)
await crud_users.delete(db=db, id=user_id)
# Hard delete (actually removes from database)
await crud_users.db_delete(db=db, id=user_id)
# Get only non-deleted records
users = await crud_users.get_multi(db=db, is_deleted=False)
```
## What's Next
Each guide builds on the previous one with practical examples:
1. **[Models](models.md)** - Define your database structure
2. **[Schemas](schemas.md)** - Add validation and serialization
3. **[CRUD Operations](crud.md)** - Implement business logic
4. **[Migrations](migrations.md)** - Deploy changes safely
The boilerplate provides a solid foundation - just follow these patterns to build your data layer!

View File

@ -0,0 +1,470 @@
# Database Migrations
This guide covers database migrations using Alembic, the migration tool for SQLAlchemy. Learn how to manage database schema changes safely and efficiently in development and production.
## Overview
The FastAPI Boilerplate uses [Alembic](https://alembic.sqlalchemy.org/) for database migrations. Alembic provides:
- **Version-controlled schema changes** - Track every database modification
- **Automatic migration generation** - Generate migrations from model changes
- **Reversible migrations** - Upgrade and downgrade database versions
- **Environment-specific configurations** - Different settings for dev/staging/production
- **Safe schema evolution** - Apply changes incrementally
## Simple Setup: Automatic Table Creation
For simple projects or development, the boilerplate includes `create_tables_on_start` parameter that automatically creates all tables on application startup:
```python
# This is enabled by default in create_application()
app = create_application(
router=router,
settings=settings,
create_tables_on_start=True # Default: True
)
```
**When to use:**
-**Development** - Quick setup without migration management
-**Simple projects** - When you don't need migration history
-**Prototyping** - Fast iteration without migration complexity
-**Testing** - Clean database state for each test run
**When NOT to use:**
-**Production** - No migration history or rollback capability
-**Team development** - Can't track schema changes between developers
-**Data migrations** - Only handles schema, not data transformations
-**Complex deployments** - No control over when/how schema changes apply
```python
# Disable for production environments
app = create_application(
router=router,
settings=settings,
create_tables_on_start=False # Use migrations instead
)
```
For production deployments and team development, use proper Alembic migrations as described below.
## Configuration
### Alembic Setup
Alembic is configured in `src/alembic.ini`:
```ini
[alembic]
# Path to migration files
script_location = migrations
# Database URL with environment variable substitution
sqlalchemy.url = postgresql://%(POSTGRES_USER)s:%(POSTGRES_PASSWORD)s@%(POSTGRES_SERVER)s:%(POSTGRES_PORT)s/%(POSTGRES_DB)s
# Other configurations
file_template = %%(year)d%%(month).2d%%(day).2d_%%(hour).2d%%(minute).2d_%%(rev)s_%%(slug)s
timezone = UTC
```
### Environment Configuration
Migration environment is configured in `src/migrations/env.py`:
```python
# src/migrations/env.py
from alembic import context
from sqlalchemy import engine_from_config, pool
from app.core.db.database import Base
from app.core.config import settings
# Import all models to ensure they're registered
from app.models import * # This imports all models
config = context.config
# Override database URL from environment
config.set_main_option("sqlalchemy.url", settings.DATABASE_URL)
target_metadata = Base.metadata
```
## Migration Workflow
### 1. Creating Migrations
Generate migrations automatically when you change models:
```bash
# Navigate to src directory
cd src
# Generate migration from model changes
uv run alembic revision --autogenerate -m "Add user profile fields"
```
**What happens:**
- Alembic compares current models with database schema
- Generates a new migration file in `src/migrations/versions/`
- Migration includes upgrade and downgrade functions
### 2. Review Generated Migration
Always review auto-generated migrations before applying:
```python
# Example migration file: src/migrations/versions/20241215_1430_add_user_profile_fields.py
"""Add user profile fields
Revision ID: abc123def456
Revises: previous_revision_id
Create Date: 2024-12-15 14:30:00.000000
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers
revision = 'abc123def456'
down_revision = 'previous_revision_id'
branch_labels = None
depends_on = None
def upgrade() -> None:
# Add new columns
op.add_column('user', sa.Column('bio', sa.String(500), nullable=True))
op.add_column('user', sa.Column('website', sa.String(255), nullable=True))
# Create index
op.create_index('ix_user_website', 'user', ['website'])
def downgrade() -> None:
# Remove changes (reverse order)
op.drop_index('ix_user_website', 'user')
op.drop_column('user', 'website')
op.drop_column('user', 'bio')
```
### 3. Apply Migration
Apply migrations to update database schema:
```bash
# Apply all pending migrations
uv run alembic upgrade head
# Apply specific number of migrations
uv run alembic upgrade +2
# Apply to specific revision
uv run alembic upgrade abc123def456
```
### 4. Verify Migration
Check migration status and current version:
```bash
# Show current database version
uv run alembic current
# Show migration history
uv run alembic history
# Show pending migrations
uv run alembic show head
```
## Common Migration Scenarios
### Adding New Model
1. **Create the model** in `src/app/models/`:
```python
# src/app/models/category.py
from sqlalchemy import String, DateTime
from sqlalchemy.orm import Mapped, mapped_column
from datetime import datetime
from app.core.db.database import Base
class Category(Base):
__tablename__ = "category"
id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True, init=False)
name: Mapped[str] = mapped_column(String(50), unique=True, nullable=False)
slug: Mapped[str] = mapped_column(String(50), unique=True, nullable=False)
description: Mapped[str] = mapped_column(String(255), nullable=True)
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
```
2. **Import in __init__.py**:
```python
# src/app/models/__init__.py
from .user import User
from .post import Post
from .tier import Tier
from .rate_limit import RateLimit
from .category import Category # Add new import
```
3. **Generate migration**:
```bash
uv run alembic revision --autogenerate -m "Add category model"
```
### Adding Foreign Key
1. **Update model with foreign key**:
```python
# Add to Post model
category_id: Mapped[Optional[int]] = mapped_column(ForeignKey("category.id"), nullable=True)
```
2. **Generate migration**:
```bash
uv run alembic revision --autogenerate -m "Add category_id to posts"
```
3. **Review and apply**:
```python
# Generated migration will include:
def upgrade() -> None:
op.add_column('post', sa.Column('category_id', sa.Integer(), nullable=True))
op.create_foreign_key('fk_post_category_id', 'post', 'category', ['category_id'], ['id'])
op.create_index('ix_post_category_id', 'post', ['category_id'])
```
### Data Migrations
Sometimes you need to migrate data, not just schema:
```python
# Example: Populate default category for existing posts
def upgrade() -> None:
# Add the column
op.add_column('post', sa.Column('category_id', sa.Integer(), nullable=True))
# Data migration
connection = op.get_bind()
# Create default category
connection.execute(
"INSERT INTO category (name, slug, description) VALUES ('General', 'general', 'Default category')"
)
# Get default category ID
result = connection.execute("SELECT id FROM category WHERE slug = 'general'")
default_category_id = result.fetchone()[0]
# Update existing posts
connection.execute(
f"UPDATE post SET category_id = {default_category_id} WHERE category_id IS NULL"
)
# Make column non-nullable after data migration
op.alter_column('post', 'category_id', nullable=False)
```
### Renaming Columns
```python
def upgrade() -> None:
# Rename column
op.alter_column('user', 'full_name', new_column_name='name')
def downgrade() -> None:
# Reverse the rename
op.alter_column('user', 'name', new_column_name='full_name')
```
### Dropping Tables
```python
def upgrade() -> None:
# Drop table (be careful!)
op.drop_table('old_table')
def downgrade() -> None:
# Recreate table structure
op.create_table('old_table',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(50), nullable=True),
sa.PrimaryKeyConstraint('id')
)
```
## Production Migration Strategy
### 1. Development Workflow
```bash
# 1. Make model changes
# 2. Generate migration
uv run alembic revision --autogenerate -m "Descriptive message"
# 3. Review migration file
# 4. Test migration
uv run alembic upgrade head
# 5. Test downgrade (optional)
uv run alembic downgrade -1
uv run alembic upgrade head
```
### 2. Staging Deployment
```bash
# 1. Deploy code with migrations
# 2. Backup database
pg_dump -h staging-db -U user dbname > backup_$(date +%Y%m%d_%H%M%S).sql
# 3. Apply migrations
uv run alembic upgrade head
# 4. Verify application works
# 5. Run tests
```
### 3. Production Deployment
```bash
# 1. Schedule maintenance window
# 2. Create database backup
pg_dump -h prod-db -U user dbname > prod_backup_$(date +%Y%m%d_%H%M%S).sql
# 3. Apply migrations (with monitoring)
uv run alembic upgrade head
# 4. Verify health checks pass
# 5. Monitor application metrics
```
## Docker Considerations
### Development with Docker Compose
For local development, migrations run automatically:
```yaml
# docker-compose.yml
services:
web:
# ... other config
depends_on:
- db
command: |
sh -c "
uv run alembic upgrade head &&
uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
"
```
### Production Docker
In production, run migrations separately:
```dockerfile
# Dockerfile migration stage
FROM python:3.11-slim as migration
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY src/ /app/
WORKDIR /app
CMD ["alembic", "upgrade", "head"]
```
```yaml
# docker-compose.prod.yml
services:
migrate:
build:
context: .
target: migration
env_file:
- .env
depends_on:
- db
command: alembic upgrade head
web:
# ... web service config
depends_on:
- migrate
```
## Migration Best Practices
### 1. Always Review Generated Migrations
```python
# Check for issues like:
# - Missing imports
# - Incorrect nullable settings
# - Missing indexes
# - Data loss operations
```
### 2. Use Descriptive Messages
```bash
# Good
uv run alembic revision --autogenerate -m "Add user email verification fields"
# Bad
uv run alembic revision --autogenerate -m "Update user model"
```
### 3. Handle Nullable Columns Carefully
```python
# When adding non-nullable columns to existing tables:
def upgrade() -> None:
# 1. Add as nullable first
op.add_column('user', sa.Column('phone', sa.String(20), nullable=True))
# 2. Populate with default data
op.execute("UPDATE user SET phone = '' WHERE phone IS NULL")
# 3. Make non-nullable
op.alter_column('user', 'phone', nullable=False)
```
### 4. Test Rollbacks
```bash
# Test that your downgrade works
uv run alembic downgrade -1
uv run alembic upgrade head
```
### 5. Use Transactions for Complex Migrations
```python
def upgrade() -> None:
# Complex migration with transaction
connection = op.get_bind()
trans = connection.begin()
try:
# Multiple operations
op.create_table(...)
op.add_column(...)
connection.execute("UPDATE ...")
trans.commit()
except:
trans.rollback()
raise
```
## Next Steps
- **[CRUD Operations](crud.md)** - Working with migrated database schema
- **[API Development](../api/index.md)** - Building endpoints for your models
- **[Testing](../testing.md)** - Testing database migrations

View File

@ -0,0 +1,484 @@
# Database Models
This section explains how SQLAlchemy models are implemented in the boilerplate, how to create new models, and the patterns used for relationships, validation, and data integrity.
## Model Structure
Models are defined in `src/app/models/` using SQLAlchemy 2.0's declarative syntax with `Mapped` type annotations.
### Base Model
All models inherit from `Base` defined in `src/app/core/db/database.py`:
```python
from sqlalchemy.orm import DeclarativeBase
class Base(DeclarativeBase):
pass
```
**SQLAlchemy 2.0 Change**: Uses `DeclarativeBase` instead of the older `declarative_base()` function. This provides better type checking and IDE support.
### Model File Structure
Each model is in its own file:
```text
src/app/models/
├── __init__.py # Imports all models for Alembic discovery
├── user.py # User authentication model
├── post.py # Example content model with relationships
├── tier.py # User subscription tiers
└── rate_limit.py # API rate limiting configuration
```
**Import Requirement**: Models must be imported in `__init__.py` for Alembic to detect them during migration generation.
## Design Decision: No SQLAlchemy Relationships
The boilerplate deliberately avoids using SQLAlchemy's `relationship()` feature. This is an intentional architectural choice with specific benefits.
### Why No Relationships
**Performance Concerns**:
- **N+1 Query Problem**: Relationships can trigger multiple queries when accessing related data
- **Lazy Loading**: Unpredictable when queries execute, making performance optimization difficult
- **Memory Usage**: Loading large object graphs consumes significant memory
**Code Clarity**:
- **Explicit Data Fetching**: Developers see exactly what data is being loaded and when
- **Predictable Queries**: No "magic" queries triggered by attribute access
- **Easier Debugging**: SQL queries are explicit in the code, not hidden in relationship configuration
**Flexibility**:
- **Query Optimization**: Can optimize each query for its specific use case
- **Selective Loading**: Load only the fields needed for each operation
- **Join Control**: Use FastCRUD's join methods when needed, skip when not
### What This Means in Practice
Instead of this (traditional SQLAlchemy):
```python
# Not used in the boilerplate
class User(Base):
posts: Mapped[List["Post"]] = relationship("Post", back_populates="created_by_user")
class Post(Base):
created_by_user: Mapped["User"] = relationship("User", back_populates="posts")
```
The boilerplate uses this approach:
```python
# DO - Explicit and controlled
class User(Base):
# Only foreign key, no relationship
tier_id: Mapped[int | None] = mapped_column(ForeignKey("tier.id"), index=True, default=None)
class Post(Base):
# Only foreign key, no relationship
created_by_user_id: Mapped[int] = mapped_column(ForeignKey("user.id"), index=True)
# Explicit queries - you control exactly what's loaded
user = await crud_users.get(db=db, id=1)
posts = await crud_posts.get_multi(db=db, created_by_user_id=user.id)
# Or use joins when needed
posts_with_users = await crud_posts.get_multi_joined(
db=db,
join_model=User,
schema_to_select=PostRead,
join_schema_to_select=UserRead
)
```
### Benefits of This Approach
**Predictable Performance**:
- Every database query is explicit in the code
- No surprise queries from accessing relationships
- Easier to identify and optimize slow operations
**Better Caching**:
- Can cache individual models without worrying about related data
- Cache invalidation is simpler and more predictable
**API Design**:
- Forces thinking about what data clients actually need
- Prevents over-fetching in API responses
- Encourages lean, focused endpoints
**Testing**:
- Easier to mock database operations
- No complex relationship setup in test fixtures
- More predictable test data requirements
### When You Need Related Data
Use FastCRUD's join capabilities:
```python
# Single record with related data
post_with_author = await crud_posts.get_joined(
db=db,
join_model=User,
schema_to_select=PostRead,
join_schema_to_select=UserRead,
id=post_id
)
# Multiple records with joins
posts_with_authors = await crud_posts.get_multi_joined(
db=db,
join_model=User,
offset=0,
limit=10
)
```
### Alternative Approaches
If you need relationships in your project, you can add them:
```python
# Add relationships if needed for your use case
from sqlalchemy.orm import relationship
class User(Base):
# ... existing fields ...
posts: Mapped[List["Post"]] = relationship("Post", back_populates="created_by_user")
class Post(Base):
# ... existing fields ...
created_by_user: Mapped["User"] = relationship("User", back_populates="posts")
```
But consider the trade-offs and whether explicit queries might be better for your use case.
## User Model Implementation
The User model (`src/app/models/user.py`) demonstrates authentication patterns:
```python
import uuid as uuid_pkg
from datetime import UTC, datetime
from sqlalchemy import DateTime, ForeignKey, String
from sqlalchemy.orm import Mapped, mapped_column
from ..core.db.database import Base
class User(Base):
__tablename__ = "user"
id: Mapped[int] = mapped_column("id", autoincrement=True, nullable=False, unique=True, primary_key=True, init=False)
# User data
name: Mapped[str] = mapped_column(String(30))
username: Mapped[str] = mapped_column(String(20), unique=True, index=True)
email: Mapped[str] = mapped_column(String(50), unique=True, index=True)
hashed_password: Mapped[str] = mapped_column(String)
# Profile
profile_image_url: Mapped[str] = mapped_column(String, default="https://profileimageurl.com")
# UUID for external references
uuid: Mapped[uuid_pkg.UUID] = mapped_column(default_factory=uuid_pkg.uuid4, primary_key=True, unique=True)
# Timestamps
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), default_factory=lambda: datetime.now(UTC))
updated_at: Mapped[datetime | None] = mapped_column(DateTime(timezone=True), default=None)
deleted_at: Mapped[datetime | None] = mapped_column(DateTime(timezone=True), default=None)
# Status flags
is_deleted: Mapped[bool] = mapped_column(default=False, index=True)
is_superuser: Mapped[bool] = mapped_column(default=False)
# Foreign key to tier system (no relationship defined)
tier_id: Mapped[int | None] = mapped_column(ForeignKey("tier.id"), index=True, default=None, init=False)
```
### Key Implementation Details
**Type Annotations**: `Mapped[type]` provides type hints for SQLAlchemy 2.0. IDE and mypy can validate types.
**String Lengths**: Explicit lengths (`String(50)`) prevent database errors and define constraints clearly.
**Nullable Fields**: Explicitly set `nullable=False` for required fields, `nullable=True` for optional ones.
**Default Values**: Use `default=` for database-level defaults, Python functions for computed defaults.
## Post Model with Relationships
The Post model (`src/app/models/post.py`) shows relationships and soft deletion:
```python
import uuid as uuid_pkg
from datetime import UTC, datetime
from sqlalchemy import DateTime, ForeignKey, String
from sqlalchemy.orm import Mapped, mapped_column
from ..core.db.database import Base
class Post(Base):
__tablename__ = "post"
id: Mapped[int] = mapped_column("id", autoincrement=True, nullable=False, unique=True, primary_key=True, init=False)
# Content
title: Mapped[str] = mapped_column(String(30))
text: Mapped[str] = mapped_column(String(63206)) # Large text field
media_url: Mapped[str | None] = mapped_column(String, default=None)
# UUID for external references
uuid: Mapped[uuid_pkg.UUID] = mapped_column(default_factory=uuid_pkg.uuid4, primary_key=True, unique=True)
# Foreign key (no relationship defined)
created_by_user_id: Mapped[int] = mapped_column(ForeignKey("user.id"), index=True)
# Timestamps (built-in soft delete pattern)
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), default_factory=lambda: datetime.now(UTC))
updated_at: Mapped[datetime | None] = mapped_column(DateTime(timezone=True), default=None)
deleted_at: Mapped[datetime | None] = mapped_column(DateTime(timezone=True), default=None)
is_deleted: Mapped[bool] = mapped_column(default=False, index=True)
```
### Soft Deletion Pattern
Soft deletion is built directly into models:
```python
# Built into each model that needs soft deletes
class Post(Base):
# ... other fields ...
# Soft delete fields
is_deleted: Mapped[bool] = mapped_column(default=False, index=True)
deleted_at: Mapped[datetime | None] = mapped_column(DateTime(timezone=True), default=None)
```
**Usage**: When `crud_posts.delete()` is called, it sets `is_deleted=True` and `deleted_at=datetime.now(UTC)` instead of removing the database row.
## Tier and Rate Limiting Models
### Tier Model
```python
# src/app/models/tier.py
class Tier(Base):
__tablename__ = "tier"
id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True, init=False)
name: Mapped[str] = mapped_column(String(50), unique=True, nullable=False)
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, nullable=False)
```
### Rate Limit Model
```python
# src/app/models/rate_limit.py
class RateLimit(Base):
__tablename__ = "rate_limit"
id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True, init=False)
tier_id: Mapped[int] = mapped_column(ForeignKey("tier.id"), nullable=False)
path: Mapped[str] = mapped_column(String(255), nullable=False)
limit: Mapped[int] = mapped_column(nullable=False) # requests allowed
period: Mapped[int] = mapped_column(nullable=False) # time period in seconds
name: Mapped[Optional[str]] = mapped_column(String(100), nullable=True)
```
**Purpose**: Links API endpoints (`path`) to rate limits (`limit` requests per `period` seconds) for specific user tiers.
## Creating New Models
### Step-by-Step Process
1. **Create model file** in `src/app/models/your_model.py`
2. **Define model class** inheriting from `Base`
3. **Add to imports** in `src/app/models/__init__.py`
4. **Generate migration** with `alembic revision --autogenerate`
5. **Apply migration** with `alembic upgrade head`
### Example: Creating a Category Model
```python
# src/app/models/category.py
from datetime import datetime
from typing import List
from sqlalchemy import String, DateTime
from sqlalchemy.orm import Mapped, mapped_column, relationship
from app.core.db.database import Base
class Category(Base):
__tablename__ = "category"
id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True, init=False)
name: Mapped[str] = mapped_column(String(50), unique=True, nullable=False)
description: Mapped[str] = mapped_column(String(255), nullable=True)
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, nullable=False)
```
If you want to relate Category to Post, just add the id reference in the model:
```python
class Post(Base):
__tablename__ = "post"
...
# Foreign key (no relationship defined)
category_id: Mapped[int] = mapped_column(ForeignKey("category.id"), index=True)
```
### Import in __init__.py
```python
# src/app/models/__init__.py
from .user import User
from .post import Post
from .tier import Tier
from .rate_limit import RateLimit
from .category import Category # Add new model
```
**Critical**: Without this import, Alembic won't detect the model for migrations.
## Model Validation and Constraints
### Database-Level Constraints
```python
from sqlalchemy import CheckConstraint, Index
class Product(Base):
__tablename__ = "product"
price: Mapped[float] = mapped_column(nullable=False)
quantity: Mapped[int] = mapped_column(nullable=False)
# Table-level constraints
__table_args__ = (
CheckConstraint('price > 0', name='positive_price'),
CheckConstraint('quantity >= 0', name='non_negative_quantity'),
Index('idx_product_price', 'price'),
)
```
### Unique Constraints
```python
# Single column unique
email: Mapped[str] = mapped_column(String(100), unique=True)
# Multi-column unique constraint
__table_args__ = (
UniqueConstraint('user_id', 'category_id', name='unique_user_category'),
)
```
## Common Model Patterns
### Timestamp Tracking
```python
class TimestampedModel:
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, nullable=False)
updated_at: Mapped[datetime] = mapped_column(
DateTime,
default=datetime.utcnow,
onupdate=datetime.utcnow,
nullable=False
)
# Use as mixin
class Post(Base, TimestampedModel, SoftDeleteMixin):
# Model automatically gets created_at, updated_at, is_deleted, deleted_at
__tablename__ = "post"
id: Mapped[int] = mapped_column(primary_key=True)
```
### Enumeration Fields
```python
from enum import Enum
from sqlalchemy import Enum as SQLEnum
class UserStatus(Enum):
ACTIVE = "active"
INACTIVE = "inactive"
SUSPENDED = "suspended"
class User(Base):
status: Mapped[UserStatus] = mapped_column(SQLEnum(UserStatus), default=UserStatus.ACTIVE)
```
### JSON Fields
```python
from sqlalchemy.dialects.postgresql import JSONB
class UserProfile(Base):
preferences: Mapped[dict] = mapped_column(JSONB, nullable=True)
metadata: Mapped[dict] = mapped_column(JSONB, default=lambda: {})
```
**PostgreSQL-specific**: Uses JSONB for efficient JSON storage and querying.
## Model Testing
### Basic Model Tests
```python
# tests/test_models.py
import pytest
from sqlalchemy.exc import IntegrityError
from app.models.user import User
def test_user_creation():
user = User(
username="testuser",
email="test@example.com",
hashed_password="hashed123"
)
assert user.username == "testuser"
assert user.is_active is True # Default value
def test_user_unique_constraint():
# Test that duplicate emails raise IntegrityError
with pytest.raises(IntegrityError):
# Create users with same email
pass
```
## Migration Considerations
### Backwards Compatible Changes
Safe changes that don't break existing code:
- Adding nullable columns
- Adding new tables
- Adding indexes
- Increasing column lengths
### Breaking Changes
Changes requiring careful migration:
- Making columns non-nullable
- Removing columns
- Changing column types
- Removing tables
## Next Steps
Now that you understand model implementation:
1. **[Schemas](schemas.md)** - Learn Pydantic validation and serialization
2. **[CRUD Operations](crud.md)** - Implement database operations with FastCRUD
3. **[Migrations](migrations.md)** - Manage schema changes with Alembic
The next section covers how Pydantic schemas provide validation and API contracts separate from database models.

View File

@ -0,0 +1,650 @@
# Database Schemas
This section explains how Pydantic schemas handle data validation, serialization, and API contracts in the boilerplate. Schemas are separate from SQLAlchemy models and define what data enters and exits your API.
## Schema Purpose and Structure
Schemas serve three main purposes:
1. **Input Validation** - Validate incoming API request data
2. **Output Serialization** - Format database data for API responses
3. **API Contracts** - Define clear interfaces between frontend and backend
### Schema File Organization
Schemas are organized in `src/app/schemas/` with one file per model:
```text
src/app/schemas/
├── __init__.py # Imports for easy access
├── user.py # User-related schemas
├── post.py # Post-related schemas
├── tier.py # Tier schemas
├── rate_limit.py # Rate limit schemas
└── job.py # Background job schemas
```
## User Schema Implementation
The User schemas (`src/app/schemas/user.py`) demonstrate common validation patterns:
```python
from datetime import datetime
from typing import Annotated
from pydantic import BaseModel, ConfigDict, EmailStr, Field
from ..core.schemas import PersistentDeletion, TimestampSchema, UUIDSchema
# Base schema with common fields
class UserBase(BaseModel):
name: Annotated[
str,
Field(
min_length=2,
max_length=30,
examples=["User Userson"]
)
]
username: Annotated[
str,
Field(
min_length=2,
max_length=20,
pattern=r"^[a-z0-9]+$",
examples=["userson"]
)
]
email: Annotated[EmailStr, Field(examples=["user.userson@example.com"])]
# Full User data
class User(TimestampSchema, UserBase, UUIDSchema, PersistentDeletion):
profile_image_url: Annotated[
str,
Field(default="https://www.profileimageurl.com")
]
hashed_password: str
is_superuser: bool = False
tier_id: int | None = None
# Schema for reading user data (API output)
class UserRead(BaseModel):
id: int
name: Annotated[
str,
Field(
min_length=2,
max_length=30,
examples=["User Userson"]
)
]
username: Annotated[
str,
Field(
min_length=2,
max_length=20,
pattern=r"^[a-z0-9]+$",
examples=["userson"]
)
]
email: Annotated[EmailStr, Field(examples=["user.userson@example.com"])]
profile_image_url: str
tier_id: int | None
# Schema for creating new users (API input)
class UserCreate(UserBase): # Inherits from UserBase
model_config = ConfigDict(extra="forbid")
password: Annotated[
str,
Field(
pattern=r"^.{8,}|[0-9]+|[A-Z]+|[a-z]+|[^a-zA-Z0-9]+$",
examples=["Str1ngst!"]
)
]
# Schema that FastCRUD will use to store just the hash
class UserCreateInternal(UserBase):
hashed_password: str
# Schema for updating users
class UserUpdate(BaseModel):
model_config = ConfigDict(extra="forbid")
name: Annotated[
str | None,
Field(
min_length=2,
max_length=30,
examples=["User Userberg"],
default=None
)
]
username: Annotated[
str | None,
Field(
min_length=2,
max_length=20,
pattern=r"^[a-z0-9]+$",
examples=["userberg"],
default=None
)
]
email: Annotated[
EmailStr | None,
Field(
examples=["user.userberg@example.com"],
default=None
)
]
profile_image_url: Annotated[
str | None,
Field(
pattern=r"^(https?|ftp)://[^\s/$.?#].[^\s]*$",
examples=["https://www.profileimageurl.com"],
default=None
),
]
# Internal update schema
class UserUpdateInternal(UserUpdate):
updated_at: datetime
# Schema to update tier id
class UserTierUpdate(BaseModel):
tier_id: int
# Schema for user deletion (soft delete timestamps)
class UserDelete(BaseModel):
model_config = ConfigDict(extra="forbid")
is_deleted: bool
deleted_at: datetime
# User specific schema
class UserRestoreDeleted(BaseModel):
is_deleted: bool
```
### Key Implementation Details
**Field Validation**: Uses `Annotated[type, Field(...)]` for validation rules. `Field` parameters include:
- `min_length/max_length` - String length constraints
- `gt/ge/lt/le` - Numeric constraints
- `pattern` - Pattern matching (regex)
- `default` - Default values
**EmailStr**: Validates email format and normalizes the value.
**ConfigDict**: Replaces the old `Config` class. `from_attributes=True` allows creating schemas from SQLAlchemy model instances.
**Internal vs External**: Separate schemas for internal operations (like password hashing) vs API exposure.
## Schema Patterns
### Base Schema Pattern
```python
# Common fields shared across operations
class PostBase(BaseModel):
title: Annotated[
str,
Field(
min_length=1,
max_length=100
)
]
content: Annotated[
str,
Field(
min_length=1,
max_length=10000
)
]
# Specific operation schemas inherit from base
class PostCreate(PostBase):
pass # Only title and content needed for creation
class PostRead(PostBase):
model_config = ConfigDict(from_attributes=True)
id: int
created_at: datetime
created_by_user_id: int
is_deleted: bool = False # From model's soft delete fields
```
**Purpose**: Reduces duplication and ensures consistency across related schemas.
### Optional Fields in Updates
```python
class PostUpdate(BaseModel):
title: Annotated[
str | None,
Field(
min_length=1,
max_length=100,
default=None
)
]
content: Annotated[
str | None,
Field(
min_length=1,
max_length=10000,
default=None
)
]
```
**Pattern**: All fields optional in update schemas. Only provided fields are updated in the database.
### Nested Schemas
```python
# Post schema with user information
class PostWithUser(PostRead):
created_by_user: UserRead # Nested user data
# Alternative: Custom nested schema
class PostAuthor(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: int
username: str
# Only include fields needed for this context
class PostRead(PostBase):
created_by_user: PostAuthor
```
**Usage**: Include related model data in responses without exposing all fields.
## Validation Patterns
### Custom Validators
```python
from pydantic import field_validator, model_validator
class UserCreateWithConfirm(UserBase):
password: str
confirm_password: str
@field_validator('username')
@classmethod
def validate_username(cls, v):
if v.lower() in ['admin', 'root', 'system']:
raise ValueError('Username not allowed')
return v.lower() # Normalize to lowercase
@model_validator(mode='after')
def validate_passwords_match(self):
if self.password != self.confirm_password:
raise ValueError('Passwords do not match')
return self
```
**field_validator**: Validates individual fields. Can transform values.
**model_validator**: Validates across multiple fields. Access to full model data.
### Computed Fields
```python
from pydantic import computed_field
class UserReadWithComputed(UserRead):
created_at: datetime # Would need to be added to actual UserRead
@computed_field
@property
def age_days(self) -> int:
return (datetime.utcnow() - self.created_at).days
@computed_field
@property
def display_name(self) -> str:
return f"@{self.username}"
```
**Purpose**: Add computed values to API responses without storing them in the database.
### Conditional Validation
```python
class PostCreate(BaseModel):
title: str
content: str
category: Optional[str] = None
is_premium: bool = False
@model_validator(mode='after')
def validate_premium_content(self):
if self.is_premium and not self.category:
raise ValueError('Premium posts must have a category')
return self
```
## Schema Configuration
### Model Config Options
```python
class UserRead(BaseModel):
model_config = ConfigDict(
from_attributes=True, # Allow creation from SQLAlchemy models
extra="forbid", # Reject extra fields
str_strip_whitespace=True, # Strip whitespace from strings
validate_assignment=True, # Validate on field assignment
populate_by_name=True, # Allow field names and aliases
)
```
### Field Aliases
```python
class UserResponse(BaseModel):
user_id: Annotated[
int,
Field(alias="id")
]
username: str
email_address: Annotated[
str,
Field(alias="email")
]
model_config = ConfigDict(populate_by_name=True)
```
**Usage**: API can accept both `id` and `user_id`, `email` and `email_address`.
## Response Schema Patterns
### Multi-Record Responses
[FastCRUD's](https://benavlabs.github.io/fastcrud/) `get_multi` method returns a `GetMultiResponse`:
```python
# Using get_multi directly
users = await crud_users.get_multi(
db=db,
offset=0,
limit=10,
schema_to_select=UserRead,
return_as_model=True,
return_total_count=True
)
# Returns GetMultiResponse structure:
# {
# "data": [UserRead, ...],
# "total_count": 150
# }
```
### Paginated Responses
For pagination with page numbers, use `PaginatedListResponse`:
```python
from fastcrud.paginated import PaginatedListResponse
# In API endpoint - ONLY for paginated list responses
@router.get("/users/", response_model=PaginatedListResponse[UserRead])
async def get_users(page: int = 1, items_per_page: int = 10):
# Returns paginated structure with additional pagination fields:
# {
# "data": [UserRead, ...],
# "total_count": 150,
# "has_more": true,
# "page": 1,
# "items_per_page": 10
# }
# Single user endpoints return UserRead directly
@router.get("/users/{user_id}", response_model=UserRead)
async def get_user(user_id: int):
# Returns single UserRead object:
# {
# "id": 1,
# "name": "User Userson",
# "username": "userson",
# "email": "user.userson@example.com",
# "profile_image_url": "https://...",
# "tier_id": null
# }
```
### Error Response Schemas
```python
class ErrorResponse(BaseModel):
detail: str
error_code: Optional[str] = None
class ValidationErrorResponse(BaseModel):
detail: str
errors: list[dict] # Pydantic validation errors
```
### Success Response Wrapper
```python
from typing import Generic, TypeVar
T = TypeVar('T')
class SuccessResponse(BaseModel, Generic[T]):
success: bool = True
data: T
message: Optional[str] = None
# Usage in endpoint
@router.post("/users/", response_model=SuccessResponse[UserRead])
async def create_user(user_data: UserCreate):
user = await crud_users.create(db=db, object=user_data)
return SuccessResponse(data=user, message="User created successfully")
```
## Creating New Schemas
### Step-by-Step Process
1. **Create schema file** in `src/app/schemas/your_model.py`
2. **Define base schema** with common fields
3. **Create operation-specific schemas** (Create, Read, Update, Delete)
4. **Add validation rules** as needed
5. **Import in __init__.py** for easy access
### Example: Category Schemas
```python
# src/app/schemas/category.py
from datetime import datetime
from typing import Annotated
from pydantic import BaseModel, Field, ConfigDict
class CategoryBase(BaseModel):
name: Annotated[
str,
Field(
min_length=1,
max_length=50
)
]
description: Annotated[
str | None,
Field(
max_length=255,
default=None
)
]
class CategoryCreate(CategoryBase):
pass
class CategoryRead(CategoryBase):
model_config = ConfigDict(from_attributes=True)
id: int
created_at: datetime
class CategoryUpdate(BaseModel):
name: Annotated[
str | None,
Field(
min_length=1,
max_length=50,
default=None
)
]
description: Annotated[
str | None,
Field(
max_length=255,
default=None
)
]
class CategoryWithPosts(CategoryRead):
posts: list[PostRead] = [] # Include related posts
```
### Import in __init__.py
```python
# src/app/schemas/__init__.py
from .user import UserCreate, UserRead, UserUpdate
from .post import PostCreate, PostRead, PostUpdate
from .category import CategoryCreate, CategoryRead, CategoryUpdate
```
## Schema Testing
### Validation Testing
```python
# tests/test_schemas.py
import pytest
from pydantic import ValidationError
from app.schemas.user import UserCreate
def test_user_create_valid():
user_data = {
"name": "Test User",
"username": "testuser",
"email": "test@example.com",
"password": "Str1ngst!"
}
user = UserCreate(**user_data)
assert user.username == "testuser"
assert user.name == "Test User"
def test_user_create_invalid_email():
with pytest.raises(ValidationError) as exc_info:
UserCreate(
name="Test User",
username="test",
email="invalid-email",
password="Str1ngst!"
)
errors = exc_info.value.errors()
assert any(error['type'] == 'value_error' for error in errors)
def test_password_validation():
with pytest.raises(ValidationError) as exc_info:
UserCreate(
name="Test User",
username="test",
email="test@example.com",
password="123" # Doesn't match pattern
)
```
### Serialization Testing
```python
from app.models.user import User
from app.schemas.user import UserRead
def test_user_read_from_model():
# Create model instance
user_model = User(
id=1,
name="Test User",
username="testuser",
email="test@example.com",
profile_image_url="https://example.com/image.jpg",
hashed_password="hashed123",
is_superuser=False,
tier_id=None,
created_at=datetime.utcnow()
)
# Convert to schema
user_schema = UserRead.model_validate(user_model)
assert user_schema.username == "testuser"
assert user_schema.id == 1
assert user_schema.name == "Test User"
# hashed_password not included in UserRead
```
## Common Pitfalls
### Model vs Schema Field Names
```python
# DON'T - Exposing sensitive fields
class UserRead(BaseModel):
hashed_password: str # Never expose password hashes
# DO - Only expose safe fields
class UserRead(BaseModel):
id: int
name: str
username: str
email: str
profile_image_url: str
tier_id: int | None
```
### Validation Performance
```python
# DON'T - Complex validation in every request
@field_validator('email')
@classmethod
def validate_email_unique(cls, v):
# Database query in validator - slow!
if crud_users.exists(email=v):
raise ValueError('Email already exists')
# DO - Handle uniqueness in business logic
# Let database unique constraints handle this
```
## Next Steps
Now that you understand schema implementation:
1. **[CRUD Operations](crud.md)** - Learn how schemas integrate with database operations
2. **[Migrations](migrations.md)** - Manage database schema changes
3. **[API Endpoints](../api/endpoints.md)** - Use schemas in FastAPI endpoints
The next section covers CRUD operations and how they use these schemas for data validation and transformation.

View File

@ -0,0 +1,717 @@
# Development Guide
This guide covers everything you need to know about extending, customizing, and developing with the FastAPI boilerplate.
## Extending the Boilerplate
### Adding New Models
Follow this step-by-step process to add new entities to your application:
#### 1. Create SQLAlchemy Model
Create a new file in `src/app/models/` (e.g., `category.py`):
```python
from sqlalchemy import String, ForeignKey
from sqlalchemy.orm import Mapped, mapped_column, relationship
from ..core.db.database import Base
class Category(Base):
__tablename__ = "category"
id: Mapped[int] = mapped_column(
"id",
autoincrement=True,
nullable=False,
unique=True,
primary_key=True,
init=False
)
name: Mapped[str] = mapped_column(String(50))
description: Mapped[str | None] = mapped_column(String(255), default=None)
# Relationships
posts: Mapped[list["Post"]] = relationship(back_populates="category")
```
#### 2. Create Pydantic Schemas
Create `src/app/schemas/category.py`:
```python
from datetime import datetime
from typing import Annotated
from pydantic import BaseModel, Field, ConfigDict
class CategoryBase(BaseModel):
name: Annotated[str, Field(min_length=1, max_length=50)]
description: Annotated[str | None, Field(max_length=255, default=None)]
class CategoryCreate(CategoryBase):
model_config = ConfigDict(extra="forbid")
class CategoryRead(CategoryBase):
model_config = ConfigDict(from_attributes=True)
id: int
created_at: datetime
class CategoryUpdate(BaseModel):
model_config = ConfigDict(extra="forbid")
name: Annotated[str | None, Field(min_length=1, max_length=50, default=None)]
description: Annotated[str | None, Field(max_length=255, default=None)]
class CategoryUpdateInternal(CategoryUpdate):
updated_at: datetime
class CategoryDelete(BaseModel):
model_config = ConfigDict(extra="forbid")
is_deleted: bool
deleted_at: datetime
```
#### 3. Create CRUD Operations
Create `src/app/crud/crud_categories.py`:
```python
from fastcrud import FastCRUD
from ..models.category import Category
from ..schemas.category import CategoryCreate, CategoryUpdate, CategoryUpdateInternal, CategoryDelete
CRUDCategory = FastCRUD[Category, CategoryCreate, CategoryUpdate, CategoryUpdateInternal, CategoryDelete]
crud_categories = CRUDCategory(Category)
```
#### 4. Update Model Imports
Add your new model to `src/app/models/__init__.py`:
```python
from .category import Category
from .user import User
from .post import Post
# ... other imports
```
#### 5. Create Database Migration
Generate and apply the migration:
```bash
# From the src/ directory
uv run alembic revision --autogenerate -m "Add category model"
uv run alembic upgrade head
```
#### 6. Create API Endpoints
Create `src/app/api/v1/categories.py`:
```python
from typing import Annotated
from fastapi import APIRouter, Depends, HTTPException, Request
from fastcrud.paginated import PaginatedListResponse, compute_offset
from sqlalchemy.ext.asyncio import AsyncSession
from ...api.dependencies import get_current_superuser, get_current_user
from ...core.db.database import async_get_db
from ...core.exceptions.http_exceptions import DuplicateValueException, NotFoundException
from ...crud.crud_categories import crud_categories
from ...schemas.category import CategoryCreate, CategoryRead, CategoryUpdate
router = APIRouter(tags=["categories"])
@router.post("/category", response_model=CategoryRead, status_code=201)
async def write_category(
request: Request,
category: CategoryCreate,
current_user: Annotated[dict, Depends(get_current_user)],
db: Annotated[AsyncSession, Depends(async_get_db)],
):
category_row = await crud_categories.exists(db=db, name=category.name)
if category_row:
raise DuplicateValueException("Category name already exists")
return await crud_categories.create(db=db, object=category)
@router.get("/categories", response_model=PaginatedListResponse[CategoryRead])
async def read_categories(
request: Request,
db: Annotated[AsyncSession, Depends(async_get_db)],
page: int = 1,
items_per_page: int = 10,
):
categories_data = await crud_categories.get_multi(
db=db,
offset=compute_offset(page, items_per_page),
limit=items_per_page,
schema_to_select=CategoryRead,
is_deleted=False,
)
return categories_data
@router.get("/category/{category_id}", response_model=CategoryRead)
async def read_category(
request: Request,
category_id: int,
db: Annotated[AsyncSession, Depends(async_get_db)],
):
db_category = await crud_categories.get(
db=db,
schema_to_select=CategoryRead,
id=category_id,
is_deleted=False
)
if not db_category:
raise NotFoundException("Category not found")
return db_category
@router.patch("/category/{category_id}", response_model=CategoryRead)
async def patch_category(
request: Request,
category_id: int,
values: CategoryUpdate,
current_user: Annotated[dict, Depends(get_current_user)],
db: Annotated[AsyncSession, Depends(async_get_db)],
):
db_category = await crud_categories.get(db=db, id=category_id, is_deleted=False)
if not db_category:
raise NotFoundException("Category not found")
if values.name:
category_row = await crud_categories.exists(db=db, name=values.name)
if category_row and category_row["id"] != category_id:
raise DuplicateValueException("Category name already exists")
return await crud_categories.update(db=db, object=values, id=category_id)
@router.delete("/category/{category_id}")
async def erase_category(
request: Request,
category_id: int,
current_user: Annotated[dict, Depends(get_current_superuser)],
db: Annotated[AsyncSession, Depends(async_get_db)],
):
db_category = await crud_categories.get(db=db, id=category_id, is_deleted=False)
if not db_category:
raise NotFoundException("Category not found")
await crud_categories.delete(db=db, db_row=db_category, garbage_collection=False)
return {"message": "Category deleted"}
```
#### 7. Register Router
Add your router to `src/app/api/v1/__init__.py`:
```python
from fastapi import APIRouter
from .categories import router as categories_router
# ... other imports
router = APIRouter()
router.include_router(categories_router, prefix="/categories")
# ... other router includes
```
### Creating Custom Middleware
Create middleware in `src/app/middleware/`:
```python
from fastapi import Request, Response
from starlette.middleware.base import BaseHTTPMiddleware
class CustomHeaderMiddleware(BaseHTTPMiddleware):
async def dispatch(self, request: Request, call_next):
# Pre-processing
start_time = time.time()
# Process request
response = await call_next(request)
# Post-processing
process_time = time.time() - start_time
response.headers["X-Process-Time"] = str(process_time)
return response
```
Register in `src/app/main.py`:
```python
from .middleware.custom_header_middleware import CustomHeaderMiddleware
app.add_middleware(CustomHeaderMiddleware)
```
## Testing
### Test Configuration
The boilerplate uses pytest for testing. Test configuration is in `pytest.ini` and test dependencies in `pyproject.toml`.
### Database Testing Setup
Create test database fixtures in `tests/conftest.py`:
```python
import asyncio
import pytest
import pytest_asyncio
from httpx import AsyncClient
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import sessionmaker
from src.app.core.config import settings
from src.app.core.db.database import Base, async_get_db
from src.app.main import app
# Test database URL
TEST_DATABASE_URL = "postgresql+asyncpg://test_user:test_pass@localhost:5432/test_db"
# Create test engine
test_engine = create_async_engine(TEST_DATABASE_URL, echo=True)
TestSessionLocal = sessionmaker(
test_engine, class_=AsyncSession, expire_on_commit=False
)
@pytest_asyncio.fixture
async def async_session():
async with test_engine.begin() as conn:
await conn.run_sync(Base.metadata.create_all)
async with TestSessionLocal() as session:
yield session
async with test_engine.begin() as conn:
await conn.run_sync(Base.metadata.drop_all)
@pytest_asyncio.fixture
async def async_client(async_session):
def get_test_db():
return async_session
app.dependency_overrides[async_get_db] = get_test_db
async with AsyncClient(app=app, base_url="http://test") as client:
yield client
app.dependency_overrides.clear()
```
### Writing Tests
#### Model Tests
```python
# tests/test_models.py
import pytest
from src.app.models.user import User
@pytest_asyncio.fixture
async def test_user(async_session):
user = User(
name="Test User",
username="testuser",
email="test@example.com",
hashed_password="hashed_password"
)
async_session.add(user)
await async_session.commit()
await async_session.refresh(user)
return user
async def test_user_creation(test_user):
assert test_user.name == "Test User"
assert test_user.username == "testuser"
assert test_user.email == "test@example.com"
```
#### API Endpoint Tests
```python
# tests/test_api.py
import pytest
from httpx import AsyncClient
async def test_create_user(async_client: AsyncClient):
user_data = {
"name": "New User",
"username": "newuser",
"email": "new@example.com",
"password": "SecurePass123!"
}
response = await async_client.post("/api/v1/users", json=user_data)
assert response.status_code == 201
data = response.json()
assert data["name"] == "New User"
assert data["username"] == "newuser"
assert "hashed_password" not in data # Ensure password not exposed
async def test_read_users(async_client: AsyncClient):
response = await async_client.get("/api/v1/users")
assert response.status_code == 200
data = response.json()
assert "data" in data
assert "total_count" in data
```
#### CRUD Tests
```python
# tests/test_crud.py
import pytest
from src.app.crud.crud_users import crud_users
from src.app.schemas.user import UserCreate
async def test_crud_create_user(async_session):
user_data = UserCreate(
name="CRUD User",
username="cruduser",
email="crud@example.com",
password="password123"
)
user = await crud_users.create(db=async_session, object=user_data)
assert user["name"] == "CRUD User"
assert user["username"] == "cruduser"
async def test_crud_get_user(async_session, test_user):
retrieved_user = await crud_users.get(
db=async_session,
id=test_user.id
)
assert retrieved_user["name"] == test_user.name
```
### Running Tests
```bash
# Run all tests
uv run pytest
# Run with coverage
uv run pytest --cov=src
# Run specific test file
uv run pytest tests/test_api.py
# Run with verbose output
uv run pytest -v
# Run tests matching pattern
uv run pytest -k "test_user"
```
## Customization
### Environment-Specific Configuration
Create environment-specific settings:
```python
# src/app/core/config.py
class LocalSettings(Settings):
ENVIRONMENT: str = "local"
DEBUG: bool = True
class ProductionSettings(Settings):
ENVIRONMENT: str = "production"
DEBUG: bool = False
# Production-specific settings
def get_settings():
env = os.getenv("ENVIRONMENT", "local")
if env == "production":
return ProductionSettings()
return LocalSettings()
settings = get_settings()
```
### Custom Logging
Configure logging in `src/app/core/config.py`:
```python
import logging
from pythonjsonlogger import jsonlogger
def setup_logging():
# JSON logging for production
if settings.ENVIRONMENT == "production":
logHandler = logging.StreamHandler()
formatter = jsonlogger.JsonFormatter()
logHandler.setFormatter(formatter)
logger = logging.getLogger()
logger.addHandler(logHandler)
logger.setLevel(logging.INFO)
else:
# Simple logging for development
logging.basicConfig(
level=logging.DEBUG,
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s"
)
```
## Opting Out of Services
### Disabling Redis Caching
1. Remove cache decorators from endpoints
2. Update dependencies in `src/app/core/config.py`:
```python
class Settings(BaseSettings):
# Comment out or remove Redis cache settings
# REDIS_CACHE_HOST: str = "localhost"
# REDIS_CACHE_PORT: int = 6379
pass
```
3. Remove Redis cache imports and usage
### Disabling Background Tasks (ARQ)
1. Remove ARQ from `pyproject.toml` dependencies
2. Remove worker configuration from `docker-compose.yml`
3. Delete `src/app/core/worker/` directory
4. Remove task-related endpoints
### Disabling Rate Limiting
1. Remove rate limiting dependencies from endpoints:
```python
# Remove this dependency
dependencies=[Depends(rate_limiter_dependency)]
```
2. Remove rate limiting models and schemas
3. Update database migrations to remove rate limit tables
### Disabling Authentication
1. Remove JWT dependencies from protected endpoints
2. Remove user-related models and endpoints
3. Update database to remove user tables
4. Remove authentication middleware
### Minimal FastAPI Setup
For a minimal setup with just basic FastAPI:
```python
# src/app/main.py (minimal version)
from fastapi import FastAPI
app = FastAPI(
title="Minimal API",
description="Basic FastAPI application",
version="1.0.0"
)
@app.get("/")
async def root():
return {"message": "Hello World"}
@app.get("/health")
async def health_check():
return {"status": "healthy"}
```
## Best Practices
### Code Organization
- Keep models, schemas, and CRUD operations in separate files
- Use consistent naming conventions across the application
- Group related functionality in modules
- Follow FastAPI and Pydantic best practices
### Database Operations
- Always use transactions for multi-step operations
- Implement soft deletes for important data
- Use database constraints for data integrity
- Index frequently queried columns
### API Design
- Use consistent response formats
- Implement proper error handling
- Version your APIs from the start
- Document all endpoints with proper schemas
### Security
- Never expose sensitive data in API responses
- Use proper authentication and authorization
- Validate all input data
- Implement rate limiting for public endpoints
- Use HTTPS in production
### Performance
- Use async/await consistently
- Implement caching for expensive operations
- Use database connection pooling
- Monitor and optimize slow queries
- Use pagination for large datasets
## Troubleshooting
### Common Issues
**Import Errors**: Ensure all new models are imported in `__init__.py` files
**Migration Failures**: Check model definitions and relationships before generating migrations
**Test Failures**: Verify test database configuration and isolation
**Performance Issues**: Check for N+1 queries and missing database indexes
**Authentication Problems**: Verify JWT configuration and token expiration settings
### Debugging Tips
- Use FastAPI's automatic interactive docs at `/docs`
- Enable SQL query logging in development
- Use proper logging throughout the application
- Test endpoints with realistic data volumes
- Monitor database performance with query analysis
## Database Migrations
!!! warning "Important Setup for Docker Users"
If you're using the database in Docker, you need to expose the port to run migrations. Change this in `docker-compose.yml`:
```yaml
db:
image: postgres:13
env_file:
- ./src/.env
volumes:
- postgres-data:/var/lib/postgresql/data
# -------- replace with comment to run migrations with docker --------
ports:
- 5432:5432
# expose:
# - "5432"
```
### Creating Migrations
!!! warning "Model Import Requirement"
To create tables if you haven't created endpoints yet, ensure you import the models in `src/app/models/__init__.py`. This step is crucial for Alembic to detect new tables.
While in the `src` folder, run Alembic migrations:
```bash
# Generate migration file
uv run alembic revision --autogenerate -m "Description of changes"
# Apply migrations
uv run alembic upgrade head
```
!!! note "Without uv"
If you don't have uv, run `pip install alembic` first, then use `alembic` commands directly.
### Migration Workflow
1. **Make Model Changes** - Modify your SQLAlchemy models
2. **Import Models** - Ensure models are imported in `src/app/models/__init__.py`
3. **Generate Migration** - Run `alembic revision --autogenerate`
4. **Review Migration** - Check the generated migration file in `src/migrations/versions/`
5. **Apply Migration** - Run `alembic upgrade head`
6. **Test Changes** - Verify your changes work as expected
### Common Migration Tasks
#### Adding a New Model
```python
# 1. Create the model file (e.g., src/app/models/category.py)
from sqlalchemy import String
from sqlalchemy.orm import Mapped, mapped_column
from app.core.db.database import Base
class Category(Base):
__tablename__ = "categories"
id: Mapped[int] = mapped_column(primary_key=True)
name: Mapped[str] = mapped_column(String(50))
description: Mapped[str] = mapped_column(String(255), nullable=True)
```
```python
# 2. Import in src/app/models/__init__.py
from .user import User
from .post import Post
from .tier import Tier
from .rate_limit import RateLimit
from .category import Category # Add this line
```
```bash
# 3. Generate and apply migration
cd src
uv run alembic revision --autogenerate -m "Add categories table"
uv run alembic upgrade head
```
#### Modifying Existing Models
```python
# 1. Modify your model
class User(Base):
# ... existing fields ...
bio: Mapped[str] = mapped_column(String(500), nullable=True) # New field
```
```bash
# 2. Generate migration
uv run alembic revision --autogenerate -m "Add bio field to users"
# 3. Review the generated migration file
# 4. Apply migration
uv run alembic upgrade head
```
This guide provides the foundation for extending and customizing the FastAPI boilerplate. For specific implementation details, refer to the existing code examples throughout the boilerplate.

86
docs/user-guide/index.md Normal file
View File

@ -0,0 +1,86 @@
# User Guide
This user guide provides comprehensive information about using and understanding the FastAPI Boilerplate. Whether you're building your first API or looking to understand advanced features, this guide covers everything you need to know.
## What You'll Learn
This guide covers all aspects of working with the FastAPI Boilerplate:
### Project Understanding
- **[Project Structure](project-structure.md)** - Navigate the codebase organization and understand architectural decisions
- **[Configuration](configuration/index.md)** - Configure your application for different environments
### Core Components
### Database Operations
- **[Database Overview](database/index.md)** - Understand the data layer architecture
- **[Models](database/models.md)** - Define and work with SQLAlchemy models
- **[Schemas](database/schemas.md)** - Create Pydantic schemas for data validation
- **[CRUD Operations](database/crud.md)** - Implement create, read, update, and delete operations
- **[Migrations](database/migrations.md)** - Manage database schema changes with Alembic
### API Development
- **[API Overview](api/index.md)** - Build robust REST APIs with FastAPI
- **[Endpoints](api/endpoints.md)** - Create and organize API endpoints
- **[Pagination](api/pagination.md)** - Implement efficient data pagination
- **[Exception Handling](api/exceptions.md)** - Handle errors gracefully
- **[API Versioning](api/versioning.md)** - Manage API versions and backward compatibility
### Security & Authentication
- **[Authentication Overview](authentication/index.md)** - Secure your API with JWT authentication
- **[JWT Tokens](authentication/jwt-tokens.md)** - Understand access and refresh token management
- **[User Management](authentication/user-management.md)** - Handle user registration, login, and profiles
- **[Permissions](authentication/permissions.md)** - Implement role-based access control
### Admin Panel
Powered by [CRUDAdmin](https://github.com/benavlabs/crudadmin) - a modern admin interface generator for FastAPI.
- **[Admin Panel Overview](admin-panel/index.md)** - Web-based database management interface
- **[Configuration](admin-panel/configuration.md)** - Setup, session backends, and environment variables
- **[Adding Models](admin-panel/adding-models.md)** - Register models, schemas, and customization
- **[User Management](admin-panel/user-management.md)** - Admin users, authentication, and security
### Performance & Caching
- **[Caching Overview](caching/index.md)** - Improve performance with Redis caching
- **[Redis Cache](caching/redis-cache.md)** - Server-side caching with Redis
- **[Client Cache](caching/client-cache.md)** - HTTP caching headers and browser caching
- **[Cache Strategies](caching/cache-strategies.md)** - Advanced caching patterns and invalidation
### Background Processing
- **[Background Tasks](background-tasks/index.md)** - Handle long-running operations with ARQ
### Rate Limiting
- **[Rate Limiting](rate-limiting/index.md)** - Protect your API from abuse with Redis-based rate limiting
## Prerequisites
Before diving into this guide, ensure you have:
- Completed the [Getting Started](../getting-started/index.md) section
- A running FastAPI Boilerplate instance
- Basic understanding of Python, FastAPI, and REST APIs
- Familiarity with SQL databases (PostgreSQL knowledge is helpful)
## Next Steps
Ready to dive in? Here are recommended learning paths:
### For New Users
1. Start with [Project Structure](project-structure.md) to understand the codebase
2. Learn [Database Models](database/models.md) and [Schemas](database/schemas.md)
3. Create your first [API Endpoints](api/endpoints.md)
4. Add [Authentication](authentication/index.md) to secure your API
### For Experienced Developers
1. Review [Database CRUD Operations](database/crud.md) for advanced patterns
2. Implement [Caching Strategies](caching/index.md) for performance
3. Set up [Background Tasks](background-tasks/index.md) for async processing
4. Configure [Rate Limiting](rate-limiting/index.md) for production use
### For Production Deployment
1. Understand [Cache Strategies](caching/cache-strategies.md) patterns
2. Configure [Rate Limiting](rate-limiting/index.md) with user tiers
3. Set up [Background Task Processing](background-tasks/index.md)
4. Review the [Production Guide](production.md) for deployment considerations
Choose your path based on your needs and experience level. Each section builds upon previous concepts while remaining self-contained for reference use.

View File

@ -0,0 +1,709 @@
# Production Deployment
This guide covers deploying the FastAPI boilerplate to production with proper performance, security, and reliability configurations.
## Production Architecture
The recommended production setup uses:
- **Gunicorn** - WSGI server managing Uvicorn workers
- **Uvicorn Workers** - ASGI server handling FastAPI requests
- **NGINX** - Reverse proxy and load balancer
- **PostgreSQL** - Production database
- **Redis** - Caching and background tasks
- **Docker** - Containerization
## Environment Configuration
### Production Environment Variables
Update your `.env` file for production:
```bash
# ------------- environment -------------
ENVIRONMENT="production"
# ------------- app settings -------------
APP_NAME="Your Production App"
DEBUG=false
# ------------- database -------------
POSTGRES_USER="prod_user"
POSTGRES_PASSWORD="secure_production_password"
POSTGRES_SERVER="db" # or your database host
POSTGRES_PORT=5432
POSTGRES_DB="prod_database"
# ------------- redis -------------
REDIS_CACHE_HOST="redis"
REDIS_CACHE_PORT=6379
REDIS_QUEUE_HOST="redis"
REDIS_QUEUE_PORT=6379
REDIS_RATE_LIMIT_HOST="redis"
REDIS_RATE_LIMIT_PORT=6379
# ------------- security -------------
SECRET_KEY="your-super-secure-secret-key-generate-with-openssl"
ALGORITHM="HS256"
ACCESS_TOKEN_EXPIRE_MINUTES=30
REFRESH_TOKEN_EXPIRE_DAYS=7
# ------------- logging -------------
LOG_LEVEL="INFO"
```
### Docker Configuration
#### Production Dockerfile
```dockerfile
FROM python:3.11-slim
WORKDIR /code
# Install system dependencies
RUN apt-get update && apt-get install -y \
gcc \
&& rm -rf /var/lib/apt/lists/*
# Install UV
RUN pip install uv
# Copy dependency files
COPY pyproject.toml uv.lock ./
# Install dependencies
RUN uv sync --frozen --no-dev
# Copy application code
COPY src/ ./src/
# Create non-root user
RUN useradd --create-home --shell /bin/bash app \
&& chown -R app:app /code
USER app
# Production command with Gunicorn
CMD ["uv", "run", "gunicorn", "src.app.main:app", "-w", "4", "-k", "uvicorn.workers.UvicornWorker", "--bind", "0.0.0.0:8000"]
```
#### Production Docker Compose
```yaml
version: '3.8'
services:
web:
build: .
ports:
- "8000:8000"
env_file:
- ./src/.env
depends_on:
- db
- redis
restart: unless-stopped
deploy:
resources:
limits:
memory: 1G
reservations:
memory: 512M
worker:
build: .
command: uv run arq src.app.core.worker.settings.WorkerSettings
env_file:
- ./src/.env
depends_on:
- db
- redis
restart: unless-stopped
deploy:
replicas: 2
db:
image: postgres:15
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
- POSTGRES_DB=${POSTGRES_DB}
restart: unless-stopped
deploy:
resources:
limits:
memory: 2G
reservations:
memory: 1G
redis:
image: redis:7-alpine
restart: unless-stopped
volumes:
- redis_data:/data
deploy:
resources:
limits:
memory: 512M
reservations:
memory: 256M
nginx:
image: nginx:alpine
ports:
- "80:80"
- "443:443"
volumes:
- ./nginx/nginx.conf:/etc/nginx/nginx.conf
- ./nginx/ssl:/etc/nginx/ssl
depends_on:
- web
restart: unless-stopped
volumes:
postgres_data:
redis_data:
```
## Gunicorn Configuration
### Basic Gunicorn Setup
Create `gunicorn.conf.py`:
```python
import multiprocessing
# Server socket
bind = "0.0.0.0:8000"
backlog = 2048
# Worker processes
workers = multiprocessing.cpu_count() * 2 + 1
worker_class = "uvicorn.workers.UvicornWorker"
worker_connections = 1000
max_requests = 1000
max_requests_jitter = 50
# Restart workers after this many requests, with up to 50 jitter
preload_app = True
# Logging
accesslog = "-"
errorlog = "-"
loglevel = "info"
access_log_format = '%(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s" %(D)s'
# Process naming
proc_name = "fastapi-boilerplate"
# Server mechanics
daemon = False
pidfile = "/tmp/gunicorn.pid"
user = None
group = None
tmp_upload_dir = None
# SSL (if terminating SSL at application level)
# keyfile = "/path/to/keyfile"
# certfile = "/path/to/certfile"
# Worker timeout
timeout = 30
keepalive = 2
# Memory management
max_requests = 1000
max_requests_jitter = 50
preload_app = True
```
### Running with Gunicorn
```bash
# Basic command
uv run gunicorn src.app.main:app -w 4 -k uvicorn.workers.UvicornWorker
# With configuration file
uv run gunicorn src.app.main:app -c gunicorn.conf.py
# With specific bind address
uv run gunicorn src.app.main:app -w 4 -k uvicorn.workers.UvicornWorker --bind 0.0.0.0:8000
```
## NGINX Configuration
### Single Server Setup
Create `nginx/nginx.conf`:
```nginx
events {
worker_connections 1024;
}
http {
upstream fastapi_backend {
server web:8000;
}
server {
listen 80;
server_name your-domain.com;
# Redirect HTTP to HTTPS
return 301 https://$server_name$request_uri;
}
server {
listen 443 ssl http2;
server_name your-domain.com;
# SSL Configuration
ssl_certificate /etc/nginx/ssl/cert.pem;
ssl_certificate_key /etc/nginx/ssl/key.pem;
ssl_protocols TLSv1.2 TLSv1.3;
ssl_ciphers ECDHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384;
ssl_prefer_server_ciphers off;
# Security headers
add_header X-Frame-Options DENY;
add_header X-Content-Type-Options nosniff;
add_header X-XSS-Protection "1; mode=block";
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
# Gzip compression
gzip on;
gzip_vary on;
gzip_min_length 10240;
gzip_proxied expired no-cache no-store private must-revalidate auth;
gzip_types
text/plain
text/css
text/xml
text/javascript
application/javascript
application/xml+rss
application/json;
# Rate limiting
limit_req_zone $binary_remote_addr zone=api:10m rate=10r/s;
location / {
limit_req zone=api burst=20 nodelay;
proxy_pass http://fastapi_backend;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# Timeouts
proxy_connect_timeout 60s;
proxy_send_timeout 60s;
proxy_read_timeout 60s;
# Buffer settings
proxy_buffering on;
proxy_buffer_size 8k;
proxy_buffers 8 8k;
}
# Health check endpoint (no rate limiting)
location /health {
proxy_pass http://fastapi_backend;
proxy_set_header Host $host;
access_log off;
}
# Static files (if any)
location /static/ {
alias /code/static/;
expires 1y;
add_header Cache-Control "public, immutable";
}
}
}
```
### Simple Single Server (default.conf)
For basic production setup, create `default.conf`:
```nginx
# ---------------- Running With One Server ----------------
server {
listen 80;
location / {
proxy_pass http://web:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
```
### Load Balancing Multiple Servers
For horizontal scaling with multiple FastAPI instances:
```nginx
# ---------------- To Run with Multiple Servers ----------------
upstream fastapi_app {
server fastapi1:8000; # Replace with actual server names
server fastapi2:8000;
# Add more servers as needed
}
server {
listen 80;
location / {
proxy_pass http://fastapi_app;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
```
### Advanced Load Balancing
For production with advanced features:
```nginx
upstream fastapi_backend {
least_conn;
server web1:8000 weight=3;
server web2:8000 weight=2;
server web3:8000 weight=1;
# Health checks
keepalive 32;
}
server {
listen 443 ssl http2;
server_name your-domain.com;
location / {
proxy_pass http://fastapi_backend;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# Connection settings for load balancing
proxy_http_version 1.1;
proxy_set_header Connection "";
}
}
```
### SSL Certificate Setup
#### Using Let's Encrypt (Certbot)
```bash
# Install certbot
sudo apt-get update
sudo apt-get install certbot python3-certbot-nginx
# Obtain certificate
sudo certbot --nginx -d your-domain.com
# Auto-renewal (add to crontab)
0 2 * * 1 /usr/bin/certbot renew --quiet
```
#### Manual SSL Setup
```bash
# Generate self-signed certificate (development only)
mkdir -p nginx/ssl
openssl req -x509 -nodes -days 365 -newkey rsa:2048 \
-keyout nginx/ssl/key.pem \
-out nginx/ssl/cert.pem
```
## Production Best Practices
### Database Optimization
#### PostgreSQL Configuration
```sql
-- Optimize PostgreSQL for production
ALTER SYSTEM SET shared_buffers = '256MB';
ALTER SYSTEM SET effective_cache_size = '1GB';
ALTER SYSTEM SET random_page_cost = 1.1;
ALTER SYSTEM SET effective_io_concurrency = 200;
SELECT pg_reload_conf();
```
#### Connection Pooling
```python
# src/app/core/db/database.py
from sqlalchemy.ext.asyncio import create_async_engine
# Production database settings
engine = create_async_engine(
DATABASE_URL,
echo=False, # Disable in production
pool_size=20,
max_overflow=0,
pool_pre_ping=True,
pool_recycle=3600,
)
```
### Redis Configuration
#### Redis Production Settings
```bash
# redis.conf adjustments
maxmemory 512mb
maxmemory-policy allkeys-lru
save 900 1
save 300 10
save 60 10000
```
### Application Optimization
#### Logging Configuration
```python
# src/app/core/config.py
import logging
from pythonjsonlogger import jsonlogger
def setup_production_logging():
logHandler = logging.StreamHandler()
formatter = jsonlogger.JsonFormatter(
"%(asctime)s %(name)s %(levelname)s %(message)s"
)
logHandler.setFormatter(formatter)
logger = logging.getLogger()
logger.addHandler(logHandler)
logger.setLevel(logging.INFO)
# Reduce noise from third-party libraries
logging.getLogger("uvicorn.access").setLevel(logging.WARNING)
logging.getLogger("sqlalchemy.engine").setLevel(logging.WARNING)
```
#### Performance Monitoring
```python
# src/app/middleware/monitoring.py
import time
from fastapi import Request
from starlette.middleware.base import BaseHTTPMiddleware
class MonitoringMiddleware(BaseHTTPMiddleware):
async def dispatch(self, request: Request, call_next):
start_time = time.time()
response = await call_next(request)
process_time = time.time() - start_time
response.headers["X-Process-Time"] = str(process_time)
# Log slow requests
if process_time > 1.0:
logger.warning(f"Slow request: {request.method} {request.url} - {process_time:.2f}s")
return response
```
### Security Configuration
#### Environment Security
```python
# src/app/core/config.py
class ProductionSettings(Settings):
# Hide docs in production
ENVIRONMENT: str = "production"
# Security settings
SECRET_KEY: str = Field(..., min_length=32)
ALLOWED_HOSTS: list[str] = ["your-domain.com", "api.your-domain.com"]
# Database security
POSTGRES_PASSWORD: str = Field(..., min_length=16)
class Config:
case_sensitive = True
```
#### Rate Limiting
```python
# Adjust rate limits for production
DEFAULT_RATE_LIMIT_LIMIT = 100 # requests per period
DEFAULT_RATE_LIMIT_PERIOD = 3600 # 1 hour
```
### Health Checks
#### Application Health Check
```python
# src/app/api/v1/health.py
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.ext.asyncio import AsyncSession
from ...core.db.database import async_get_db
from ...core.utils.cache import redis_client
router = APIRouter()
@router.get("/health")
async def health_check():
return {"status": "healthy", "timestamp": datetime.utcnow()}
@router.get("/health/detailed")
async def detailed_health_check(db: AsyncSession = Depends(async_get_db)):
health_status = {"status": "healthy", "services": {}}
# Check database
try:
await db.execute("SELECT 1")
health_status["services"]["database"] = "healthy"
except Exception:
health_status["services"]["database"] = "unhealthy"
health_status["status"] = "unhealthy"
# Check Redis
try:
await redis_client.ping()
health_status["services"]["redis"] = "healthy"
except Exception:
health_status["services"]["redis"] = "unhealthy"
health_status["status"] = "unhealthy"
if health_status["status"] == "unhealthy":
raise HTTPException(status_code=503, detail=health_status)
return health_status
```
### Deployment Process
#### CI/CD Pipeline (GitHub Actions)
```yaml
# .github/workflows/deploy.yml
name: Deploy to Production
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Build and push Docker image
env:
DOCKER_REGISTRY: your-registry.com
run: |
docker build -t $DOCKER_REGISTRY/fastapi-app:latest .
docker push $DOCKER_REGISTRY/fastapi-app:latest
- name: Deploy to production
run: |
# Your deployment commands
ssh production-server "docker compose pull && docker compose up -d"
```
#### Zero-Downtime Deployment
```bash
#!/bin/bash
# deploy.sh - Zero-downtime deployment script
# Pull new images
docker compose pull
# Start new containers
docker compose up -d --no-deps --scale web=2 web
# Wait for health check
sleep 30
# Stop old containers
docker compose up -d --no-deps --scale web=1 web
# Clean up
docker system prune -f
```
### Monitoring and Alerting
#### Basic Monitoring Setup
```python
# Basic metrics collection
import psutil
from fastapi import APIRouter
router = APIRouter()
@router.get("/metrics")
async def get_metrics():
return {
"cpu_percent": psutil.cpu_percent(),
"memory_percent": psutil.virtual_memory().percent,
"disk_usage": psutil.disk_usage('/').percent
}
```
### Backup Strategy
#### Database Backup
```bash
#!/bin/bash
# backup-db.sh
BACKUP_DIR="/backups"
DATE=$(date +%Y%m%d_%H%M%S)
pg_dump -h localhost -U $POSTGRES_USER $POSTGRES_DB | gzip > $BACKUP_DIR/backup_$DATE.sql.gz
# Keep only last 7 days of backups
find $BACKUP_DIR -name "backup_*.sql.gz" -mtime +7 -delete
```
## Troubleshooting
### Common Production Issues
**High Memory Usage**: Check for memory leaks, optimize database queries, adjust worker counts
**Slow Response Times**: Enable query logging, check database indexes, optimize N+1 queries
**Connection Timeouts**: Adjust proxy timeouts, check database connection pool settings
**SSL Certificate Issues**: Verify certificate paths, check renewal process
### Performance Tuning
- Monitor database query performance
- Implement proper caching strategies
- Use connection pooling
- Optimize Docker image layers
- Configure proper resource limits
This production guide provides a solid foundation for deploying the FastAPI boilerplate to production environments with proper performance, security, and reliability configurations.

View File

@ -0,0 +1,296 @@
# Project Structure
Understanding the project structure is essential for navigating the FastAPI Boilerplate effectively. This guide explains the organization of the codebase, the purpose of each directory, and how components interact with each other.
## Overview
The FastAPI Boilerplate follows a clean, modular architecture that separates concerns and promotes maintainability. The structure is designed to scale from simple APIs to complex applications while maintaining code organization and clarity.
## Root Directory Structure
```text
FastAPI-boilerplate/
├── Dockerfile # Container configuration
├── docker-compose.yml # Multi-service orchestration
├── pyproject.toml # Project configuration and dependencies
├── uv.lock # Dependency lock file
├── README.md # Project documentation
├── LICENSE.md # License information
├── tests/ # Test suite
├── docs/ # Documentation
└── src/ # Source code
```
### Configuration Files
| File | Purpose |
|------|---------|
| `Dockerfile` | Defines the container image for the application |
| `docker-compose.yml` | Orchestrates multiple services (API, database, Redis, worker) |
| `pyproject.toml` | Modern Python project configuration with dependencies and metadata |
| `uv.lock` | Locks exact dependency versions for reproducible builds |
## Source Code Structure
The `src/` directory contains all application code:
```text
src/
├── app/ # Main application package
│ ├── main.py # Application entry point
│ ├── api/ # API layer
│ ├── core/ # Core utilities and configurations
│ ├── crud/ # Database operations
│ ├── models/ # SQLAlchemy models
│ ├── schemas/ # Pydantic schemas
│ ├── middleware/ # Custom middleware
│ └── logs/ # Application logs
├── migrations/ # Database migrations
└── scripts/ # Utility scripts
```
## Core Application (`src/app/`)
### Entry Point
- **`main.py`** - FastAPI application instance and configuration
### API Layer (`api/`)
```text
api/
├── dependencies.py # Shared dependencies
└── v1/ # API version 1
├── login.py # Authentication endpoints
├── logout.py # Logout functionality
├── users.py # User management
├── posts.py # Post operations
├── tasks.py # Background task endpoints
├── tiers.py # User tier management
└── rate_limits.py # Rate limiting endpoints
```
**Purpose**: Contains all API endpoints organized by functionality and version.
### Core System (`core/`)
```text
core/
├── config.py # Application settings
├── logger.py # Logging configuration
├── schemas.py # Core Pydantic schemas
├── security.py # Security utilities
├── setup.py # Application factory
├── db/ # Database core
├── exceptions/ # Custom exceptions
├── utils/ # Utility functions
└── worker/ # Background worker
```
**Purpose**: Houses core functionality, configuration, and shared utilities.
#### Database Core (`core/db/`)
```text
db/
├── database.py # Database connection and session management
├── models.py # Base models and mixins
├── crud_token_blacklist.py # Token blacklist operations
└── token_blacklist.py # Token blacklist model
```
#### Exceptions (`core/exceptions/`)
```text
exceptions/
├── cache_exceptions.py # Cache-related exceptions
└── http_exceptions.py # HTTP exceptions
```
#### Utilities (`core/utils/`)
```text
utils/
├── cache.py # Caching utilities
├── queue.py # Task queue management
└── rate_limit.py # Rate limiting utilities
```
#### Worker (`core/worker/`)
```text
worker/
├── settings.py # Worker configuration
└── functions.py # Background task definitions
```
### Data Layer
#### Models (`models/`)
```text
models/
├── user.py # User model
├── post.py # Post model
├── tier.py # User tier model
└── rate_limit.py # Rate limit model
```
**Purpose**: SQLAlchemy ORM models defining database schema.
#### Schemas (`schemas/`)
```text
schemas/
├── user.py # User validation schemas
├── post.py # Post validation schemas
├── tier.py # Tier validation schemas
├── rate_limit.py # Rate limit schemas
└── job.py # Background job schemas
```
**Purpose**: Pydantic schemas for request/response validation and serialization.
#### CRUD Operations (`crud/`)
```text
crud/
├── crud_base.py # Base CRUD class
├── crud_users.py # User operations
├── crud_posts.py # Post operations
├── crud_tier.py # Tier operations
├── crud_rate_limit.py # Rate limit operations
└── helper.py # CRUD helper functions
```
**Purpose**: Database operations using FastCRUD for consistent data access patterns.
### Additional Components
#### Middleware (`middleware/`)
```text
middleware/
└── client_cache_middleware.py # Client-side caching middleware
```
#### Logs (`logs/`)
```text
logs/
└── app.log # Application log file
```
## Database Migrations (`src/migrations/`)
```text
migrations/
├── README # Migration instructions
├── env.py # Alembic environment configuration
├── script.py.mako # Migration template
└── versions/ # Individual migration files
```
**Purpose**: Alembic database migrations for schema version control.
## Utility Scripts (`src/scripts/`)
```text
scripts/
├── create_first_superuser.py # Create initial admin user
└── create_first_tier.py # Create initial user tier
```
**Purpose**: Initialization and maintenance scripts.
## Testing Structure (`tests/`)
```text
tests/
├── conftest.py # Pytest configuration and fixtures
├── test_user_unit.py # User-related unit tests
└── helpers/ # Test utilities
├── generators.py # Test data generators
└── mocks.py # Mock objects and functions
```
## Architectural Patterns
### Layered Architecture
The boilerplate implements a clean layered architecture:
1. **API Layer** (`api/`) - Handles HTTP requests and responses
2. **Business Logic** (`crud/`) - Implements business rules and data operations
3. **Data Access** (`models/`) - Defines data structure and database interaction
4. **Core Services** (`core/`) - Provides shared functionality and configuration
### Dependency Injection
FastAPI's dependency injection system is used throughout:
- **Database Sessions** - Injected into endpoints via `async_get_db`
- **Authentication** - User context provided by `get_current_user`
- **Rate Limiting** - Applied via `rate_limiter_dependency`
- **Caching** - Managed through decorators and middleware
### Configuration Management
All configuration is centralized in `core/config.py`:
- **Environment Variables** - Loaded from `.env` file
- **Settings Classes** - Organized by functionality (database, security, etc.)
- **Type Safety** - Using Pydantic for validation
### Error Handling
Centralized exception handling:
- **Custom Exceptions** - Defined in `core/exceptions/`
- **HTTP Status Codes** - Consistent error responses
- **Logging** - Automatic error logging and tracking
## Design Principles
### Single Responsibility
Each module has a clear, single purpose:
- Models define data structure
- Schemas handle validation
- CRUD manages data operations
- API endpoints handle requests
### Separation of Concerns
- Business logic separated from presentation
- Database operations isolated from API logic
- Configuration centralized and environment-aware
### Modularity
- Features can be added/removed independently
- Services can be disabled via configuration
- Clear interfaces between components
### Scalability
- Async/await throughout the application
- Connection pooling for database access
- Caching and background task support
- Horizontal scaling ready
## Navigation Tips
### Finding Code
- **Models** → `src/app/models/`
- **API Endpoints** → `src/app/api/v1/`
- **Database Operations** → `src/app/crud/`
- **Configuration** → `src/app/core/config.py`
- **Business Logic** → Distributed across CRUD and API layers
### Adding New Features
1. **Model** → Define in `models/`
2. **Schema** → Create in `schemas/`
3. **CRUD** → Implement in `crud/`
4. **API** → Add endpoints in `api/v1/`
5. **Migration** → Generate with Alembic
### Understanding Data Flow
```text
Request → API Endpoint → Dependencies → CRUD → Model → Database
Response ← API Response ← Schema ← CRUD ← Query Result ← Database
```
This structure provides a solid foundation for building scalable, maintainable APIs while keeping the codebase organized and easy to navigate.

View File

@ -0,0 +1,481 @@
# Rate Limiting
The boilerplate includes a sophisticated rate limiting system built on Redis that protects your API from abuse while supporting user tiers with different access levels. This system provides flexible, scalable rate limiting for production applications.
## Overview
Rate limiting controls how many requests users can make within a specific time period. The boilerplate implements:
- **Redis-Based Storage**: Fast, distributed rate limiting using Redis
- **User Tier System**: Different limits for different user types
- **Path-Specific Limits**: Granular control per API endpoint
- **Fallback Protection**: Default limits for unauthenticated users
## Quick Example
```python
from fastapi import Depends
from app.api.dependencies import rate_limiter_dependency
@router.post("/api/v1/posts", dependencies=[Depends(rate_limiter_dependency)])
async def create_post(post_data: PostCreate):
# This endpoint is automatically rate limited based on:
# - User's tier (basic, premium, enterprise)
# - Specific limits for the /posts endpoint
# - Default limits for unauthenticated users
return await crud_posts.create(db=db, object=post_data)
```
## Architecture
### Rate Limiting Components
**Rate Limiter Class**: Singleton Redis client for checking limits<br>
**User Tiers**: Database-stored user subscription levels<br>
**Rate Limit Rules**: Path-specific limits per tier<br>
**Dependency Injection**: Automatic enforcement via FastAPI dependencies<br>
### How It Works
1. **Request Arrives**: User makes API request to protected endpoint
2. **User Identification**: System identifies user and their tier
3. **Limit Lookup**: Finds applicable rate limit for user tier + endpoint
4. **Redis Check**: Increments counter in Redis sliding window
5. **Allow/Deny**: Request proceeds or returns 429 Too Many Requests
## User Tier System
### Default Tiers
The system supports flexible user tiers with different access levels:
```python
# Example tier configuration
tiers = {
"free": {
"requests_per_minute": 10,
"requests_per_hour": 100,
"special_endpoints": {
"/api/v1/ai/generate": {"limit": 2, "period": 3600}, # 2 per hour
"/api/v1/exports": {"limit": 1, "period": 86400}, # 1 per day
}
},
"premium": {
"requests_per_minute": 60,
"requests_per_hour": 1000,
"special_endpoints": {
"/api/v1/ai/generate": {"limit": 50, "period": 3600},
"/api/v1/exports": {"limit": 10, "period": 86400},
}
},
"enterprise": {
"requests_per_minute": 300,
"requests_per_hour": 10000,
"special_endpoints": {
"/api/v1/ai/generate": {"limit": 500, "period": 3600},
"/api/v1/exports": {"limit": 100, "period": 86400},
}
}
}
```
### Rate Limit Database Structure
```python
# Rate limits are stored per tier and path
class RateLimit:
id: int
tier_id: int # Links to user tier
name: str # Descriptive name
path: str # API path (sanitized)
limit: int # Number of requests allowed
period: int # Time period in seconds
```
## Implementation Details
### Automatic Rate Limiting
The system automatically applies rate limiting through dependency injection:
```python
@router.post("/protected-endpoint", dependencies=[Depends(rate_limiter_dependency)])
async def protected_endpoint():
"""This endpoint is automatically rate limited."""
pass
# The dependency:
# 1. Identifies the user and their tier
# 2. Looks up rate limits for this path
# 3. Checks Redis counter
# 4. Allows or blocks the request
```
#### Example Dependency Implementation
To make the rate limiting dependency functional, you must implement how user tiers and paths resolve to actual rate limits.
Below is a complete example using Redis and the database to determine per-tier and per-path restrictions.
```python
async def rate_limiter_dependency(
request: Request,
db: AsyncSession = Depends(async_get_db),
user=Depends(get_current_user_optional),
):
"""
Enforces rate limits per user tier and API path.
- Identifies user (or defaults to IP-based anonymous rate limit)
- Finds tier-specific limit for the request path
- Checks Redis counter to determine if request should be allowed
"""
path = sanitize_path(request.url.path)
user_id = getattr(user, "id", None) or request.client.host or "anonymous"
# Determine user tier (default to "free" or anonymous)
if user and getattr(user, "tier_id", None):
tier = await crud_tiers.get(db=db, id=user.tier_id)
else:
tier = await crud_tiers.get(db=db, name="free")
if not tier:
raise RateLimitException("Tier configuration not found")
# Find specific rate limit rule for this path + tier
rate_limit_rule = await crud_rate_limits.get_by_path_and_tier(
db=db, path=path, tier_id=tier.id
)
# Use default limits if no specific rule is found
limit = getattr(rate_limit_rule, "limit", 100)
period = getattr(rate_limit_rule, "period", 3600)
# Check rate limit in Redis
is_limited = await rate_limiter.is_rate_limited(
db=db,
user_id=user_id,
path=path,
limit=limit,
period=period,
)
if is_limited:
raise RateLimitException(
f"Rate limit exceeded for path '{path}'. Try again later."
)
```
### Redis-Based Counting
The rate limiter uses Redis for distributed, high-performance counting:
```python
# Sliding window implementation
async def is_rate_limited(self, user_id: int, path: str, limit: int, period: int) -> bool:
current_timestamp = int(datetime.now(UTC).timestamp())
window_start = current_timestamp - (current_timestamp % period)
# Create unique key for this user/path/window
key = f"ratelimit:{user_id}:{sanitized_path}:{window_start}"
# Increment counter
current_count = await redis_client.incr(key)
# Set expiration on first increment
if current_count == 1:
await redis_client.expire(key, period)
# Check if limit exceeded
return current_count > limit
```
### Path Sanitization
API paths are sanitized for consistent Redis key generation:
```python
def sanitize_path(path: str) -> str:
return path.strip("/").replace("/", "_")
# Examples:
# "/api/v1/users" → "api_v1_users"
# "/posts/{id}" → "posts_{id}"
```
## Configuration
### Environment Variables
```bash
# Rate Limiting Settings
DEFAULT_RATE_LIMIT_LIMIT=100 # Default requests per period
DEFAULT_RATE_LIMIT_PERIOD=3600 # Default period (1 hour)
# Redis Rate Limiter Settings
REDIS_RATE_LIMITER_HOST=localhost
REDIS_RATE_LIMITER_PORT=6379
REDIS_RATE_LIMITER_DB=2 # Separate from cache/queue
```
### Creating User Tiers
```python
# Create tiers via API (superuser only)
POST /api/v1/tiers
{
"name": "premium",
"description": "Premium subscription with higher limits"
}
# Assign tier to user
PUT /api/v1/users/{user_id}/tier
{
"tier_id": 2
}
```
### Setting Rate Limits
```python
# Create rate limits per tier and endpoint
POST /api/v1/tier/premium/rate_limit
{
"name": "premium_posts_limit",
"path": "/api/v1/posts",
"limit": 100, # 100 requests
"period": 3600 # per hour
}
# Different limits for different endpoints
POST /api/v1/tier/free/rate_limit
{
"name": "free_ai_limit",
"path": "/api/v1/ai/generate",
"limit": 5, # 5 requests
"period": 86400 # per day
}
```
## Usage Patterns
### Basic Protection
```python
# Protect all endpoints in a router
router = APIRouter(dependencies=[Depends(rate_limiter_dependency)])
@router.get("/users")
async def get_users():
"""Rate limited based on user tier."""
pass
@router.post("/posts")
async def create_post():
"""Rate limited based on user tier."""
pass
```
### Selective Protection
```python
# Protect only specific endpoints
@router.get("/public-data")
async def get_public_data():
"""No rate limiting - public endpoint."""
pass
@router.post("/premium-feature", dependencies=[Depends(rate_limiter_dependency)])
async def premium_feature():
"""Rate limited - premium feature."""
pass
```
### Custom Error Handling
```python
from app.core.exceptions.http_exceptions import RateLimitException
@app.exception_handler(RateLimitException)
async def rate_limit_handler(request: Request, exc: RateLimitException):
"""Custom rate limit error response."""
return JSONResponse(
status_code=429,
content={
"error": "Rate limit exceeded",
"message": "Too many requests. Please try again later.",
"retry_after": 60 # Suggest retry time
},
headers={"Retry-After": "60"}
)
```
## Monitoring and Analytics
### Rate Limit Metrics
```python
@router.get("/admin/rate-limit-stats")
async def get_rate_limit_stats():
"""Monitor rate limiting effectiveness."""
# Get Redis statistics
redis_info = await rate_limiter.client.info()
# Count current rate limit keys
pattern = "ratelimit:*"
keys = await rate_limiter.client.keys(pattern)
# Analyze by endpoint
endpoint_stats = {}
for key in keys:
parts = key.split(":")
if len(parts) >= 3:
endpoint = parts[2]
endpoint_stats[endpoint] = endpoint_stats.get(endpoint, 0) + 1
return {
"total_active_limits": len(keys),
"redis_memory_usage": redis_info.get("used_memory_human"),
"endpoint_stats": endpoint_stats
}
```
### User Analytics
```python
async def analyze_user_usage(user_id: int, days: int = 7):
"""Analyze user's API usage patterns."""
# This would require additional logging/analytics
# implementation to track request patterns
return {
"user_id": user_id,
"tier": "premium",
"requests_last_7_days": 2540,
"average_requests_per_day": 363,
"top_endpoints": [
{"path": "/api/v1/posts", "count": 1200},
{"path": "/api/v1/users", "count": 800},
{"path": "/api/v1/ai/generate", "count": 540}
],
"rate_limit_hits": 12, # Times user hit rate limits
"suggested_tier": "enterprise" # Based on usage patterns
}
```
## Best Practices
### Rate Limit Design
```python
# Design limits based on resource cost
expensive_endpoints = {
"/api/v1/ai/generate": {"limit": 10, "period": 3600}, # AI is expensive
"/api/v1/reports/export": {"limit": 3, "period": 86400}, # Export is heavy
"/api/v1/bulk/import": {"limit": 1, "period": 3600}, # Import is intensive
}
# More generous limits for lightweight endpoints
lightweight_endpoints = {
"/api/v1/users/me": {"limit": 1000, "period": 3600}, # Profile access
"/api/v1/posts": {"limit": 300, "period": 3600}, # Content browsing
"/api/v1/search": {"limit": 500, "period": 3600}, # Search queries
}
```
### Production Considerations
```python
# Use separate Redis database for rate limiting
REDIS_RATE_LIMITER_DB=2 # Isolate from cache and queues
# Set appropriate Redis memory policies
# maxmemory-policy volatile-lru # Remove expired rate limit keys first
# Monitor Redis memory usage
# Rate limit keys can accumulate quickly under high load
# Consider rate limit key cleanup
async def cleanup_expired_rate_limits():
"""Clean up expired rate limit keys."""
pattern = "ratelimit:*"
keys = await redis_client.keys(pattern)
for key in keys:
ttl = await redis_client.ttl(key)
if ttl == -2: # Key expired but not cleaned up
await redis_client.delete(key)
```
### Security Considerations
```python
# Rate limit by IP for unauthenticated users
if not user:
user_id = request.client.host if request.client else "unknown"
limit, period = DEFAULT_LIMIT, DEFAULT_PERIOD
# Prevent rate limit enumeration attacks
# Don't expose exact remaining requests in error messages
# Use progressive delays for repeated violations
# Consider temporary bans for severe abuse
# Log rate limit violations for security monitoring
if is_limited:
logger.warning(
f"Rate limit exceeded",
extra={
"user_id": user_id,
"path": path,
"ip": request.client.host if request.client else "unknown",
"user_agent": request.headers.get("user-agent")
}
)
```
## Common Use Cases
### API Monetization
```python
# Different tiers for different pricing levels
tiers = {
"free": {"daily_requests": 1000, "cost": 0},
"starter": {"daily_requests": 10000, "cost": 29},
"professional": {"daily_requests": 100000, "cost": 99},
"enterprise": {"daily_requests": 1000000, "cost": 499}
}
```
### Resource Protection
```python
# Protect expensive operations
@router.post("/ai/generate-image", dependencies=[Depends(rate_limiter_dependency)])
async def generate_image():
"""Expensive AI operation - heavily rate limited."""
pass
@router.get("/data/export", dependencies=[Depends(rate_limiter_dependency)])
async def export_data():
"""Database-intensive operation - rate limited."""
pass
```
### Abuse Prevention
```python
# Strict limits on user-generated content
@router.post("/posts", dependencies=[Depends(rate_limiter_dependency)])
async def create_post():
"""Prevent spam posting."""
pass
@router.post("/comments", dependencies=[Depends(rate_limiter_dependency)])
async def create_comment():
"""Prevent comment spam."""
pass
```
This comprehensive rate limiting system provides robust protection against API abuse while supporting flexible business models through user tiers and granular endpoint controls.

810
docs/user-guide/testing.md Normal file
View File

@ -0,0 +1,810 @@
# Testing Guide
This guide covers comprehensive testing strategies for the FastAPI boilerplate, including unit tests, integration tests, and API testing.
## Test Setup
### Testing Dependencies
The boilerplate uses these testing libraries:
- **pytest** - Testing framework
- **pytest-asyncio** - Async test support
- **httpx** - Async HTTP client for API tests
- **pytest-cov** - Coverage reporting
- **faker** - Test data generation
### Test Configuration
#### pytest.ini
```ini
[tool:pytest]
testpaths = tests
python_files = test_*.py
python_classes = Test*
python_functions = test_*
addopts =
-v
--strict-markers
--strict-config
--cov=src
--cov-report=term-missing
--cov-report=html
--cov-report=xml
--cov-fail-under=80
markers =
unit: Unit tests
integration: Integration tests
api: API tests
slow: Slow tests
asyncio_mode = auto
```
#### Test Database Setup
Create `tests/conftest.py`:
```python
import asyncio
import pytest
import pytest_asyncio
from typing import AsyncGenerator
from httpx import AsyncClient
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import sessionmaker
from faker import Faker
from src.app.core.config import settings
from src.app.core.db.database import Base, async_get_db
from src.app.main import app
from src.app.models.user import User
from src.app.models.post import Post
from src.app.core.security import get_password_hash
# Test database configuration
TEST_DATABASE_URL = "postgresql+asyncpg://test_user:test_pass@localhost:5432/test_db"
# Create test engine and session
test_engine = create_async_engine(TEST_DATABASE_URL, echo=False)
TestSessionLocal = sessionmaker(
test_engine, class_=AsyncSession, expire_on_commit=False
)
fake = Faker()
@pytest_asyncio.fixture
async def async_session() -> AsyncGenerator[AsyncSession, None]:
"""Create a fresh database session for each test."""
async with test_engine.begin() as conn:
await conn.run_sync(Base.metadata.create_all)
async with TestSessionLocal() as session:
yield session
async with test_engine.begin() as conn:
await conn.run_sync(Base.metadata.drop_all)
@pytest_asyncio.fixture
async def async_client(async_session: AsyncSession) -> AsyncGenerator[AsyncClient, None]:
"""Create an async HTTP client for testing."""
def get_test_db():
return async_session
app.dependency_overrides[async_get_db] = get_test_db
async with AsyncClient(app=app, base_url="http://test") as client:
yield client
app.dependency_overrides.clear()
@pytest_asyncio.fixture
async def test_user(async_session: AsyncSession) -> User:
"""Create a test user."""
user = User(
name=fake.name(),
username=fake.user_name(),
email=fake.email(),
hashed_password=get_password_hash("testpassword123"),
is_superuser=False
)
async_session.add(user)
await async_session.commit()
await async_session.refresh(user)
return user
@pytest_asyncio.fixture
async def test_superuser(async_session: AsyncSession) -> User:
"""Create a test superuser."""
user = User(
name="Super Admin",
username="superadmin",
email="admin@test.com",
hashed_password=get_password_hash("superpassword123"),
is_superuser=True
)
async_session.add(user)
await async_session.commit()
await async_session.refresh(user)
return user
@pytest_asyncio.fixture
async def test_post(async_session: AsyncSession, test_user: User) -> Post:
"""Create a test post."""
post = Post(
title=fake.sentence(),
content=fake.text(),
created_by_user_id=test_user.id
)
async_session.add(post)
await async_session.commit()
await async_session.refresh(post)
return post
@pytest_asyncio.fixture
async def auth_headers(async_client: AsyncClient, test_user: User) -> dict:
"""Get authentication headers for a test user."""
login_data = {
"username": test_user.username,
"password": "testpassword123"
}
response = await async_client.post("/api/v1/auth/login", data=login_data)
token = response.json()["access_token"]
return {"Authorization": f"Bearer {token}"}
@pytest_asyncio.fixture
async def superuser_headers(async_client: AsyncClient, test_superuser: User) -> dict:
"""Get authentication headers for a test superuser."""
login_data = {
"username": test_superuser.username,
"password": "superpassword123"
}
response = await async_client.post("/api/v1/auth/login", data=login_data)
token = response.json()["access_token"]
return {"Authorization": f"Bearer {token}"}
```
## Unit Tests
### Model Tests
```python
# tests/test_models.py
import pytest
from datetime import datetime
from src.app.models.user import User
from src.app.models.post import Post
@pytest.mark.unit
class TestUserModel:
"""Test User model functionality."""
async def test_user_creation(self, async_session):
"""Test creating a user."""
user = User(
name="Test User",
username="testuser",
email="test@example.com",
hashed_password="hashed_password"
)
async_session.add(user)
await async_session.commit()
await async_session.refresh(user)
assert user.id is not None
assert user.name == "Test User"
assert user.username == "testuser"
assert user.email == "test@example.com"
assert user.created_at is not None
assert user.is_superuser is False
assert user.is_deleted is False
async def test_user_relationships(self, async_session, test_user):
"""Test user relationships."""
post = Post(
title="Test Post",
content="Test content",
created_by_user_id=test_user.id
)
async_session.add(post)
await async_session.commit()
# Test relationship
await async_session.refresh(test_user)
assert len(test_user.posts) == 1
assert test_user.posts[0].title == "Test Post"
@pytest.mark.unit
class TestPostModel:
"""Test Post model functionality."""
async def test_post_creation(self, async_session, test_user):
"""Test creating a post."""
post = Post(
title="Test Post",
content="This is test content",
created_by_user_id=test_user.id
)
async_session.add(post)
await async_session.commit()
await async_session.refresh(post)
assert post.id is not None
assert post.title == "Test Post"
assert post.content == "This is test content"
assert post.created_by_user_id == test_user.id
assert post.created_at is not None
assert post.is_deleted is False
```
### Schema Tests
```python
# tests/test_schemas.py
import pytest
from pydantic import ValidationError
from src.app.schemas.user import UserCreate, UserRead, UserUpdate
from src.app.schemas.post import PostCreate, PostRead, PostUpdate
@pytest.mark.unit
class TestUserSchemas:
"""Test User schema validation."""
def test_user_create_valid(self):
"""Test valid user creation schema."""
user_data = {
"name": "John Doe",
"username": "johndoe",
"email": "john@example.com",
"password": "SecurePass123!"
}
user = UserCreate(**user_data)
assert user.name == "John Doe"
assert user.username == "johndoe"
assert user.email == "john@example.com"
assert user.password == "SecurePass123!"
def test_user_create_invalid_email(self):
"""Test invalid email validation."""
with pytest.raises(ValidationError) as exc_info:
UserCreate(
name="John Doe",
username="johndoe",
email="invalid-email",
password="SecurePass123!"
)
errors = exc_info.value.errors()
assert any(error['type'] == 'value_error' for error in errors)
def test_user_create_short_password(self):
"""Test password length validation."""
with pytest.raises(ValidationError) as exc_info:
UserCreate(
name="John Doe",
username="johndoe",
email="john@example.com",
password="123"
)
errors = exc_info.value.errors()
assert any(error['type'] == 'value_error' for error in errors)
def test_user_update_partial(self):
"""Test partial user update."""
update_data = {"name": "Jane Doe"}
user_update = UserUpdate(**update_data)
assert user_update.name == "Jane Doe"
assert user_update.username is None
assert user_update.email is None
@pytest.mark.unit
class TestPostSchemas:
"""Test Post schema validation."""
def test_post_create_valid(self):
"""Test valid post creation."""
post_data = {
"title": "Test Post",
"content": "This is a test post content"
}
post = PostCreate(**post_data)
assert post.title == "Test Post"
assert post.content == "This is a test post content"
def test_post_create_empty_title(self):
"""Test empty title validation."""
with pytest.raises(ValidationError):
PostCreate(
title="",
content="This is a test post content"
)
def test_post_create_long_title(self):
"""Test title length validation."""
with pytest.raises(ValidationError):
PostCreate(
title="x" * 101, # Exceeds max length
content="This is a test post content"
)
```
### CRUD Tests
```python
# tests/test_crud.py
import pytest
from src.app.crud.crud_users import crud_users
from src.app.crud.crud_posts import crud_posts
from src.app.schemas.user import UserCreate, UserUpdate
from src.app.schemas.post import PostCreate, PostUpdate
@pytest.mark.unit
class TestUserCRUD:
"""Test User CRUD operations."""
async def test_create_user(self, async_session):
"""Test creating a user."""
user_data = UserCreate(
name="CRUD User",
username="cruduser",
email="crud@example.com",
password="password123"
)
user = await crud_users.create(db=async_session, object=user_data)
assert user["name"] == "CRUD User"
assert user["username"] == "cruduser"
assert user["email"] == "crud@example.com"
assert "id" in user
async def test_get_user(self, async_session, test_user):
"""Test getting a user."""
retrieved_user = await crud_users.get(
db=async_session,
id=test_user.id
)
assert retrieved_user is not None
assert retrieved_user["id"] == test_user.id
assert retrieved_user["name"] == test_user.name
assert retrieved_user["username"] == test_user.username
async def test_get_user_by_email(self, async_session, test_user):
"""Test getting a user by email."""
retrieved_user = await crud_users.get(
db=async_session,
email=test_user.email
)
assert retrieved_user is not None
assert retrieved_user["email"] == test_user.email
async def test_update_user(self, async_session, test_user):
"""Test updating a user."""
update_data = UserUpdate(name="Updated Name")
updated_user = await crud_users.update(
db=async_session,
object=update_data,
id=test_user.id
)
assert updated_user["name"] == "Updated Name"
assert updated_user["id"] == test_user.id
async def test_delete_user(self, async_session, test_user):
"""Test soft deleting a user."""
await crud_users.delete(db=async_session, id=test_user.id)
# User should be soft deleted
deleted_user = await crud_users.get(
db=async_session,
id=test_user.id,
is_deleted=True
)
assert deleted_user is not None
assert deleted_user["is_deleted"] is True
async def test_get_multi_users(self, async_session):
"""Test getting multiple users."""
# Create multiple users
for i in range(5):
user_data = UserCreate(
name=f"User {i}",
username=f"user{i}",
email=f"user{i}@example.com",
password="password123"
)
await crud_users.create(db=async_session, object=user_data)
# Get users with pagination
result = await crud_users.get_multi(
db=async_session,
offset=0,
limit=3
)
assert len(result["data"]) == 3
assert result["total_count"] == 5
assert result["has_more"] is True
@pytest.mark.unit
class TestPostCRUD:
"""Test Post CRUD operations."""
async def test_create_post(self, async_session, test_user):
"""Test creating a post."""
post_data = PostCreate(
title="Test Post",
content="This is test content"
)
post = await crud_posts.create(
db=async_session,
object=post_data,
created_by_user_id=test_user.id
)
assert post["title"] == "Test Post"
assert post["content"] == "This is test content"
assert post["created_by_user_id"] == test_user.id
async def test_get_posts_by_user(self, async_session, test_user):
"""Test getting posts by user."""
# Create multiple posts
for i in range(3):
post_data = PostCreate(
title=f"Post {i}",
content=f"Content {i}"
)
await crud_posts.create(
db=async_session,
object=post_data,
created_by_user_id=test_user.id
)
# Get posts by user
result = await crud_posts.get_multi(
db=async_session,
created_by_user_id=test_user.id
)
assert len(result["data"]) == 3
assert result["total_count"] == 3
```
## Integration Tests
### API Endpoint Tests
```python
# tests/test_api_users.py
import pytest
from httpx import AsyncClient
@pytest.mark.integration
class TestUserAPI:
"""Test User API endpoints."""
async def test_create_user(self, async_client: AsyncClient):
"""Test user creation endpoint."""
user_data = {
"name": "New User",
"username": "newuser",
"email": "new@example.com",
"password": "SecurePass123!"
}
response = await async_client.post("/api/v1/users", json=user_data)
assert response.status_code == 201
data = response.json()
assert data["name"] == "New User"
assert data["username"] == "newuser"
assert data["email"] == "new@example.com"
assert "hashed_password" not in data
assert "id" in data
async def test_create_user_duplicate_email(self, async_client: AsyncClient, test_user):
"""Test creating user with duplicate email."""
user_data = {
"name": "Duplicate User",
"username": "duplicateuser",
"email": test_user.email, # Use existing email
"password": "SecurePass123!"
}
response = await async_client.post("/api/v1/users", json=user_data)
assert response.status_code == 409 # Conflict
async def test_get_users(self, async_client: AsyncClient):
"""Test getting users list."""
response = await async_client.get("/api/v1/users")
assert response.status_code == 200
data = response.json()
assert "data" in data
assert "total_count" in data
assert "has_more" in data
assert isinstance(data["data"], list)
async def test_get_user_by_id(self, async_client: AsyncClient, test_user):
"""Test getting specific user."""
response = await async_client.get(f"/api/v1/users/{test_user.id}")
assert response.status_code == 200
data = response.json()
assert data["id"] == test_user.id
assert data["name"] == test_user.name
assert data["username"] == test_user.username
async def test_get_user_not_found(self, async_client: AsyncClient):
"""Test getting non-existent user."""
response = await async_client.get("/api/v1/users/99999")
assert response.status_code == 404
async def test_update_user_authorized(self, async_client: AsyncClient, test_user, auth_headers):
"""Test updating user with proper authorization."""
update_data = {"name": "Updated Name"}
response = await async_client.patch(
f"/api/v1/users/{test_user.id}",
json=update_data,
headers=auth_headers
)
assert response.status_code == 200
data = response.json()
assert data["name"] == "Updated Name"
assert data["id"] == test_user.id
async def test_update_user_unauthorized(self, async_client: AsyncClient, test_user):
"""Test updating user without authorization."""
update_data = {"name": "Updated Name"}
response = await async_client.patch(
f"/api/v1/users/{test_user.id}",
json=update_data
)
assert response.status_code == 401
async def test_delete_user_superuser(self, async_client: AsyncClient, test_user, superuser_headers):
"""Test deleting user as superuser."""
response = await async_client.delete(
f"/api/v1/users/{test_user.id}",
headers=superuser_headers
)
assert response.status_code == 200
async def test_delete_user_forbidden(self, async_client: AsyncClient, test_user, auth_headers):
"""Test deleting user without superuser privileges."""
response = await async_client.delete(
f"/api/v1/users/{test_user.id}",
headers=auth_headers
)
assert response.status_code == 403
@pytest.mark.integration
class TestAuthAPI:
"""Test Authentication API endpoints."""
async def test_login_success(self, async_client: AsyncClient, test_user):
"""Test successful login."""
login_data = {
"username": test_user.username,
"password": "testpassword123"
}
response = await async_client.post("/api/v1/auth/login", data=login_data)
assert response.status_code == 200
data = response.json()
assert "access_token" in data
assert "refresh_token" in data
assert data["token_type"] == "bearer"
async def test_login_invalid_credentials(self, async_client: AsyncClient, test_user):
"""Test login with invalid credentials."""
login_data = {
"username": test_user.username,
"password": "wrongpassword"
}
response = await async_client.post("/api/v1/auth/login", data=login_data)
assert response.status_code == 401
async def test_get_current_user(self, async_client: AsyncClient, test_user, auth_headers):
"""Test getting current user information."""
response = await async_client.get("/api/v1/auth/me", headers=auth_headers)
assert response.status_code == 200
data = response.json()
assert data["id"] == test_user.id
assert data["username"] == test_user.username
async def test_refresh_token(self, async_client: AsyncClient, test_user):
"""Test token refresh."""
# First login to get refresh token
login_data = {
"username": test_user.username,
"password": "testpassword123"
}
login_response = await async_client.post("/api/v1/auth/login", data=login_data)
refresh_token = login_response.json()["refresh_token"]
# Use refresh token to get new access token
refresh_response = await async_client.post(
"/api/v1/auth/refresh",
headers={"Authorization": f"Bearer {refresh_token}"}
)
assert refresh_response.status_code == 200
data = refresh_response.json()
assert "access_token" in data
```
## Running Tests
### Basic Test Commands
```bash
# Run all tests
uv run pytest
# Run specific test categories
uv run pytest -m unit
uv run pytest -m integration
uv run pytest -m api
# Run tests with coverage
uv run pytest --cov=src --cov-report=html
# Run tests in parallel
uv run pytest -n auto
# Run specific test file
uv run pytest tests/test_api_users.py
# Run with verbose output
uv run pytest -v
# Run tests matching pattern
uv run pytest -k "test_user"
# Run tests and stop on first failure
uv run pytest -x
# Run slow tests
uv run pytest -m slow
```
### Test Environment Setup
```bash
# Set up test database
createdb test_db
# Run tests with specific environment
ENVIRONMENT=testing uv run pytest
# Run tests with debug output
uv run pytest -s --log-cli-level=DEBUG
```
## Testing Best Practices
### Test Organization
- **Separate concerns**: Unit tests for business logic, integration tests for API endpoints
- **Use fixtures**: Create reusable test data and setup
- **Test isolation**: Each test should be independent
- **Clear naming**: Test names should describe what they're testing
### Test Data
- **Use factories**: Create test data programmatically
- **Avoid hardcoded values**: Use variables and constants
- **Clean up**: Ensure tests don't leave data behind
- **Realistic data**: Use faker or similar libraries for realistic test data
### Assertions
- **Specific assertions**: Test specific behaviors, not just "it works"
- **Multiple assertions**: Test all relevant aspects of the response
- **Error cases**: Test error conditions and edge cases
- **Performance**: Include performance tests for critical paths
### Mocking
```python
# Example of mocking external dependencies
from unittest.mock import patch, AsyncMock
@pytest.mark.unit
async def test_external_api_call():
"""Test function that calls external API."""
with patch('src.app.services.external_api.make_request') as mock_request:
mock_request.return_value = {"status": "success"}
result = await some_function_that_calls_external_api()
assert result["status"] == "success"
mock_request.assert_called_once()
```
### Continuous Integration
```yaml
# .github/workflows/test.yml
name: Tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
services:
postgres:
image: postgres:15
env:
POSTGRES_USER: test_user
POSTGRES_PASSWORD: test_pass
POSTGRES_DB: test_db
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: 3.11
- name: Install dependencies
run: |
pip install uv
uv sync
- name: Run tests
run: uv run pytest --cov=src --cov-report=xml
- name: Upload coverage
uses: codecov/codecov-action@v3
with:
file: ./coverage.xml
```
This testing guide provides comprehensive coverage of testing strategies for the FastAPI boilerplate, ensuring reliable and maintainable code.

159
mkdocs.yml Normal file
View File

@ -0,0 +1,159 @@
site_name: FastAPI Boilerplate
site_description: A production-ready FastAPI boilerplate with async support, JWT authentication, Redis caching, and more.
site_author: Benav Labs
site_url: https://github.com/benavlabs/fastapi-boilerplate
theme:
name: material
font:
text: Ubuntu
logo: assets/FastAPI-boilerplate.png
favicon: assets/FastAPI-boilerplate.png
features:
- navigation.instant
- navigation.instant.prefetch
- navigation.tabs
- navigation.indexes
- search.suggest
- content.code.copy
- content.code.annotate
- navigation.top
- navigation.footer
palette:
- media: "(prefers-color-scheme: light)"
scheme: default
primary: custom
accent: custom
toggle:
icon: material/brightness-7
name: Switch to dark mode
- media: "(prefers-color-scheme: dark)"
scheme: slate
primary: custom
accent: custom
toggle:
icon: material/brightness-4
name: Switch to light mode
plugins:
- search
- mkdocstrings:
handlers:
python:
rendering:
show_source: true
nav:
- Home: index.md
- Getting Started:
- Overview: getting-started/index.md
- Installation: getting-started/installation.md
- Configuration: getting-started/configuration.md
- First Run: getting-started/first-run.md
- User Guide:
- Overview: user-guide/index.md
- Project Structure: user-guide/project-structure.md
- Configuration:
- Overview: user-guide/configuration/index.md
- Environment Variables: user-guide/configuration/environment-variables.md
- Settings Classes: user-guide/configuration/settings-classes.md
- Docker Setup: user-guide/configuration/docker-setup.md
- Environment-Specific: user-guide/configuration/environment-specific.md
- Database:
- Overview: user-guide/database/index.md
- Models: user-guide/database/models.md
- Schemas: user-guide/database/schemas.md
- CRUD Operations: user-guide/database/crud.md
- Migrations: user-guide/database/migrations.md
- API:
- Overview: user-guide/api/index.md
- Endpoints: user-guide/api/endpoints.md
- Pagination: user-guide/api/pagination.md
- Exceptions: user-guide/api/exceptions.md
- Versioning: user-guide/api/versioning.md
- Authentication:
- Overview: user-guide/authentication/index.md
- JWT Tokens: user-guide/authentication/jwt-tokens.md
- User Management: user-guide/authentication/user-management.md
- Permissions: user-guide/authentication/permissions.md
- Admin Panel:
- user-guide/admin-panel/index.md
- Configuration: user-guide/admin-panel/configuration.md
- Adding Models: user-guide/admin-panel/adding-models.md
- User Management: user-guide/admin-panel/user-management.md
- Caching:
- Overview: user-guide/caching/index.md
- Redis Cache: user-guide/caching/redis-cache.md
- Client Cache: user-guide/caching/client-cache.md
- Cache Strategies: user-guide/caching/cache-strategies.md
- Background Tasks: user-guide/background-tasks/index.md
- Rate Limiting: user-guide/rate-limiting/index.md
- Development: user-guide/development.md
- Production: user-guide/production.md
- Testing: user-guide/testing.md
- Community: community.md
# - Examples:
# - Overview: examples/index.md
# - Basic CRUD: examples/basic-crud.md
# - Authentication Flow: examples/authentication-flow.md
# - Background Job Workflow: examples/background-job-workflow.md
# - Caching Patterns: examples/caching-patterns.md
# - Production Setup: examples/production-setup.md
# - Reference:
# - Overview: reference/index.md
# - API Reference: reference/api-reference.md
# - Configuration Reference: reference/configuration-reference.md
# - Database Schema: reference/database-schema.md
# - Middleware Reference: reference/middleware-reference.md
# - Dependencies Reference: reference/dependencies-reference.md
# - Contributing:
# - Overview: contributing/index.md
# - Development Setup: contributing/development-setup.md
# - Coding Standards: contributing/coding-standards.md
# - Pull Request Process: contributing/pull-request-process.md
# - Testing Guidelines: contributing/testing-guidelines.md
# - Migration Guides:
# - Overview: migration-guides/index.md
# - Version Migrations: migration-guides/from-v1-to-v2.md
# - From Other Frameworks: migration-guides/from-other-frameworks.md
# - FAQ: faq.md
markdown_extensions:
- admonition
- codehilite
- toc:
permalink: true
- pymdownx.details
- pymdownx.highlight:
anchor_linenums: true
line_spans: __span
pygments_lang_class: true
- pymdownx.inlinehilite
- pymdownx.snippets
- pymdownx.superfences
- pymdownx.tabbed:
alternate_style: true
- pymdownx.tasklist:
custom_checkbox: true
- attr_list
- md_in_html
extra:
social:
- icon: fontawesome/brands/github
link: https://github.com/benavlabs/fastapi-boilerplate
- icon: fontawesome/brands/python
link: https://pypi.org/project/fastapi/
version:
provider: mike
analytics:
provider: google
property: !ENV [GOOGLE_ANALYTICS_KEY, ""]
extra_css:
- stylesheets/extra.css
repo_name: benavlabs/fastapi-boilerplate
repo_url: https://github.com/benavlabs/fastapi-boilerplate
edit_uri: edit/main/docs/

123
pyproject.toml Normal file
View File

@ -0,0 +1,123 @@
[project]
name = "fastapi-boilerplate"
version = "0.1.0"
description = "A fully Async FastAPI boilerplate using SQLAlchemy and Pydantic 2"
authors = [{ name = "Igor Magalhaes", email = "igor.magalhaes.r@gmail.com" }]
license = { text = "MIT" }
readme = "README.md"
requires-python = "~=3.11"
dependencies = [
"python-dotenv>=1.0.0",
"pydantic[email]>=2.6.1",
"fastapi>=0.109.1",
"uvicorn>=0.27.0",
"uvloop>=0.19.0",
"httptools>=0.6.1",
"uuid>=1.30",
"uuid6>=2024.1.12",
"alembic>=1.13.1",
"asyncpg>=0.29.0",
"SQLAlchemy-Utils>=0.41.1",
"python-jose>=3.3.0",
"SQLAlchemy>=2.0.25",
"python-multipart>=0.0.9",
"greenlet>=2.0.2",
"httpx>=0.26.0",
"pydantic-settings>=2.0.3",
"redis>=5.0.1",
"arq>=0.25.0",
"bcrypt>=4.1.1",
"psycopg2-binary>=2.9.9",
"fastcrud>=0.15.5",
"crudadmin>=0.4.2",
"gunicorn>=23.0.0",
"ruff>=0.11.13",
"mypy>=1.16.0",
"niquests>=3.15.2",
]
[project.optional-dependencies]
dev = [
"pytest>=7.4.2",
"pytest-mock>=3.14.0",
"faker>=26.0.0",
"mypy>=1.8.0",
"types-redis>=4.6.0",
"ruff>=0.1.0",
]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.sdist]
include = ["src/"]
[tool.hatch.build.targets.wheel]
include = ["src/"]
packages = ["src"]
[tool.ruff]
target-version = "py311"
line-length = 120
fix = true
[tool.ruff.lint]
select = [
# https://docs.astral.sh/ruff/rules/#pyflakes-f
"F", # Pyflakes
# https://docs.astral.sh/ruff/rules/#pycodestyle-e-w
"E", # pycodestyle
"W", # Warning
# https://docs.astral.sh/ruff/rules/#flake8-comprehensions-c4
# https://docs.astral.sh/ruff/rules/#mccabe-c90
"C", # Complexity (mccabe+) & comprehensions
# https://docs.astral.sh/ruff/rules/#pyupgrade-up
"UP", # pyupgrade
# https://docs.astral.sh/ruff/rules/#isort-i
"I", # isort
]
ignore = [
# https://docs.astral.sh/ruff/rules/#pycodestyle-e-w
"E402", # module level import not at top of file
# https://docs.astral.sh/ruff/rules/#pyupgrade-up
"UP006", # use-pep585-annotation
"UP007", # use-pep604-annotation
"E741", # Ambiguous variable name
# "UP035", # deprecated-assertion
]
[tool.ruff.lint.per-file-ignores]
"__init__.py" = [
"F401", # unused import
"F403", # star imports
]
[tool.ruff.lint.mccabe]
max-complexity = 24
[tool.ruff.lint.pydocstyle]
convention = "numpy"
[tool.pytest.ini_options]
filterwarnings = [
"ignore::PendingDeprecationWarning:starlette.formparsers",
]
[dependency-groups]
dev = [
"openapi-generator-cli>=7.16.0",
"pytest-asyncio>=1.0.0",
]
[tool.mypy]
python_version = "3.11"
warn_return_any = true
warn_unused_configs = true
ignore_missing_imports = true
mypy_path = "src"
explicit_package_bases = true
[[tool.mypy.overrides]]
module = "src.app.*"
disallow_untyped_defs = true

0
src/__init__.py Normal file
View File

116
src/alembic.ini Normal file
View File

@ -0,0 +1,116 @@
# A generic, single database configuration.
[alembic]
# path to migration scripts
script_location = migrations
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
# sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory.
prepend_sys_path = .
# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python-dateutil library that can be
# installed by adding `alembic[tz]` to the pip requirements
# string value is passed to dateutil.tz.gettz()
# leave blank for localtime
# timezone =
# max length of characters to apply to the
# "slug" field
# truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; This defaults
# to migrations/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "version_path_separator" below.
# version_locations = %(here)s/bar:%(here)s/bat:migrations/versions
# version path separator; As mentioned above, this is the character used to split
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
# Valid values for version_path_separator are:
#
# version_path_separator = :
# version_path_separator = ;
# version_path_separator = space
version_path_separator = os # Use os.pathsep. Default configuration used for new projects.
# set to 'true' to search source files recursively
# in each "version_locations" directory
# new in Alembic version 1.10
# recursive_version_locations = false
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8
sqlalchemy.url = driver://user:pass@localhost/dbname
[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples
# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks = black
# black.type = console_scripts
# black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME
# lint with attempts to fix using "ruff" - use the exec runner, execute a binary
# hooks = ruff
# ruff.type = exec
# ruff.executable = %(here)s/.venv/bin/ruff
# ruff.options = --fix REVISION_SCRIPT_FILENAME
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
qualname =
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

0
src/app/__init__.py Normal file
View File

View File

View File

@ -0,0 +1,53 @@
from typing import Optional
from crudadmin import CRUDAdmin
from ..core.config import EnvironmentOption, settings
from ..core.db.database import async_get_db
from .views import register_admin_views
def create_admin_interface() -> Optional[CRUDAdmin]:
"""Create and configure the admin interface."""
if not settings.CRUD_ADMIN_ENABLED:
return None
session_backend = "memory"
redis_config = None
if settings.CRUD_ADMIN_REDIS_ENABLED:
session_backend = "redis"
redis_config = {
"host": settings.CRUD_ADMIN_REDIS_HOST,
"port": settings.CRUD_ADMIN_REDIS_PORT,
"db": settings.CRUD_ADMIN_REDIS_DB,
"password": settings.CRUD_ADMIN_REDIS_PASSWORD if settings.CRUD_ADMIN_REDIS_PASSWORD != "None" else None,
}
admin = CRUDAdmin(
session=async_get_db,
SECRET_KEY=settings.SECRET_KEY.get_secret_value(),
mount_path=settings.CRUD_ADMIN_MOUNT_PATH,
session_backend=session_backend,
redis_config=redis_config,
allowed_ips=settings.CRUD_ADMIN_ALLOWED_IPS_LIST if settings.CRUD_ADMIN_ALLOWED_IPS_LIST else None,
allowed_networks=settings.CRUD_ADMIN_ALLOWED_NETWORKS_LIST
if settings.CRUD_ADMIN_ALLOWED_NETWORKS_LIST
else None,
max_sessions_per_user=settings.CRUD_ADMIN_MAX_SESSIONS,
session_timeout_minutes=settings.CRUD_ADMIN_SESSION_TIMEOUT,
secure_cookies=settings.SESSION_SECURE_COOKIES,
enforce_https=settings.ENVIRONMENT == EnvironmentOption.PRODUCTION,
track_events=settings.CRUD_ADMIN_TRACK_EVENTS,
track_sessions_in_db=settings.CRUD_ADMIN_TRACK_SESSIONS,
initial_admin={
"username": settings.ADMIN_USERNAME,
"password": settings.ADMIN_PASSWORD,
}
if settings.ADMIN_USERNAME and settings.ADMIN_PASSWORD
else None,
)
register_admin_views(admin)
return admin

44
src/app/admin/views.py Normal file
View File

@ -0,0 +1,44 @@
from typing import Annotated
from crudadmin import CRUDAdmin
from crudadmin.admin_interface.model_view import PasswordTransformer
from pydantic import BaseModel, Field
from ..core.security import get_password_hash
from ..models.user import User
from ..schemas.user import UserCreate, UserCreateInternal, UserUpdate
class PostCreateAdmin(BaseModel):
title: Annotated[str, Field(min_length=2, max_length=30, examples=["This is my post"])]
text: Annotated[str, Field(min_length=1, max_length=63206, examples=["This is the content of my post."])]
created_by_user_id: int
media_url: Annotated[
str | None,
Field(pattern=r"^(https?|ftp)://[^\s/$.?#].[^\s]*$", examples=["https://www.postimageurl.com"], default=None),
]
def register_admin_views(admin: CRUDAdmin) -> None:
"""Register all models and their schemas with the admin interface.
This function adds all available models to the admin interface with appropriate
schemas and permissions.
"""
password_transformer = PasswordTransformer(
password_field="password",
hashed_field="hashed_password",
hash_function=get_password_hash,
required_fields=["name", "username", "email"],
)
admin.add_view(
model=User,
create_schema=UserCreate,
update_schema=UserUpdate,
update_internal_schema=UserCreateInternal,
password_transformer=password_transformer,
allowed_actions={"view", "create", "update"},
)

6
src/app/api/__init__.py Normal file
View File

@ -0,0 +1,6 @@
from fastapi import APIRouter
from ..api.v1 import router as v1_router
router = APIRouter(prefix="/api")
router.include_router(v1_router)

View File

@ -0,0 +1,80 @@
from typing import Annotated, Any
from fastapi import Depends, HTTPException, Request
from sqlalchemy.ext.asyncio import AsyncSession
from ..core.config import settings
from ..core.db.database import async_get_db
from ..core.exceptions.http_exceptions import ForbiddenException, UnauthorizedException
from ..core.logger import logging
from ..core.security import TokenType, oauth2_scheme, verify_token
from ..crud.crud_users import crud_users
from ..lib.tbank_client.client import TBankClient
logger = logging.getLogger(__name__)
DEFAULT_LIMIT = settings.DEFAULT_RATE_LIMIT_LIMIT
DEFAULT_PERIOD = settings.DEFAULT_RATE_LIMIT_PERIOD
async def get_current_user(
token: Annotated[str, Depends(oauth2_scheme)], db: Annotated[AsyncSession, Depends(async_get_db)]
) -> dict[str, Any] | None:
token_data = await verify_token(token, TokenType.ACCESS, db)
if token_data is None:
raise UnauthorizedException("User not authenticated.")
if "@" in token_data.username_or_email:
user = await crud_users.get(db=db, email=token_data.username_or_email, is_deleted=False)
else:
user = await crud_users.get(db=db, username=token_data.username_or_email, is_deleted=False)
if user:
if hasattr(user, 'model_dump'):
return user.model_dump()
else:
return user
raise UnauthorizedException("User not authenticated.")
async def get_optional_user(request: Request, db: AsyncSession = Depends(async_get_db)) -> dict | None:
token = request.headers.get("Authorization")
if not token:
return None
try:
token_type, _, token_value = token.partition(" ")
if token_type.lower() != "bearer" or not token_value:
return None
token_data = await verify_token(token_value, TokenType.ACCESS, db)
if token_data is None:
return None
return await get_current_user(token_value, db=db)
except HTTPException as http_exc:
if http_exc.status_code != 401:
logger.error(f"Unexpected HTTPException in get_optional_user: {http_exc.detail}")
return None
except Exception as exc:
logger.error(f"Unexpected error in get_optional_user: {exc}")
return None
async def get_current_superuser(current_user: Annotated[dict, Depends(get_current_user)]) -> dict:
if not current_user["is_superuser"]:
raise ForbiddenException("You do not have enough privileges.")
return current_user
async def get_tbank_client():
api_key = settings.TBANK_API_KEY
if not api_key:
raise HTTPException(status_code=500, detail="TBank API key is not configured")
client = TBankClient(api_key=api_key)
return client

View File

@ -0,0 +1,13 @@
from fastapi import APIRouter
from .login import router as login_router
from .logout import router as logout_router
from .tasks import router as tasks_router
from .users import router as users_router
from .counterparty import router as counterparty_router
router = APIRouter(prefix="/v1")
router.include_router(login_router)
router.include_router(logout_router)
router.include_router(users_router)
router.include_router(counterparty_router)

View File

@ -0,0 +1,20 @@
from typing import Annotated
from fastapi import APIRouter, Depends
from ..dependencies import get_tbank_client
from ...lib import TBankClient
router = APIRouter(tags=["counterparty"], prefix="/counterparty")
tbank_client_dependency = Annotated[TBankClient, Depends(get_tbank_client)]
@router.get(
"/excerpt/by-inn",
)
async def get_counterparty_excerpt_by_inn(
inn: str,
tbank_client: tbank_client_dependency,
):
return await tbank_client.get_counterparty_excerpt_by_inn(inn)

58
src/app/api/v1/login.py Normal file
View File

@ -0,0 +1,58 @@
from datetime import timedelta
from typing import Annotated
from fastapi import APIRouter, Depends, Request, Response
from fastapi.security import OAuth2PasswordRequestForm
from sqlalchemy.ext.asyncio import AsyncSession
from ...core.config import settings
from ...core.db.database import async_get_db
from ...core.exceptions.http_exceptions import UnauthorizedException
from ...core.schemas import Token
from ...core.security import (
ACCESS_TOKEN_EXPIRE_MINUTES,
TokenType,
authenticate_user,
create_access_token,
create_refresh_token,
verify_token,
)
router = APIRouter(tags=["login"])
@router.post("/login", response_model=Token)
async def login_for_access_token(
response: Response,
form_data: Annotated[OAuth2PasswordRequestForm, Depends()],
db: Annotated[AsyncSession, Depends(async_get_db)],
) -> dict[str, str]:
user = await authenticate_user(username_or_email=form_data.username, password=form_data.password, db=db)
if not user:
raise UnauthorizedException("Wrong username, email or password.")
access_token_expires = timedelta(minutes=ACCESS_TOKEN_EXPIRE_MINUTES)
access_token = await create_access_token(data={"sub": user["username"]}, expires_delta=access_token_expires)
refresh_token = await create_refresh_token(data={"sub": user["username"]})
max_age = settings.REFRESH_TOKEN_EXPIRE_DAYS * 24 * 60 * 60
response.set_cookie(
key="refresh_token", value=refresh_token, httponly=True, secure=True, samesite="lax", max_age=max_age
)
return {"access_token": access_token, "token_type": "bearer"}
@router.post("/refresh")
async def refresh_access_token(request: Request, db: AsyncSession = Depends(async_get_db)) -> dict[str, str]:
refresh_token = request.cookies.get("refresh_token")
if not refresh_token:
raise UnauthorizedException("Refresh token missing.")
user_data = await verify_token(refresh_token, TokenType.REFRESH, db)
if not user_data:
raise UnauthorizedException("Invalid refresh token.")
new_access_token = await create_access_token(data={"sub": user_data.username_or_email})
return {"access_token": new_access_token, "token_type": "bearer"}

31
src/app/api/v1/logout.py Normal file
View File

@ -0,0 +1,31 @@
from typing import Optional
from fastapi import APIRouter, Cookie, Depends, Response
from jose import JWTError
from sqlalchemy.ext.asyncio import AsyncSession
from ...core.db.database import async_get_db
from ...core.exceptions.http_exceptions import UnauthorizedException
from ...core.security import blacklist_tokens, oauth2_scheme
router = APIRouter(tags=["login"])
@router.post("/logout")
async def logout(
response: Response,
access_token: str = Depends(oauth2_scheme),
refresh_token: Optional[str] = Cookie(None, alias="refresh_token"),
db: AsyncSession = Depends(async_get_db),
) -> dict[str, str]:
try:
if not refresh_token:
raise UnauthorizedException("Refresh token not found")
await blacklist_tokens(access_token=access_token, refresh_token=refresh_token, db=db)
response.delete_cookie(key="refresh_token")
return {"message": "Logged out successfully"}
except JWTError:
raise UnauthorizedException("Invalid token.")

58
src/app/api/v1/tasks.py Normal file
View File

@ -0,0 +1,58 @@
from typing import Any
from arq.jobs import Job as ArqJob
from fastapi import APIRouter, HTTPException
from ...core.utils import queue
from ...schemas.job import Job
router = APIRouter(prefix="/tasks", tags=["tasks"])
@router.post("/task", response_model=Job, status_code=201)
async def create_task(message: str) -> dict[str, str]:
"""Create a new background task.
Parameters
----------
message: str
The message or data to be processed by the task.
Returns
-------
dict[str, str]
A dictionary containing the ID of the created task.
"""
if queue.pool is None:
raise HTTPException(status_code=503, detail="Queue is not available")
job = await queue.pool.enqueue_job("sample_background_task", message)
if job is None:
raise HTTPException(status_code=500, detail="Failed to create task")
return {"id": job.job_id}
@router.get("/task/{task_id}")
async def get_task(task_id: str) -> dict[str, Any] | None:
"""Get information about a specific background task.
Parameters
----------
task_id: str
The ID of the task.
Returns
-------
Optional[dict[str, Any]]
A dictionary containing information about the task if found, or None otherwise.
"""
if queue.pool is None:
raise HTTPException(status_code=503, detail="Queue is not available")
job = ArqJob(task_id, queue.pool)
job_info = await job.info()
if job_info is None:
return None
return job_info.__dict__

145
src/app/api/v1/users.py Normal file
View File

@ -0,0 +1,145 @@
from typing import Annotated, Any, cast
from fastapi import APIRouter, Depends, Request
from fastcrud.paginated import PaginatedListResponse, compute_offset, paginated_response
from sqlalchemy.ext.asyncio import AsyncSession
from ...api.dependencies import get_current_superuser, get_current_user
from ...core.db.database import async_get_db
from ...core.exceptions.http_exceptions import DuplicateValueException, ForbiddenException, NotFoundException
from ...core.security import blacklist_token, get_password_hash, oauth2_scheme
from ...crud.crud_users import crud_users
from ...schemas.user import UserCreate, UserCreateInternal, UserRead, UserUpdate
router = APIRouter(tags=["users"])
@router.post("/user", response_model=UserRead, status_code=201)
async def write_user(
request: Request, user: UserCreate, db: Annotated[AsyncSession, Depends(async_get_db)]
) -> UserRead:
email_row = await crud_users.exists(db=db, email=user.email)
if email_row:
raise DuplicateValueException("Email is already registered")
username_row = await crud_users.exists(db=db, username=user.username)
if username_row:
raise DuplicateValueException("Username not available")
user_internal_dict = user.model_dump()
user_internal_dict["hashed_password"] = get_password_hash(password=user_internal_dict["password"])
del user_internal_dict["password"]
user_internal = UserCreateInternal(**user_internal_dict)
created_user = await crud_users.create(db=db, object=user_internal)
user_read = await crud_users.get(db=db, id=created_user.id, schema_to_select=UserRead)
if user_read is None:
raise NotFoundException("Created user not found")
return cast(UserRead, user_read)
@router.get("/users", response_model=PaginatedListResponse[UserRead])
async def read_users(
request: Request, db: Annotated[AsyncSession, Depends(async_get_db)], page: int = 1, items_per_page: int = 10
) -> dict:
users_data = await crud_users.get_multi(
db=db,
offset=compute_offset(page, items_per_page),
limit=items_per_page,
is_deleted=False,
)
response: dict[str, Any] = paginated_response(crud_data=users_data, page=page, items_per_page=items_per_page)
return response
@router.get("/user/me/", response_model=UserRead)
async def read_users_me(request: Request, current_user: Annotated[dict, Depends(get_current_user)]) -> dict:
return current_user
@router.get("/user/{username}", response_model=UserRead)
async def read_user(request: Request, username: str, db: Annotated[AsyncSession, Depends(async_get_db)]) -> UserRead:
db_user = await crud_users.get(db=db, username=username, is_deleted=False, schema_to_select=UserRead)
if db_user is None:
raise NotFoundException("User not found")
return cast(UserRead, db_user)
@router.patch("/user/{username}")
async def patch_user(
request: Request,
values: UserUpdate,
username: str,
current_user: Annotated[dict, Depends(get_current_user)],
db: Annotated[AsyncSession, Depends(async_get_db)],
) -> dict[str, str]:
db_user = await crud_users.get(db=db, username=username)
if db_user is None:
raise NotFoundException("User not found")
if isinstance(db_user, dict):
db_username = db_user["username"]
db_email = db_user["email"]
else:
db_username = db_user.username
db_email = db_user.email
if db_username != current_user["username"]:
raise ForbiddenException()
if values.email is not None and values.email != db_email:
if await crud_users.exists(db=db, email=values.email):
raise DuplicateValueException("Email is already registered")
if values.username is not None and values.username != db_username:
if await crud_users.exists(db=db, username=values.username):
raise DuplicateValueException("Username not available")
await crud_users.update(db=db, object=values, username=username)
return {"message": "User updated"}
@router.delete("/user/{username}")
async def erase_user(
request: Request,
username: str,
current_user: Annotated[dict, Depends(get_current_user)],
db: Annotated[AsyncSession, Depends(async_get_db)],
token: str = Depends(oauth2_scheme),
) -> dict[str, str]:
db_user = await crud_users.get(db=db, username=username, schema_to_select=UserRead)
if not db_user:
raise NotFoundException("User not found")
if username != current_user["username"]:
raise ForbiddenException()
await crud_users.delete(db=db, username=username)
await blacklist_token(token=token, db=db)
return {"message": "User deleted"}
@router.delete("/db_user/{username}", dependencies=[Depends(get_current_superuser)])
async def erase_db_user(
request: Request,
username: str,
db: Annotated[AsyncSession, Depends(async_get_db)],
token: str = Depends(oauth2_scheme),
) -> dict[str, str]:
db_user = await crud_users.exists(db=db, username=username)
if not db_user:
raise NotFoundException("User not found")
await crud_users.db_delete(db=db, username=username)
await blacklist_token(token=token, db=db)
return {"message": "User deleted from the database"}

0
src/app/core/__init__.py Normal file
View File

154
src/app/core/config.py Normal file
View File

@ -0,0 +1,154 @@
import os
from enum import Enum
from pydantic import SecretStr
from pydantic_settings import BaseSettings
from starlette.config import Config
current_file_dir = os.path.dirname(os.path.realpath(__file__))
env_path = os.path.join(current_file_dir, "..", "..", ".env")
config = Config(env_path)
class AppSettings(BaseSettings):
APP_NAME: str = config("APP_NAME", default="FastAPI app")
APP_DESCRIPTION: str | None = config("APP_DESCRIPTION", default=None)
APP_VERSION: str | None = config("APP_VERSION", default=None)
LICENSE_NAME: str | None = config("LICENSE", default=None)
CONTACT_NAME: str | None = config("CONTACT_NAME", default=None)
CONTACT_EMAIL: str | None = config("CONTACT_EMAIL", default=None)
class CryptSettings(BaseSettings):
SECRET_KEY: SecretStr = config("SECRET_KEY", cast=SecretStr)
ALGORITHM: str = config("ALGORITHM", default="HS256")
ACCESS_TOKEN_EXPIRE_MINUTES: int = config("ACCESS_TOKEN_EXPIRE_MINUTES", default=30)
REFRESH_TOKEN_EXPIRE_DAYS: int = config("REFRESH_TOKEN_EXPIRE_DAYS", default=7)
class DatabaseSettings(BaseSettings):
pass
class SQLiteSettings(DatabaseSettings):
SQLITE_URI: str = config("SQLITE_URI", default="./sql_app.db")
SQLITE_SYNC_PREFIX: str = config("SQLITE_SYNC_PREFIX", default="sqlite:///")
SQLITE_ASYNC_PREFIX: str = config("SQLITE_ASYNC_PREFIX", default="sqlite+aiosqlite:///")
class MySQLSettings(DatabaseSettings):
MYSQL_USER: str = config("MYSQL_USER", default="username")
MYSQL_PASSWORD: str = config("MYSQL_PASSWORD", default="password")
MYSQL_SERVER: str = config("MYSQL_SERVER", default="localhost")
MYSQL_PORT: int = config("MYSQL_PORT", default=5432)
MYSQL_DB: str = config("MYSQL_DB", default="dbname")
MYSQL_URI: str = f"{MYSQL_USER}:{MYSQL_PASSWORD}@{MYSQL_SERVER}:{MYSQL_PORT}/{MYSQL_DB}"
MYSQL_SYNC_PREFIX: str = config("MYSQL_SYNC_PREFIX", default="mysql://")
MYSQL_ASYNC_PREFIX: str = config("MYSQL_ASYNC_PREFIX", default="mysql+aiomysql://")
MYSQL_URL: str | None = config("MYSQL_URL", default=None)
class PostgresSettings(DatabaseSettings):
POSTGRES_USER: str = config("POSTGRES_USER", default="postgres")
POSTGRES_PASSWORD: str = config("POSTGRES_PASSWORD", default="postgres")
POSTGRES_SERVER: str = config("POSTGRES_SERVER", default="localhost")
POSTGRES_PORT: int = config("POSTGRES_PORT", default=5432)
POSTGRES_DB: str = config("POSTGRES_DB", default="postgres")
POSTGRES_SYNC_PREFIX: str = config("POSTGRES_SYNC_PREFIX", default="postgresql://")
POSTGRES_ASYNC_PREFIX: str = config("POSTGRES_ASYNC_PREFIX", default="postgresql+asyncpg://")
POSTGRES_URI: str = f"{POSTGRES_USER}:{POSTGRES_PASSWORD}@{POSTGRES_SERVER}:{POSTGRES_PORT}/{POSTGRES_DB}"
POSTGRES_URL: str | None = config("POSTGRES_URL", default=None)
class FirstUserSettings(BaseSettings):
ADMIN_NAME: str = config("ADMIN_NAME", default="admin")
ADMIN_EMAIL: str = config("ADMIN_EMAIL", default="admin@admin.com")
ADMIN_USERNAME: str = config("ADMIN_USERNAME", default="admin")
ADMIN_PASSWORD: str = config("ADMIN_PASSWORD", default="!Ch4ng3Th1sP4ssW0rd!")
class TestSettings(BaseSettings): ...
class RedisCacheSettings(BaseSettings):
REDIS_CACHE_HOST: str = config("REDIS_CACHE_HOST", default="localhost")
REDIS_CACHE_PORT: int = config("REDIS_CACHE_PORT", default=6379)
REDIS_CACHE_URL: str = f"redis://{REDIS_CACHE_HOST}:{REDIS_CACHE_PORT}"
class ClientSideCacheSettings(BaseSettings):
CLIENT_CACHE_MAX_AGE: int = config("CLIENT_CACHE_MAX_AGE", default=60)
class RedisQueueSettings(BaseSettings):
REDIS_QUEUE_HOST: str = config("REDIS_QUEUE_HOST", default="localhost")
REDIS_QUEUE_PORT: int = config("REDIS_QUEUE_PORT", default=6379)
class RedisRateLimiterSettings(BaseSettings):
REDIS_RATE_LIMIT_HOST: str = config("REDIS_RATE_LIMIT_HOST", default="localhost")
REDIS_RATE_LIMIT_PORT: int = config("REDIS_RATE_LIMIT_PORT", default=6379)
REDIS_RATE_LIMIT_URL: str = f"redis://{REDIS_RATE_LIMIT_HOST}:{REDIS_RATE_LIMIT_PORT}"
class DefaultRateLimitSettings(BaseSettings):
DEFAULT_RATE_LIMIT_LIMIT: int = config("DEFAULT_RATE_LIMIT_LIMIT", default=10)
DEFAULT_RATE_LIMIT_PERIOD: int = config("DEFAULT_RATE_LIMIT_PERIOD", default=3600)
class CRUDAdminSettings(BaseSettings):
CRUD_ADMIN_ENABLED: bool = config("CRUD_ADMIN_ENABLED", default=True)
CRUD_ADMIN_MOUNT_PATH: str = config("CRUD_ADMIN_MOUNT_PATH", default="/admin")
CRUD_ADMIN_ALLOWED_IPS_LIST: list[str] | None = None
CRUD_ADMIN_ALLOWED_NETWORKS_LIST: list[str] | None = None
CRUD_ADMIN_MAX_SESSIONS: int = config("CRUD_ADMIN_MAX_SESSIONS", default=10)
CRUD_ADMIN_SESSION_TIMEOUT: int = config("CRUD_ADMIN_SESSION_TIMEOUT", default=1440)
SESSION_SECURE_COOKIES: bool = config("SESSION_SECURE_COOKIES", default=True)
CRUD_ADMIN_TRACK_EVENTS: bool = config("CRUD_ADMIN_TRACK_EVENTS", default=True)
CRUD_ADMIN_TRACK_SESSIONS: bool = config("CRUD_ADMIN_TRACK_SESSIONS", default=True)
CRUD_ADMIN_REDIS_ENABLED: bool = config("CRUD_ADMIN_REDIS_ENABLED", default=False)
CRUD_ADMIN_REDIS_HOST: str = config("CRUD_ADMIN_REDIS_HOST", default="localhost")
CRUD_ADMIN_REDIS_PORT: int = config("CRUD_ADMIN_REDIS_PORT", default=6379)
CRUD_ADMIN_REDIS_DB: int = config("CRUD_ADMIN_REDIS_DB", default=0)
CRUD_ADMIN_REDIS_PASSWORD: str | None = config("CRUD_ADMIN_REDIS_PASSWORD", default="None")
CRUD_ADMIN_REDIS_SSL: bool = config("CRUD_ADMIN_REDIS_SSL", default=False)
class EnvironmentOption(Enum):
LOCAL = "local"
STAGING = "staging"
PRODUCTION = "production"
class EnvironmentSettings(BaseSettings):
ENVIRONMENT: EnvironmentOption = config("ENVIRONMENT", default=EnvironmentOption.LOCAL)
class TBankSettings(BaseSettings):
TBANK_API_KEY: str = config("TBANK_API_KEY", default="")
TBANK_API_URL: str = config("TBANK_API_URL", default="https://business.tbank.ru/openapi/api")
class Settings(
AppSettings,
SQLiteSettings,
PostgresSettings,
CryptSettings,
FirstUserSettings,
TestSettings,
RedisCacheSettings,
ClientSideCacheSettings,
RedisQueueSettings,
RedisRateLimiterSettings,
DefaultRateLimitSettings,
CRUDAdminSettings,
EnvironmentSettings,
TBankSettings,
):
pass
settings = Settings()

View File

View File

@ -0,0 +1,14 @@
from fastcrud import FastCRUD
from ..db.token_blacklist import TokenBlacklist
from ..schemas import TokenBlacklistCreate, TokenBlacklistRead, TokenBlacklistUpdate
CRUDTokenBlacklist = FastCRUD[
TokenBlacklist,
TokenBlacklistCreate,
TokenBlacklistUpdate,
TokenBlacklistUpdate,
TokenBlacklistUpdate,
TokenBlacklistRead,
]
crud_token_blacklist = CRUDTokenBlacklist(TokenBlacklist)

View File

@ -0,0 +1,26 @@
from collections.abc import AsyncGenerator
from sqlalchemy.ext.asyncio import async_sessionmaker, create_async_engine
from sqlalchemy.ext.asyncio.session import AsyncSession
from sqlalchemy.orm import DeclarativeBase, MappedAsDataclass
from ..config import settings
class Base(DeclarativeBase, MappedAsDataclass):
pass
DATABASE_URI = settings.POSTGRES_URI
DATABASE_PREFIX = settings.POSTGRES_ASYNC_PREFIX
DATABASE_URL = f"{DATABASE_PREFIX}{DATABASE_URI}"
async_engine = create_async_engine(DATABASE_URL, echo=False, future=True)
local_session = async_sessionmaker(bind=async_engine, class_=AsyncSession, expire_on_commit=False)
async def async_get_db() -> AsyncGenerator[AsyncSession, None]:
async with local_session() as db:
yield db

27
src/app/core/db/models.py Normal file
View File

@ -0,0 +1,27 @@
import uuid as uuid_pkg
from uuid6 import uuid7
from datetime import UTC, datetime
from sqlalchemy import Boolean, DateTime, text
from sqlalchemy.dialects.postgresql import UUID
from sqlalchemy.orm import Mapped, mapped_column
class UUIDMixin:
uuid: Mapped[uuid_pkg.UUID] = mapped_column(
UUID(as_uuid=True), primary_key=True, default=uuid7, server_default=text("gen_random_uuid()")
)
class TimestampMixin:
created_at: Mapped[datetime] = mapped_column(
DateTime, default=datetime.now(UTC), server_default=text("current_timestamp(0)")
)
updated_at: Mapped[datetime | None] = mapped_column(
DateTime, nullable=True, onupdate=datetime.now(UTC), server_default=text("current_timestamp(0)")
)
class SoftDeleteMixin:
deleted_at: Mapped[datetime | None] = mapped_column(DateTime, nullable=True)
is_deleted: Mapped[bool] = mapped_column(Boolean, default=False)

View File

@ -0,0 +1,14 @@
from datetime import datetime
from sqlalchemy import DateTime, String
from sqlalchemy.orm import Mapped, mapped_column
from .database import Base
class TokenBlacklist(Base):
__tablename__ = "token_blacklist"
id: Mapped[int] = mapped_column("id", autoincrement=True, nullable=False, unique=True, primary_key=True, init=False)
token: Mapped[str] = mapped_column(String, unique=True, index=True)
expires_at: Mapped[datetime] = mapped_column(DateTime)

View File

View File

@ -0,0 +1,16 @@
class CacheIdentificationInferenceError(Exception):
def __init__(self, message: str = "Could not infer id for resource being cached.") -> None:
self.message = message
super().__init__(self.message)
class InvalidRequestError(Exception):
def __init__(self, message: str = "Type of request not supported.") -> None:
self.message = message
super().__init__(self.message)
class MissingClientError(Exception):
def __init__(self, message: str = "Client is None.") -> None:
self.message = message
super().__init__(self.message)

View File

@ -0,0 +1,11 @@
# ruff: noqa
from fastcrud.exceptions.http_exceptions import (
CustomException,
BadRequestException,
NotFoundException,
ForbiddenException,
UnauthorizedException,
UnprocessableEntityException,
DuplicateValueException,
RateLimitException,
)

20
src/app/core/logger.py Normal file
View File

@ -0,0 +1,20 @@
import logging
import os
from logging.handlers import RotatingFileHandler
LOG_DIR = os.path.join(os.path.dirname(os.path.dirname(__file__)), "logs")
if not os.path.exists(LOG_DIR):
os.makedirs(LOG_DIR)
LOG_FILE_PATH = os.path.join(LOG_DIR, "app.log")
LOGGING_LEVEL = logging.INFO
LOGGING_FORMAT = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
logging.basicConfig(level=LOGGING_LEVEL, format=LOGGING_FORMAT)
file_handler = RotatingFileHandler(LOG_FILE_PATH, maxBytes=10485760, backupCount=5)
file_handler.setLevel(LOGGING_LEVEL)
file_handler.setFormatter(logging.Formatter(LOGGING_FORMAT))
logging.getLogger("").addHandler(file_handler)

75
src/app/core/schemas.py Normal file
View File

@ -0,0 +1,75 @@
import uuid as uuid_pkg
from uuid6 import uuid7
from datetime import UTC, datetime
from typing import Any
from pydantic import BaseModel, Field, field_serializer
class HealthCheck(BaseModel):
name: str
version: str
description: str
# -------------- mixins --------------
class UUIDSchema(BaseModel):
uuid: uuid_pkg.UUID = Field(default_factory=uuid7)
class TimestampSchema(BaseModel):
created_at: datetime = Field(default_factory=lambda: datetime.now(UTC).replace(tzinfo=None))
updated_at: datetime | None = Field(default=None)
@field_serializer("created_at")
def serialize_dt(self, created_at: datetime | None, _info: Any) -> str | None:
if created_at is not None:
return created_at.isoformat()
return None
@field_serializer("updated_at")
def serialize_updated_at(self, updated_at: datetime | None, _info: Any) -> str | None:
if updated_at is not None:
return updated_at.isoformat()
return None
class PersistentDeletion(BaseModel):
deleted_at: datetime | None = Field(default=None)
is_deleted: bool = False
@field_serializer("deleted_at")
def serialize_dates(self, deleted_at: datetime | None, _info: Any) -> str | None:
if deleted_at is not None:
return deleted_at.isoformat()
return None
# -------------- token --------------
class Token(BaseModel):
access_token: str
token_type: str
class TokenData(BaseModel):
username_or_email: str
class TokenBlacklistBase(BaseModel):
token: str
expires_at: datetime
class TokenBlacklistRead(TokenBlacklistBase):
id: int
class TokenBlacklistCreate(TokenBlacklistBase):
pass
class TokenBlacklistUpdate(TokenBlacklistBase):
pass

137
src/app/core/security.py Normal file
View File

@ -0,0 +1,137 @@
from datetime import UTC, datetime, timedelta
from enum import Enum
from typing import Any, Literal, cast
import bcrypt
from fastapi.security import OAuth2PasswordBearer
from jose import JWTError, jwt
from pydantic import SecretStr
from sqlalchemy.ext.asyncio import AsyncSession
from ..crud.crud_users import crud_users
from .config import settings
from .db.crud_token_blacklist import crud_token_blacklist
from .schemas import TokenBlacklistCreate, TokenData
SECRET_KEY: SecretStr = settings.SECRET_KEY
ALGORITHM = settings.ALGORITHM
ACCESS_TOKEN_EXPIRE_MINUTES = settings.ACCESS_TOKEN_EXPIRE_MINUTES
REFRESH_TOKEN_EXPIRE_DAYS = settings.REFRESH_TOKEN_EXPIRE_DAYS
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="/api/v1/login")
class TokenType(str, Enum):
ACCESS = "access"
REFRESH = "refresh"
async def verify_password(plain_password: str, hashed_password: str) -> bool:
correct_password: bool = bcrypt.checkpw(plain_password.encode(), hashed_password.encode())
return correct_password
def get_password_hash(password: str) -> str:
hashed_password: str = bcrypt.hashpw(password.encode(), bcrypt.gensalt()).decode()
return hashed_password
async def authenticate_user(username_or_email: str, password: str, db: AsyncSession) -> dict[str, Any] | Literal[False]:
if "@" in username_or_email:
db_user = await crud_users.get(db=db, email=username_or_email, is_deleted=False)
else:
db_user = await crud_users.get(db=db, username=username_or_email, is_deleted=False)
if not db_user:
return False
db_user = cast(dict[str, Any], db_user)
if not await verify_password(password, db_user["hashed_password"]):
return False
return db_user
async def create_access_token(data: dict[str, Any], expires_delta: timedelta | None = None) -> str:
to_encode = data.copy()
if expires_delta:
expire = datetime.now(UTC).replace(tzinfo=None) + expires_delta
else:
expire = datetime.now(UTC).replace(tzinfo=None) + timedelta(minutes=ACCESS_TOKEN_EXPIRE_MINUTES)
to_encode.update({"exp": expire, "token_type": TokenType.ACCESS})
encoded_jwt: str = jwt.encode(to_encode, SECRET_KEY.get_secret_value(), algorithm=ALGORITHM)
return encoded_jwt
async def create_refresh_token(data: dict[str, Any], expires_delta: timedelta | None = None) -> str:
to_encode = data.copy()
if expires_delta:
expire = datetime.now(UTC).replace(tzinfo=None) + expires_delta
else:
expire = datetime.now(UTC).replace(tzinfo=None) + timedelta(days=REFRESH_TOKEN_EXPIRE_DAYS)
to_encode.update({"exp": expire, "token_type": TokenType.REFRESH})
encoded_jwt: str = jwt.encode(to_encode, SECRET_KEY.get_secret_value(), algorithm=ALGORITHM)
return encoded_jwt
async def verify_token(token: str, expected_token_type: TokenType, db: AsyncSession) -> TokenData | None:
"""Verify a JWT token and return TokenData if valid.
Parameters
----------
token: str
The JWT token to be verified.
expected_token_type: TokenType
The expected type of token (access or refresh)
db: AsyncSession
Database session for performing database operations.
Returns
-------
TokenData | None
TokenData instance if the token is valid, None otherwise.
"""
is_blacklisted = await crud_token_blacklist.exists(db, token=token)
if is_blacklisted:
return None
try:
payload = jwt.decode(token, SECRET_KEY.get_secret_value(), algorithms=[ALGORITHM])
username_or_email: str | None = payload.get("sub")
token_type: str | None = payload.get("token_type")
if username_or_email is None or token_type != expected_token_type:
return None
return TokenData(username_or_email=username_or_email)
except JWTError:
return None
async def blacklist_tokens(access_token: str, refresh_token: str, db: AsyncSession) -> None:
"""Blacklist both access and refresh tokens.
Parameters
----------
access_token: str
The access token to blacklist
refresh_token: str
The refresh token to blacklist
db: AsyncSession
Database session for performing database operations.
"""
for token in [access_token, refresh_token]:
payload = jwt.decode(token, SECRET_KEY.get_secret_value(), algorithms=[ALGORITHM])
exp_timestamp = payload.get("exp")
if exp_timestamp is not None:
expires_at = datetime.fromtimestamp(exp_timestamp)
await crud_token_blacklist.create(db, object=TokenBlacklistCreate(token=token, expires_at=expires_at))
async def blacklist_token(token: str, db: AsyncSession) -> None:
payload = jwt.decode(token, SECRET_KEY.get_secret_value(), algorithms=[ALGORITHM])
exp_timestamp = payload.get("exp")
if exp_timestamp is not None:
expires_at = datetime.fromtimestamp(exp_timestamp)
await crud_token_blacklist.create(db, object=TokenBlacklistCreate(token=token, expires_at=expires_at))

229
src/app/core/setup.py Normal file
View File

@ -0,0 +1,229 @@
from collections.abc import AsyncGenerator, Callable
from contextlib import _AsyncGeneratorContextManager, asynccontextmanager
from typing import Any
import anyio
import anyio.to_thread
import fastapi
import redis.asyncio as redis
from arq import create_pool
from arq.connections import RedisSettings
from fastapi import APIRouter, Depends, FastAPI
from fastapi.openapi.docs import get_redoc_html, get_swagger_ui_html
from fastapi.openapi.utils import get_openapi
from ..api.dependencies import get_current_superuser
from ..core.utils.rate_limit import rate_limiter
from ..middleware.client_cache_middleware import ClientCacheMiddleware
from ..models import * # noqa: F403
from .config import (
AppSettings,
ClientSideCacheSettings,
DatabaseSettings,
EnvironmentOption,
EnvironmentSettings,
RedisCacheSettings,
RedisQueueSettings,
RedisRateLimiterSettings,
settings,
)
from .db.database import Base
from .db.database import async_engine as engine
from .utils import cache, queue
# -------------- database --------------
async def create_tables_on_startcreate_tables() -> None:
async with engine.begin() as conn:
await conn.run_sync(Base.metadata.create_all)
# -------------- cache --------------
async def create_redis_cache_pool() -> None:
cache.pool = redis.ConnectionPool.from_url(settings.REDIS_CACHE_URL)
cache.client = redis.Redis.from_pool(cache.pool) # type: ignore
async def close_redis_cache_pool() -> None:
if cache.client is not None:
await cache.client.aclose() # type: ignore
# -------------- queue --------------
async def create_redis_queue_pool() -> None:
queue.pool = await create_pool(RedisSettings(host=settings.REDIS_QUEUE_HOST, port=settings.REDIS_QUEUE_PORT))
async def close_redis_queue_pool() -> None:
if queue.pool is not None:
await queue.pool.aclose() # type: ignore
# -------------- rate limit --------------
async def create_redis_rate_limit_pool() -> None:
rate_limiter.initialize(settings.REDIS_RATE_LIMIT_URL) # type: ignore
async def close_redis_rate_limit_pool() -> None:
if rate_limiter.client is not None:
await rate_limiter.client.aclose() # type: ignore
# -------------- application --------------
async def set_threadpool_tokens(number_of_tokens: int = 100) -> None:
limiter = anyio.to_thread.current_default_thread_limiter()
limiter.total_tokens = number_of_tokens
def lifespan_factory(
settings: (
DatabaseSettings
| RedisCacheSettings
| AppSettings
| ClientSideCacheSettings
| RedisQueueSettings
| RedisRateLimiterSettings
| EnvironmentSettings
),
create_tables_on_start: bool = True,
) -> Callable[[FastAPI], _AsyncGeneratorContextManager[Any]]:
"""Factory to create a lifespan async context manager for a FastAPI app."""
@asynccontextmanager
async def lifespan(app: FastAPI) -> AsyncGenerator:
from asyncio import Event
initialization_complete = Event()
app.state.initialization_complete = initialization_complete
await set_threadpool_tokens()
try:
if isinstance(settings, RedisCacheSettings):
await create_redis_cache_pool()
if isinstance(settings, RedisQueueSettings):
await create_redis_queue_pool()
if isinstance(settings, RedisRateLimiterSettings):
await create_redis_rate_limit_pool()
initialization_complete.set()
yield
finally:
if isinstance(settings, RedisCacheSettings):
await close_redis_cache_pool()
if isinstance(settings, RedisQueueSettings):
await close_redis_queue_pool()
if isinstance(settings, RedisRateLimiterSettings):
await close_redis_rate_limit_pool()
return lifespan
# -------------- application --------------
def create_application(
router: APIRouter,
settings: (
DatabaseSettings
| RedisCacheSettings
| AppSettings
| ClientSideCacheSettings
| RedisQueueSettings
| RedisRateLimiterSettings
| EnvironmentSettings
),
create_tables_on_start: bool = True,
lifespan: Callable[[FastAPI], _AsyncGeneratorContextManager[Any]] | None = None,
**kwargs: Any,
) -> FastAPI:
"""Creates and configures a FastAPI application based on the provided settings.
This function initializes a FastAPI application and configures it with various settings
and handlers based on the type of the `settings` object provided.
Parameters
----------
router : APIRouter
The APIRouter object containing the routes to be included in the FastAPI application.
settings
An instance representing the settings for configuring the FastAPI application.
It determines the configuration applied:
- AppSettings: Configures basic app metadata like name, description, contact, and license info.
- DatabaseSettings: Adds event handlers for initializing database tables during startup.
- RedisCacheSettings: Sets up event handlers for creating and closing a Redis cache pool.
- ClientSideCacheSettings: Integrates middleware for client-side caching.
- RedisQueueSettings: Sets up event handlers for creating and closing a Redis queue pool.
- RedisRateLimiterSettings: Sets up event handlers for creating and closing a Redis rate limiter pool.
- EnvironmentSettings: Conditionally sets documentation URLs and integrates custom routes for API documentation
based on the environment type.
create_tables_on_start : bool
A flag to indicate whether to create database tables on application startup.
Defaults to True.
**kwargs
Additional keyword arguments passed directly to the FastAPI constructor.
Returns
-------
FastAPI
A fully configured FastAPI application instance.
The function configures the FastAPI application with different features and behaviors
based on the provided settings. It includes setting up database connections, Redis pools
for caching, queue, and rate limiting, client-side caching, and customizing the API documentation
based on the environment settings.
"""
# --- before creating application ---
if isinstance(settings, AppSettings):
to_update = {
"title": settings.APP_NAME,
"description": settings.APP_DESCRIPTION,
"contact": {"name": settings.CONTACT_NAME, "email": settings.CONTACT_EMAIL},
"license_info": {"name": settings.LICENSE_NAME},
}
kwargs.update(to_update)
if isinstance(settings, EnvironmentSettings):
kwargs.update({"docs_url": None, "redoc_url": None, "openapi_url": None})
# Use custom lifespan if provided, otherwise use default factory
if lifespan is None:
lifespan = lifespan_factory(settings, create_tables_on_start=create_tables_on_start)
application = FastAPI(lifespan=lifespan, **kwargs)
application.include_router(router)
if isinstance(settings, ClientSideCacheSettings):
application.add_middleware(ClientCacheMiddleware, max_age=settings.CLIENT_CACHE_MAX_AGE)
if isinstance(settings, EnvironmentSettings):
if settings.ENVIRONMENT != EnvironmentOption.PRODUCTION:
docs_router = APIRouter()
if settings.ENVIRONMENT != EnvironmentOption.LOCAL:
docs_router = APIRouter(dependencies=[Depends(get_current_superuser)])
@docs_router.get("/docs", include_in_schema=False)
async def get_swagger_documentation() -> fastapi.responses.HTMLResponse:
return get_swagger_ui_html(openapi_url="/openapi.json", title="docs")
@docs_router.get("/redoc", include_in_schema=False)
async def get_redoc_documentation() -> fastapi.responses.HTMLResponse:
return get_redoc_html(openapi_url="/openapi.json", title="docs")
@docs_router.get("/openapi.json", include_in_schema=False)
async def openapi() -> dict[str, Any]:
out: dict = get_openapi(title=application.title, version=application.version, routes=application.routes)
return out
application.include_router(docs_router)
return application

View File

337
src/app/core/utils/cache.py Normal file
View File

@ -0,0 +1,337 @@
import functools
import json
import re
from collections.abc import Callable
from typing import Any
from fastapi import Request
from fastapi.encoders import jsonable_encoder
from redis.asyncio import ConnectionPool, Redis
from ..exceptions.cache_exceptions import CacheIdentificationInferenceError, InvalidRequestError, MissingClientError
pool: ConnectionPool | None = None
client: Redis | None = None
def _infer_resource_id(kwargs: dict[str, Any], resource_id_type: type | tuple[type, ...]) -> int | str:
"""Infer the resource ID from a dictionary of keyword arguments.
Parameters
----------
kwargs: Dict[str, Any]
A dictionary of keyword arguments.
resource_id_type: Union[type, Tuple[type, ...]]
The expected type of the resource ID, which can be integer (int) or a string (str).
Returns
-------
Union[None, int, str]
The inferred resource ID. If it cannot be inferred or does not match the expected type, it returns None.
Note
----
- When `resource_id_type` is `int`, the function looks for an argument with the key 'id'.
- When `resource_id_type` is `str`, it attempts to infer the resource ID as a string.
"""
resource_id: int | str | None = None
for arg_name, arg_value in kwargs.items():
if isinstance(arg_value, resource_id_type):
if (resource_id_type is int) and ("id" in arg_name):
resource_id = arg_value
elif (resource_id_type is int) and ("id" not in arg_name):
pass
elif resource_id_type is str:
resource_id = arg_value
if resource_id is None:
raise CacheIdentificationInferenceError
return resource_id
def _extract_data_inside_brackets(input_string: str) -> list[str]:
"""Extract data inside curly brackets from a given string using regular expressions.
Parameters
----------
input_string: str
The input string in which to find data enclosed within curly brackets.
Returns
-------
List[str]
A list of strings containing the data found inside the curly brackets within the input string.
Example
-------
>>> _extract_data_inside_brackets("The {quick} brown {fox} jumps over the {lazy} dog.")
['quick', 'fox', 'lazy']
"""
data_inside_brackets = re.findall(r"{(.*?)}", input_string)
return data_inside_brackets
def _construct_data_dict(data_inside_brackets: list[str], kwargs: dict[str, Any]) -> dict[str, Any]:
"""Construct a dictionary based on data inside brackets and keyword arguments.
Parameters
----------
data_inside_brackets: List[str]
A list of keys inside brackets.
kwargs: Dict[str, Any]
A dictionary of keyword arguments.
Returns
-------
Dict[str, Any]: A dictionary with keys from data_inside_brackets and corresponding values from kwargs.
"""
data_dict = {}
for key in data_inside_brackets:
data_dict[key] = kwargs[key]
return data_dict
def _format_prefix(prefix: str, kwargs: dict[str, Any]) -> str:
"""Format a prefix using keyword arguments.
Parameters
----------
prefix: str
The prefix template to be formatted.
kwargs: Dict[str, Any]
A dictionary of keyword arguments.
Returns
-------
str: The formatted prefix.
"""
data_inside_brackets = _extract_data_inside_brackets(prefix)
data_dict = _construct_data_dict(data_inside_brackets, kwargs)
formatted_prefix = prefix.format(**data_dict)
return formatted_prefix
def _format_extra_data(to_invalidate_extra: dict[str, str], kwargs: dict[str, Any]) -> dict[str, Any]:
"""Format extra data based on provided templates and keyword arguments.
This function takes a dictionary of templates and their associated values and a dictionary of keyword arguments.
It formats the templates with the corresponding values from the keyword arguments and returns a dictionary
where keys are the formatted templates and values are the associated keyword argument values.
Parameters
----------
to_invalidate_extra: Dict[str, str]
A dictionary where keys are templates and values are the associated values.
kwargs: Dict[str, Any]
A dictionary of keyword arguments.
Returns
-------
Dict[str, Any]: A dictionary where keys are formatted templates and values
are associated keyword argument values.
"""
formatted_extra = {}
for prefix, id_template in to_invalidate_extra.items():
formatted_prefix = _format_prefix(prefix, kwargs)
id = _extract_data_inside_brackets(id_template)[0]
formatted_extra[formatted_prefix] = kwargs[id]
return formatted_extra
async def _delete_keys_by_pattern(pattern: str) -> None:
"""Delete keys from Redis that match a given pattern using the SCAN command.
This function iteratively scans the Redis key space for keys that match a specific pattern
and deletes them. It uses the SCAN command to efficiently find keys, which is more
performance-friendly compared to the KEYS command, especially for large datasets.
The function scans the key space in an iterative manner using a cursor-based approach.
It retrieves a batch of keys matching the pattern on each iteration and deletes them
until no matching keys are left.
Parameters
----------
pattern: str
The pattern to match keys against. The pattern can include wildcards,
such as '*' for matching any character sequence. Example: 'user:*'
Notes
-----
- The SCAN command is used with a count of 100 to retrieve keys in batches.
This count can be adjusted based on the size of your dataset and Redis performance.
- The function uses the delete command to remove keys in bulk. If the dataset
is extremely large, consider implementing additional logic to handle bulk deletion
more efficiently.
- Be cautious with patterns that could match a large number of keys, as deleting
many keys simultaneously may impact the performance of the Redis server.
"""
if client is None:
return
cursor = 0
while True:
cursor, keys = await client.scan(cursor, match=pattern, count=100)
if keys:
await client.delete(*keys)
if cursor == 0:
break
def cache(
key_prefix: str,
resource_id_name: Any = None,
expiration: int = 3600,
resource_id_type: type | tuple[type, ...] = int,
to_invalidate_extra: dict[str, Any] | None = None,
pattern_to_invalidate_extra: list[str] | None = None,
) -> Callable:
"""Cache decorator for FastAPI endpoints.
This decorator enables caching the results of FastAPI endpoint functions to improve response times
and reduce the load on the application by storing and retrieving data in a cache.
Parameters
----------
key_prefix: str
A unique prefix to identify the cache key.
resource_id_name: Any, optional
The name of the resource ID argument in the decorated function. If provided, it is used directly;
otherwise, the resource ID is inferred from the function's arguments.
expiration: int, optional
The expiration time for the cached data in seconds. Defaults to 3600 seconds (1 hour).
resource_id_type: Union[type, Tuple[type, ...]], default int
The expected type of the resource ID.
This can be a single type (e.g., int) or a tuple of types (e.g., (int, str)).
Defaults to int. This is used only if resource_id_name is not provided.
to_invalidate_extra: Dict[str, Any] | None, optional
A dictionary where keys are cache key prefixes and values are templates for cache key suffixes.
These keys are invalidated when the decorated function is called with a method other than GET.
pattern_to_invalidate_extra: List[str] | None, optional
A list of string patterns for cache keys that should be invalidated when the decorated function is called.
This allows for bulk invalidation of cache keys based on a matching pattern.
Returns
-------
Callable
A decorator function that can be applied to FastAPI endpoint functions.
Example usage
-------------
```python
from fastapi import FastAPI, Request
from my_module import cache # Replace with your actual module and imports
app = FastAPI()
# Define a sample endpoint with caching
@app.get("/sample/{resource_id}")
@cache(key_prefix="sample_data", expiration=3600, resource_id_type=int)
async def sample_endpoint(request: Request, resource_id: int):
# Your endpoint logic here
return {"data": "your_data"}
```
This decorator caches the response data of the endpoint function using a unique cache key.
The cached data is retrieved for GET requests, and the cache is invalidated for other types of requests.
Advanced Example Usage
-------------
```python
from fastapi import FastAPI, Request
from my_module import cache
app = FastAPI()
@app.get("/users/{user_id}/items")
@cache(key_prefix="user_items", resource_id_name="user_id", expiration=1200)
async def read_user_items(request: Request, user_id: int):
# Endpoint logic to fetch user's items
return {"items": "user specific items"}
@app.put("/items/{item_id}")
@cache(
key_prefix="item_data",
resource_id_name="item_id",
to_invalidate_extra={"user_items": "{user_id}"},
pattern_to_invalidate_extra=["user_*_items:*"],
)
async def update_item(request: Request, item_id: int, data: dict, user_id: int):
# Update logic for an item
# Invalidate both the specific item cache and all user-specific item lists
return {"status": "updated"}
```
In this example:
- When reading user items, the response is cached under a key formed with 'user_items' prefix and 'user_id'.
- When updating an item, the cache for this specific item (under 'item_data:item_id') and all caches with keys
starting with 'user_{user_id}_items:' are invalidated. The `to_invalidate_extra` parameter specifically targets
the cache for user-specific item lists, while `pattern_to_invalidate_extra` allows bulk invalidation of all keys
matching the pattern 'user_*_items:*', covering all users.
Note
----
- resource_id_type is used only if resource_id is not passed.
- `to_invalidate_extra` and `pattern_to_invalidate_extra` are used for cache invalidation on methods other than GET.
- Using `pattern_to_invalidate_extra` can be resource-intensive on large datasets. Use it judiciously and
consider the potential impact on Redis performance.
"""
def wrapper(func: Callable) -> Callable:
@functools.wraps(func)
async def inner(request: Request, *args: Any, **kwargs: Any) -> Any:
if client is None:
raise MissingClientError
if resource_id_name:
resource_id = kwargs[resource_id_name]
else:
resource_id = _infer_resource_id(kwargs=kwargs, resource_id_type=resource_id_type)
formatted_key_prefix = _format_prefix(key_prefix, kwargs)
cache_key = f"{formatted_key_prefix}:{resource_id}"
if request.method == "GET":
if to_invalidate_extra is not None or pattern_to_invalidate_extra is not None:
raise InvalidRequestError
cached_data = await client.get(cache_key)
if cached_data:
return json.loads(cached_data.decode())
result = await func(request, *args, **kwargs)
if request.method == "GET":
serializable_data = jsonable_encoder(result)
serialized_data = json.dumps(serializable_data)
await client.set(cache_key, serialized_data)
await client.expire(cache_key, expiration)
return json.loads(serialized_data)
else:
await client.delete(cache_key)
if to_invalidate_extra is not None:
formatted_extra = _format_extra_data(to_invalidate_extra, kwargs)
for prefix, id in formatted_extra.items():
extra_cache_key = f"{prefix}:{id}"
await client.delete(extra_cache_key)
if pattern_to_invalidate_extra is not None:
for pattern in pattern_to_invalidate_extra:
formatted_pattern = _format_prefix(pattern, kwargs)
await _delete_keys_by_pattern(formatted_pattern + "*")
return result
return inner
return wrapper

View File

@ -0,0 +1,3 @@
from arq.connections import ArqRedis
pool: ArqRedis | None = None

View File

@ -0,0 +1,61 @@
from datetime import UTC, datetime
from typing import Optional
from redis.asyncio import ConnectionPool, Redis
from sqlalchemy.ext.asyncio import AsyncSession
from ...core.logger import logging
logger = logging.getLogger(__name__)
class RateLimiter:
_instance: Optional["RateLimiter"] = None
pool: Optional[ConnectionPool] = None
client: Optional[Redis] = None
def __new__(cls) -> "RateLimiter":
if cls._instance is None:
cls._instance = super().__new__(cls)
return cls._instance
@classmethod
def initialize(cls, redis_url: str) -> None:
instance = cls()
if instance.pool is None:
instance.pool = ConnectionPool.from_url(redis_url)
instance.client = Redis(connection_pool=instance.pool)
@classmethod
def get_client(cls) -> Redis:
instance = cls()
if instance.client is None:
logger.error("Redis client is not initialized.")
raise Exception("Redis client is not initialized.")
return instance.client
async def is_rate_limited(self, db: AsyncSession, user_id: int, path: str, limit: int, period: int) -> bool:
return False
# client = self.get_client()
# current_timestamp = int(datetime.now(UTC).timestamp())
# window_start = current_timestamp - (current_timestamp % period)
#
# sanitized_path = sanitize_path(path)
# key = f"ratelimit:{user_id}:{sanitized_path}:{window_start}"
#
# try:
# current_count = await client.incr(key)
# if current_count == 1:
# await client.expire(key, period)
#
# if current_count > limit:
# return True
#
# except Exception as e:
# logger.exception(f"Error checking rate limit for user {user_id} on path {path}: {e}")
# raise e
return False
rate_limiter = RateLimiter()

View File

View File

@ -0,0 +1,24 @@
import asyncio
import logging
import uvloop
from arq.worker import Worker
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
logging.basicConfig(level=logging.INFO, format="%(asctime)s - %(name)s - %(levelname)s - %(message)s")
# -------- background tasks --------
async def sample_background_task(ctx: Worker, name: str) -> str:
await asyncio.sleep(5)
return f"Task {name} is complete!"
# -------- base functions --------
async def startup(ctx: Worker) -> None:
logging.info("Worker Started")
async def shutdown(ctx: Worker) -> None:
logging.info("Worker end")

View File

@ -0,0 +1,15 @@
from arq.connections import RedisSettings
from ...core.config import settings
from .functions import sample_background_task, shutdown, startup
REDIS_QUEUE_HOST = settings.REDIS_QUEUE_HOST
REDIS_QUEUE_PORT = settings.REDIS_QUEUE_PORT
class WorkerSettings:
functions = [sample_background_task]
redis_settings = RedisSettings(host=REDIS_QUEUE_HOST, port=REDIS_QUEUE_PORT)
on_startup = startup
on_shutdown = shutdown
handle_signals = False

0
src/app/crud/__init__.py Normal file
View File

View File

@ -0,0 +1,7 @@
from fastcrud import FastCRUD
from ..models.user import User
from ..schemas.user import UserCreateInternal, UserDelete, UserRead, UserUpdate, UserUpdateInternal
CRUDUser = FastCRUD[User, UserCreateInternal, UserUpdate, UserUpdateInternal, UserDelete, UserRead]
crud_users = CRUDUser(User)

39
src/app/main.py Normal file
View File

@ -0,0 +1,39 @@
from collections.abc import AsyncGenerator
from contextlib import asynccontextmanager
from fastapi import FastAPI
from .admin.initialize import create_admin_interface
from .api import router
from .core.config import settings
from .core.setup import create_application, lifespan_factory
admin = create_admin_interface()
@asynccontextmanager
async def lifespan_with_admin(app: FastAPI) -> AsyncGenerator[None, None]:
"""Custom lifespan that includes admin initialization."""
# Get the default lifespan
default_lifespan = lifespan_factory(settings)
# Run the default lifespan initialization and our admin initialization
async with default_lifespan(app):
# Initialize admin interface if it exists
if admin:
# Initialize admin database and setup
await admin.initialize()
yield
app = create_application(
router=router,
settings=settings,
lifespan=lifespan_with_admin,
create_tables_on_start=False
)
# Mount admin interface if enabled
if admin:
app.mount(settings.CRUD_ADMIN_MOUNT_PATH, admin.app)

View File

@ -0,0 +1,56 @@
from fastapi import FastAPI, Request, Response
from starlette.middleware.base import BaseHTTPMiddleware, RequestResponseEndpoint
class ClientCacheMiddleware(BaseHTTPMiddleware):
"""Middleware to set the `Cache-Control` header for client-side caching on all responses.
Parameters
----------
app: FastAPI
The FastAPI application instance.
max_age: int, optional
Duration (in seconds) for which the response should be cached. Defaults to 60 seconds.
Attributes
----------
max_age: int
Duration (in seconds) for which the response should be cached.
Methods
-------
async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -> Response:
Process the request and set the `Cache-Control` header in the response.
Note
----
- The `Cache-Control` header instructs clients (e.g., browsers)
to cache the response for the specified duration.
"""
def __init__(self, app: FastAPI, max_age: int = 60) -> None:
super().__init__(app)
self.max_age = max_age
async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -> Response:
"""Process the request and set the `Cache-Control` header in the response.
Parameters
----------
request: Request
The incoming request.
call_next: RequestResponseEndpoint
The next middleware or route handler in the processing chain.
Returns
-------
Response
The response object with the `Cache-Control` header set.
Note
----
- This method is automatically called by Starlette for processing the request-response cycle.
"""
response: Response = await call_next(request)
response.headers["Cache-Control"] = f"public, max-age={self.max_age}"
return response

View File

@ -0,0 +1 @@
from .user import User

28
src/app/models/user.py Normal file
View File

@ -0,0 +1,28 @@
from uuid6 import uuid7
from datetime import UTC, datetime
import uuid as uuid_pkg
from sqlalchemy import DateTime, ForeignKey, String
from sqlalchemy.dialects.postgresql import UUID
from sqlalchemy.orm import Mapped, mapped_column
from ..core.db.database import Base
class User(Base):
__tablename__ = "user"
id: Mapped[int] = mapped_column(autoincrement=True, primary_key=True, init=False)
name: Mapped[str] = mapped_column(String(30))
username: Mapped[str] = mapped_column(String(20), unique=True, index=True)
email: Mapped[str] = mapped_column(String(50), unique=True, index=True)
hashed_password: Mapped[str] = mapped_column(String)
profile_image_url: Mapped[str] = mapped_column(String, default="https://profileimageurl.com")
uuid: Mapped[uuid_pkg.UUID] = mapped_column(UUID(as_uuid=True), default_factory=uuid7, unique=True)
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), default_factory=lambda: datetime.now(UTC))
updated_at: Mapped[datetime | None] = mapped_column(DateTime(timezone=True), default=None)
deleted_at: Mapped[datetime | None] = mapped_column(DateTime(timezone=True), default=None)
is_deleted: Mapped[bool] = mapped_column(default=False, index=True)
is_superuser: Mapped[bool] = mapped_column(default=False)

View File

5
src/app/schemas/job.py Normal file
View File

@ -0,0 +1,5 @@
from pydantic import BaseModel
class Job(BaseModel):
id: str

70
src/app/schemas/user.py Normal file
View File

@ -0,0 +1,70 @@
from datetime import datetime
from typing import Annotated
from pydantic import BaseModel, ConfigDict, EmailStr, Field
from ..core.schemas import PersistentDeletion, TimestampSchema, UUIDSchema
class UserBase(BaseModel):
name: Annotated[str, Field(min_length=2, max_length=30, examples=["User Userson"])]
username: Annotated[str, Field(min_length=2, max_length=20, pattern=r"^[a-z0-9]+$", examples=["userson"])]
email: Annotated[EmailStr, Field(examples=["user.userson@example.com"])]
class User(TimestampSchema, UserBase, UUIDSchema, PersistentDeletion):
profile_image_url: Annotated[str, Field(default="https://www.profileimageurl.com")]
hashed_password: str
is_superuser: bool = False
tier_id: int | None = None
class UserRead(BaseModel):
id: int
name: Annotated[str, Field(min_length=2, max_length=30, examples=["User Userson"])]
username: Annotated[str, Field(min_length=2, max_length=20, pattern=r"^[a-z0-9]+$", examples=["userson"])]
email: Annotated[EmailStr, Field(examples=["user.userson@example.com"])]
profile_image_url: str
tier_id: int | None
class UserCreate(UserBase):
model_config = ConfigDict(extra="forbid")
password: Annotated[str, Field(pattern=r"^.{8,}|[0-9]+|[A-Z]+|[a-z]+|[^a-zA-Z0-9]+$", examples=["Str1ngst!"])]
class UserCreateInternal(UserBase):
hashed_password: str
class UserUpdate(BaseModel):
model_config = ConfigDict(extra="forbid")
name: Annotated[str | None, Field(min_length=2, max_length=30, examples=["User Userberg"], default=None)]
username: Annotated[
str | None, Field(min_length=2, max_length=20, pattern=r"^[a-z0-9]+$", examples=["userberg"], default=None)
]
email: Annotated[EmailStr | None, Field(examples=["user.userberg@example.com"], default=None)]
profile_image_url: Annotated[
str | None,
Field(
pattern=r"^(https?|ftp)://[^\s/$.?#].[^\s]*$", examples=["https://www.profileimageurl.com"], default=None
),
]
class UserUpdateInternal(UserUpdate):
updated_at: datetime
class UserDelete(BaseModel):
model_config = ConfigDict(extra="forbid")
is_deleted: bool
deleted_at: datetime
class UserRestoreDeleted(BaseModel):
is_deleted: bool

16
src/app/test.py Normal file
View File

@ -0,0 +1,16 @@
from src.app.core.config import settings
from src.app.lib import TBankClient
async def main():
api_key = settings.TBANK_API_KEY
client = TBankClient(api_key=api_key)
data = await client.get_counterparty_excerpt_by_inn('9726096281')
print(data)
if __name__ == "__main__":
import asyncio
asyncio.run(main())

1
src/migrations/README Normal file
View File

@ -0,0 +1 @@
Generic single-database configuration.

Some files were not shown because too many files have changed in this diff Show More