feat: Tubearr — full project state through M006/S01
Migrated git root from W:/programming/Projects/ to W:/programming/Projects/Tubearr/. Previous history preserved in Tubearr-full-backup.bundle at parent directory. Completed milestones: M001 through M005 Active: M006/S02 (Add Channel UX)
This commit is contained in:
commit
4606dce553
218 changed files with 64040 additions and 0 deletions
518
.agents/skills/drizzle-migrations/SKILL.md
Normal file
518
.agents/skills/drizzle-migrations/SKILL.md
Normal file
|
|
@ -0,0 +1,518 @@
|
|||
---
|
||||
name: drizzle-migrations
|
||||
description: "Migration-first database development workflow using Drizzle ORM for TypeScript/J..."
|
||||
version: 1.0.0
|
||||
tags: []
|
||||
progressive_disclosure:
|
||||
entry_point:
|
||||
summary: "Migration-first database development workflow using Drizzle ORM for TypeScript/J..."
|
||||
when_to_use: "When working with drizzle-migrations or related functionality."
|
||||
quick_start: "1. Review the core concepts below. 2. Apply patterns to your use case. 3. Follow best practices for implementation."
|
||||
---
|
||||
# Drizzle ORM Database Migrations (TypeScript)
|
||||
|
||||
Migration-first database development workflow using Drizzle ORM for TypeScript/JavaScript projects.
|
||||
|
||||
## When to Use This Skill
|
||||
|
||||
Use this skill when:
|
||||
- Working with Drizzle ORM in TypeScript/JavaScript projects
|
||||
- Need to create or modify database schema
|
||||
- Want migration-first development workflow
|
||||
- Setting up new database tables or columns
|
||||
- Need to ensure schema consistency across environments
|
||||
|
||||
## Core Principle: Migration-First Development
|
||||
|
||||
**Critical Rule**: Schema changes ALWAYS start with migrations, never code-first.
|
||||
|
||||
### Why Migration-First?
|
||||
- ✅ SQL migrations are the single source of truth
|
||||
- ✅ Prevents schema drift between environments
|
||||
- ✅ Enables rollback and versioning
|
||||
- ✅ Forces explicit schema design decisions
|
||||
- ✅ TypeScript types generated from migrations
|
||||
- ✅ CI/CD can validate schema changes
|
||||
|
||||
### Anti-Pattern (Code-First)
|
||||
❌ **WRONG**: Writing TypeScript schema first
|
||||
```typescript
|
||||
// DON'T DO THIS FIRST
|
||||
export const users = pgTable('users', {
|
||||
id: uuid('id').primaryKey(),
|
||||
email: text('email').notNull(),
|
||||
});
|
||||
```
|
||||
|
||||
### Correct Pattern (Migration-First)
|
||||
✅ **CORRECT**: Write SQL migration first
|
||||
```sql
|
||||
-- drizzle/0001_add_users_table.sql
|
||||
CREATE TABLE users (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
email TEXT NOT NULL UNIQUE,
|
||||
created_at TIMESTAMP DEFAULT NOW()
|
||||
);
|
||||
```
|
||||
|
||||
## Complete Migration Workflow
|
||||
|
||||
### Step 1: Design Schema in SQL Migration
|
||||
|
||||
Create descriptive SQL migration file:
|
||||
|
||||
```sql
|
||||
-- drizzle/0001_create_school_calendars.sql
|
||||
CREATE TABLE school_calendars (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
school_id UUID NOT NULL REFERENCES schools(id) ON DELETE CASCADE,
|
||||
start_date DATE NOT NULL,
|
||||
end_date DATE NOT NULL,
|
||||
academic_year TEXT NOT NULL,
|
||||
created_at TIMESTAMP DEFAULT NOW(),
|
||||
updated_at TIMESTAMP DEFAULT NOW()
|
||||
);
|
||||
|
||||
-- Add indexes for query performance
|
||||
CREATE INDEX idx_school_calendars_school_id ON school_calendars(school_id);
|
||||
CREATE INDEX idx_school_calendars_academic_year ON school_calendars(academic_year);
|
||||
|
||||
-- Add constraints
|
||||
ALTER TABLE school_calendars
|
||||
ADD CONSTRAINT check_date_range
|
||||
CHECK (end_date > start_date);
|
||||
```
|
||||
|
||||
**Naming Convention**:
|
||||
- Use sequential numbers: `0001_`, `0002_`, etc.
|
||||
- Descriptive names: `create_school_calendars`, `add_user_roles`
|
||||
- Format: `XXXX_descriptive_name.sql`
|
||||
|
||||
### Step 2: Generate TypeScript Definitions
|
||||
|
||||
Drizzle Kit generates TypeScript types from SQL:
|
||||
|
||||
```bash
|
||||
# Generate TypeScript schema and snapshots
|
||||
pnpm drizzle-kit generate
|
||||
|
||||
# Or using npm
|
||||
npm run db:generate
|
||||
```
|
||||
|
||||
**What This Creates**:
|
||||
1. TypeScript schema files (if using `drizzle-kit push`)
|
||||
2. Snapshot files in `drizzle/meta/XXXX_snapshot.json`
|
||||
3. Migration metadata
|
||||
|
||||
### Step 3: Create Schema Snapshot
|
||||
|
||||
Snapshots enable schema drift detection:
|
||||
|
||||
```json
|
||||
// drizzle/meta/0001_snapshot.json (auto-generated)
|
||||
{
|
||||
"version": "5",
|
||||
"dialect": "postgresql",
|
||||
"tables": {
|
||||
"school_calendars": {
|
||||
"name": "school_calendars",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "uuid",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"default": "gen_random_uuid()"
|
||||
},
|
||||
"school_id": {
|
||||
"name": "school_id",
|
||||
"type": "uuid",
|
||||
"notNull": true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Snapshots in Version Control**:
|
||||
- ✅ Commit snapshots to git
|
||||
- ✅ Enables drift detection in CI
|
||||
- ✅ Documents schema history
|
||||
|
||||
### Step 4: Implement TypeScript Schema
|
||||
|
||||
Now write TypeScript schema that mirrors SQL migration:
|
||||
|
||||
```typescript
|
||||
// src/lib/db/schema/school/calendar.ts
|
||||
import { pgTable, uuid, date, text, timestamp } from 'drizzle-orm/pg-core';
|
||||
import { schools } from './school';
|
||||
|
||||
export const schoolCalendars = pgTable('school_calendars', {
|
||||
id: uuid('id').primaryKey().defaultRandom(),
|
||||
schoolId: uuid('school_id')
|
||||
.notNull()
|
||||
.references(() => schools.id, { onDelete: 'cascade' }),
|
||||
startDate: date('start_date').notNull(),
|
||||
endDate: date('end_date').notNull(),
|
||||
academicYear: text('academic_year').notNull(),
|
||||
createdAt: timestamp('created_at').defaultNow(),
|
||||
updatedAt: timestamp('updated_at').defaultNow(),
|
||||
});
|
||||
|
||||
// Type inference
|
||||
export type SchoolCalendar = typeof schoolCalendars.$inferSelect;
|
||||
export type NewSchoolCalendar = typeof schoolCalendars.$inferInsert;
|
||||
```
|
||||
|
||||
**Key Points**:
|
||||
- Column names match SQL exactly: `school_id` → `'school_id'`
|
||||
- TypeScript property names use camelCase: `schoolId`
|
||||
- Constraints and indexes defined in SQL, not TypeScript
|
||||
- Foreign keys reference other tables
|
||||
|
||||
### Step 5: Organize Schemas by Domain
|
||||
|
||||
Structure schemas for maintainability:
|
||||
|
||||
```
|
||||
src/lib/db/schema/
|
||||
├── index.ts # Export all schemas
|
||||
├── school/
|
||||
│ ├── index.ts
|
||||
│ ├── district.ts
|
||||
│ ├── holiday.ts
|
||||
│ ├── school.ts
|
||||
│ └── calendar.ts
|
||||
├── providers.ts
|
||||
├── cart.ts
|
||||
└── users.ts
|
||||
```
|
||||
|
||||
**index.ts** (export all):
|
||||
```typescript
|
||||
// src/lib/db/schema/index.ts
|
||||
export * from './school';
|
||||
export * from './providers';
|
||||
export * from './cart';
|
||||
export * from './users';
|
||||
```
|
||||
|
||||
**school/index.ts**:
|
||||
```typescript
|
||||
// src/lib/db/schema/school/index.ts
|
||||
export * from './district';
|
||||
export * from './holiday';
|
||||
export * from './school';
|
||||
export * from './calendar';
|
||||
```
|
||||
|
||||
### Step 6: Add Quality Check to CI
|
||||
|
||||
Validate schema consistency in CI/CD:
|
||||
|
||||
```yaml
|
||||
# .github/workflows/quality.yml
|
||||
name: Quality Checks
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
branches: [main, develop]
|
||||
push:
|
||||
branches: [main]
|
||||
|
||||
jobs:
|
||||
quality:
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '20'
|
||||
cache: 'pnpm'
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --frozen-lockfile
|
||||
|
||||
- name: Check database schema drift
|
||||
run: pnpm drizzle-kit check
|
||||
|
||||
- name: Verify migrations (dry-run)
|
||||
run: pnpm drizzle-kit push --dry-run
|
||||
env:
|
||||
DATABASE_URL: ${{ secrets.STAGING_DATABASE_URL }}
|
||||
|
||||
- name: Run type checking
|
||||
run: pnpm tsc --noEmit
|
||||
|
||||
- name: Lint code
|
||||
run: pnpm lint
|
||||
```
|
||||
|
||||
**CI Checks Explained**:
|
||||
- `drizzle-kit check`: Validates snapshots match schema
|
||||
- `drizzle-kit push --dry-run`: Tests migration without applying
|
||||
- Type checking: Ensures TypeScript compiles
|
||||
- Linting: Enforces code style
|
||||
|
||||
### Step 7: Test on Staging
|
||||
|
||||
Before production, test migration on staging:
|
||||
|
||||
```bash
|
||||
# 1. Run migration on staging
|
||||
STAGING_DATABASE_URL="..." pnpm drizzle-kit push
|
||||
|
||||
# 2. Verify schema
|
||||
pnpm drizzle-kit check
|
||||
|
||||
# 3. Test affected API routes
|
||||
curl https://staging.example.com/api/schools/calendars
|
||||
|
||||
# 4. Check for data integrity issues
|
||||
# Run queries to verify data looks correct
|
||||
|
||||
# 5. Monitor logs for errors
|
||||
# Check application logs for migration-related errors
|
||||
```
|
||||
|
||||
**Staging Checklist**:
|
||||
- [ ] Migration runs without errors
|
||||
- [ ] Schema drift check passes
|
||||
- [ ] API routes using new schema work correctly
|
||||
- [ ] No data integrity issues
|
||||
- [ ] Application logs show no errors
|
||||
- [ ] Query performance acceptable
|
||||
|
||||
## Common Migration Patterns
|
||||
|
||||
### Adding a Column
|
||||
|
||||
```sql
|
||||
-- drizzle/0005_add_user_phone.sql
|
||||
ALTER TABLE users
|
||||
ADD COLUMN phone TEXT;
|
||||
|
||||
-- Add index if querying by phone
|
||||
CREATE INDEX idx_users_phone ON users(phone);
|
||||
```
|
||||
|
||||
TypeScript:
|
||||
```typescript
|
||||
export const users = pgTable('users', {
|
||||
id: uuid('id').primaryKey(),
|
||||
email: text('email').notNull(),
|
||||
phone: text('phone'), // New column
|
||||
});
|
||||
```
|
||||
|
||||
### Creating a Junction Table
|
||||
|
||||
```sql
|
||||
-- drizzle/0006_create_provider_specialties.sql
|
||||
CREATE TABLE provider_specialties (
|
||||
provider_id UUID NOT NULL REFERENCES providers(id) ON DELETE CASCADE,
|
||||
specialty_id UUID NOT NULL REFERENCES specialties(id) ON DELETE CASCADE,
|
||||
PRIMARY KEY (provider_id, specialty_id)
|
||||
);
|
||||
|
||||
CREATE INDEX idx_provider_specialties_provider ON provider_specialties(provider_id);
|
||||
CREATE INDEX idx_provider_specialties_specialty ON provider_specialties(specialty_id);
|
||||
```
|
||||
|
||||
TypeScript:
|
||||
```typescript
|
||||
export const providerSpecialties = pgTable('provider_specialties', {
|
||||
providerId: uuid('provider_id')
|
||||
.notNull()
|
||||
.references(() => providers.id, { onDelete: 'cascade' }),
|
||||
specialtyId: uuid('specialty_id')
|
||||
.notNull()
|
||||
.references(() => specialties.id, { onDelete: 'cascade' }),
|
||||
}, (table) => ({
|
||||
pk: primaryKey(table.providerId, table.specialtyId),
|
||||
}));
|
||||
```
|
||||
|
||||
### Modifying Column Type
|
||||
|
||||
```sql
|
||||
-- drizzle/0007_change_price_to_decimal.sql
|
||||
ALTER TABLE services
|
||||
ALTER COLUMN price TYPE DECIMAL(10, 2);
|
||||
```
|
||||
|
||||
TypeScript:
|
||||
```typescript
|
||||
import { decimal } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const services = pgTable('services', {
|
||||
id: uuid('id').primaryKey(),
|
||||
name: text('name').notNull(),
|
||||
price: decimal('price', { precision: 10, scale: 2 }).notNull(),
|
||||
});
|
||||
```
|
||||
|
||||
### Adding Constraints
|
||||
|
||||
```sql
|
||||
-- drizzle/0008_add_email_constraint.sql
|
||||
ALTER TABLE users
|
||||
ADD CONSTRAINT users_email_unique UNIQUE (email);
|
||||
|
||||
ALTER TABLE users
|
||||
ADD CONSTRAINT users_email_format CHECK (email ~* '^[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}$');
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
### drizzle.config.ts
|
||||
|
||||
```typescript
|
||||
import type { Config } from 'drizzle-kit';
|
||||
|
||||
export default {
|
||||
schema: './src/lib/db/schema/index.ts',
|
||||
out: './drizzle',
|
||||
driver: 'pg',
|
||||
dbCredentials: {
|
||||
connectionString: process.env.DATABASE_URL!,
|
||||
},
|
||||
} satisfies Config;
|
||||
```
|
||||
|
||||
### package.json Scripts
|
||||
|
||||
```json
|
||||
{
|
||||
"scripts": {
|
||||
"db:generate": "drizzle-kit generate:pg",
|
||||
"db:push": "drizzle-kit push:pg",
|
||||
"db:studio": "drizzle-kit studio",
|
||||
"db:check": "drizzle-kit check:pg",
|
||||
"db:up": "drizzle-kit up:pg"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Migration Testing Workflow
|
||||
|
||||
### Local Testing
|
||||
|
||||
```bash
|
||||
# 1. Create migration
|
||||
echo "CREATE TABLE test (...)" > drizzle/0009_test.sql
|
||||
|
||||
# 2. Generate TypeScript
|
||||
pnpm db:generate
|
||||
|
||||
# 3. Push to local database
|
||||
pnpm db:push
|
||||
|
||||
# 4. Verify schema
|
||||
pnpm db:check
|
||||
|
||||
# 5. Test in application
|
||||
pnpm dev
|
||||
# Manually test affected features
|
||||
|
||||
# 6. Run tests
|
||||
pnpm test
|
||||
```
|
||||
|
||||
### Rollback Strategy
|
||||
|
||||
```sql
|
||||
-- drizzle/0010_add_feature.sql (up migration)
|
||||
CREATE TABLE new_feature (...);
|
||||
|
||||
-- drizzle/0010_add_feature_down.sql (down migration)
|
||||
DROP TABLE new_feature;
|
||||
```
|
||||
|
||||
Apply rollback:
|
||||
```bash
|
||||
# Manually run down migration
|
||||
psql $DATABASE_URL -f drizzle/0010_add_feature_down.sql
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Do's
|
||||
- ✅ Write SQL migrations first
|
||||
- ✅ Use descriptive migration names
|
||||
- ✅ Add indexes for foreign keys
|
||||
- ✅ Include constraints in migrations
|
||||
- ✅ Test migrations on staging before production
|
||||
- ✅ Commit snapshots to version control
|
||||
- ✅ Organize schemas by domain
|
||||
- ✅ Use `drizzle-kit check` in CI
|
||||
|
||||
### Don'ts
|
||||
- ❌ Never write TypeScript schema before SQL migration
|
||||
- ❌ Don't skip staging testing
|
||||
- ❌ Don't modify old migrations (create new ones)
|
||||
- ❌ Don't forget to add indexes
|
||||
- ❌ Don't use `drizzle-kit push` in production (use proper migrations)
|
||||
- ❌ Don't commit generated files without snapshots
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Schema Drift Detected
|
||||
**Error**: `Schema drift detected`
|
||||
|
||||
**Solution**:
|
||||
```bash
|
||||
# Check what changed
|
||||
pnpm drizzle-kit check
|
||||
|
||||
# Regenerate snapshots
|
||||
pnpm drizzle-kit generate
|
||||
|
||||
# Review changes and commit
|
||||
git add drizzle/meta/
|
||||
git commit -m "Update schema snapshots"
|
||||
```
|
||||
|
||||
### Migration Fails on Staging
|
||||
**Error**: Migration fails with data constraint violation
|
||||
|
||||
**Solution**:
|
||||
1. Rollback migration
|
||||
2. Create data migration script
|
||||
3. Run data migration first
|
||||
4. Then run schema migration
|
||||
|
||||
```sql
|
||||
-- First: Migrate data
|
||||
UPDATE users SET status = 'active' WHERE status IS NULL;
|
||||
|
||||
-- Then: Add constraint
|
||||
ALTER TABLE users
|
||||
ALTER COLUMN status SET NOT NULL;
|
||||
```
|
||||
|
||||
### TypeScript Types Out of Sync
|
||||
**Error**: TypeScript types don't match database
|
||||
|
||||
**Solution**:
|
||||
```bash
|
||||
# Regenerate everything
|
||||
pnpm db:generate
|
||||
pnpm tsc --noEmit
|
||||
|
||||
# If still broken, check schema files
|
||||
# Ensure column names match SQL exactly
|
||||
```
|
||||
|
||||
## Related Skills
|
||||
|
||||
- `universal-data-database-migration` - Universal migration patterns
|
||||
- `toolchains-typescript-data-drizzle` - Drizzle ORM usage patterns
|
||||
- `toolchains-typescript-core` - TypeScript best practices
|
||||
- `universal-debugging-verification-before-completion` - Verification workflows
|
||||
396
.agents/skills/drizzle-orm/SKILL.md
Normal file
396
.agents/skills/drizzle-orm/SKILL.md
Normal file
|
|
@ -0,0 +1,396 @@
|
|||
---
|
||||
name: drizzle-orm
|
||||
description: "Type-safe SQL ORM for TypeScript with zero runtime overhead"
|
||||
progressive_disclosure:
|
||||
entry_point:
|
||||
summary: "Type-safe SQL ORM for TypeScript with zero runtime overhead"
|
||||
when_to_use: "When working with drizzle-orm or related functionality."
|
||||
quick_start: "1. Review the core concepts below. 2. Apply patterns to your use case. 3. Follow best practices for implementation."
|
||||
references:
|
||||
- advanced-schemas.md
|
||||
- performance.md
|
||||
- query-patterns.md
|
||||
- vs-prisma.md
|
||||
---
|
||||
# Drizzle ORM
|
||||
|
||||
Modern TypeScript-first ORM with zero dependencies, compile-time type safety, and SQL-like syntax. Optimized for edge runtimes and serverless environments.
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Installation
|
||||
|
||||
```bash
|
||||
# Core ORM
|
||||
npm install drizzle-orm
|
||||
|
||||
# Database driver (choose one)
|
||||
npm install pg # PostgreSQL
|
||||
npm install mysql2 # MySQL
|
||||
npm install better-sqlite3 # SQLite
|
||||
|
||||
# Drizzle Kit (migrations)
|
||||
npm install -D drizzle-kit
|
||||
```
|
||||
|
||||
### Basic Setup
|
||||
|
||||
```typescript
|
||||
// db/schema.ts
|
||||
import { pgTable, serial, text, timestamp } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: text('email').notNull().unique(),
|
||||
name: text('name').notNull(),
|
||||
createdAt: timestamp('created_at').defaultNow(),
|
||||
});
|
||||
|
||||
// db/client.ts
|
||||
import { drizzle } from 'drizzle-orm/node-postgres';
|
||||
import { Pool } from 'pg';
|
||||
import * as schema from './schema';
|
||||
|
||||
const pool = new Pool({ connectionString: process.env.DATABASE_URL });
|
||||
export const db = drizzle(pool, { schema });
|
||||
```
|
||||
|
||||
### First Query
|
||||
|
||||
```typescript
|
||||
import { db } from './db/client';
|
||||
import { users } from './db/schema';
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
// Insert
|
||||
const newUser = await db.insert(users).values({
|
||||
email: 'user@example.com',
|
||||
name: 'John Doe',
|
||||
}).returning();
|
||||
|
||||
// Select
|
||||
const allUsers = await db.select().from(users);
|
||||
|
||||
// Where
|
||||
const user = await db.select().from(users).where(eq(users.id, 1));
|
||||
|
||||
// Update
|
||||
await db.update(users).set({ name: 'Jane Doe' }).where(eq(users.id, 1));
|
||||
|
||||
// Delete
|
||||
await db.delete(users).where(eq(users.id, 1));
|
||||
```
|
||||
|
||||
## Schema Definition
|
||||
|
||||
### Column Types Reference
|
||||
|
||||
| PostgreSQL | MySQL | SQLite | TypeScript |
|
||||
|------------|-------|--------|------------|
|
||||
| `serial()` | `serial()` | `integer()` | `number` |
|
||||
| `text()` | `text()` | `text()` | `string` |
|
||||
| `integer()` | `int()` | `integer()` | `number` |
|
||||
| `boolean()` | `boolean()` | `integer()` | `boolean` |
|
||||
| `timestamp()` | `datetime()` | `integer()` | `Date` |
|
||||
| `json()` | `json()` | `text()` | `unknown` |
|
||||
| `uuid()` | `varchar(36)` | `text()` | `string` |
|
||||
|
||||
### Common Schema Patterns
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, varchar, integer, boolean, timestamp, json, unique } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: varchar('email', { length: 255 }).notNull().unique(),
|
||||
passwordHash: varchar('password_hash', { length: 255 }).notNull(),
|
||||
role: text('role', { enum: ['admin', 'user', 'guest'] }).default('user'),
|
||||
metadata: json('metadata').$type<{ theme: string; locale: string }>(),
|
||||
isActive: boolean('is_active').default(true),
|
||||
createdAt: timestamp('created_at').defaultNow().notNull(),
|
||||
updatedAt: timestamp('updated_at').defaultNow().notNull(),
|
||||
}, (table) => ({
|
||||
emailIdx: unique('email_unique_idx').on(table.email),
|
||||
}));
|
||||
|
||||
// Infer TypeScript types
|
||||
type User = typeof users.$inferSelect;
|
||||
type NewUser = typeof users.$inferInsert;
|
||||
```
|
||||
|
||||
## Relations
|
||||
|
||||
### One-to-Many
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, integer } from 'drizzle-orm/pg-core';
|
||||
import { relations } from 'drizzle-orm';
|
||||
|
||||
export const authors = pgTable('authors', {
|
||||
id: serial('id').primaryKey(),
|
||||
name: text('name').notNull(),
|
||||
});
|
||||
|
||||
export const posts = pgTable('posts', {
|
||||
id: serial('id').primaryKey(),
|
||||
title: text('title').notNull(),
|
||||
authorId: integer('author_id').notNull().references(() => authors.id),
|
||||
});
|
||||
|
||||
export const authorsRelations = relations(authors, ({ many }) => ({
|
||||
posts: many(posts),
|
||||
}));
|
||||
|
||||
export const postsRelations = relations(posts, ({ one }) => ({
|
||||
author: one(authors, {
|
||||
fields: [posts.authorId],
|
||||
references: [authors.id],
|
||||
}),
|
||||
}));
|
||||
|
||||
// Query with relations
|
||||
const authorsWithPosts = await db.query.authors.findMany({
|
||||
with: { posts: true },
|
||||
});
|
||||
```
|
||||
|
||||
### Many-to-Many
|
||||
|
||||
```typescript
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
name: text('name').notNull(),
|
||||
});
|
||||
|
||||
export const groups = pgTable('groups', {
|
||||
id: serial('id').primaryKey(),
|
||||
name: text('name').notNull(),
|
||||
});
|
||||
|
||||
export const usersToGroups = pgTable('users_to_groups', {
|
||||
userId: integer('user_id').notNull().references(() => users.id),
|
||||
groupId: integer('group_id').notNull().references(() => groups.id),
|
||||
}, (table) => ({
|
||||
pk: primaryKey({ columns: [table.userId, table.groupId] }),
|
||||
}));
|
||||
|
||||
export const usersRelations = relations(users, ({ many }) => ({
|
||||
groups: many(usersToGroups),
|
||||
}));
|
||||
|
||||
export const groupsRelations = relations(groups, ({ many }) => ({
|
||||
users: many(usersToGroups),
|
||||
}));
|
||||
|
||||
export const usersToGroupsRelations = relations(usersToGroups, ({ one }) => ({
|
||||
user: one(users, { fields: [usersToGroups.userId], references: [users.id] }),
|
||||
group: one(groups, { fields: [usersToGroups.groupId], references: [groups.id] }),
|
||||
}));
|
||||
```
|
||||
|
||||
## Queries
|
||||
|
||||
### Filtering
|
||||
|
||||
```typescript
|
||||
import { eq, ne, gt, gte, lt, lte, like, ilike, inArray, isNull, isNotNull, and, or, between } from 'drizzle-orm';
|
||||
|
||||
// Equality
|
||||
await db.select().from(users).where(eq(users.email, 'user@example.com'));
|
||||
|
||||
// Comparison
|
||||
await db.select().from(users).where(gt(users.id, 10));
|
||||
|
||||
// Pattern matching
|
||||
await db.select().from(users).where(like(users.name, '%John%'));
|
||||
|
||||
// Multiple conditions
|
||||
await db.select().from(users).where(
|
||||
and(
|
||||
eq(users.role, 'admin'),
|
||||
gt(users.createdAt, new Date('2024-01-01'))
|
||||
)
|
||||
);
|
||||
|
||||
// IN clause
|
||||
await db.select().from(users).where(inArray(users.id, [1, 2, 3]));
|
||||
|
||||
// NULL checks
|
||||
await db.select().from(users).where(isNull(users.deletedAt));
|
||||
```
|
||||
|
||||
### Joins
|
||||
|
||||
```typescript
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
// Inner join
|
||||
const result = await db
|
||||
.select({
|
||||
user: users,
|
||||
post: posts,
|
||||
})
|
||||
.from(users)
|
||||
.innerJoin(posts, eq(users.id, posts.authorId));
|
||||
|
||||
// Left join
|
||||
const result = await db
|
||||
.select({
|
||||
user: users,
|
||||
post: posts,
|
||||
})
|
||||
.from(users)
|
||||
.leftJoin(posts, eq(users.id, posts.authorId));
|
||||
|
||||
// Multiple joins with aggregation
|
||||
import { count, sql } from 'drizzle-orm';
|
||||
|
||||
const result = await db
|
||||
.select({
|
||||
authorName: authors.name,
|
||||
postCount: count(posts.id),
|
||||
})
|
||||
.from(authors)
|
||||
.leftJoin(posts, eq(authors.id, posts.authorId))
|
||||
.groupBy(authors.id);
|
||||
```
|
||||
|
||||
### Pagination & Sorting
|
||||
|
||||
```typescript
|
||||
import { desc, asc } from 'drizzle-orm';
|
||||
|
||||
// Order by
|
||||
await db.select().from(users).orderBy(desc(users.createdAt));
|
||||
|
||||
// Limit & offset
|
||||
await db.select().from(users).limit(10).offset(20);
|
||||
|
||||
// Pagination helper
|
||||
function paginate(page: number, pageSize: number = 10) {
|
||||
return db.select().from(users)
|
||||
.limit(pageSize)
|
||||
.offset(page * pageSize);
|
||||
}
|
||||
```
|
||||
|
||||
## Transactions
|
||||
|
||||
```typescript
|
||||
// Auto-rollback on error
|
||||
await db.transaction(async (tx) => {
|
||||
await tx.insert(users).values({ email: 'user@example.com', name: 'John' });
|
||||
await tx.insert(posts).values({ title: 'First Post', authorId: 1 });
|
||||
// If any query fails, entire transaction rolls back
|
||||
});
|
||||
|
||||
// Manual control
|
||||
const tx = db.transaction(async (tx) => {
|
||||
const user = await tx.insert(users).values({ ... }).returning();
|
||||
|
||||
if (!user) {
|
||||
tx.rollback();
|
||||
return;
|
||||
}
|
||||
|
||||
await tx.insert(posts).values({ authorId: user.id });
|
||||
});
|
||||
```
|
||||
|
||||
## Migrations
|
||||
|
||||
### Drizzle Kit Configuration
|
||||
|
||||
```typescript
|
||||
// drizzle.config.ts
|
||||
import type { Config } from 'drizzle-kit';
|
||||
|
||||
export default {
|
||||
schema: './db/schema.ts',
|
||||
out: './drizzle',
|
||||
dialect: 'postgresql',
|
||||
dbCredentials: {
|
||||
url: process.env.DATABASE_URL!,
|
||||
},
|
||||
} satisfies Config;
|
||||
```
|
||||
|
||||
### Migration Workflow
|
||||
|
||||
```bash
|
||||
# Generate migration
|
||||
npx drizzle-kit generate
|
||||
|
||||
# View SQL
|
||||
cat drizzle/0000_migration.sql
|
||||
|
||||
# Apply migration
|
||||
npx drizzle-kit migrate
|
||||
|
||||
# Introspect existing database
|
||||
npx drizzle-kit introspect
|
||||
|
||||
# Drizzle Studio (database GUI)
|
||||
npx drizzle-kit studio
|
||||
```
|
||||
|
||||
### Example Migration
|
||||
|
||||
```sql
|
||||
-- drizzle/0000_initial.sql
|
||||
CREATE TABLE IF NOT EXISTS "users" (
|
||||
"id" serial PRIMARY KEY NOT NULL,
|
||||
"email" varchar(255) NOT NULL,
|
||||
"name" text NOT NULL,
|
||||
"created_at" timestamp DEFAULT now() NOT NULL,
|
||||
CONSTRAINT "users_email_unique" UNIQUE("email")
|
||||
);
|
||||
```
|
||||
|
||||
## Navigation
|
||||
|
||||
### Detailed References
|
||||
|
||||
- **[🏗️ Advanced Schemas](./references/advanced-schemas.md)** - Custom types, composite keys, indexes, constraints, multi-tenant patterns. Load when designing complex database schemas.
|
||||
|
||||
- **[🔍 Query Patterns](./references/query-patterns.md)** - Subqueries, CTEs, raw SQL, prepared statements, batch operations. Load when optimizing queries or handling complex filtering.
|
||||
|
||||
- **[⚡ Performance](./references/performance.md)** - Connection pooling, query optimization, N+1 prevention, prepared statements, edge runtime integration. Load when scaling or optimizing database performance.
|
||||
|
||||
- **[🔄 vs Prisma](./references/vs-prisma.md)** - Feature comparison, migration guide, when to choose Drizzle over Prisma. Load when evaluating ORMs or migrating from Prisma.
|
||||
|
||||
## Red Flags
|
||||
|
||||
**Stop and reconsider if:**
|
||||
- Using `any` or `unknown` for JSON columns without type annotation
|
||||
- Building raw SQL strings without using `sql` template (SQL injection risk)
|
||||
- Not using transactions for multi-step data modifications
|
||||
- Fetching all rows without pagination in production queries
|
||||
- Missing indexes on foreign keys or frequently queried columns
|
||||
- Using `select()` without specifying columns for large tables
|
||||
|
||||
## Performance Benefits vs Prisma
|
||||
|
||||
| Metric | Drizzle | Prisma |
|
||||
|--------|---------|--------|
|
||||
| **Bundle Size** | ~35KB | ~230KB |
|
||||
| **Cold Start** | ~10ms | ~250ms |
|
||||
| **Query Speed** | Baseline | ~2-3x slower |
|
||||
| **Memory** | ~10MB | ~50MB |
|
||||
| **Type Generation** | Runtime inference | Build-time generation |
|
||||
|
||||
## Integration
|
||||
|
||||
- **typescript-core**: Type-safe schema inference with `satisfies`
|
||||
- **nextjs-core**: Server Actions, Route Handlers, Middleware integration
|
||||
- **Database Migration**: Safe schema evolution patterns
|
||||
|
||||
## Related Skills
|
||||
|
||||
When using Drizzle, these skills enhance your workflow:
|
||||
- **prisma**: Alternative ORM comparison: Drizzle vs Prisma trade-offs
|
||||
- **typescript**: Advanced TypeScript patterns for type-safe queries
|
||||
- **nextjs**: Drizzle with Next.js Server Actions and API routes
|
||||
- **sqlalchemy**: SQLAlchemy patterns for Python developers learning Drizzle
|
||||
|
||||
[Full documentation available in these skills if deployed in your bundle]
|
||||
380
.agents/skills/drizzle-orm/references/advanced-schemas.md
Normal file
380
.agents/skills/drizzle-orm/references/advanced-schemas.md
Normal file
|
|
@ -0,0 +1,380 @@
|
|||
# Advanced Schemas
|
||||
|
||||
Deep dive into complex schema patterns, custom types, and database-specific features in Drizzle ORM.
|
||||
|
||||
## Custom Column Types
|
||||
|
||||
### Enums
|
||||
|
||||
```typescript
|
||||
import { pgEnum, pgTable, serial } from 'drizzle-orm/pg-core';
|
||||
|
||||
// PostgreSQL native enum
|
||||
export const roleEnum = pgEnum('role', ['admin', 'user', 'guest']);
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
role: roleEnum('role').default('user'),
|
||||
});
|
||||
|
||||
// MySQL/SQLite: Use text with constraints
|
||||
import { mysqlTable, text } from 'drizzle-orm/mysql-core';
|
||||
|
||||
export const users = mysqlTable('users', {
|
||||
role: text('role', { enum: ['admin', 'user', 'guest'] }).default('user'),
|
||||
});
|
||||
```
|
||||
|
||||
### Custom JSON Types
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, json } from 'drizzle-orm/pg-core';
|
||||
import { z } from 'zod';
|
||||
|
||||
// Type-safe JSON with Zod
|
||||
const MetadataSchema = z.object({
|
||||
theme: z.enum(['light', 'dark']),
|
||||
locale: z.string(),
|
||||
notifications: z.boolean(),
|
||||
});
|
||||
|
||||
type Metadata = z.infer<typeof MetadataSchema>;
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
metadata: json('metadata').$type<Metadata>(),
|
||||
});
|
||||
|
||||
// Runtime validation
|
||||
async function updateMetadata(userId: number, metadata: unknown) {
|
||||
const validated = MetadataSchema.parse(metadata);
|
||||
await db.update(users).set({ metadata: validated }).where(eq(users.id, userId));
|
||||
}
|
||||
```
|
||||
|
||||
### Arrays
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const posts = pgTable('posts', {
|
||||
id: serial('id').primaryKey(),
|
||||
tags: text('tags').array(),
|
||||
});
|
||||
|
||||
// Query array columns
|
||||
import { arrayContains, arrayContained } from 'drizzle-orm';
|
||||
|
||||
await db.select().from(posts).where(arrayContains(posts.tags, ['typescript', 'drizzle']));
|
||||
```
|
||||
|
||||
## Indexes
|
||||
|
||||
### Basic Indexes
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, varchar, index, uniqueIndex } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: varchar('email', { length: 255 }).notNull(),
|
||||
name: text('name'),
|
||||
city: text('city'),
|
||||
}, (table) => ({
|
||||
emailIdx: uniqueIndex('email_idx').on(table.email),
|
||||
nameIdx: index('name_idx').on(table.name),
|
||||
cityNameIdx: index('city_name_idx').on(table.city, table.name),
|
||||
}));
|
||||
```
|
||||
|
||||
### Partial Indexes
|
||||
|
||||
```typescript
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: varchar('email', { length: 255 }),
|
||||
deletedAt: timestamp('deleted_at'),
|
||||
}, (table) => ({
|
||||
activeEmailIdx: uniqueIndex('active_email_idx')
|
||||
.on(table.email)
|
||||
.where(sql`${table.deletedAt} IS NULL`),
|
||||
}));
|
||||
```
|
||||
|
||||
### Full-Text Search
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, index } from 'drizzle-orm/pg-core';
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const posts = pgTable('posts', {
|
||||
id: serial('id').primaryKey(),
|
||||
title: text('title').notNull(),
|
||||
content: text('content').notNull(),
|
||||
}, (table) => ({
|
||||
searchIdx: index('search_idx').using(
|
||||
'gin',
|
||||
sql`to_tsvector('english', ${table.title} || ' ' || ${table.content})`
|
||||
),
|
||||
}));
|
||||
|
||||
// Full-text search query
|
||||
const results = await db.select().from(posts).where(
|
||||
sql`to_tsvector('english', ${posts.title} || ' ' || ${posts.content}) @@ plainto_tsquery('english', 'typescript orm')`
|
||||
);
|
||||
```
|
||||
|
||||
## Composite Keys
|
||||
|
||||
```typescript
|
||||
import { pgTable, text, primaryKey } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const userPreferences = pgTable('user_preferences', {
|
||||
userId: integer('user_id').notNull(),
|
||||
key: text('key').notNull(),
|
||||
value: text('value').notNull(),
|
||||
}, (table) => ({
|
||||
pk: primaryKey({ columns: [table.userId, table.key] }),
|
||||
}));
|
||||
```
|
||||
|
||||
## Check Constraints
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, integer, check } from 'drizzle-orm/pg-core';
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const products = pgTable('products', {
|
||||
id: serial('id').primaryKey(),
|
||||
price: integer('price').notNull(),
|
||||
discountPrice: integer('discount_price'),
|
||||
}, (table) => ({
|
||||
priceCheck: check('price_check', sql`${table.price} > 0`),
|
||||
discountCheck: check('discount_check', sql`${table.discountPrice} < ${table.price}`),
|
||||
}));
|
||||
```
|
||||
|
||||
## Generated Columns
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, integer } from 'drizzle-orm/pg-core';
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
firstName: text('first_name').notNull(),
|
||||
lastName: text('last_name').notNull(),
|
||||
fullName: text('full_name').generatedAlwaysAs(
|
||||
(): SQL => sql`${users.firstName} || ' ' || ${users.lastName}`,
|
||||
{ mode: 'stored' }
|
||||
),
|
||||
});
|
||||
```
|
||||
|
||||
## Multi-Tenant Patterns
|
||||
|
||||
### Row-Level Security (PostgreSQL)
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, uuid } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const tenants = pgTable('tenants', {
|
||||
id: uuid('id').defaultRandom().primaryKey(),
|
||||
name: text('name').notNull(),
|
||||
});
|
||||
|
||||
export const documents = pgTable('documents', {
|
||||
id: serial('id').primaryKey(),
|
||||
tenantId: uuid('tenant_id').notNull().references(() => tenants.id),
|
||||
title: text('title').notNull(),
|
||||
content: text('content'),
|
||||
});
|
||||
|
||||
// Apply RLS policy (via migration SQL)
|
||||
/*
|
||||
ALTER TABLE documents ENABLE ROW LEVEL SECURITY;
|
||||
|
||||
CREATE POLICY tenant_isolation ON documents
|
||||
USING (tenant_id = current_setting('app.current_tenant_id')::uuid);
|
||||
*/
|
||||
|
||||
// Set tenant context
|
||||
await db.execute(sql`SET app.current_tenant_id = ${tenantId}`);
|
||||
```
|
||||
|
||||
### Schema-Per-Tenant
|
||||
|
||||
```typescript
|
||||
import { drizzle } from 'drizzle-orm/node-postgres';
|
||||
|
||||
// Create schema-aware connection
|
||||
function getTenantDb(tenantId: string) {
|
||||
const schemaName = `tenant_${tenantId}`;
|
||||
|
||||
return drizzle(pool, {
|
||||
schema: {
|
||||
...schema,
|
||||
},
|
||||
schemaPrefix: schemaName,
|
||||
});
|
||||
}
|
||||
|
||||
// Use tenant-specific DB
|
||||
const tenantDb = getTenantDb('tenant123');
|
||||
await tenantDb.select().from(users);
|
||||
```
|
||||
|
||||
## Database-Specific Features
|
||||
|
||||
### PostgreSQL: JSONB Operations
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, jsonb } from 'drizzle-orm/pg-core';
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const settings = pgTable('settings', {
|
||||
id: serial('id').primaryKey(),
|
||||
config: jsonb('config').$type<Record<string, unknown>>(),
|
||||
});
|
||||
|
||||
// JSONB operators
|
||||
await db.select().from(settings).where(
|
||||
sql`${settings.config}->>'theme' = 'dark'`
|
||||
);
|
||||
|
||||
// JSONB path query
|
||||
await db.select().from(settings).where(
|
||||
sql`${settings.config} @> '{"notifications": {"email": true}}'::jsonb`
|
||||
);
|
||||
```
|
||||
|
||||
### MySQL: Spatial Types
|
||||
|
||||
```typescript
|
||||
import { mysqlTable, serial, geometry } from 'drizzle-orm/mysql-core';
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const locations = mysqlTable('locations', {
|
||||
id: serial('id').primaryKey(),
|
||||
point: geometry('point', { type: 'point', srid: 4326 }),
|
||||
});
|
||||
|
||||
// Spatial query
|
||||
await db.select().from(locations).where(
|
||||
sql`ST_Distance_Sphere(${locations.point}, POINT(${lng}, ${lat})) < 1000`
|
||||
);
|
||||
```
|
||||
|
||||
### SQLite: FTS5
|
||||
|
||||
```typescript
|
||||
import { sqliteTable, text } from 'drizzle-orm/sqlite-core';
|
||||
|
||||
export const documents = sqliteTable('documents', {
|
||||
title: text('title'),
|
||||
content: text('content'),
|
||||
});
|
||||
|
||||
// Create FTS5 virtual table (via migration)
|
||||
/*
|
||||
CREATE VIRTUAL TABLE documents_fts USING fts5(title, content, content='documents');
|
||||
*/
|
||||
```
|
||||
|
||||
## Schema Versioning
|
||||
|
||||
### Migration Strategy
|
||||
|
||||
```typescript
|
||||
// db/schema.ts
|
||||
export const schemaVersion = pgTable('schema_version', {
|
||||
version: serial('version').primaryKey(),
|
||||
appliedAt: timestamp('applied_at').defaultNow(),
|
||||
});
|
||||
|
||||
// Track migrations
|
||||
await db.insert(schemaVersion).values({ version: 1 });
|
||||
|
||||
// Check version
|
||||
const [currentVersion] = await db.select().from(schemaVersion).orderBy(desc(schemaVersion.version)).limit(1);
|
||||
```
|
||||
|
||||
## Type Inference Helpers
|
||||
|
||||
```typescript
|
||||
import { InferSelectModel, InferInsertModel } from 'drizzle-orm';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: text('email').notNull(),
|
||||
name: text('name'),
|
||||
});
|
||||
|
||||
// Generate types
|
||||
export type User = InferSelectModel<typeof users>;
|
||||
export type NewUser = InferInsertModel<typeof users>;
|
||||
|
||||
// Partial updates
|
||||
export type UserUpdate = Partial<NewUser>;
|
||||
|
||||
// Nested relation types
|
||||
export type UserWithPosts = User & {
|
||||
posts: Post[];
|
||||
};
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Schema Organization
|
||||
|
||||
```typescript
|
||||
// db/schema/users.ts
|
||||
export const users = pgTable('users', { ... });
|
||||
export const userRelations = relations(users, { ... });
|
||||
|
||||
// db/schema/posts.ts
|
||||
export const posts = pgTable('posts', { ... });
|
||||
export const postRelations = relations(posts, { ... });
|
||||
|
||||
// db/schema/index.ts
|
||||
export * from './users';
|
||||
export * from './posts';
|
||||
|
||||
// db/client.ts
|
||||
import * as schema from './schema';
|
||||
export const db = drizzle(pool, { schema });
|
||||
```
|
||||
|
||||
### Naming Conventions
|
||||
|
||||
```typescript
|
||||
// ✅ Good: Consistent naming
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
firstName: text('first_name'),
|
||||
createdAt: timestamp('created_at'),
|
||||
});
|
||||
|
||||
// ❌ Bad: Inconsistent naming
|
||||
export const Users = pgTable('user', {
|
||||
ID: serial('userId').primaryKey(),
|
||||
first_name: text('firstname'),
|
||||
});
|
||||
```
|
||||
|
||||
### Default Values
|
||||
|
||||
```typescript
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const posts = pgTable('posts', {
|
||||
id: serial('id').primaryKey(),
|
||||
slug: text('slug').notNull(),
|
||||
views: integer('views').default(0),
|
||||
createdAt: timestamp('created_at').defaultNow(),
|
||||
updatedAt: timestamp('updated_at').default(sql`CURRENT_TIMESTAMP`),
|
||||
uuid: uuid('uuid').defaultRandom(),
|
||||
});
|
||||
```
|
||||
594
.agents/skills/drizzle-orm/references/performance.md
Normal file
594
.agents/skills/drizzle-orm/references/performance.md
Normal file
|
|
@ -0,0 +1,594 @@
|
|||
# Performance Optimization
|
||||
|
||||
Connection pooling, query optimization, edge runtime integration, and performance best practices.
|
||||
|
||||
## Connection Pooling
|
||||
|
||||
### PostgreSQL (node-postgres)
|
||||
|
||||
```typescript
|
||||
import { Pool } from 'pg';
|
||||
import { drizzle } from 'drizzle-orm/node-postgres';
|
||||
|
||||
const pool = new Pool({
|
||||
host: process.env.DB_HOST,
|
||||
port: parseInt(process.env.DB_PORT || '5432'),
|
||||
database: process.env.DB_NAME,
|
||||
user: process.env.DB_USER,
|
||||
password: process.env.DB_PASSWORD,
|
||||
max: 20, // Maximum pool size
|
||||
idleTimeoutMillis: 30000, // Close idle clients after 30s
|
||||
connectionTimeoutMillis: 2000, // Timeout connection attempts
|
||||
});
|
||||
|
||||
export const db = drizzle(pool);
|
||||
|
||||
// Graceful shutdown
|
||||
process.on('SIGTERM', async () => {
|
||||
await pool.end();
|
||||
});
|
||||
```
|
||||
|
||||
### MySQL (mysql2)
|
||||
|
||||
```typescript
|
||||
import mysql from 'mysql2/promise';
|
||||
import { drizzle } from 'drizzle-orm/mysql2';
|
||||
|
||||
const poolConnection = mysql.createPool({
|
||||
host: process.env.DB_HOST,
|
||||
user: process.env.DB_USER,
|
||||
password: process.env.DB_PASSWORD,
|
||||
database: process.env.DB_NAME,
|
||||
waitForConnections: true,
|
||||
connectionLimit: 10,
|
||||
maxIdle: 10,
|
||||
idleTimeout: 60000,
|
||||
queueLimit: 0,
|
||||
enableKeepAlive: true,
|
||||
keepAliveInitialDelay: 0,
|
||||
});
|
||||
|
||||
export const db = drizzle(poolConnection);
|
||||
```
|
||||
|
||||
### SQLite (better-sqlite3)
|
||||
|
||||
```typescript
|
||||
import Database from 'better-sqlite3';
|
||||
import { drizzle } from 'drizzle-orm/better-sqlite3';
|
||||
|
||||
const sqlite = new Database('sqlite.db', {
|
||||
readonly: false,
|
||||
fileMustExist: false,
|
||||
timeout: 5000,
|
||||
verbose: console.log, // Remove in production
|
||||
});
|
||||
|
||||
// Performance pragmas
|
||||
sqlite.pragma('journal_mode = WAL');
|
||||
sqlite.pragma('synchronous = normal');
|
||||
sqlite.pragma('cache_size = -64000'); // 64MB cache
|
||||
sqlite.pragma('temp_store = memory');
|
||||
|
||||
export const db = drizzle(sqlite);
|
||||
|
||||
process.on('exit', () => sqlite.close());
|
||||
```
|
||||
|
||||
## Query Optimization
|
||||
|
||||
### Select Only Needed Columns
|
||||
|
||||
```typescript
|
||||
// ❌ Bad: Fetch all columns
|
||||
const users = await db.select().from(users);
|
||||
|
||||
// ✅ Good: Fetch only needed columns
|
||||
const users = await db.select({
|
||||
id: users.id,
|
||||
email: users.email,
|
||||
name: users.name,
|
||||
}).from(users);
|
||||
```
|
||||
|
||||
### Use Indexes Effectively
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, varchar, index } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: varchar('email', { length: 255 }).notNull(),
|
||||
city: text('city'),
|
||||
status: text('status'),
|
||||
}, (table) => ({
|
||||
// Index frequently queried columns
|
||||
emailIdx: index('email_idx').on(table.email),
|
||||
|
||||
// Composite index for common query patterns
|
||||
cityStatusIdx: index('city_status_idx').on(table.city, table.status),
|
||||
}));
|
||||
|
||||
// Query uses index
|
||||
const activeUsersInNYC = await db.select()
|
||||
.from(users)
|
||||
.where(and(
|
||||
eq(users.city, 'NYC'),
|
||||
eq(users.status, 'active')
|
||||
));
|
||||
```
|
||||
|
||||
### Analyze Query Plans
|
||||
|
||||
```typescript
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
// PostgreSQL EXPLAIN
|
||||
const plan = await db.execute(
|
||||
sql`EXPLAIN ANALYZE SELECT * FROM ${users} WHERE ${users.email} = 'user@example.com'`
|
||||
);
|
||||
|
||||
console.log(plan.rows);
|
||||
|
||||
// Check for:
|
||||
// - "Seq Scan" (bad) vs "Index Scan" (good)
|
||||
// - Actual time vs estimated time
|
||||
// - Rows removed by filter
|
||||
```
|
||||
|
||||
### Pagination Performance
|
||||
|
||||
```typescript
|
||||
// ❌ Bad: OFFSET on large datasets (gets slower as offset increases)
|
||||
const page = await db.select()
|
||||
.from(users)
|
||||
.limit(20)
|
||||
.offset(10000); // Scans 10,020 rows!
|
||||
|
||||
// ✅ Good: Cursor-based pagination (constant time)
|
||||
const page = await db.select()
|
||||
.from(users)
|
||||
.where(gt(users.id, lastSeenId))
|
||||
.orderBy(asc(users.id))
|
||||
.limit(20);
|
||||
|
||||
// ✅ Good: Seek method for timestamp-based pagination
|
||||
const page = await db.select()
|
||||
.from(posts)
|
||||
.where(lt(posts.createdAt, lastSeenTimestamp))
|
||||
.orderBy(desc(posts.createdAt))
|
||||
.limit(20);
|
||||
```
|
||||
|
||||
## Edge Runtime Integration
|
||||
|
||||
### Cloudflare Workers (D1)
|
||||
|
||||
```typescript
|
||||
import { drizzle } from 'drizzle-orm/d1';
|
||||
|
||||
export default {
|
||||
async fetch(request: Request, env: Env): Promise<Response> {
|
||||
const db = drizzle(env.DB);
|
||||
|
||||
const users = await db.select().from(users).limit(10);
|
||||
|
||||
return Response.json(users);
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
### Vercel Edge (Neon)
|
||||
|
||||
```typescript
|
||||
import { neon } from '@neondatabase/serverless';
|
||||
import { drizzle } from 'drizzle-orm/neon-http';
|
||||
|
||||
export const runtime = 'edge';
|
||||
|
||||
export async function GET() {
|
||||
const sql = neon(process.env.DATABASE_URL!);
|
||||
const db = drizzle(sql);
|
||||
|
||||
const users = await db.select().from(users);
|
||||
|
||||
return Response.json(users);
|
||||
}
|
||||
```
|
||||
|
||||
### Supabase Edge Functions
|
||||
|
||||
```typescript
|
||||
import { createClient } from '@supabase/supabase-js';
|
||||
import { drizzle } from 'drizzle-orm/postgres-js';
|
||||
import postgres from 'postgres';
|
||||
|
||||
Deno.serve(async (req) => {
|
||||
const client = postgres(Deno.env.get('DATABASE_URL')!);
|
||||
const db = drizzle(client);
|
||||
|
||||
const data = await db.select().from(users);
|
||||
|
||||
return new Response(JSON.stringify(data), {
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Caching Strategies
|
||||
|
||||
### In-Memory Cache
|
||||
|
||||
```typescript
|
||||
import { LRUCache } from 'lru-cache';
|
||||
|
||||
const cache = new LRUCache<string, any>({
|
||||
max: 500,
|
||||
ttl: 1000 * 60 * 5, // 5 minutes
|
||||
});
|
||||
|
||||
async function getCachedUser(id: number) {
|
||||
const key = `user:${id}`;
|
||||
const cached = cache.get(key);
|
||||
|
||||
if (cached) return cached;
|
||||
|
||||
const user = await db.select().from(users).where(eq(users.id, id));
|
||||
cache.set(key, user);
|
||||
|
||||
return user;
|
||||
}
|
||||
```
|
||||
|
||||
### Redis Cache Layer
|
||||
|
||||
```typescript
|
||||
import { Redis } from 'ioredis';
|
||||
|
||||
const redis = new Redis(process.env.REDIS_URL);
|
||||
|
||||
async function getCachedData<T>(
|
||||
key: string,
|
||||
fetcher: () => Promise<T>,
|
||||
ttl: number = 300
|
||||
): Promise<T> {
|
||||
// Try cache first
|
||||
const cached = await redis.get(key);
|
||||
if (cached) return JSON.parse(cached);
|
||||
|
||||
// Fetch from database
|
||||
const data = await fetcher();
|
||||
|
||||
// Store in cache
|
||||
await redis.setex(key, ttl, JSON.stringify(data));
|
||||
|
||||
return data;
|
||||
}
|
||||
|
||||
// Usage
|
||||
const users = await getCachedData(
|
||||
'users:all',
|
||||
() => db.select().from(users),
|
||||
600
|
||||
);
|
||||
```
|
||||
|
||||
### Materialized Views (PostgreSQL)
|
||||
|
||||
```typescript
|
||||
// Create materialized view (via migration)
|
||||
/*
|
||||
CREATE MATERIALIZED VIEW user_stats AS
|
||||
SELECT
|
||||
u.id,
|
||||
u.name,
|
||||
COUNT(p.id) AS post_count,
|
||||
COUNT(c.id) AS comment_count
|
||||
FROM users u
|
||||
LEFT JOIN posts p ON p.author_id = u.id
|
||||
LEFT JOIN comments c ON c.user_id = u.id
|
||||
GROUP BY u.id;
|
||||
|
||||
CREATE UNIQUE INDEX ON user_stats (id);
|
||||
*/
|
||||
|
||||
// Define schema
|
||||
export const userStats = pgMaterializedView('user_stats').as((qb) =>
|
||||
qb.select({
|
||||
id: users.id,
|
||||
name: users.name,
|
||||
postCount: sql<number>`COUNT(${posts.id})`,
|
||||
commentCount: sql<number>`COUNT(${comments.id})`,
|
||||
})
|
||||
.from(users)
|
||||
.leftJoin(posts, eq(posts.authorId, users.id))
|
||||
.leftJoin(comments, eq(comments.userId, users.id))
|
||||
.groupBy(users.id)
|
||||
);
|
||||
|
||||
// Refresh materialized view
|
||||
await db.execute(sql`REFRESH MATERIALIZED VIEW CONCURRENTLY user_stats`);
|
||||
|
||||
// Query materialized view (fast!)
|
||||
const stats = await db.select().from(userStats);
|
||||
```
|
||||
|
||||
## Batch Operations Optimization
|
||||
|
||||
### Batch Insert with COPY (PostgreSQL)
|
||||
|
||||
```typescript
|
||||
import { copyFrom } from 'pg-copy-streams';
|
||||
import { pipeline } from 'stream/promises';
|
||||
import { Readable } from 'stream';
|
||||
|
||||
async function bulkInsert(data: any[]) {
|
||||
const client = await pool.connect();
|
||||
|
||||
try {
|
||||
const stream = client.query(
|
||||
copyFrom(`COPY users (email, name) FROM STDIN WITH (FORMAT csv)`)
|
||||
);
|
||||
|
||||
const input = Readable.from(
|
||||
data.map(row => `${row.email},${row.name}\n`)
|
||||
);
|
||||
|
||||
await pipeline(input, stream);
|
||||
} finally {
|
||||
client.release();
|
||||
}
|
||||
}
|
||||
|
||||
// 10x faster than batch INSERT for large datasets
|
||||
```
|
||||
|
||||
### Chunk Processing
|
||||
|
||||
```typescript
|
||||
async function* chunked<T>(array: T[], size: number) {
|
||||
for (let i = 0; i < array.length; i += size) {
|
||||
yield array.slice(i, i + size);
|
||||
}
|
||||
}
|
||||
|
||||
async function bulkUpdate(updates: { id: number; name: string }[]) {
|
||||
for await (const chunk of chunked(updates, 100)) {
|
||||
await db.transaction(async (tx) => {
|
||||
for (const update of chunk) {
|
||||
await tx.update(users)
|
||||
.set({ name: update.name })
|
||||
.where(eq(users.id, update.id));
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Connection Management
|
||||
|
||||
### Serverless Optimization
|
||||
|
||||
```typescript
|
||||
// ❌ Bad: New connection per request
|
||||
export async function handler() {
|
||||
const pool = new Pool({ connectionString: process.env.DATABASE_URL });
|
||||
const db = drizzle(pool);
|
||||
|
||||
const users = await db.select().from(users);
|
||||
|
||||
await pool.end();
|
||||
return users;
|
||||
}
|
||||
|
||||
// ✅ Good: Reuse connection across warm starts
|
||||
let cachedDb: ReturnType<typeof drizzle> | null = null;
|
||||
|
||||
export async function handler() {
|
||||
if (!cachedDb) {
|
||||
const pool = new Pool({
|
||||
connectionString: process.env.DATABASE_URL,
|
||||
max: 1, // Serverless: single connection per instance
|
||||
});
|
||||
cachedDb = drizzle(pool);
|
||||
}
|
||||
|
||||
const users = await cachedDb.select().from(users);
|
||||
return users;
|
||||
}
|
||||
```
|
||||
|
||||
### HTTP-based Databases (Neon, Turso)
|
||||
|
||||
```typescript
|
||||
// No connection pooling needed - uses HTTP
|
||||
import { neon } from '@neondatabase/serverless';
|
||||
import { drizzle } from 'drizzle-orm/neon-http';
|
||||
|
||||
const sql = neon(process.env.DATABASE_URL!);
|
||||
const db = drizzle(sql);
|
||||
|
||||
// Each query is a single HTTP request
|
||||
const users = await db.select().from(users);
|
||||
```
|
||||
|
||||
## Read Replicas
|
||||
|
||||
```typescript
|
||||
import { Pool } from 'pg';
|
||||
import { drizzle } from 'drizzle-orm/node-postgres';
|
||||
|
||||
// Primary (writes)
|
||||
const primaryPool = new Pool({ connectionString: process.env.PRIMARY_DB_URL });
|
||||
const primaryDb = drizzle(primaryPool);
|
||||
|
||||
// Replica (reads)
|
||||
const replicaPool = new Pool({ connectionString: process.env.REPLICA_DB_URL });
|
||||
const replicaDb = drizzle(replicaPool);
|
||||
|
||||
// Route queries appropriately
|
||||
async function getUsers() {
|
||||
return replicaDb.select().from(users); // Read from replica
|
||||
}
|
||||
|
||||
async function createUser(data: NewUser) {
|
||||
return primaryDb.insert(users).values(data).returning(); // Write to primary
|
||||
}
|
||||
```
|
||||
|
||||
## Monitoring & Profiling
|
||||
|
||||
### Query Logging
|
||||
|
||||
```typescript
|
||||
import { drizzle } from 'drizzle-orm/node-postgres';
|
||||
|
||||
const db = drizzle(pool, {
|
||||
logger: {
|
||||
logQuery(query: string, params: unknown[]) {
|
||||
console.log('Query:', query);
|
||||
console.log('Params:', params);
|
||||
console.time('query');
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Custom logger with metrics
|
||||
class MetricsLogger {
|
||||
private queries: Map<string, { count: number; totalTime: number }> = new Map();
|
||||
|
||||
logQuery(query: string) {
|
||||
const start = Date.now();
|
||||
|
||||
return () => {
|
||||
const duration = Date.now() - start;
|
||||
const stats = this.queries.get(query) || { count: 0, totalTime: 0 };
|
||||
|
||||
this.queries.set(query, {
|
||||
count: stats.count + 1,
|
||||
totalTime: stats.totalTime + duration,
|
||||
});
|
||||
|
||||
if (duration > 1000) {
|
||||
console.warn(`Slow query (${duration}ms):`, query);
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
getStats() {
|
||||
return Array.from(this.queries.entries()).map(([query, stats]) => ({
|
||||
query,
|
||||
count: stats.count,
|
||||
avgTime: stats.totalTime / stats.count,
|
||||
}));
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Performance Monitoring
|
||||
|
||||
```typescript
|
||||
import { performance } from 'perf_hooks';
|
||||
|
||||
async function measureQuery<T>(
|
||||
name: string,
|
||||
query: Promise<T>
|
||||
): Promise<T> {
|
||||
const start = performance.now();
|
||||
|
||||
try {
|
||||
const result = await query;
|
||||
const duration = performance.now() - start;
|
||||
|
||||
console.log(`[${name}] completed in ${duration.toFixed(2)}ms`);
|
||||
|
||||
return result;
|
||||
} catch (error) {
|
||||
const duration = performance.now() - start;
|
||||
console.error(`[${name}] failed after ${duration.toFixed(2)}ms`, error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Usage
|
||||
const users = await measureQuery(
|
||||
'fetchUsers',
|
||||
db.select().from(users).limit(100)
|
||||
);
|
||||
```
|
||||
|
||||
## Database-Specific Optimizations
|
||||
|
||||
### PostgreSQL
|
||||
|
||||
```typescript
|
||||
// Connection optimization
|
||||
const pool = new Pool({
|
||||
max: 20,
|
||||
application_name: 'myapp',
|
||||
statement_timeout: 30000, // 30s query timeout
|
||||
query_timeout: 30000,
|
||||
connectionTimeoutMillis: 5000,
|
||||
idle_in_transaction_session_timeout: 10000,
|
||||
});
|
||||
|
||||
// Session optimization
|
||||
await db.execute(sql`SET work_mem = '256MB'`);
|
||||
await db.execute(sql`SET maintenance_work_mem = '512MB'`);
|
||||
await db.execute(sql`SET effective_cache_size = '4GB'`);
|
||||
```
|
||||
|
||||
### MySQL
|
||||
|
||||
```typescript
|
||||
const pool = mysql.createPool({
|
||||
waitForConnections: true,
|
||||
connectionLimit: 10,
|
||||
queueLimit: 0,
|
||||
enableKeepAlive: true,
|
||||
keepAliveInitialDelay: 0,
|
||||
dateStrings: false,
|
||||
supportBigNumbers: true,
|
||||
bigNumberStrings: false,
|
||||
multipleStatements: false, // Security
|
||||
timezone: 'Z', // UTC
|
||||
});
|
||||
```
|
||||
|
||||
### SQLite
|
||||
|
||||
```typescript
|
||||
// WAL mode for concurrent reads
|
||||
sqlite.pragma('journal_mode = WAL');
|
||||
|
||||
// Optimize for performance
|
||||
sqlite.pragma('synchronous = NORMAL');
|
||||
sqlite.pragma('cache_size = -64000'); // 64MB
|
||||
sqlite.pragma('temp_store = MEMORY');
|
||||
sqlite.pragma('mmap_size = 30000000000'); // 30GB mmap
|
||||
|
||||
// Disable for bulk inserts
|
||||
const stmt = sqlite.prepare('INSERT INTO users (email, name) VALUES (?, ?)');
|
||||
|
||||
const insertMany = sqlite.transaction((users) => {
|
||||
for (const user of users) {
|
||||
stmt.run(user.email, user.name);
|
||||
}
|
||||
});
|
||||
|
||||
insertMany(users); // 100x faster than individual inserts
|
||||
```
|
||||
|
||||
## Best Practices Summary
|
||||
|
||||
1. **Always use connection pooling** in long-running processes
|
||||
2. **Select only needed columns** to reduce network transfer
|
||||
3. **Add indexes** on frequently queried columns and foreign keys
|
||||
4. **Use cursor-based pagination** instead of OFFSET for large datasets
|
||||
5. **Batch operations** when inserting/updating multiple records
|
||||
6. **Cache expensive queries** with appropriate TTL
|
||||
7. **Monitor slow queries** and optimize with EXPLAIN ANALYZE
|
||||
8. **Use prepared statements** for frequently executed queries
|
||||
9. **Implement read replicas** for high-traffic read operations
|
||||
10. **Use HTTP-based databases** (Neon, Turso) for edge/serverless
|
||||
577
.agents/skills/drizzle-orm/references/query-patterns.md
Normal file
577
.agents/skills/drizzle-orm/references/query-patterns.md
Normal file
|
|
@ -0,0 +1,577 @@
|
|||
# Query Patterns
|
||||
|
||||
Advanced querying techniques, subqueries, CTEs, and raw SQL in Drizzle ORM.
|
||||
|
||||
## Subqueries
|
||||
|
||||
### SELECT Subqueries
|
||||
|
||||
```typescript
|
||||
import { sql, eq } from 'drizzle-orm';
|
||||
|
||||
// Scalar subquery
|
||||
const avgPrice = db.select({ value: avg(products.price) }).from(products);
|
||||
|
||||
const expensiveProducts = await db
|
||||
.select()
|
||||
.from(products)
|
||||
.where(gt(products.price, avgPrice));
|
||||
|
||||
// Correlated subquery
|
||||
const authorsWithPostCount = await db
|
||||
.select({
|
||||
author: authors,
|
||||
postCount: sql<number>`(
|
||||
SELECT COUNT(*)
|
||||
FROM ${posts}
|
||||
WHERE ${posts.authorId} = ${authors.id}
|
||||
)`,
|
||||
})
|
||||
.from(authors);
|
||||
```
|
||||
|
||||
### EXISTS Subqueries
|
||||
|
||||
```typescript
|
||||
// Find authors with posts
|
||||
const authorsWithPosts = await db
|
||||
.select()
|
||||
.from(authors)
|
||||
.where(
|
||||
sql`EXISTS (
|
||||
SELECT 1
|
||||
FROM ${posts}
|
||||
WHERE ${posts.authorId} = ${authors.id}
|
||||
)`
|
||||
);
|
||||
|
||||
// Find authors without posts
|
||||
const authorsWithoutPosts = await db
|
||||
.select()
|
||||
.from(authors)
|
||||
.where(
|
||||
sql`NOT EXISTS (
|
||||
SELECT 1
|
||||
FROM ${posts}
|
||||
WHERE ${posts.authorId} = ${authors.id}
|
||||
)`
|
||||
);
|
||||
```
|
||||
|
||||
### IN Subqueries
|
||||
|
||||
```typescript
|
||||
// Find users who commented
|
||||
const usersWhoCommented = await db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(
|
||||
sql`${users.id} IN (
|
||||
SELECT DISTINCT ${comments.userId}
|
||||
FROM ${comments}
|
||||
)`
|
||||
);
|
||||
```
|
||||
|
||||
## Common Table Expressions (CTEs)
|
||||
|
||||
### Basic CTE
|
||||
|
||||
```typescript
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
const topAuthors = db.$with('top_authors').as(
|
||||
db.select({
|
||||
id: authors.id,
|
||||
name: authors.name,
|
||||
postCount: sql<number>`COUNT(${posts.id})`.as('post_count'),
|
||||
})
|
||||
.from(authors)
|
||||
.leftJoin(posts, eq(authors.id, posts.authorId))
|
||||
.groupBy(authors.id)
|
||||
.having(sql`COUNT(${posts.id}) > 10`)
|
||||
);
|
||||
|
||||
const result = await db
|
||||
.with(topAuthors)
|
||||
.select()
|
||||
.from(topAuthors);
|
||||
```
|
||||
|
||||
### Recursive CTE
|
||||
|
||||
```typescript
|
||||
// Organizational hierarchy
|
||||
export const employees = pgTable('employees', {
|
||||
id: serial('id').primaryKey(),
|
||||
name: text('name').notNull(),
|
||||
managerId: integer('manager_id').references((): AnyPgColumn => employees.id),
|
||||
});
|
||||
|
||||
const employeeHierarchy = db.$with('employee_hierarchy').as(
|
||||
db.select({
|
||||
id: employees.id,
|
||||
name: employees.name,
|
||||
managerId: employees.managerId,
|
||||
level: sql<number>`1`.as('level'),
|
||||
})
|
||||
.from(employees)
|
||||
.where(isNull(employees.managerId))
|
||||
.unionAll(
|
||||
db.select({
|
||||
id: employees.id,
|
||||
name: employees.name,
|
||||
managerId: employees.managerId,
|
||||
level: sql<number>`employee_hierarchy.level + 1`,
|
||||
})
|
||||
.from(employees)
|
||||
.innerJoin(
|
||||
sql`employee_hierarchy`,
|
||||
sql`${employees.managerId} = employee_hierarchy.id`
|
||||
)
|
||||
)
|
||||
);
|
||||
|
||||
const hierarchy = await db
|
||||
.with(employeeHierarchy)
|
||||
.select()
|
||||
.from(employeeHierarchy);
|
||||
```
|
||||
|
||||
### Multiple CTEs
|
||||
|
||||
```typescript
|
||||
const activeUsers = db.$with('active_users').as(
|
||||
db.select().from(users).where(eq(users.isActive, true))
|
||||
);
|
||||
|
||||
const recentPosts = db.$with('recent_posts').as(
|
||||
db.select().from(posts).where(gt(posts.createdAt, sql`NOW() - INTERVAL '30 days'`))
|
||||
);
|
||||
|
||||
const result = await db
|
||||
.with(activeUsers, recentPosts)
|
||||
.select({
|
||||
user: activeUsers,
|
||||
post: recentPosts,
|
||||
})
|
||||
.from(activeUsers)
|
||||
.leftJoin(recentPosts, eq(activeUsers.id, recentPosts.authorId));
|
||||
```
|
||||
|
||||
## Raw SQL
|
||||
|
||||
### Safe Raw Queries
|
||||
|
||||
```typescript
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
// Parameterized query (safe from SQL injection)
|
||||
const userId = 123;
|
||||
const user = await db.execute(
|
||||
sql`SELECT * FROM ${users} WHERE ${users.id} = ${userId}`
|
||||
);
|
||||
|
||||
// Raw SQL with type safety
|
||||
const result = await db.execute<{ count: number }>(
|
||||
sql`SELECT COUNT(*) as count FROM ${users}`
|
||||
);
|
||||
```
|
||||
|
||||
### SQL Template Composition
|
||||
|
||||
```typescript
|
||||
// Reusable SQL fragments
|
||||
function whereActive() {
|
||||
return sql`${users.isActive} = true`;
|
||||
}
|
||||
|
||||
function whereRole(role: string) {
|
||||
return sql`${users.role} = ${role}`;
|
||||
}
|
||||
|
||||
// Compose fragments
|
||||
const admins = await db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(sql`${whereActive()} AND ${whereRole('admin')}`);
|
||||
```
|
||||
|
||||
### Dynamic WHERE Clauses
|
||||
|
||||
```typescript
|
||||
import { and, SQL } from 'drizzle-orm';
|
||||
|
||||
interface Filters {
|
||||
name?: string;
|
||||
role?: string;
|
||||
isActive?: boolean;
|
||||
}
|
||||
|
||||
function buildFilters(filters: Filters): SQL | undefined {
|
||||
const conditions: SQL[] = [];
|
||||
|
||||
if (filters.name) {
|
||||
conditions.push(like(users.name, `%${filters.name}%`));
|
||||
}
|
||||
|
||||
if (filters.role) {
|
||||
conditions.push(eq(users.role, filters.role));
|
||||
}
|
||||
|
||||
if (filters.isActive !== undefined) {
|
||||
conditions.push(eq(users.isActive, filters.isActive));
|
||||
}
|
||||
|
||||
return conditions.length > 0 ? and(...conditions) : undefined;
|
||||
}
|
||||
|
||||
// Usage
|
||||
const filters: Filters = { name: 'John', isActive: true };
|
||||
const users = await db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(buildFilters(filters));
|
||||
```
|
||||
|
||||
## Aggregations
|
||||
|
||||
### Basic Aggregates
|
||||
|
||||
```typescript
|
||||
import { count, sum, avg, min, max, sql } from 'drizzle-orm';
|
||||
|
||||
// Count
|
||||
const userCount = await db.select({ count: count() }).from(users);
|
||||
|
||||
// Sum
|
||||
const totalRevenue = await db.select({ total: sum(orders.amount) }).from(orders);
|
||||
|
||||
// Average
|
||||
const avgPrice = await db.select({ avg: avg(products.price) }).from(products);
|
||||
|
||||
// Multiple aggregates
|
||||
const stats = await db
|
||||
.select({
|
||||
count: count(),
|
||||
total: sum(orders.amount),
|
||||
avg: avg(orders.amount),
|
||||
min: min(orders.amount),
|
||||
max: max(orders.amount),
|
||||
})
|
||||
.from(orders);
|
||||
```
|
||||
|
||||
### GROUP BY with HAVING
|
||||
|
||||
```typescript
|
||||
// Authors with more than 5 posts
|
||||
const prolificAuthors = await db
|
||||
.select({
|
||||
author: authors.name,
|
||||
postCount: count(posts.id),
|
||||
})
|
||||
.from(authors)
|
||||
.leftJoin(posts, eq(authors.id, posts.authorId))
|
||||
.groupBy(authors.id)
|
||||
.having(sql`COUNT(${posts.id}) > 5`);
|
||||
```
|
||||
|
||||
### Window Functions
|
||||
|
||||
```typescript
|
||||
// Rank products by price within category
|
||||
const rankedProducts = await db
|
||||
.select({
|
||||
product: products,
|
||||
priceRank: sql<number>`RANK() OVER (PARTITION BY ${products.categoryId} ORDER BY ${products.price} DESC)`,
|
||||
})
|
||||
.from(products);
|
||||
|
||||
// Running total
|
||||
const ordersWithRunningTotal = await db
|
||||
.select({
|
||||
order: orders,
|
||||
runningTotal: sql<number>`SUM(${orders.amount}) OVER (ORDER BY ${orders.createdAt})`,
|
||||
})
|
||||
.from(orders);
|
||||
|
||||
// Row number
|
||||
const numberedUsers = await db
|
||||
.select({
|
||||
user: users,
|
||||
rowNum: sql<number>`ROW_NUMBER() OVER (ORDER BY ${users.createdAt})`,
|
||||
})
|
||||
.from(users);
|
||||
```
|
||||
|
||||
## Prepared Statements
|
||||
|
||||
### Reusable Queries
|
||||
|
||||
```typescript
|
||||
// Prepare once, execute many times
|
||||
const getUserById = db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(eq(users.id, sql.placeholder('id')))
|
||||
.prepare('get_user_by_id');
|
||||
|
||||
// Execute with different parameters
|
||||
const user1 = await getUserById.execute({ id: 1 });
|
||||
const user2 = await getUserById.execute({ id: 2 });
|
||||
|
||||
// Complex prepared statement
|
||||
const searchUsers = db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(
|
||||
and(
|
||||
like(users.name, sql.placeholder('name')),
|
||||
eq(users.role, sql.placeholder('role'))
|
||||
)
|
||||
)
|
||||
.prepare('search_users');
|
||||
|
||||
const admins = await searchUsers.execute({ name: '%John%', role: 'admin' });
|
||||
```
|
||||
|
||||
## Batch Operations
|
||||
|
||||
### Batch Insert
|
||||
|
||||
```typescript
|
||||
// Insert multiple rows
|
||||
const newUsers = await db.insert(users).values([
|
||||
{ email: 'user1@example.com', name: 'User 1' },
|
||||
{ email: 'user2@example.com', name: 'User 2' },
|
||||
{ email: 'user3@example.com', name: 'User 3' },
|
||||
]).returning();
|
||||
|
||||
// Batch with onConflictDoNothing
|
||||
await db.insert(users).values(bulkUsers).onConflictDoNothing();
|
||||
|
||||
// Batch with onConflictDoUpdate (upsert)
|
||||
await db.insert(users)
|
||||
.values(bulkUsers)
|
||||
.onConflictDoUpdate({
|
||||
target: users.email,
|
||||
set: { name: sql`EXCLUDED.name` },
|
||||
});
|
||||
```
|
||||
|
||||
### Batch Update
|
||||
|
||||
```typescript
|
||||
// Update multiple specific rows
|
||||
await db.transaction(async (tx) => {
|
||||
for (const update of updates) {
|
||||
await tx.update(users)
|
||||
.set({ name: update.name })
|
||||
.where(eq(users.id, update.id));
|
||||
}
|
||||
});
|
||||
|
||||
// Bulk update with CASE
|
||||
await db.execute(sql`
|
||||
UPDATE ${users}
|
||||
SET ${users.role} = CASE ${users.id}
|
||||
${sql.join(
|
||||
updates.map((u) => sql`WHEN ${u.id} THEN ${u.role}`),
|
||||
sql.raw(' ')
|
||||
)}
|
||||
END
|
||||
WHERE ${users.id} IN (${sql.join(updates.map((u) => u.id), sql.raw(', '))})
|
||||
`);
|
||||
```
|
||||
|
||||
### Batch Delete
|
||||
|
||||
```typescript
|
||||
// Delete multiple IDs
|
||||
await db.delete(users).where(inArray(users.id, [1, 2, 3, 4, 5]));
|
||||
|
||||
// Conditional batch delete
|
||||
await db.delete(posts).where(
|
||||
and(
|
||||
lt(posts.createdAt, new Date('2023-01-01')),
|
||||
eq(posts.isDraft, true)
|
||||
)
|
||||
);
|
||||
```
|
||||
|
||||
## LATERAL Joins
|
||||
|
||||
```typescript
|
||||
// Get top 3 posts for each author
|
||||
const authorsWithTopPosts = await db
|
||||
.select({
|
||||
author: authors,
|
||||
post: posts,
|
||||
})
|
||||
.from(authors)
|
||||
.leftJoin(
|
||||
sql`LATERAL (
|
||||
SELECT * FROM ${posts}
|
||||
WHERE ${posts.authorId} = ${authors.id}
|
||||
ORDER BY ${posts.views} DESC
|
||||
LIMIT 3
|
||||
) AS ${posts}`,
|
||||
sql`true`
|
||||
);
|
||||
```
|
||||
|
||||
## UNION Queries
|
||||
|
||||
```typescript
|
||||
// Combine results from multiple queries
|
||||
const allContent = await db
|
||||
.select({ id: posts.id, title: posts.title, type: sql<string>`'post'` })
|
||||
.from(posts)
|
||||
.union(
|
||||
db.select({ id: articles.id, title: articles.title, type: sql<string>`'article'` })
|
||||
.from(articles)
|
||||
);
|
||||
|
||||
// UNION ALL (includes duplicates)
|
||||
const allItems = await db
|
||||
.select({ id: products.id, name: products.name })
|
||||
.from(products)
|
||||
.unionAll(
|
||||
db.select({ id: services.id, name: services.name }).from(services)
|
||||
);
|
||||
```
|
||||
|
||||
## Distinct Queries
|
||||
|
||||
```typescript
|
||||
// DISTINCT
|
||||
const uniqueRoles = await db.selectDistinct({ role: users.role }).from(users);
|
||||
|
||||
// DISTINCT ON (PostgreSQL)
|
||||
const latestPostPerAuthor = await db
|
||||
.selectDistinctOn([posts.authorId], {
|
||||
post: posts,
|
||||
})
|
||||
.from(posts)
|
||||
.orderBy(posts.authorId, desc(posts.createdAt));
|
||||
```
|
||||
|
||||
## Locking Strategies
|
||||
|
||||
```typescript
|
||||
// FOR UPDATE (pessimistic locking)
|
||||
await db.transaction(async (tx) => {
|
||||
const user = await tx
|
||||
.select()
|
||||
.from(users)
|
||||
.where(eq(users.id, userId))
|
||||
.for('update');
|
||||
|
||||
// Critical section - user row is locked
|
||||
await tx.update(users)
|
||||
.set({ balance: user.balance - amount })
|
||||
.where(eq(users.id, userId));
|
||||
});
|
||||
|
||||
// FOR SHARE (shared lock)
|
||||
const user = await db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(eq(users.id, userId))
|
||||
.for('share');
|
||||
|
||||
// SKIP LOCKED
|
||||
const availableTask = await db
|
||||
.select()
|
||||
.from(tasks)
|
||||
.where(eq(tasks.status, 'pending'))
|
||||
.limit(1)
|
||||
.for('update', { skipLocked: true });
|
||||
```
|
||||
|
||||
## Query Builder Patterns
|
||||
|
||||
### Type-Safe Query Builder
|
||||
|
||||
```typescript
|
||||
class UserQueryBuilder {
|
||||
private query = db.select().from(users);
|
||||
|
||||
whereRole(role: string) {
|
||||
this.query = this.query.where(eq(users.role, role));
|
||||
return this;
|
||||
}
|
||||
|
||||
whereActive() {
|
||||
this.query = this.query.where(eq(users.isActive, true));
|
||||
return this;
|
||||
}
|
||||
|
||||
orderByCreated() {
|
||||
this.query = this.query.orderBy(desc(users.createdAt));
|
||||
return this;
|
||||
}
|
||||
|
||||
async execute() {
|
||||
return await this.query;
|
||||
}
|
||||
}
|
||||
|
||||
// Usage
|
||||
const admins = await new UserQueryBuilder()
|
||||
.whereRole('admin')
|
||||
.whereActive()
|
||||
.orderByCreated()
|
||||
.execute();
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Avoid N+1 Queries
|
||||
|
||||
```typescript
|
||||
// ❌ Bad: N+1 query
|
||||
const authors = await db.select().from(authors);
|
||||
for (const author of authors) {
|
||||
author.posts = await db.select().from(posts).where(eq(posts.authorId, author.id));
|
||||
}
|
||||
|
||||
// ✅ Good: Single query with join
|
||||
const authorsWithPosts = await db.query.authors.findMany({
|
||||
with: { posts: true },
|
||||
});
|
||||
|
||||
// ✅ Good: Dataloader pattern
|
||||
import DataLoader from 'dataloader';
|
||||
|
||||
const postLoader = new DataLoader(async (authorIds: number[]) => {
|
||||
const posts = await db.select().from(posts).where(inArray(posts.authorId, authorIds));
|
||||
|
||||
const grouped = authorIds.map(id =>
|
||||
posts.filter(post => post.authorId === id)
|
||||
);
|
||||
|
||||
return grouped;
|
||||
});
|
||||
```
|
||||
|
||||
### Query Timeouts
|
||||
|
||||
```typescript
|
||||
// PostgreSQL statement timeout
|
||||
await db.execute(sql`SET statement_timeout = '5s'`);
|
||||
|
||||
// Per-query timeout
|
||||
const withTimeout = async <T>(promise: Promise<T>, ms: number): Promise<T> => {
|
||||
const timeout = new Promise<never>((_, reject) =>
|
||||
setTimeout(() => reject(new Error('Query timeout')), ms)
|
||||
);
|
||||
return Promise.race([promise, timeout]);
|
||||
};
|
||||
|
||||
const users = await withTimeout(
|
||||
db.select().from(users),
|
||||
5000
|
||||
);
|
||||
```
|
||||
503
.agents/skills/drizzle-orm/references/vs-prisma.md
Normal file
503
.agents/skills/drizzle-orm/references/vs-prisma.md
Normal file
|
|
@ -0,0 +1,503 @@
|
|||
# Drizzle vs Prisma Comparison
|
||||
|
||||
Feature comparison, migration guide, and decision framework for choosing between Drizzle and Prisma.
|
||||
|
||||
## Quick Comparison
|
||||
|
||||
| Feature | Drizzle ORM | Prisma |
|
||||
|---------|-------------|--------|
|
||||
| **Type Safety** | ✅ Compile-time inference | ✅ Generated types |
|
||||
| **Bundle Size** | **~35KB** | ~230KB |
|
||||
| **Runtime** | **Zero dependencies** | Heavy runtime |
|
||||
| **Cold Start** | **~10ms** | ~250ms |
|
||||
| **Query Performance** | **Faster (native SQL)** | Slower (translation layer) |
|
||||
| **Learning Curve** | Moderate (SQL knowledge helpful) | Easier (abstracted) |
|
||||
| **Migrations** | SQL-based | Declarative schema |
|
||||
| **Raw SQL** | **First-class support** | Limited support |
|
||||
| **Edge Runtime** | **Fully compatible** | Limited support |
|
||||
| **Ecosystem** | Growing | Mature |
|
||||
| **Studio (GUI)** | ✅ Drizzle Studio | ✅ Prisma Studio |
|
||||
|
||||
## When to Choose Drizzle
|
||||
|
||||
### ✅ Choose Drizzle if you need:
|
||||
|
||||
1. **Performance-critical applications**
|
||||
- Microservices with tight latency requirements
|
||||
- High-throughput APIs (>10K req/s)
|
||||
- Serverless/edge functions with cold start concerns
|
||||
|
||||
2. **Minimal bundle size**
|
||||
- Client-side database (SQLite in browser)
|
||||
- Edge runtime deployments
|
||||
- Mobile applications with bundle constraints
|
||||
|
||||
3. **SQL control**
|
||||
- Complex queries with CTEs, window functions
|
||||
- Raw SQL for specific database features
|
||||
- Database-specific optimizations
|
||||
|
||||
4. **Type inference over generation**
|
||||
- No build step for type generation
|
||||
- Immediate TypeScript feedback
|
||||
- Schema changes reflected instantly
|
||||
|
||||
### Example: Edge Function with Drizzle
|
||||
|
||||
```typescript
|
||||
import { neon } from '@neondatabase/serverless';
|
||||
import { drizzle } from 'drizzle-orm/neon-http';
|
||||
|
||||
export const runtime = 'edge';
|
||||
|
||||
export async function GET() {
|
||||
const sql = neon(process.env.DATABASE_URL!);
|
||||
const db = drizzle(sql); // ~35KB bundle, <10ms cold start
|
||||
|
||||
const users = await db.select().from(users);
|
||||
return Response.json(users);
|
||||
}
|
||||
```
|
||||
|
||||
## When to Choose Prisma
|
||||
|
||||
### ✅ Choose Prisma if you need:
|
||||
|
||||
1. **Rapid prototyping**
|
||||
- Quick schema iterations
|
||||
- Automatic migrations
|
||||
- Less SQL knowledge required
|
||||
|
||||
2. **Team with varied SQL experience**
|
||||
- Abstracted query interface
|
||||
- Declarative migrations
|
||||
- Generated documentation
|
||||
|
||||
3. **Mature ecosystem**
|
||||
- Extensive community resources
|
||||
- Third-party integrations (Nexus, tRPC)
|
||||
- Enterprise support options
|
||||
|
||||
4. **Rich developer experience**
|
||||
- Prisma Studio (GUI)
|
||||
- VS Code extension
|
||||
- Comprehensive documentation
|
||||
|
||||
### Example: Next.js App with Prisma
|
||||
|
||||
```typescript
|
||||
// schema.prisma
|
||||
model User {
|
||||
id Int @id @default(autoincrement())
|
||||
email String @unique
|
||||
posts Post[]
|
||||
}
|
||||
|
||||
model Post {
|
||||
id Int @id @default(autoincrement())
|
||||
title String
|
||||
authorId Int
|
||||
author User @relation(fields: [authorId], references: [id])
|
||||
}
|
||||
|
||||
// app/api/users/route.ts
|
||||
import { prisma } from '@/lib/prisma';
|
||||
|
||||
export async function GET() {
|
||||
const users = await prisma.user.findMany({
|
||||
include: { posts: true },
|
||||
});
|
||||
return Response.json(users);
|
||||
}
|
||||
```
|
||||
|
||||
## Feature Comparison
|
||||
|
||||
### Schema Definition
|
||||
|
||||
**Drizzle** (TypeScript-first):
|
||||
```typescript
|
||||
import { pgTable, serial, text, integer } from 'drizzle-orm/pg-core';
|
||||
import { relations } from 'drizzle-orm';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: text('email').notNull().unique(),
|
||||
});
|
||||
|
||||
export const posts = pgTable('posts', {
|
||||
id: serial('id').primaryKey(),
|
||||
title: text('title').notNull(),
|
||||
authorId: integer('author_id').notNull().references(() => users.id),
|
||||
});
|
||||
|
||||
export const usersRelations = relations(users, ({ many }) => ({
|
||||
posts: many(posts),
|
||||
}));
|
||||
```
|
||||
|
||||
**Prisma** (Schema DSL):
|
||||
```prisma
|
||||
model User {
|
||||
id Int @id @default(autoincrement())
|
||||
email String @unique
|
||||
posts Post[]
|
||||
}
|
||||
|
||||
model Post {
|
||||
id Int @id @default(autoincrement())
|
||||
title String
|
||||
authorId Int
|
||||
author User @relation(fields: [authorId], references: [id])
|
||||
}
|
||||
```
|
||||
|
||||
### Querying
|
||||
|
||||
**Drizzle** (SQL-like):
|
||||
```typescript
|
||||
import { eq, like, and, gt } from 'drizzle-orm';
|
||||
|
||||
// Simple query
|
||||
const user = await db.select().from(users).where(eq(users.id, 1));
|
||||
|
||||
// Complex filtering
|
||||
const results = await db.select()
|
||||
.from(users)
|
||||
.where(
|
||||
and(
|
||||
like(users.email, '%@example.com'),
|
||||
gt(users.createdAt, new Date('2024-01-01'))
|
||||
)
|
||||
);
|
||||
|
||||
// Joins
|
||||
const usersWithPosts = await db
|
||||
.select({
|
||||
user: users,
|
||||
post: posts,
|
||||
})
|
||||
.from(users)
|
||||
.leftJoin(posts, eq(users.id, posts.authorId));
|
||||
```
|
||||
|
||||
**Prisma** (Fluent API):
|
||||
```typescript
|
||||
// Simple query
|
||||
const user = await prisma.user.findUnique({ where: { id: 1 } });
|
||||
|
||||
// Complex filtering
|
||||
const results = await prisma.user.findMany({
|
||||
where: {
|
||||
email: { endsWith: '@example.com' },
|
||||
createdAt: { gt: new Date('2024-01-01') },
|
||||
},
|
||||
});
|
||||
|
||||
// Relations
|
||||
const usersWithPosts = await prisma.user.findMany({
|
||||
include: { posts: true },
|
||||
});
|
||||
```
|
||||
|
||||
### Migrations
|
||||
|
||||
**Drizzle** (SQL-based):
|
||||
```bash
|
||||
# Generate migration
|
||||
npx drizzle-kit generate
|
||||
|
||||
# Output: drizzle/0000_migration.sql
|
||||
# CREATE TABLE "users" (
|
||||
# "id" serial PRIMARY KEY,
|
||||
# "email" text NOT NULL UNIQUE
|
||||
# );
|
||||
|
||||
# Apply migration
|
||||
npx drizzle-kit migrate
|
||||
```
|
||||
|
||||
**Prisma** (Declarative):
|
||||
```bash
|
||||
# Generate and apply migration
|
||||
npx prisma migrate dev --name add_users
|
||||
|
||||
# Prisma compares schema.prisma to database
|
||||
# Generates SQL automatically
|
||||
# Applies migration
|
||||
```
|
||||
|
||||
### Type Generation
|
||||
|
||||
**Drizzle** (Inferred):
|
||||
```typescript
|
||||
// Types are inferred at compile time
|
||||
type User = typeof users.$inferSelect;
|
||||
type NewUser = typeof users.$inferInsert;
|
||||
|
||||
// Immediate feedback in IDE
|
||||
const user: User = await db.select().from(users);
|
||||
```
|
||||
|
||||
**Prisma** (Generated):
|
||||
```typescript
|
||||
// Types generated after schema change
|
||||
// Run: npx prisma generate
|
||||
|
||||
import { User, Post } from '@prisma/client';
|
||||
|
||||
const user: User = await prisma.user.findUnique({ where: { id: 1 } });
|
||||
```
|
||||
|
||||
### Raw SQL
|
||||
|
||||
**Drizzle** (First-class):
|
||||
```typescript
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
// Tagged template with type safety
|
||||
const result = await db.execute(
|
||||
sql`SELECT * FROM ${users} WHERE ${users.email} = ${email}`
|
||||
);
|
||||
|
||||
// Mix ORM and raw SQL
|
||||
const customQuery = await db
|
||||
.select({
|
||||
user: users,
|
||||
postCount: sql<number>`COUNT(${posts.id})`,
|
||||
})
|
||||
.from(users)
|
||||
.leftJoin(posts, eq(users.id, posts.authorId))
|
||||
.groupBy(users.id);
|
||||
```
|
||||
|
||||
**Prisma** (Limited):
|
||||
```typescript
|
||||
// Raw query (loses type safety)
|
||||
const result = await prisma.$queryRaw`
|
||||
SELECT * FROM users WHERE email = ${email}
|
||||
`;
|
||||
|
||||
// Typed raw query (manual type annotation)
|
||||
const users = await prisma.$queryRaw<User[]>`
|
||||
SELECT * FROM users
|
||||
`;
|
||||
```
|
||||
|
||||
## Performance Benchmarks
|
||||
|
||||
### Query Execution Time (1000 queries)
|
||||
|
||||
| Operation | Drizzle | Prisma | Difference |
|
||||
|-----------|---------|--------|------------|
|
||||
| findUnique | 1.2s | 3.1s | **2.6x faster** |
|
||||
| findMany (10 rows) | 1.5s | 3.8s | **2.5x faster** |
|
||||
| findMany (100 rows) | 2.1s | 5.2s | **2.5x faster** |
|
||||
| create | 1.8s | 4.1s | **2.3x faster** |
|
||||
| update | 1.7s | 3.9s | **2.3x faster** |
|
||||
|
||||
### Bundle Size Impact
|
||||
|
||||
```bash
|
||||
# Next.js production build
|
||||
|
||||
# With Drizzle
|
||||
├─ Client (First Load JS)
|
||||
│ └─ pages/index.js: 85 KB (+35KB Drizzle)
|
||||
|
||||
# With Prisma
|
||||
├─ Client (First Load JS)
|
||||
│ └─ pages/index.js: 280 KB (+230KB Prisma)
|
||||
```
|
||||
|
||||
### Cold Start Times (AWS Lambda)
|
||||
|
||||
| Database | Drizzle | Prisma |
|
||||
|----------|---------|--------|
|
||||
| PostgreSQL | ~50ms | ~300ms |
|
||||
| MySQL | ~45ms | ~280ms |
|
||||
| SQLite | ~10ms | ~150ms |
|
||||
|
||||
## Migration from Prisma to Drizzle
|
||||
|
||||
### Step 1: Install Drizzle
|
||||
|
||||
```bash
|
||||
npm install drizzle-orm
|
||||
npm install -D drizzle-kit
|
||||
|
||||
# Keep Prisma temporarily
|
||||
# npm uninstall prisma @prisma/client
|
||||
```
|
||||
|
||||
### Step 2: Introspect Existing Database
|
||||
|
||||
```typescript
|
||||
// drizzle.config.ts
|
||||
import type { Config } from 'drizzle-kit';
|
||||
|
||||
export default {
|
||||
schema: './db/schema.ts',
|
||||
out: './drizzle',
|
||||
dialect: 'postgresql',
|
||||
dbCredentials: {
|
||||
url: process.env.DATABASE_URL!,
|
||||
},
|
||||
} satisfies Config;
|
||||
```
|
||||
|
||||
```bash
|
||||
# Generate Drizzle schema from existing database
|
||||
npx drizzle-kit introspect
|
||||
```
|
||||
|
||||
### Step 3: Convert Queries
|
||||
|
||||
**Prisma**:
|
||||
```typescript
|
||||
// Before (Prisma)
|
||||
const users = await prisma.user.findMany({
|
||||
where: { email: { contains: 'example.com' } },
|
||||
include: { posts: true },
|
||||
orderBy: { createdAt: 'desc' },
|
||||
take: 10,
|
||||
});
|
||||
```
|
||||
|
||||
**Drizzle**:
|
||||
```typescript
|
||||
// After (Drizzle)
|
||||
import { like, desc } from 'drizzle-orm';
|
||||
|
||||
const users = await db.query.users.findMany({
|
||||
where: like(users.email, '%example.com%'),
|
||||
with: { posts: true },
|
||||
orderBy: [desc(users.createdAt)],
|
||||
limit: 10,
|
||||
});
|
||||
|
||||
// Or SQL-style
|
||||
const users = await db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(like(users.email, '%example.com%'))
|
||||
.orderBy(desc(users.createdAt))
|
||||
.limit(10);
|
||||
```
|
||||
|
||||
### Step 4: Conversion Patterns
|
||||
|
||||
```typescript
|
||||
// Prisma → Drizzle mapping
|
||||
|
||||
// findUnique
|
||||
await prisma.user.findUnique({ where: { id: 1 } });
|
||||
await db.select().from(users).where(eq(users.id, 1));
|
||||
|
||||
// findMany with filters
|
||||
await prisma.user.findMany({ where: { role: 'admin' } });
|
||||
await db.select().from(users).where(eq(users.role, 'admin'));
|
||||
|
||||
// create
|
||||
await prisma.user.create({ data: { email: 'user@example.com' } });
|
||||
await db.insert(users).values({ email: 'user@example.com' }).returning();
|
||||
|
||||
// update
|
||||
await prisma.user.update({ where: { id: 1 }, data: { name: 'John' } });
|
||||
await db.update(users).set({ name: 'John' }).where(eq(users.id, 1));
|
||||
|
||||
// delete
|
||||
await prisma.user.delete({ where: { id: 1 } });
|
||||
await db.delete(users).where(eq(users.id, 1));
|
||||
|
||||
// count
|
||||
await prisma.user.count();
|
||||
await db.select({ count: count() }).from(users);
|
||||
|
||||
// aggregate
|
||||
await prisma.post.aggregate({ _avg: { views: true } });
|
||||
await db.select({ avg: avg(posts.views) }).from(posts);
|
||||
```
|
||||
|
||||
### Step 5: Test & Remove Prisma
|
||||
|
||||
```bash
|
||||
# Run tests with Drizzle
|
||||
npm test
|
||||
|
||||
# Remove Prisma when confident
|
||||
npm uninstall prisma @prisma/client
|
||||
rm -rf prisma/
|
||||
```
|
||||
|
||||
## Decision Matrix
|
||||
|
||||
| Requirement | Drizzle | Prisma |
|
||||
|-------------|---------|--------|
|
||||
| Need minimal bundle size | ✅ | ❌ |
|
||||
| Edge runtime deployment | ✅ | ⚠️ |
|
||||
| Team unfamiliar with SQL | ❌ | ✅ |
|
||||
| Complex raw SQL queries | ✅ | ❌ |
|
||||
| Rapid prototyping | ⚠️ | ✅ |
|
||||
| Type-safe migrations | ✅ | ✅ |
|
||||
| Performance critical | ✅ | ❌ |
|
||||
| Mature ecosystem | ⚠️ | ✅ |
|
||||
| First-class TypeScript | ✅ | ✅ |
|
||||
| Zero dependencies | ✅ | ❌ |
|
||||
|
||||
## Hybrid Approach
|
||||
|
||||
You can use both in the same project:
|
||||
|
||||
```typescript
|
||||
// Use Drizzle for performance-critical paths
|
||||
import { db as drizzleDb } from './lib/drizzle';
|
||||
|
||||
export async function GET() {
|
||||
const users = await drizzleDb.select().from(users);
|
||||
return Response.json(users);
|
||||
}
|
||||
|
||||
// Use Prisma for admin dashboards (less performance-critical)
|
||||
import { prisma } from './lib/prisma';
|
||||
|
||||
export async function getStaticProps() {
|
||||
const stats = await prisma.user.aggregate({
|
||||
_count: true,
|
||||
_avg: { posts: true },
|
||||
});
|
||||
return { props: { stats } };
|
||||
}
|
||||
```
|
||||
|
||||
## Community & Resources
|
||||
|
||||
### Drizzle
|
||||
- Docs: [orm.drizzle.team](https://orm.drizzle.team)
|
||||
- Discord: [drizzle.team/discord](https://drizzle.team/discord)
|
||||
- GitHub: [drizzle-team/drizzle-orm](https://github.com/drizzle-team/drizzle-orm)
|
||||
|
||||
### Prisma
|
||||
- Docs: [prisma.io/docs](https://prisma.io/docs)
|
||||
- Discord: [pris.ly/discord](https://pris.ly/discord)
|
||||
- GitHub: [prisma/prisma](https://github.com/prisma/prisma)
|
||||
|
||||
## Final Recommendation
|
||||
|
||||
**Choose Drizzle for:**
|
||||
- Greenfield projects prioritizing performance
|
||||
- Edge/serverless applications
|
||||
- Teams comfortable with SQL
|
||||
- Minimal bundle size requirements
|
||||
|
||||
**Choose Prisma for:**
|
||||
- Established teams with Prisma experience
|
||||
- Rapid MVP development
|
||||
- Teams new to databases
|
||||
- Reliance on Prisma ecosystem (Nexus, etc.)
|
||||
|
||||
**Consider migration when:**
|
||||
- Performance becomes a bottleneck
|
||||
- Bundle size impacts user experience
|
||||
- Edge runtime deployment needed
|
||||
- Team SQL proficiency increases
|
||||
75
.agents/skills/fastify-best-practices/SKILL.md
Normal file
75
.agents/skills/fastify-best-practices/SKILL.md
Normal file
|
|
@ -0,0 +1,75 @@
|
|||
---
|
||||
name: fastify-best-practices
|
||||
description: "Guides development of Fastify Node.js backend servers and REST APIs using TypeScript or JavaScript. Use when building, configuring, or debugging a Fastify application — including defining routes, implementing plugins, setting up JSON Schema validation, handling errors, optimising performance, managing authentication, configuring CORS and security headers, integrating databases, working with WebSockets, and deploying to production. Covers the full Fastify request lifecycle (hooks, serialization, logging with Pino) and TypeScript integration via strip types. Trigger terms: Fastify, Node.js server, REST API, API routes, backend framework, fastify.config, server.ts, app.ts."
|
||||
metadata:
|
||||
tags: fastify, nodejs, typescript, backend, api, server, http
|
||||
---
|
||||
|
||||
## When to use
|
||||
|
||||
Use this skill when you need to:
|
||||
- Develop backend applications using Fastify
|
||||
- Implement Fastify plugins and route handlers
|
||||
- Get guidance on Fastify architecture and patterns
|
||||
- Use TypeScript with Fastify (strip types)
|
||||
- Implement testing with Fastify's inject method
|
||||
- Configure validation, serialization, and error handling
|
||||
|
||||
## Quick Start
|
||||
|
||||
A minimal, runnable Fastify server to get started immediately:
|
||||
|
||||
```ts
|
||||
import Fastify from 'fastify'
|
||||
|
||||
const app = Fastify({ logger: true })
|
||||
|
||||
app.get('/health', async (request, reply) => {
|
||||
return { status: 'ok' }
|
||||
})
|
||||
|
||||
const start = async () => {
|
||||
await app.listen({ port: 3000, host: '0.0.0.0' })
|
||||
}
|
||||
start()
|
||||
```
|
||||
|
||||
## Recommended Reading Order for Common Scenarios
|
||||
|
||||
- **New to Fastify?** Start with `plugins.md` → `routes.md` → `schemas.md`
|
||||
- **Adding authentication:** `plugins.md` → `hooks.md` → `authentication.md`
|
||||
- **Improving performance:** `schemas.md` → `serialization.md` → `performance.md`
|
||||
- **Setting up testing:** `routes.md` → `testing.md`
|
||||
- **Going to production:** `logging.md` → `configuration.md` → `deployment.md`
|
||||
|
||||
## How to use
|
||||
|
||||
Read individual rule files for detailed explanations and code examples:
|
||||
|
||||
- [rules/plugins.md](rules/plugins.md) - Plugin development and encapsulation
|
||||
- [rules/routes.md](rules/routes.md) - Route organization and handlers
|
||||
- [rules/schemas.md](rules/schemas.md) - JSON Schema validation
|
||||
- [rules/error-handling.md](rules/error-handling.md) - Error handling patterns
|
||||
- [rules/hooks.md](rules/hooks.md) - Hooks and request lifecycle
|
||||
- [rules/authentication.md](rules/authentication.md) - Authentication and authorization
|
||||
- [rules/testing.md](rules/testing.md) - Testing with inject()
|
||||
- [rules/performance.md](rules/performance.md) - Performance optimization
|
||||
- [rules/logging.md](rules/logging.md) - Logging with Pino
|
||||
- [rules/typescript.md](rules/typescript.md) - TypeScript integration
|
||||
- [rules/decorators.md](rules/decorators.md) - Decorators and extensions
|
||||
- [rules/content-type.md](rules/content-type.md) - Content type parsing
|
||||
- [rules/serialization.md](rules/serialization.md) - Response serialization
|
||||
- [rules/cors-security.md](rules/cors-security.md) - CORS and security headers
|
||||
- [rules/websockets.md](rules/websockets.md) - WebSocket support
|
||||
- [rules/database.md](rules/database.md) - Database integration patterns
|
||||
- [rules/configuration.md](rules/configuration.md) - Application configuration
|
||||
- [rules/deployment.md](rules/deployment.md) - Production deployment
|
||||
- [rules/http-proxy.md](rules/http-proxy.md) - HTTP proxying and reply.from()
|
||||
|
||||
## Core Principles
|
||||
|
||||
- **Encapsulation**: Fastify's plugin system provides automatic encapsulation
|
||||
- **Schema-first**: Define schemas for validation and serialization
|
||||
- **Performance**: Fastify is optimized for speed; use its features correctly
|
||||
- **Async/await**: All handlers and hooks support async functions
|
||||
- **Minimal dependencies**: Prefer Fastify's built-in features and official plugins
|
||||
521
.agents/skills/fastify-best-practices/rules/authentication.md
Normal file
521
.agents/skills/fastify-best-practices/rules/authentication.md
Normal file
|
|
@ -0,0 +1,521 @@
|
|||
---
|
||||
name: authentication
|
||||
description: Authentication and authorization patterns in Fastify
|
||||
metadata:
|
||||
tags: auth, jwt, session, oauth, security, authorization
|
||||
---
|
||||
|
||||
# Authentication and Authorization
|
||||
|
||||
## JWT Authentication with @fastify/jwt
|
||||
|
||||
Use `@fastify/jwt` for JSON Web Token authentication:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import fastifyJwt from '@fastify/jwt';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
app.register(fastifyJwt, {
|
||||
secret: process.env.JWT_SECRET,
|
||||
sign: {
|
||||
expiresIn: '1h',
|
||||
},
|
||||
});
|
||||
|
||||
// Decorate request with authentication method
|
||||
app.decorate('authenticate', async function (request, reply) {
|
||||
try {
|
||||
await request.jwtVerify();
|
||||
} catch (err) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
}
|
||||
});
|
||||
|
||||
// Login route
|
||||
app.post('/login', {
|
||||
schema: {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
email: { type: 'string', format: 'email' },
|
||||
password: { type: 'string' },
|
||||
},
|
||||
required: ['email', 'password'],
|
||||
},
|
||||
},
|
||||
}, async (request, reply) => {
|
||||
const { email, password } = request.body;
|
||||
const user = await validateCredentials(email, password);
|
||||
|
||||
if (!user) {
|
||||
return reply.code(401).send({ error: 'Invalid credentials' });
|
||||
}
|
||||
|
||||
const token = app.jwt.sign({
|
||||
id: user.id,
|
||||
email: user.email,
|
||||
role: user.role,
|
||||
});
|
||||
|
||||
return { token };
|
||||
});
|
||||
|
||||
// Protected route
|
||||
app.get('/profile', {
|
||||
onRequest: [app.authenticate],
|
||||
}, async (request) => {
|
||||
return { user: request.user };
|
||||
});
|
||||
```
|
||||
|
||||
## Refresh Tokens
|
||||
|
||||
Implement refresh token rotation:
|
||||
|
||||
```typescript
|
||||
import fastifyJwt from '@fastify/jwt';
|
||||
import { randomBytes } from 'node:crypto';
|
||||
|
||||
app.register(fastifyJwt, {
|
||||
secret: process.env.JWT_SECRET,
|
||||
sign: {
|
||||
expiresIn: '15m', // Short-lived access tokens
|
||||
},
|
||||
});
|
||||
|
||||
// Store refresh tokens (use Redis in production)
|
||||
const refreshTokens = new Map<string, { userId: string; expires: number }>();
|
||||
|
||||
app.post('/auth/login', async (request, reply) => {
|
||||
const { email, password } = request.body;
|
||||
const user = await validateCredentials(email, password);
|
||||
|
||||
if (!user) {
|
||||
return reply.code(401).send({ error: 'Invalid credentials' });
|
||||
}
|
||||
|
||||
const accessToken = app.jwt.sign({ id: user.id, role: user.role });
|
||||
const refreshToken = randomBytes(32).toString('hex');
|
||||
|
||||
refreshTokens.set(refreshToken, {
|
||||
userId: user.id,
|
||||
expires: Date.now() + 7 * 24 * 60 * 60 * 1000, // 7 days
|
||||
});
|
||||
|
||||
return { accessToken, refreshToken };
|
||||
});
|
||||
|
||||
app.post('/auth/refresh', async (request, reply) => {
|
||||
const { refreshToken } = request.body;
|
||||
const stored = refreshTokens.get(refreshToken);
|
||||
|
||||
if (!stored || stored.expires < Date.now()) {
|
||||
refreshTokens.delete(refreshToken);
|
||||
return reply.code(401).send({ error: 'Invalid refresh token' });
|
||||
}
|
||||
|
||||
// Delete old token (rotation)
|
||||
refreshTokens.delete(refreshToken);
|
||||
|
||||
const user = await db.users.findById(stored.userId);
|
||||
const accessToken = app.jwt.sign({ id: user.id, role: user.role });
|
||||
const newRefreshToken = randomBytes(32).toString('hex');
|
||||
|
||||
refreshTokens.set(newRefreshToken, {
|
||||
userId: user.id,
|
||||
expires: Date.now() + 7 * 24 * 60 * 60 * 1000,
|
||||
});
|
||||
|
||||
return { accessToken, refreshToken: newRefreshToken };
|
||||
});
|
||||
|
||||
app.post('/auth/logout', async (request, reply) => {
|
||||
const { refreshToken } = request.body;
|
||||
refreshTokens.delete(refreshToken);
|
||||
return { success: true };
|
||||
});
|
||||
```
|
||||
|
||||
## Role-Based Access Control
|
||||
|
||||
Implement RBAC with decorators:
|
||||
|
||||
```typescript
|
||||
type Role = 'admin' | 'user' | 'moderator';
|
||||
|
||||
// Create authorization decorator
|
||||
app.decorate('authorize', function (...allowedRoles: Role[]) {
|
||||
return async (request, reply) => {
|
||||
await request.jwtVerify();
|
||||
|
||||
const userRole = request.user.role as Role;
|
||||
if (!allowedRoles.includes(userRole)) {
|
||||
return reply.code(403).send({
|
||||
error: 'Forbidden',
|
||||
message: `Role '${userRole}' is not authorized for this resource`,
|
||||
});
|
||||
}
|
||||
};
|
||||
});
|
||||
|
||||
// Admin only route
|
||||
app.get('/admin/users', {
|
||||
onRequest: [app.authorize('admin')],
|
||||
}, async (request) => {
|
||||
return db.users.findAll();
|
||||
});
|
||||
|
||||
// Admin or moderator
|
||||
app.delete('/posts/:id', {
|
||||
onRequest: [app.authorize('admin', 'moderator')],
|
||||
}, async (request) => {
|
||||
await db.posts.delete(request.params.id);
|
||||
return { deleted: true };
|
||||
});
|
||||
```
|
||||
|
||||
## Permission-Based Authorization
|
||||
|
||||
Fine-grained permission checks:
|
||||
|
||||
```typescript
|
||||
interface Permission {
|
||||
resource: string;
|
||||
action: 'create' | 'read' | 'update' | 'delete';
|
||||
}
|
||||
|
||||
const rolePermissions: Record<string, Permission[]> = {
|
||||
admin: [
|
||||
{ resource: '*', action: 'create' },
|
||||
{ resource: '*', action: 'read' },
|
||||
{ resource: '*', action: 'update' },
|
||||
{ resource: '*', action: 'delete' },
|
||||
],
|
||||
user: [
|
||||
{ resource: 'posts', action: 'create' },
|
||||
{ resource: 'posts', action: 'read' },
|
||||
{ resource: 'comments', action: 'create' },
|
||||
{ resource: 'comments', action: 'read' },
|
||||
],
|
||||
};
|
||||
|
||||
function hasPermission(role: string, resource: string, action: string): boolean {
|
||||
const permissions = rolePermissions[role] || [];
|
||||
return permissions.some(
|
||||
(p) =>
|
||||
(p.resource === '*' || p.resource === resource) &&
|
||||
p.action === action
|
||||
);
|
||||
}
|
||||
|
||||
app.decorate('checkPermission', function (resource: string, action: string) {
|
||||
return async (request, reply) => {
|
||||
await request.jwtVerify();
|
||||
|
||||
if (!hasPermission(request.user.role, resource, action)) {
|
||||
return reply.code(403).send({
|
||||
error: 'Forbidden',
|
||||
message: `Not allowed to ${action} ${resource}`,
|
||||
});
|
||||
}
|
||||
};
|
||||
});
|
||||
|
||||
// Usage
|
||||
app.post('/posts', {
|
||||
onRequest: [app.checkPermission('posts', 'create')],
|
||||
}, createPostHandler);
|
||||
|
||||
app.delete('/posts/:id', {
|
||||
onRequest: [app.checkPermission('posts', 'delete')],
|
||||
}, deletePostHandler);
|
||||
```
|
||||
|
||||
## API Key / Bearer Token Authentication
|
||||
|
||||
Use `@fastify/bearer-auth` for API key and bearer token authentication:
|
||||
|
||||
```typescript
|
||||
import bearerAuth from '@fastify/bearer-auth';
|
||||
|
||||
const validKeys = new Set([process.env.API_KEY]);
|
||||
|
||||
app.register(bearerAuth, {
|
||||
keys: validKeys,
|
||||
errorResponse: (err) => ({
|
||||
error: 'Unauthorized',
|
||||
message: 'Invalid API key',
|
||||
}),
|
||||
});
|
||||
|
||||
// All routes are now protected
|
||||
app.get('/api/data', async (request) => {
|
||||
return { data: [] };
|
||||
});
|
||||
```
|
||||
|
||||
For database-backed API keys with custom validation:
|
||||
|
||||
```typescript
|
||||
import bearerAuth from '@fastify/bearer-auth';
|
||||
|
||||
app.register(bearerAuth, {
|
||||
auth: async (key, request) => {
|
||||
const apiKey = await db.apiKeys.findByKey(key);
|
||||
|
||||
if (!apiKey || !apiKey.active) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Track usage (fire and forget)
|
||||
db.apiKeys.recordUsage(apiKey.id, {
|
||||
ip: request.ip,
|
||||
timestamp: new Date(),
|
||||
});
|
||||
|
||||
request.apiKey = apiKey;
|
||||
return true;
|
||||
},
|
||||
errorResponse: (err) => ({
|
||||
error: 'Unauthorized',
|
||||
message: 'Invalid API key',
|
||||
}),
|
||||
});
|
||||
```
|
||||
|
||||
## OAuth 2.0 Integration
|
||||
|
||||
Integrate with OAuth providers using @fastify/oauth2:
|
||||
|
||||
```typescript
|
||||
import fastifyOauth2 from '@fastify/oauth2';
|
||||
|
||||
app.register(fastifyOauth2, {
|
||||
name: 'googleOAuth2',
|
||||
scope: ['profile', 'email'],
|
||||
credentials: {
|
||||
client: {
|
||||
id: process.env.GOOGLE_CLIENT_ID,
|
||||
secret: process.env.GOOGLE_CLIENT_SECRET,
|
||||
},
|
||||
},
|
||||
startRedirectPath: '/auth/google',
|
||||
callbackUri: 'http://localhost:3000/auth/google/callback',
|
||||
discovery: {
|
||||
issuer: 'https://accounts.google.com',
|
||||
},
|
||||
});
|
||||
|
||||
app.get('/auth/google/callback', async (request, reply) => {
|
||||
const { token } = await app.googleOAuth2.getAccessTokenFromAuthorizationCodeFlow(request);
|
||||
|
||||
// Fetch user info from Google
|
||||
const userInfo = await fetch('https://www.googleapis.com/oauth2/v2/userinfo', {
|
||||
headers: { Authorization: `Bearer ${token.access_token}` },
|
||||
}).then((r) => r.json());
|
||||
|
||||
// Find or create user
|
||||
let user = await db.users.findByEmail(userInfo.email);
|
||||
if (!user) {
|
||||
user = await db.users.create({
|
||||
email: userInfo.email,
|
||||
name: userInfo.name,
|
||||
provider: 'google',
|
||||
providerId: userInfo.id,
|
||||
});
|
||||
}
|
||||
|
||||
// Generate JWT
|
||||
const jwt = app.jwt.sign({ id: user.id, role: user.role });
|
||||
|
||||
// Redirect to frontend with token
|
||||
return reply.redirect(`/auth/success?token=${jwt}`);
|
||||
});
|
||||
```
|
||||
|
||||
## Session-Based Authentication
|
||||
|
||||
Use @fastify/session for session management:
|
||||
|
||||
```typescript
|
||||
import fastifyCookie from '@fastify/cookie';
|
||||
import fastifySession from '@fastify/session';
|
||||
import RedisStore from 'connect-redis';
|
||||
import { createClient } from 'redis';
|
||||
|
||||
const redisClient = createClient({ url: process.env.REDIS_URL });
|
||||
await redisClient.connect();
|
||||
|
||||
app.register(fastifyCookie);
|
||||
app.register(fastifySession, {
|
||||
secret: process.env.SESSION_SECRET,
|
||||
store: new RedisStore({ client: redisClient }),
|
||||
cookie: {
|
||||
secure: process.env.NODE_ENV === 'production',
|
||||
httpOnly: true,
|
||||
maxAge: 24 * 60 * 60 * 1000, // 1 day
|
||||
},
|
||||
});
|
||||
|
||||
app.post('/login', async (request, reply) => {
|
||||
const { email, password } = request.body;
|
||||
const user = await validateCredentials(email, password);
|
||||
|
||||
if (!user) {
|
||||
return reply.code(401).send({ error: 'Invalid credentials' });
|
||||
}
|
||||
|
||||
request.session.userId = user.id;
|
||||
request.session.role = user.role;
|
||||
|
||||
return { success: true };
|
||||
});
|
||||
|
||||
app.decorate('requireSession', async function (request, reply) {
|
||||
if (!request.session.userId) {
|
||||
return reply.code(401).send({ error: 'Not authenticated' });
|
||||
}
|
||||
});
|
||||
|
||||
app.get('/profile', {
|
||||
onRequest: [app.requireSession],
|
||||
}, async (request) => {
|
||||
const user = await db.users.findById(request.session.userId);
|
||||
return { user };
|
||||
});
|
||||
|
||||
app.post('/logout', async (request, reply) => {
|
||||
await request.session.destroy();
|
||||
return { success: true };
|
||||
});
|
||||
```
|
||||
|
||||
## Resource-Based Authorization
|
||||
|
||||
Check ownership of resources:
|
||||
|
||||
```typescript
|
||||
app.decorate('checkOwnership', function (getResourceOwnerId: (request) => Promise<string>) {
|
||||
return async (request, reply) => {
|
||||
const ownerId = await getResourceOwnerId(request);
|
||||
|
||||
if (ownerId !== request.user.id && request.user.role !== 'admin') {
|
||||
return reply.code(403).send({
|
||||
error: 'Forbidden',
|
||||
message: 'You do not own this resource',
|
||||
});
|
||||
}
|
||||
};
|
||||
});
|
||||
|
||||
// Check post ownership
|
||||
app.put('/posts/:id', {
|
||||
onRequest: [
|
||||
app.authenticate,
|
||||
app.checkOwnership(async (request) => {
|
||||
const post = await db.posts.findById(request.params.id);
|
||||
return post?.authorId;
|
||||
}),
|
||||
],
|
||||
}, updatePostHandler);
|
||||
|
||||
// Alternative: inline check
|
||||
app.put('/posts/:id', {
|
||||
onRequest: [app.authenticate],
|
||||
}, async (request, reply) => {
|
||||
const post = await db.posts.findById(request.params.id);
|
||||
|
||||
if (!post) {
|
||||
return reply.code(404).send({ error: 'Post not found' });
|
||||
}
|
||||
|
||||
if (post.authorId !== request.user.id && request.user.role !== 'admin') {
|
||||
return reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
|
||||
return db.posts.update(post.id, request.body);
|
||||
});
|
||||
```
|
||||
|
||||
## Password Hashing
|
||||
|
||||
Use secure password hashing with argon2:
|
||||
|
||||
```typescript
|
||||
import { hash, verify } from '@node-rs/argon2';
|
||||
|
||||
async function hashPassword(password: string): Promise<string> {
|
||||
return hash(password, {
|
||||
memoryCost: 65536,
|
||||
timeCost: 3,
|
||||
parallelism: 4,
|
||||
});
|
||||
}
|
||||
|
||||
async function verifyPassword(hash: string, password: string): Promise<boolean> {
|
||||
return verify(hash, password);
|
||||
}
|
||||
|
||||
app.post('/register', async (request, reply) => {
|
||||
const { email, password } = request.body;
|
||||
|
||||
const hashedPassword = await hashPassword(password);
|
||||
const user = await db.users.create({
|
||||
email,
|
||||
password: hashedPassword,
|
||||
});
|
||||
|
||||
reply.code(201);
|
||||
return { id: user.id, email: user.email };
|
||||
});
|
||||
|
||||
app.post('/login', async (request, reply) => {
|
||||
const { email, password } = request.body;
|
||||
const user = await db.users.findByEmail(email);
|
||||
|
||||
if (!user || !(await verifyPassword(user.password, password))) {
|
||||
return reply.code(401).send({ error: 'Invalid credentials' });
|
||||
}
|
||||
|
||||
const token = app.jwt.sign({ id: user.id, role: user.role });
|
||||
return { token };
|
||||
});
|
||||
```
|
||||
|
||||
## Rate Limiting for Auth Endpoints
|
||||
|
||||
Protect auth endpoints from brute force. **IMPORTANT: For production security, you MUST configure rate limiting with a Redis backend.** In-memory rate limiting is not safe for distributed deployments and can be bypassed.
|
||||
|
||||
```typescript
|
||||
import fastifyRateLimit from '@fastify/rate-limit';
|
||||
import Redis from 'ioredis';
|
||||
|
||||
const redis = new Redis(process.env.REDIS_URL);
|
||||
|
||||
// Global rate limit with Redis backend
|
||||
app.register(fastifyRateLimit, {
|
||||
max: 100,
|
||||
timeWindow: '1 minute',
|
||||
redis, // REQUIRED for production - ensures rate limiting works across all instances
|
||||
});
|
||||
|
||||
// Stricter limit for auth endpoints
|
||||
app.register(async function authRoutes(fastify) {
|
||||
await fastify.register(fastifyRateLimit, {
|
||||
max: 5,
|
||||
timeWindow: '1 minute',
|
||||
redis, // REQUIRED for production
|
||||
keyGenerator: (request) => {
|
||||
// Rate limit by IP + email combination
|
||||
const email = request.body?.email || '';
|
||||
return `${request.ip}:${email}`;
|
||||
},
|
||||
});
|
||||
|
||||
fastify.post('/login', loginHandler);
|
||||
fastify.post('/register', registerHandler);
|
||||
fastify.post('/forgot-password', forgotPasswordHandler);
|
||||
}, { prefix: '/auth' });
|
||||
```
|
||||
217
.agents/skills/fastify-best-practices/rules/configuration.md
Normal file
217
.agents/skills/fastify-best-practices/rules/configuration.md
Normal file
|
|
@ -0,0 +1,217 @@
|
|||
---
|
||||
name: configuration
|
||||
description: Application configuration in Fastify using env-schema
|
||||
metadata:
|
||||
tags: configuration, environment, env, settings, env-schema
|
||||
---
|
||||
|
||||
# Application Configuration
|
||||
|
||||
## Use env-schema for Configuration
|
||||
|
||||
**Always use `env-schema` for configuration validation.** It provides JSON Schema validation for environment variables with sensible defaults.
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import envSchema from 'env-schema';
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
const schema = Type.Object({
|
||||
PORT: Type.Number({ default: 3000 }),
|
||||
HOST: Type.String({ default: '0.0.0.0' }),
|
||||
DATABASE_URL: Type.String(),
|
||||
JWT_SECRET: Type.String({ minLength: 32 }),
|
||||
LOG_LEVEL: Type.Union([
|
||||
Type.Literal('trace'),
|
||||
Type.Literal('debug'),
|
||||
Type.Literal('info'),
|
||||
Type.Literal('warn'),
|
||||
Type.Literal('error'),
|
||||
Type.Literal('fatal'),
|
||||
], { default: 'info' }),
|
||||
});
|
||||
|
||||
type Config = Static<typeof schema>;
|
||||
|
||||
const config = envSchema<Config>({
|
||||
schema,
|
||||
dotenv: true, // Load from .env file
|
||||
});
|
||||
|
||||
const app = Fastify({
|
||||
logger: { level: config.LOG_LEVEL },
|
||||
});
|
||||
|
||||
app.decorate('config', config);
|
||||
|
||||
declare module 'fastify' {
|
||||
interface FastifyInstance {
|
||||
config: Config;
|
||||
}
|
||||
}
|
||||
|
||||
await app.listen({ port: config.PORT, host: config.HOST });
|
||||
```
|
||||
|
||||
## Configuration as Plugin
|
||||
|
||||
Encapsulate configuration in a plugin for reuse:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
import envSchema from 'env-schema';
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
const schema = Type.Object({
|
||||
PORT: Type.Number({ default: 3000 }),
|
||||
HOST: Type.String({ default: '0.0.0.0' }),
|
||||
DATABASE_URL: Type.String(),
|
||||
JWT_SECRET: Type.String({ minLength: 32 }),
|
||||
LOG_LEVEL: Type.String({ default: 'info' }),
|
||||
});
|
||||
|
||||
type Config = Static<typeof schema>;
|
||||
|
||||
declare module 'fastify' {
|
||||
interface FastifyInstance {
|
||||
config: Config;
|
||||
}
|
||||
}
|
||||
|
||||
export default fp(async function configPlugin(fastify) {
|
||||
const config = envSchema<Config>({
|
||||
schema,
|
||||
dotenv: true,
|
||||
});
|
||||
|
||||
fastify.decorate('config', config);
|
||||
}, {
|
||||
name: 'config',
|
||||
});
|
||||
```
|
||||
|
||||
## Secrets Management
|
||||
|
||||
Handle secrets securely:
|
||||
|
||||
```typescript
|
||||
// Never log secrets
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: config.LOG_LEVEL,
|
||||
redact: ['req.headers.authorization', '*.password', '*.secret', '*.apiKey'],
|
||||
},
|
||||
});
|
||||
|
||||
// For production, use secret managers (AWS Secrets Manager, Vault, etc.)
|
||||
// Pass secrets through environment variables - never commit them
|
||||
```
|
||||
|
||||
## Feature Flags
|
||||
|
||||
Implement feature flags via environment variables:
|
||||
|
||||
```typescript
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
const schema = Type.Object({
|
||||
// ... other config
|
||||
FEATURE_NEW_DASHBOARD: Type.Boolean({ default: false }),
|
||||
FEATURE_BETA_API: Type.Boolean({ default: false }),
|
||||
});
|
||||
|
||||
type Config = Static<typeof schema>;
|
||||
|
||||
const config = envSchema<Config>({ schema, dotenv: true });
|
||||
|
||||
// Use in routes
|
||||
app.get('/dashboard', async (request) => {
|
||||
if (app.config.FEATURE_NEW_DASHBOARD) {
|
||||
return { version: 'v2', data: await getNewDashboardData() };
|
||||
}
|
||||
return { version: 'v1', data: await getOldDashboardData() };
|
||||
});
|
||||
```
|
||||
|
||||
## Anti-Patterns to Avoid
|
||||
|
||||
### NEVER use configuration files
|
||||
|
||||
```typescript
|
||||
// ❌ NEVER DO THIS - configuration files are an antipattern
|
||||
import config from './config/production.json';
|
||||
|
||||
// ❌ NEVER DO THIS - per-environment config files
|
||||
const env = process.env.NODE_ENV || 'development';
|
||||
const config = await import(`./config/${env}.js`);
|
||||
```
|
||||
|
||||
Configuration files lead to:
|
||||
- Security risks (secrets in files)
|
||||
- Deployment complexity
|
||||
- Environment drift
|
||||
- Difficult secret rotation
|
||||
|
||||
### NEVER use per-environment configuration
|
||||
|
||||
```typescript
|
||||
// ❌ NEVER DO THIS
|
||||
const configs = {
|
||||
development: { logLevel: 'debug' },
|
||||
production: { logLevel: 'info' },
|
||||
test: { logLevel: 'silent' },
|
||||
};
|
||||
const config = configs[process.env.NODE_ENV];
|
||||
```
|
||||
|
||||
Instead, use a single configuration source (environment variables) with sensible defaults. The environment controls the values, not conditional code.
|
||||
|
||||
### Use specific environment variables, not NODE_ENV
|
||||
|
||||
```typescript
|
||||
// ❌ AVOID checking NODE_ENV
|
||||
if (process.env.NODE_ENV === 'production') {
|
||||
// do something
|
||||
}
|
||||
|
||||
// ✅ BETTER - use explicit feature flags or configuration
|
||||
if (app.config.ENABLE_DETAILED_LOGGING) {
|
||||
// do something
|
||||
}
|
||||
```
|
||||
|
||||
## Dynamic Configuration
|
||||
|
||||
For configuration that needs to change without restart, fetch from an external service:
|
||||
|
||||
```typescript
|
||||
interface DynamicConfig {
|
||||
rateLimit: number;
|
||||
maintenanceMode: boolean;
|
||||
}
|
||||
|
||||
let dynamicConfig: DynamicConfig = {
|
||||
rateLimit: 100,
|
||||
maintenanceMode: false,
|
||||
};
|
||||
|
||||
async function refreshConfig() {
|
||||
try {
|
||||
const newConfig = await fetchConfigFromService();
|
||||
dynamicConfig = newConfig;
|
||||
app.log.info('Configuration refreshed');
|
||||
} catch (error) {
|
||||
app.log.error({ err: error }, 'Failed to refresh configuration');
|
||||
}
|
||||
}
|
||||
|
||||
// Refresh periodically
|
||||
setInterval(refreshConfig, 60000);
|
||||
|
||||
// Use in hooks
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
if (dynamicConfig.maintenanceMode && !request.url.startsWith('/health')) {
|
||||
reply.code(503).send({ error: 'Service under maintenance' });
|
||||
}
|
||||
});
|
||||
```
|
||||
387
.agents/skills/fastify-best-practices/rules/content-type.md
Normal file
387
.agents/skills/fastify-best-practices/rules/content-type.md
Normal file
|
|
@ -0,0 +1,387 @@
|
|||
---
|
||||
name: content-type
|
||||
description: Content type parsing in Fastify
|
||||
metadata:
|
||||
tags: content-type, parsing, body, multipart, json
|
||||
---
|
||||
|
||||
# Content Type Parsing
|
||||
|
||||
## Default Content Type Parsers
|
||||
|
||||
Fastify includes parsers for common content types:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Built-in parsers:
|
||||
// - application/json
|
||||
// - text/plain
|
||||
|
||||
app.post('/json', async (request) => {
|
||||
// request.body is parsed JSON object
|
||||
return { received: request.body };
|
||||
});
|
||||
|
||||
app.post('/text', async (request) => {
|
||||
// request.body is string for text/plain
|
||||
return { text: request.body };
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Content Type Parsers
|
||||
|
||||
Add parsers for additional content types:
|
||||
|
||||
```typescript
|
||||
// Parse application/x-www-form-urlencoded
|
||||
app.addContentTypeParser(
|
||||
'application/x-www-form-urlencoded',
|
||||
{ parseAs: 'string' },
|
||||
(request, body, done) => {
|
||||
const parsed = new URLSearchParams(body);
|
||||
done(null, Object.fromEntries(parsed));
|
||||
},
|
||||
);
|
||||
|
||||
// Async parser
|
||||
app.addContentTypeParser(
|
||||
'application/x-www-form-urlencoded',
|
||||
{ parseAs: 'string' },
|
||||
async (request, body) => {
|
||||
const parsed = new URLSearchParams(body);
|
||||
return Object.fromEntries(parsed);
|
||||
},
|
||||
);
|
||||
```
|
||||
|
||||
## XML Parsing
|
||||
|
||||
Parse XML content:
|
||||
|
||||
```typescript
|
||||
import { XMLParser } from 'fast-xml-parser';
|
||||
|
||||
const xmlParser = new XMLParser({
|
||||
ignoreAttributes: false,
|
||||
attributeNamePrefix: '@_',
|
||||
});
|
||||
|
||||
app.addContentTypeParser(
|
||||
'application/xml',
|
||||
{ parseAs: 'string' },
|
||||
async (request, body) => {
|
||||
return xmlParser.parse(body);
|
||||
},
|
||||
);
|
||||
|
||||
app.addContentTypeParser(
|
||||
'text/xml',
|
||||
{ parseAs: 'string' },
|
||||
async (request, body) => {
|
||||
return xmlParser.parse(body);
|
||||
},
|
||||
);
|
||||
|
||||
app.post('/xml', async (request) => {
|
||||
// request.body is parsed XML as JavaScript object
|
||||
return { data: request.body };
|
||||
});
|
||||
```
|
||||
|
||||
## Multipart Form Data
|
||||
|
||||
Use @fastify/multipart for file uploads. **Configure these critical options:**
|
||||
|
||||
```typescript
|
||||
import fastifyMultipart from '@fastify/multipart';
|
||||
|
||||
app.register(fastifyMultipart, {
|
||||
// CRITICAL: Always set explicit limits
|
||||
limits: {
|
||||
fieldNameSize: 100, // Max field name size in bytes
|
||||
fieldSize: 1024 * 1024, // Max field value size (1MB)
|
||||
fields: 10, // Max number of non-file fields
|
||||
fileSize: 10 * 1024 * 1024, // Max file size (10MB)
|
||||
files: 5, // Max number of files
|
||||
headerPairs: 2000, // Max number of header pairs
|
||||
parts: 1000, // Max number of parts (fields + files)
|
||||
},
|
||||
// IMPORTANT: Throw on limit exceeded (default is to truncate silently!)
|
||||
throwFileSizeLimit: true,
|
||||
// Attach all fields to request.body for easier access
|
||||
attachFieldsToBody: true,
|
||||
// Only accept specific file types (security!)
|
||||
// onFile: async (part) => {
|
||||
// if (!['image/jpeg', 'image/png'].includes(part.mimetype)) {
|
||||
// throw new Error('Invalid file type');
|
||||
// }
|
||||
// },
|
||||
});
|
||||
|
||||
// Handle file upload
|
||||
app.post('/upload', async (request, reply) => {
|
||||
const data = await request.file();
|
||||
|
||||
if (!data) {
|
||||
return reply.code(400).send({ error: 'No file uploaded' });
|
||||
}
|
||||
|
||||
// data.file is a stream
|
||||
const buffer = await data.toBuffer();
|
||||
|
||||
return {
|
||||
filename: data.filename,
|
||||
mimetype: data.mimetype,
|
||||
size: buffer.length,
|
||||
};
|
||||
});
|
||||
|
||||
// Handle multiple files
|
||||
app.post('/upload-multiple', async (request) => {
|
||||
const files = [];
|
||||
|
||||
for await (const part of request.files()) {
|
||||
const buffer = await part.toBuffer();
|
||||
files.push({
|
||||
filename: part.filename,
|
||||
mimetype: part.mimetype,
|
||||
size: buffer.length,
|
||||
});
|
||||
}
|
||||
|
||||
return { files };
|
||||
});
|
||||
|
||||
// Handle mixed form data
|
||||
app.post('/form', async (request) => {
|
||||
const parts = request.parts();
|
||||
const fields: Record<string, string> = {};
|
||||
const files: Array<{ name: string; size: number }> = [];
|
||||
|
||||
for await (const part of parts) {
|
||||
if (part.type === 'file') {
|
||||
const buffer = await part.toBuffer();
|
||||
files.push({ name: part.filename, size: buffer.length });
|
||||
} else {
|
||||
fields[part.fieldname] = part.value as string;
|
||||
}
|
||||
}
|
||||
|
||||
return { fields, files };
|
||||
});
|
||||
```
|
||||
|
||||
## Stream Processing
|
||||
|
||||
Process body as stream for large payloads:
|
||||
|
||||
```typescript
|
||||
import { pipeline } from 'node:stream/promises';
|
||||
import { createWriteStream } from 'node:fs';
|
||||
|
||||
// Add parser that returns stream
|
||||
app.addContentTypeParser(
|
||||
'application/octet-stream',
|
||||
async (request, payload) => {
|
||||
return payload; // Return stream directly
|
||||
},
|
||||
);
|
||||
|
||||
app.post('/upload-stream', async (request, reply) => {
|
||||
const destination = createWriteStream('./upload.bin');
|
||||
|
||||
await pipeline(request.body, destination);
|
||||
|
||||
return { success: true };
|
||||
});
|
||||
```
|
||||
|
||||
## Custom JSON Parser
|
||||
|
||||
Replace the default JSON parser:
|
||||
|
||||
```typescript
|
||||
// Remove default parser
|
||||
app.removeContentTypeParser('application/json');
|
||||
|
||||
// Add custom parser with error handling
|
||||
app.addContentTypeParser(
|
||||
'application/json',
|
||||
{ parseAs: 'string' },
|
||||
async (request, body) => {
|
||||
try {
|
||||
return JSON.parse(body);
|
||||
} catch (error) {
|
||||
throw {
|
||||
statusCode: 400,
|
||||
code: 'INVALID_JSON',
|
||||
message: 'Invalid JSON payload',
|
||||
};
|
||||
}
|
||||
},
|
||||
);
|
||||
```
|
||||
|
||||
## Content Type with Parameters
|
||||
|
||||
Handle content types with parameters:
|
||||
|
||||
```typescript
|
||||
// Match content type with any charset
|
||||
app.addContentTypeParser(
|
||||
'application/json; charset=utf-8',
|
||||
{ parseAs: 'string' },
|
||||
async (request, body) => {
|
||||
return JSON.parse(body);
|
||||
},
|
||||
);
|
||||
|
||||
// Use regex for flexible matching
|
||||
app.addContentTypeParser(
|
||||
/^application\/.*\+json$/,
|
||||
{ parseAs: 'string' },
|
||||
async (request, body) => {
|
||||
return JSON.parse(body);
|
||||
},
|
||||
);
|
||||
```
|
||||
|
||||
## Catch-All Parser
|
||||
|
||||
Handle unknown content types:
|
||||
|
||||
```typescript
|
||||
app.addContentTypeParser('*', async (request, payload) => {
|
||||
const chunks: Buffer[] = [];
|
||||
|
||||
for await (const chunk of payload) {
|
||||
chunks.push(chunk);
|
||||
}
|
||||
|
||||
const buffer = Buffer.concat(chunks);
|
||||
|
||||
// Try to determine content type
|
||||
const contentType = request.headers['content-type'];
|
||||
|
||||
if (contentType?.includes('json')) {
|
||||
return JSON.parse(buffer.toString('utf-8'));
|
||||
}
|
||||
|
||||
if (contentType?.includes('text')) {
|
||||
return buffer.toString('utf-8');
|
||||
}
|
||||
|
||||
return buffer;
|
||||
});
|
||||
```
|
||||
|
||||
## Body Limit Configuration
|
||||
|
||||
Configure body size limits:
|
||||
|
||||
```typescript
|
||||
// Global limit
|
||||
const app = Fastify({
|
||||
bodyLimit: 1048576, // 1MB
|
||||
});
|
||||
|
||||
// Per-route limit
|
||||
app.post('/large-upload', {
|
||||
bodyLimit: 52428800, // 50MB for this route
|
||||
}, async (request) => {
|
||||
return { size: JSON.stringify(request.body).length };
|
||||
});
|
||||
|
||||
// Per content type limit
|
||||
app.addContentTypeParser('application/json', {
|
||||
parseAs: 'string',
|
||||
bodyLimit: 2097152, // 2MB for JSON
|
||||
}, async (request, body) => {
|
||||
return JSON.parse(body);
|
||||
});
|
||||
```
|
||||
|
||||
## Protocol Buffers
|
||||
|
||||
Parse protobuf content:
|
||||
|
||||
```typescript
|
||||
import protobuf from 'protobufjs';
|
||||
|
||||
const root = await protobuf.load('./schema.proto');
|
||||
const MessageType = root.lookupType('package.MessageType');
|
||||
|
||||
app.addContentTypeParser(
|
||||
'application/x-protobuf',
|
||||
{ parseAs: 'buffer' },
|
||||
async (request, body) => {
|
||||
const message = MessageType.decode(body);
|
||||
return MessageType.toObject(message);
|
||||
},
|
||||
);
|
||||
```
|
||||
|
||||
## Form Data with @fastify/formbody
|
||||
|
||||
Simple form parsing:
|
||||
|
||||
```typescript
|
||||
import formbody from '@fastify/formbody';
|
||||
|
||||
app.register(formbody);
|
||||
|
||||
app.post('/form', async (request) => {
|
||||
// request.body is parsed form data
|
||||
const { name, email } = request.body as { name: string; email: string };
|
||||
return { name, email };
|
||||
});
|
||||
```
|
||||
|
||||
## Content Negotiation
|
||||
|
||||
Handle different request formats:
|
||||
|
||||
```typescript
|
||||
app.post('/data', async (request, reply) => {
|
||||
const contentType = request.headers['content-type'];
|
||||
|
||||
// Body is already parsed by the appropriate parser
|
||||
const data = request.body;
|
||||
|
||||
// Respond based on Accept header
|
||||
const accept = request.headers.accept;
|
||||
|
||||
if (accept?.includes('application/xml')) {
|
||||
reply.type('application/xml');
|
||||
return `<data>${JSON.stringify(data)}</data>`;
|
||||
}
|
||||
|
||||
reply.type('application/json');
|
||||
return data;
|
||||
});
|
||||
```
|
||||
|
||||
## Validation After Parsing
|
||||
|
||||
Validate parsed content:
|
||||
|
||||
```typescript
|
||||
app.post('/users', {
|
||||
schema: {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string', minLength: 1 },
|
||||
email: { type: 'string', format: 'email' },
|
||||
},
|
||||
required: ['name', 'email'],
|
||||
},
|
||||
},
|
||||
}, async (request) => {
|
||||
// Body is parsed AND validated
|
||||
return request.body;
|
||||
});
|
||||
```
|
||||
445
.agents/skills/fastify-best-practices/rules/cors-security.md
Normal file
445
.agents/skills/fastify-best-practices/rules/cors-security.md
Normal file
|
|
@ -0,0 +1,445 @@
|
|||
---
|
||||
name: cors-security
|
||||
description: CORS and security headers in Fastify
|
||||
metadata:
|
||||
tags: cors, security, headers, helmet, csrf
|
||||
---
|
||||
|
||||
# CORS and Security
|
||||
|
||||
## CORS with @fastify/cors
|
||||
|
||||
Enable Cross-Origin Resource Sharing:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import cors from '@fastify/cors';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Simple CORS - allow all origins
|
||||
app.register(cors);
|
||||
|
||||
// Configured CORS
|
||||
app.register(cors, {
|
||||
origin: ['https://example.com', 'https://app.example.com'],
|
||||
methods: ['GET', 'POST', 'PUT', 'DELETE'],
|
||||
allowedHeaders: ['Content-Type', 'Authorization'],
|
||||
exposedHeaders: ['X-Total-Count'],
|
||||
credentials: true,
|
||||
maxAge: 86400, // 24 hours
|
||||
});
|
||||
```
|
||||
|
||||
## Dynamic CORS Origin
|
||||
|
||||
Validate origins dynamically:
|
||||
|
||||
```typescript
|
||||
app.register(cors, {
|
||||
origin: (origin, callback) => {
|
||||
// Allow requests with no origin (mobile apps, curl, etc.)
|
||||
if (!origin) {
|
||||
return callback(null, true);
|
||||
}
|
||||
|
||||
// Check against allowed origins
|
||||
const allowedOrigins = [
|
||||
'https://example.com',
|
||||
'https://app.example.com',
|
||||
/\.example\.com$/,
|
||||
];
|
||||
|
||||
const isAllowed = allowedOrigins.some((allowed) => {
|
||||
if (allowed instanceof RegExp) {
|
||||
return allowed.test(origin);
|
||||
}
|
||||
return allowed === origin;
|
||||
});
|
||||
|
||||
if (isAllowed) {
|
||||
callback(null, true);
|
||||
} else {
|
||||
callback(new Error('Not allowed by CORS'), false);
|
||||
}
|
||||
},
|
||||
credentials: true,
|
||||
});
|
||||
```
|
||||
|
||||
## Per-Route CORS
|
||||
|
||||
Configure CORS for specific routes:
|
||||
|
||||
```typescript
|
||||
app.register(cors, {
|
||||
origin: true, // Reflect request origin
|
||||
credentials: true,
|
||||
});
|
||||
|
||||
// Or disable CORS for specific routes
|
||||
app.route({
|
||||
method: 'GET',
|
||||
url: '/internal',
|
||||
config: {
|
||||
cors: false,
|
||||
},
|
||||
handler: async () => {
|
||||
return { internal: true };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Security Headers with @fastify/helmet
|
||||
|
||||
Add security headers:
|
||||
|
||||
```typescript
|
||||
import helmet from '@fastify/helmet';
|
||||
|
||||
app.register(helmet, {
|
||||
contentSecurityPolicy: {
|
||||
directives: {
|
||||
defaultSrc: ["'self'"],
|
||||
scriptSrc: ["'self'", "'unsafe-inline'"],
|
||||
styleSrc: ["'self'", "'unsafe-inline'"],
|
||||
imgSrc: ["'self'", 'data:', 'https:'],
|
||||
connectSrc: ["'self'", 'https://api.example.com'],
|
||||
},
|
||||
},
|
||||
crossOriginEmbedderPolicy: false, // Disable if embedding external resources
|
||||
});
|
||||
```
|
||||
|
||||
## Configure Individual Headers
|
||||
|
||||
Fine-tune security headers:
|
||||
|
||||
```typescript
|
||||
app.register(helmet, {
|
||||
// Strict Transport Security
|
||||
hsts: {
|
||||
maxAge: 31536000, // 1 year
|
||||
includeSubDomains: true,
|
||||
preload: true,
|
||||
},
|
||||
|
||||
// Content Security Policy
|
||||
contentSecurityPolicy: {
|
||||
useDefaults: true,
|
||||
directives: {
|
||||
'script-src': ["'self'", 'https://trusted-cdn.com'],
|
||||
},
|
||||
},
|
||||
|
||||
// X-Frame-Options
|
||||
frameguard: {
|
||||
action: 'deny', // or 'sameorigin'
|
||||
},
|
||||
|
||||
// X-Content-Type-Options
|
||||
noSniff: true,
|
||||
|
||||
// X-XSS-Protection (legacy)
|
||||
xssFilter: true,
|
||||
|
||||
// Referrer-Policy
|
||||
referrerPolicy: {
|
||||
policy: 'strict-origin-when-cross-origin',
|
||||
},
|
||||
|
||||
// X-Permitted-Cross-Domain-Policies
|
||||
permittedCrossDomainPolicies: false,
|
||||
|
||||
// X-DNS-Prefetch-Control
|
||||
dnsPrefetchControl: {
|
||||
allow: false,
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Rate Limiting
|
||||
|
||||
Protect against abuse:
|
||||
|
||||
```typescript
|
||||
import rateLimit from '@fastify/rate-limit';
|
||||
|
||||
app.register(rateLimit, {
|
||||
max: 100,
|
||||
timeWindow: '1 minute',
|
||||
errorResponseBuilder: (request, context) => ({
|
||||
statusCode: 429,
|
||||
error: 'Too Many Requests',
|
||||
message: `Rate limit exceeded. Retry in ${context.after}`,
|
||||
retryAfter: context.after,
|
||||
}),
|
||||
});
|
||||
|
||||
// Per-route rate limit
|
||||
app.get('/expensive', {
|
||||
config: {
|
||||
rateLimit: {
|
||||
max: 10,
|
||||
timeWindow: '1 minute',
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
|
||||
// Skip rate limit for certain routes
|
||||
app.get('/health', {
|
||||
config: {
|
||||
rateLimit: false,
|
||||
},
|
||||
}, () => ({ status: 'ok' }));
|
||||
```
|
||||
|
||||
## Redis-Based Rate Limiting
|
||||
|
||||
Use Redis for distributed rate limiting:
|
||||
|
||||
```typescript
|
||||
import rateLimit from '@fastify/rate-limit';
|
||||
import Redis from 'ioredis';
|
||||
|
||||
const redis = new Redis(process.env.REDIS_URL);
|
||||
|
||||
app.register(rateLimit, {
|
||||
max: 100,
|
||||
timeWindow: '1 minute',
|
||||
redis,
|
||||
nameSpace: 'rate-limit:',
|
||||
keyGenerator: (request) => {
|
||||
// Rate limit by user ID if authenticated, otherwise by IP
|
||||
return request.user?.id || request.ip;
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## CSRF Protection
|
||||
|
||||
Protect against Cross-Site Request Forgery:
|
||||
|
||||
```typescript
|
||||
import fastifyCsrf from '@fastify/csrf-protection';
|
||||
import fastifyCookie from '@fastify/cookie';
|
||||
|
||||
app.register(fastifyCookie);
|
||||
app.register(fastifyCsrf, {
|
||||
cookieOpts: {
|
||||
signed: true,
|
||||
httpOnly: true,
|
||||
sameSite: 'strict',
|
||||
},
|
||||
});
|
||||
|
||||
// Generate token
|
||||
app.get('/csrf-token', async (request, reply) => {
|
||||
const token = reply.generateCsrf();
|
||||
return { token };
|
||||
});
|
||||
|
||||
// Protected route
|
||||
app.post('/transfer', {
|
||||
preHandler: app.csrfProtection,
|
||||
}, async (request) => {
|
||||
// CSRF token validated
|
||||
return { success: true };
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Security Headers
|
||||
|
||||
Add custom headers:
|
||||
|
||||
```typescript
|
||||
app.addHook('onSend', async (request, reply) => {
|
||||
// Custom security headers
|
||||
reply.header('X-Request-ID', request.id);
|
||||
reply.header('X-Content-Type-Options', 'nosniff');
|
||||
reply.header('X-Frame-Options', 'DENY');
|
||||
reply.header('Permissions-Policy', 'geolocation=(), camera=()');
|
||||
});
|
||||
|
||||
// Per-route headers
|
||||
app.get('/download', async (request, reply) => {
|
||||
reply.header('Content-Disposition', 'attachment; filename="file.pdf"');
|
||||
reply.header('X-Download-Options', 'noopen');
|
||||
return reply.send(fileStream);
|
||||
});
|
||||
```
|
||||
|
||||
## Secure Cookies
|
||||
|
||||
Configure secure cookies:
|
||||
|
||||
```typescript
|
||||
import cookie from '@fastify/cookie';
|
||||
|
||||
app.register(cookie, {
|
||||
secret: process.env.COOKIE_SECRET,
|
||||
parseOptions: {
|
||||
httpOnly: true,
|
||||
secure: process.env.NODE_ENV === 'production',
|
||||
sameSite: 'strict',
|
||||
path: '/',
|
||||
maxAge: 3600, // 1 hour
|
||||
},
|
||||
});
|
||||
|
||||
// Set secure cookie
|
||||
app.post('/login', async (request, reply) => {
|
||||
const token = await createSession(request.body);
|
||||
|
||||
reply.setCookie('session', token, {
|
||||
httpOnly: true,
|
||||
secure: true,
|
||||
sameSite: 'strict',
|
||||
path: '/',
|
||||
maxAge: 86400,
|
||||
signed: true,
|
||||
});
|
||||
|
||||
return { success: true };
|
||||
});
|
||||
|
||||
// Read signed cookie
|
||||
app.get('/profile', async (request) => {
|
||||
const session = request.cookies.session;
|
||||
const unsigned = request.unsignCookie(session);
|
||||
|
||||
if (!unsigned.valid) {
|
||||
throw { statusCode: 401, message: 'Invalid session' };
|
||||
}
|
||||
|
||||
return { sessionId: unsigned.value };
|
||||
});
|
||||
```
|
||||
|
||||
## Request Validation Security
|
||||
|
||||
Validate and sanitize input:
|
||||
|
||||
```typescript
|
||||
// Schema-based validation protects against injection
|
||||
app.post('/users', {
|
||||
schema: {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
email: {
|
||||
type: 'string',
|
||||
format: 'email',
|
||||
maxLength: 254,
|
||||
},
|
||||
name: {
|
||||
type: 'string',
|
||||
minLength: 1,
|
||||
maxLength: 100,
|
||||
pattern: '^[a-zA-Z\\s]+$', // Only letters and spaces
|
||||
},
|
||||
},
|
||||
required: ['email', 'name'],
|
||||
additionalProperties: false,
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## IP Filtering
|
||||
|
||||
Restrict access by IP:
|
||||
|
||||
```typescript
|
||||
const allowedIps = new Set([
|
||||
'192.168.1.0/24',
|
||||
'10.0.0.0/8',
|
||||
]);
|
||||
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
if (request.url.startsWith('/admin')) {
|
||||
const clientIp = request.ip;
|
||||
|
||||
if (!isIpAllowed(clientIp, allowedIps)) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
function isIpAllowed(ip: string, allowed: Set<string>): boolean {
|
||||
// Implement IP/CIDR matching
|
||||
for (const range of allowed) {
|
||||
if (ipInRange(ip, range)) return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
```
|
||||
|
||||
## Trust Proxy
|
||||
|
||||
Configure for reverse proxy environments:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
trustProxy: true, // Trust X-Forwarded-* headers
|
||||
});
|
||||
|
||||
// Or specific proxy configuration
|
||||
const app = Fastify({
|
||||
trustProxy: ['127.0.0.1', '10.0.0.0/8'],
|
||||
});
|
||||
|
||||
// Now request.ip returns the real client IP
|
||||
app.get('/ip', async (request) => {
|
||||
return {
|
||||
ip: request.ip,
|
||||
ips: request.ips, // Array of all IPs in chain
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## HTTPS Redirect
|
||||
|
||||
Force HTTPS in production:
|
||||
|
||||
```typescript
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
if (
|
||||
process.env.NODE_ENV === 'production' &&
|
||||
request.headers['x-forwarded-proto'] !== 'https'
|
||||
) {
|
||||
const httpsUrl = `https://${request.hostname}${request.url}`;
|
||||
reply.redirect(301, httpsUrl);
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Security Best Practices Summary
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import cors from '@fastify/cors';
|
||||
import helmet from '@fastify/helmet';
|
||||
import rateLimit from '@fastify/rate-limit';
|
||||
|
||||
const app = Fastify({
|
||||
trustProxy: true,
|
||||
bodyLimit: 1048576, // 1MB max body
|
||||
});
|
||||
|
||||
// Security plugins
|
||||
app.register(helmet);
|
||||
app.register(cors, {
|
||||
origin: process.env.ALLOWED_ORIGINS?.split(','),
|
||||
credentials: true,
|
||||
});
|
||||
app.register(rateLimit, {
|
||||
max: 100,
|
||||
timeWindow: '1 minute',
|
||||
});
|
||||
|
||||
// Validate all input with schemas
|
||||
// Never expose internal errors in production
|
||||
// Use parameterized queries for database
|
||||
// Keep dependencies updated
|
||||
```
|
||||
320
.agents/skills/fastify-best-practices/rules/database.md
Normal file
320
.agents/skills/fastify-best-practices/rules/database.md
Normal file
|
|
@ -0,0 +1,320 @@
|
|||
---
|
||||
name: database
|
||||
description: Database integration with Fastify using official adapters
|
||||
metadata:
|
||||
tags: database, postgres, mysql, mongodb, redis, sql
|
||||
---
|
||||
|
||||
# Database Integration
|
||||
|
||||
## Use Official Fastify Database Adapters
|
||||
|
||||
Always use the official Fastify database plugins from the `@fastify` organization. They provide proper connection pooling, encapsulation, and integration with Fastify's lifecycle.
|
||||
|
||||
## PostgreSQL with @fastify/postgres
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import fastifyPostgres from '@fastify/postgres';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.register(fastifyPostgres, {
|
||||
connectionString: process.env.DATABASE_URL,
|
||||
});
|
||||
|
||||
// Use in routes
|
||||
app.get('/users', async (request) => {
|
||||
const client = await app.pg.connect();
|
||||
try {
|
||||
const { rows } = await client.query('SELECT * FROM users');
|
||||
return rows;
|
||||
} finally {
|
||||
client.release();
|
||||
}
|
||||
});
|
||||
|
||||
// Or use the pool directly for simple queries
|
||||
app.get('/users/:id', async (request) => {
|
||||
const { id } = request.params;
|
||||
const { rows } = await app.pg.query(
|
||||
'SELECT * FROM users WHERE id = $1',
|
||||
[id],
|
||||
);
|
||||
return rows[0];
|
||||
});
|
||||
|
||||
// Transactions
|
||||
app.post('/transfer', async (request) => {
|
||||
const { fromId, toId, amount } = request.body;
|
||||
const client = await app.pg.connect();
|
||||
|
||||
try {
|
||||
await client.query('BEGIN');
|
||||
await client.query(
|
||||
'UPDATE accounts SET balance = balance - $1 WHERE id = $2',
|
||||
[amount, fromId],
|
||||
);
|
||||
await client.query(
|
||||
'UPDATE accounts SET balance = balance + $1 WHERE id = $2',
|
||||
[amount, toId],
|
||||
);
|
||||
await client.query('COMMIT');
|
||||
return { success: true };
|
||||
} catch (error) {
|
||||
await client.query('ROLLBACK');
|
||||
throw error;
|
||||
} finally {
|
||||
client.release();
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## MySQL with @fastify/mysql
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import fastifyMysql from '@fastify/mysql';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.register(fastifyMysql, {
|
||||
promise: true,
|
||||
connectionString: process.env.MYSQL_URL,
|
||||
});
|
||||
|
||||
app.get('/users', async (request) => {
|
||||
const connection = await app.mysql.getConnection();
|
||||
try {
|
||||
const [rows] = await connection.query('SELECT * FROM users');
|
||||
return rows;
|
||||
} finally {
|
||||
connection.release();
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## MongoDB with @fastify/mongodb
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import fastifyMongo from '@fastify/mongodb';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.register(fastifyMongo, {
|
||||
url: process.env.MONGODB_URL,
|
||||
});
|
||||
|
||||
app.get('/users', async (request) => {
|
||||
const users = await app.mongo.db
|
||||
.collection('users')
|
||||
.find({})
|
||||
.toArray();
|
||||
return users;
|
||||
});
|
||||
|
||||
app.get('/users/:id', async (request) => {
|
||||
const { id } = request.params;
|
||||
const user = await app.mongo.db
|
||||
.collection('users')
|
||||
.findOne({ _id: new app.mongo.ObjectId(id) });
|
||||
return user;
|
||||
});
|
||||
|
||||
app.post('/users', async (request) => {
|
||||
const result = await app.mongo.db
|
||||
.collection('users')
|
||||
.insertOne(request.body);
|
||||
return { id: result.insertedId };
|
||||
});
|
||||
```
|
||||
|
||||
## Redis with @fastify/redis
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import fastifyRedis from '@fastify/redis';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.register(fastifyRedis, {
|
||||
url: process.env.REDIS_URL,
|
||||
});
|
||||
|
||||
// Caching example
|
||||
app.get('/data/:key', async (request) => {
|
||||
const { key } = request.params;
|
||||
|
||||
// Try cache first
|
||||
const cached = await app.redis.get(`cache:${key}`);
|
||||
if (cached) {
|
||||
return JSON.parse(cached);
|
||||
}
|
||||
|
||||
// Fetch from database
|
||||
const data = await fetchFromDatabase(key);
|
||||
|
||||
// Cache for 5 minutes
|
||||
await app.redis.setex(`cache:${key}`, 300, JSON.stringify(data));
|
||||
|
||||
return data;
|
||||
});
|
||||
```
|
||||
|
||||
## Database as Plugin
|
||||
|
||||
Encapsulate database access in a plugin:
|
||||
|
||||
```typescript
|
||||
// plugins/database.ts
|
||||
import fp from 'fastify-plugin';
|
||||
import fastifyPostgres from '@fastify/postgres';
|
||||
|
||||
export default fp(async function databasePlugin(fastify) {
|
||||
await fastify.register(fastifyPostgres, {
|
||||
connectionString: fastify.config.DATABASE_URL,
|
||||
});
|
||||
|
||||
// Add health check
|
||||
fastify.decorate('checkDatabaseHealth', async () => {
|
||||
try {
|
||||
await fastify.pg.query('SELECT 1');
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
});
|
||||
}, {
|
||||
name: 'database',
|
||||
dependencies: ['config'],
|
||||
});
|
||||
```
|
||||
|
||||
## Repository Pattern
|
||||
|
||||
Abstract database access with repositories:
|
||||
|
||||
```typescript
|
||||
// repositories/user.repository.ts
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
|
||||
export interface User {
|
||||
id: string;
|
||||
email: string;
|
||||
name: string;
|
||||
}
|
||||
|
||||
export function createUserRepository(app: FastifyInstance) {
|
||||
return {
|
||||
async findById(id: string): Promise<User | null> {
|
||||
const { rows } = await app.pg.query(
|
||||
'SELECT * FROM users WHERE id = $1',
|
||||
[id],
|
||||
);
|
||||
return rows[0] || null;
|
||||
},
|
||||
|
||||
async findByEmail(email: string): Promise<User | null> {
|
||||
const { rows } = await app.pg.query(
|
||||
'SELECT * FROM users WHERE email = $1',
|
||||
[email],
|
||||
);
|
||||
return rows[0] || null;
|
||||
},
|
||||
|
||||
async create(data: Omit<User, 'id'>): Promise<User> {
|
||||
const { rows } = await app.pg.query(
|
||||
'INSERT INTO users (email, name) VALUES ($1, $2) RETURNING *',
|
||||
[data.email, data.name],
|
||||
);
|
||||
return rows[0];
|
||||
},
|
||||
|
||||
async update(id: string, data: Partial<User>): Promise<User | null> {
|
||||
const fields = Object.keys(data);
|
||||
const values = Object.values(data);
|
||||
const setClause = fields
|
||||
.map((f, i) => `${f} = $${i + 2}`)
|
||||
.join(', ');
|
||||
|
||||
const { rows } = await app.pg.query(
|
||||
`UPDATE users SET ${setClause} WHERE id = $1 RETURNING *`,
|
||||
[id, ...values],
|
||||
);
|
||||
return rows[0] || null;
|
||||
},
|
||||
|
||||
async delete(id: string): Promise<boolean> {
|
||||
const { rowCount } = await app.pg.query(
|
||||
'DELETE FROM users WHERE id = $1',
|
||||
[id],
|
||||
);
|
||||
return rowCount > 0;
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
// Usage in plugin
|
||||
import fp from 'fastify-plugin';
|
||||
import { createUserRepository } from './repositories/user.repository.js';
|
||||
|
||||
export default fp(async function repositoriesPlugin(fastify) {
|
||||
fastify.decorate('repositories', {
|
||||
users: createUserRepository(fastify),
|
||||
});
|
||||
}, {
|
||||
name: 'repositories',
|
||||
dependencies: ['database'],
|
||||
});
|
||||
```
|
||||
|
||||
## Testing with Database
|
||||
|
||||
Use transactions for test isolation:
|
||||
|
||||
```typescript
|
||||
import { describe, it, beforeEach, afterEach } from 'node:test';
|
||||
import { build } from './app.js';
|
||||
|
||||
describe('User API', () => {
|
||||
let app;
|
||||
let client;
|
||||
|
||||
beforeEach(async () => {
|
||||
app = await build();
|
||||
client = await app.pg.connect();
|
||||
await client.query('BEGIN');
|
||||
});
|
||||
|
||||
afterEach(async () => {
|
||||
await client.query('ROLLBACK');
|
||||
client.release();
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should create a user', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/users',
|
||||
payload: { email: 'test@example.com', name: 'Test' },
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 201);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Connection Pool Configuration
|
||||
|
||||
Configure connection pools appropriately:
|
||||
|
||||
```typescript
|
||||
app.register(fastifyPostgres, {
|
||||
connectionString: process.env.DATABASE_URL,
|
||||
// Pool configuration
|
||||
max: 20, // Maximum pool size
|
||||
idleTimeoutMillis: 30000, // Close idle clients after 30s
|
||||
connectionTimeoutMillis: 5000, // Timeout for new connections
|
||||
});
|
||||
```
|
||||
416
.agents/skills/fastify-best-practices/rules/decorators.md
Normal file
416
.agents/skills/fastify-best-practices/rules/decorators.md
Normal file
|
|
@ -0,0 +1,416 @@
|
|||
---
|
||||
name: decorators
|
||||
description: Decorators and request/reply extensions in Fastify
|
||||
metadata:
|
||||
tags: decorators, extensions, customization, utilities
|
||||
---
|
||||
|
||||
# Decorators and Extensions
|
||||
|
||||
## Understanding Decorators
|
||||
|
||||
Decorators add custom properties and methods to Fastify instances, requests, and replies:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Decorate the Fastify instance
|
||||
app.decorate('utility', {
|
||||
formatDate: (date: Date) => date.toISOString(),
|
||||
generateId: () => crypto.randomUUID(),
|
||||
});
|
||||
|
||||
// Use in routes
|
||||
app.get('/example', async function (request, reply) {
|
||||
const id = this.utility.generateId();
|
||||
return { id, timestamp: this.utility.formatDate(new Date()) };
|
||||
});
|
||||
```
|
||||
|
||||
## Decorator Types
|
||||
|
||||
Three types of decorators for different contexts:
|
||||
|
||||
```typescript
|
||||
// Instance decorator - available on fastify instance
|
||||
app.decorate('config', { apiVersion: '1.0.0' });
|
||||
app.decorate('db', databaseConnection);
|
||||
app.decorate('cache', cacheClient);
|
||||
|
||||
// Request decorator - available on each request
|
||||
app.decorateRequest('user', null); // Object property
|
||||
app.decorateRequest('startTime', 0); // Primitive
|
||||
app.decorateRequest('getData', function() { // Method
|
||||
return this.body;
|
||||
});
|
||||
|
||||
// Reply decorator - available on each reply
|
||||
app.decorateReply('sendError', function(code: number, message: string) {
|
||||
return this.code(code).send({ error: message });
|
||||
});
|
||||
app.decorateReply('success', function(data: unknown) {
|
||||
return this.send({ success: true, data });
|
||||
});
|
||||
```
|
||||
|
||||
## TypeScript Declaration Merging
|
||||
|
||||
Extend Fastify types for type safety:
|
||||
|
||||
```typescript
|
||||
// Declare custom properties
|
||||
declare module 'fastify' {
|
||||
interface FastifyInstance {
|
||||
config: {
|
||||
apiVersion: string;
|
||||
environment: string;
|
||||
};
|
||||
db: DatabaseClient;
|
||||
cache: CacheClient;
|
||||
}
|
||||
|
||||
interface FastifyRequest {
|
||||
user: {
|
||||
id: string;
|
||||
email: string;
|
||||
roles: string[];
|
||||
} | null;
|
||||
startTime: number;
|
||||
requestId: string;
|
||||
}
|
||||
|
||||
interface FastifyReply {
|
||||
sendError: (code: number, message: string) => void;
|
||||
success: (data: unknown) => void;
|
||||
}
|
||||
}
|
||||
|
||||
// Register decorators
|
||||
app.decorate('config', {
|
||||
apiVersion: '1.0.0',
|
||||
environment: process.env.NODE_ENV,
|
||||
});
|
||||
|
||||
app.decorateRequest('user', null);
|
||||
app.decorateRequest('startTime', 0);
|
||||
|
||||
app.decorateReply('sendError', function (code: number, message: string) {
|
||||
this.code(code).send({ error: message });
|
||||
});
|
||||
```
|
||||
|
||||
## Decorator Initialization
|
||||
|
||||
Initialize request/reply decorators in hooks:
|
||||
|
||||
```typescript
|
||||
// Decorators with primitive defaults are copied
|
||||
app.decorateRequest('startTime', 0);
|
||||
|
||||
// Initialize in hook
|
||||
app.addHook('onRequest', async (request) => {
|
||||
request.startTime = Date.now();
|
||||
});
|
||||
|
||||
// Object decorators need getter pattern for proper initialization
|
||||
app.decorateRequest('context', null);
|
||||
|
||||
app.addHook('onRequest', async (request) => {
|
||||
request.context = {
|
||||
traceId: request.headers['x-trace-id'] || crypto.randomUUID(),
|
||||
clientIp: request.ip,
|
||||
userAgent: request.headers['user-agent'],
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Dependency Injection with Decorators
|
||||
|
||||
Use decorators for dependency injection:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
// Database plugin
|
||||
export default fp(async function databasePlugin(fastify, options) {
|
||||
const db = await createDatabaseConnection(options.connectionString);
|
||||
|
||||
fastify.decorate('db', db);
|
||||
|
||||
fastify.addHook('onClose', async () => {
|
||||
await db.close();
|
||||
});
|
||||
});
|
||||
|
||||
// User service plugin
|
||||
export default fp(async function userServicePlugin(fastify) {
|
||||
// Depends on db decorator
|
||||
if (!fastify.hasDecorator('db')) {
|
||||
throw new Error('Database plugin must be registered first');
|
||||
}
|
||||
|
||||
const userService = {
|
||||
findById: (id: string) => fastify.db.query('SELECT * FROM users WHERE id = $1', [id]),
|
||||
create: (data: CreateUserInput) => fastify.db.query(
|
||||
'INSERT INTO users (name, email) VALUES ($1, $2) RETURNING *',
|
||||
[data.name, data.email]
|
||||
),
|
||||
};
|
||||
|
||||
fastify.decorate('userService', userService);
|
||||
}, {
|
||||
dependencies: ['database-plugin'],
|
||||
});
|
||||
|
||||
// Use in routes
|
||||
app.get('/users/:id', async function (request) {
|
||||
const user = await this.userService.findById(request.params.id);
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## Request Context Pattern
|
||||
|
||||
Build rich request context:
|
||||
|
||||
```typescript
|
||||
interface RequestContext {
|
||||
traceId: string;
|
||||
user: User | null;
|
||||
permissions: Set<string>;
|
||||
startTime: number;
|
||||
metadata: Map<string, unknown>;
|
||||
}
|
||||
|
||||
declare module 'fastify' {
|
||||
interface FastifyRequest {
|
||||
ctx: RequestContext;
|
||||
}
|
||||
}
|
||||
|
||||
app.decorateRequest('ctx', null);
|
||||
|
||||
app.addHook('onRequest', async (request) => {
|
||||
request.ctx = {
|
||||
traceId: request.headers['x-trace-id']?.toString() || crypto.randomUUID(),
|
||||
user: null,
|
||||
permissions: new Set(),
|
||||
startTime: Date.now(),
|
||||
metadata: new Map(),
|
||||
};
|
||||
});
|
||||
|
||||
// Auth hook populates user
|
||||
app.addHook('preHandler', async (request) => {
|
||||
const token = request.headers.authorization;
|
||||
if (token) {
|
||||
const user = await verifyToken(token);
|
||||
request.ctx.user = user;
|
||||
request.ctx.permissions = new Set(user.permissions);
|
||||
}
|
||||
});
|
||||
|
||||
// Use in handlers
|
||||
app.get('/profile', async (request, reply) => {
|
||||
if (!request.ctx.user) {
|
||||
return reply.code(401).send({ error: 'Unauthorized' });
|
||||
}
|
||||
|
||||
if (!request.ctx.permissions.has('read:profile')) {
|
||||
return reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
|
||||
return request.ctx.user;
|
||||
});
|
||||
```
|
||||
|
||||
## Reply Helpers
|
||||
|
||||
Create consistent response methods:
|
||||
|
||||
```typescript
|
||||
declare module 'fastify' {
|
||||
interface FastifyReply {
|
||||
ok: (data?: unknown) => void;
|
||||
created: (data: unknown) => void;
|
||||
noContent: () => void;
|
||||
badRequest: (message: string, details?: unknown) => void;
|
||||
unauthorized: (message?: string) => void;
|
||||
forbidden: (message?: string) => void;
|
||||
notFound: (resource?: string) => void;
|
||||
conflict: (message: string) => void;
|
||||
serverError: (message?: string) => void;
|
||||
}
|
||||
}
|
||||
|
||||
app.decorateReply('ok', function (data?: unknown) {
|
||||
this.code(200).send(data ?? { success: true });
|
||||
});
|
||||
|
||||
app.decorateReply('created', function (data: unknown) {
|
||||
this.code(201).send(data);
|
||||
});
|
||||
|
||||
app.decorateReply('noContent', function () {
|
||||
this.code(204).send();
|
||||
});
|
||||
|
||||
app.decorateReply('badRequest', function (message: string, details?: unknown) {
|
||||
this.code(400).send({
|
||||
statusCode: 400,
|
||||
error: 'Bad Request',
|
||||
message,
|
||||
details,
|
||||
});
|
||||
});
|
||||
|
||||
app.decorateReply('unauthorized', function (message = 'Authentication required') {
|
||||
this.code(401).send({
|
||||
statusCode: 401,
|
||||
error: 'Unauthorized',
|
||||
message,
|
||||
});
|
||||
});
|
||||
|
||||
app.decorateReply('notFound', function (resource = 'Resource') {
|
||||
this.code(404).send({
|
||||
statusCode: 404,
|
||||
error: 'Not Found',
|
||||
message: `${resource} not found`,
|
||||
});
|
||||
});
|
||||
|
||||
// Usage
|
||||
app.get('/users/:id', async (request, reply) => {
|
||||
const user = await db.users.findById(request.params.id);
|
||||
if (!user) {
|
||||
return reply.notFound('User');
|
||||
}
|
||||
return reply.ok(user);
|
||||
});
|
||||
|
||||
app.post('/users', async (request, reply) => {
|
||||
const user = await db.users.create(request.body);
|
||||
return reply.created(user);
|
||||
});
|
||||
```
|
||||
|
||||
## Checking Decorators
|
||||
|
||||
Check if decorators exist before using:
|
||||
|
||||
```typescript
|
||||
// Check at registration time
|
||||
app.register(async function (fastify) {
|
||||
if (!fastify.hasDecorator('db')) {
|
||||
throw new Error('Database decorator required');
|
||||
}
|
||||
|
||||
if (!fastify.hasRequestDecorator('user')) {
|
||||
throw new Error('User request decorator required');
|
||||
}
|
||||
|
||||
if (!fastify.hasReplyDecorator('sendError')) {
|
||||
throw new Error('sendError reply decorator required');
|
||||
}
|
||||
|
||||
// Safe to use decorators
|
||||
});
|
||||
```
|
||||
|
||||
## Decorator Encapsulation
|
||||
|
||||
Decorators respect encapsulation by default:
|
||||
|
||||
```typescript
|
||||
app.register(async function pluginA(fastify) {
|
||||
fastify.decorate('pluginAUtil', () => 'A');
|
||||
|
||||
fastify.get('/a', async function () {
|
||||
return this.pluginAUtil(); // Works
|
||||
});
|
||||
});
|
||||
|
||||
app.register(async function pluginB(fastify) {
|
||||
// this.pluginAUtil is NOT available here (encapsulated)
|
||||
|
||||
fastify.get('/b', async function () {
|
||||
// this.pluginAUtil() would be undefined
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
Use `fastify-plugin` to share decorators:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
export default fp(async function sharedDecorator(fastify) {
|
||||
fastify.decorate('sharedUtil', () => 'shared');
|
||||
});
|
||||
|
||||
// Now available to parent and sibling plugins
|
||||
```
|
||||
|
||||
## Functional Decorators
|
||||
|
||||
Create decorators that return functions:
|
||||
|
||||
```typescript
|
||||
declare module 'fastify' {
|
||||
interface FastifyInstance {
|
||||
createValidator: <T>(schema: object) => (data: unknown) => T;
|
||||
createRateLimiter: (options: RateLimitOptions) => RateLimiter;
|
||||
}
|
||||
}
|
||||
|
||||
app.decorate('createValidator', function <T>(schema: object) {
|
||||
const validate = ajv.compile(schema);
|
||||
return (data: unknown): T => {
|
||||
if (!validate(data)) {
|
||||
throw new ValidationError(validate.errors);
|
||||
}
|
||||
return data as T;
|
||||
};
|
||||
});
|
||||
|
||||
// Usage
|
||||
const validateUser = app.createValidator<User>(userSchema);
|
||||
|
||||
app.post('/users', async (request) => {
|
||||
const user = validateUser(request.body);
|
||||
return db.users.create(user);
|
||||
});
|
||||
```
|
||||
|
||||
## Async Decorator Initialization
|
||||
|
||||
Handle async initialization properly:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
export default fp(async function asyncPlugin(fastify) {
|
||||
// Async initialization
|
||||
const connection = await createAsyncConnection();
|
||||
const cache = await initializeCache();
|
||||
|
||||
fastify.decorate('asyncService', {
|
||||
connection,
|
||||
cache,
|
||||
query: async (sql: string) => connection.query(sql),
|
||||
});
|
||||
|
||||
fastify.addHook('onClose', async () => {
|
||||
await connection.close();
|
||||
await cache.disconnect();
|
||||
});
|
||||
});
|
||||
|
||||
// Plugin is fully initialized before routes execute
|
||||
app.get('/data', async function () {
|
||||
return this.asyncService.query('SELECT * FROM data');
|
||||
});
|
||||
```
|
||||
425
.agents/skills/fastify-best-practices/rules/deployment.md
Normal file
425
.agents/skills/fastify-best-practices/rules/deployment.md
Normal file
|
|
@ -0,0 +1,425 @@
|
|||
---
|
||||
name: deployment
|
||||
description: Production deployment for Fastify applications
|
||||
metadata:
|
||||
tags: deployment, production, docker, kubernetes, scaling
|
||||
---
|
||||
|
||||
# Production Deployment
|
||||
|
||||
## Graceful Shutdown with close-with-grace
|
||||
|
||||
Use `close-with-grace` for proper shutdown handling:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import closeWithGrace from 'close-with-grace';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
// Register plugins and routes
|
||||
await app.register(import('./plugins/index.js'));
|
||||
await app.register(import('./routes/index.js'));
|
||||
|
||||
// Graceful shutdown handler
|
||||
closeWithGrace({ delay: 10000 }, async ({ signal, err }) => {
|
||||
if (err) {
|
||||
app.log.error({ err }, 'Server closing due to error');
|
||||
} else {
|
||||
app.log.info({ signal }, 'Server closing due to signal');
|
||||
}
|
||||
|
||||
await app.close();
|
||||
});
|
||||
|
||||
// Start server
|
||||
await app.listen({
|
||||
port: parseInt(process.env.PORT || '3000', 10),
|
||||
host: '0.0.0.0',
|
||||
});
|
||||
|
||||
app.log.info(`Server listening on ${app.server.address()}`);
|
||||
```
|
||||
|
||||
## Health Check Endpoints
|
||||
|
||||
Implement comprehensive health checks:
|
||||
|
||||
```typescript
|
||||
app.get('/health', async () => {
|
||||
return { status: 'ok', timestamp: new Date().toISOString() };
|
||||
});
|
||||
|
||||
app.get('/health/live', async () => {
|
||||
return { status: 'ok' };
|
||||
});
|
||||
|
||||
app.get('/health/ready', async (request, reply) => {
|
||||
const checks = {
|
||||
database: false,
|
||||
cache: false,
|
||||
};
|
||||
|
||||
try {
|
||||
await app.db`SELECT 1`;
|
||||
checks.database = true;
|
||||
} catch {
|
||||
// Database not ready
|
||||
}
|
||||
|
||||
try {
|
||||
await app.cache.ping();
|
||||
checks.cache = true;
|
||||
} catch {
|
||||
// Cache not ready
|
||||
}
|
||||
|
||||
const allHealthy = Object.values(checks).every(Boolean);
|
||||
|
||||
if (!allHealthy) {
|
||||
reply.code(503);
|
||||
}
|
||||
|
||||
return {
|
||||
status: allHealthy ? 'ok' : 'degraded',
|
||||
checks,
|
||||
timestamp: new Date().toISOString(),
|
||||
};
|
||||
});
|
||||
|
||||
// Detailed health for monitoring
|
||||
app.get('/health/details', {
|
||||
preHandler: [app.authenticate, app.requireAdmin],
|
||||
}, async () => {
|
||||
const memory = process.memoryUsage();
|
||||
|
||||
return {
|
||||
status: 'ok',
|
||||
uptime: process.uptime(),
|
||||
memory: {
|
||||
heapUsed: Math.round(memory.heapUsed / 1024 / 1024),
|
||||
heapTotal: Math.round(memory.heapTotal / 1024 / 1024),
|
||||
rss: Math.round(memory.rss / 1024 / 1024),
|
||||
},
|
||||
version: process.env.APP_VERSION,
|
||||
nodeVersion: process.version,
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Docker Configuration
|
||||
|
||||
Create an optimized Dockerfile:
|
||||
|
||||
```dockerfile
|
||||
# Build stage
|
||||
FROM node:22-alpine AS builder
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
COPY package*.json ./
|
||||
RUN npm ci --only=production
|
||||
|
||||
COPY . .
|
||||
|
||||
# Production stage
|
||||
FROM node:22-alpine
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Run as non-root user
|
||||
RUN addgroup -g 1001 -S nodejs && \
|
||||
adduser -S nodejs -u 1001
|
||||
|
||||
# Copy from builder
|
||||
COPY --from=builder --chown=nodejs:nodejs /app/node_modules ./node_modules
|
||||
COPY --from=builder --chown=nodejs:nodejs /app/src ./src
|
||||
COPY --from=builder --chown=nodejs:nodejs /app/package.json ./
|
||||
|
||||
USER nodejs
|
||||
|
||||
EXPOSE 3000
|
||||
|
||||
ENV NODE_ENV=production
|
||||
ENV PORT=3000
|
||||
|
||||
# Health check
|
||||
HEALTHCHECK --interval=30s --timeout=5s --start-period=5s --retries=3 \
|
||||
CMD wget --no-verbose --tries=1 --spider http://localhost:3000/health || exit 1
|
||||
|
||||
CMD ["node", "src/app.ts"]
|
||||
```
|
||||
|
||||
```yaml
|
||||
# docker-compose.yml
|
||||
services:
|
||||
api:
|
||||
build: .
|
||||
ports:
|
||||
- "3000:3000"
|
||||
environment:
|
||||
- NODE_ENV=production
|
||||
- DATABASE_URL=postgres://user:pass@db:5432/app
|
||||
- JWT_SECRET=${JWT_SECRET}
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
restart: unless-stopped
|
||||
|
||||
db:
|
||||
image: postgres:16-alpine
|
||||
environment:
|
||||
- POSTGRES_USER=user
|
||||
- POSTGRES_PASSWORD=pass
|
||||
- POSTGRES_DB=app
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U user -d app"]
|
||||
interval: 5s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
|
||||
volumes:
|
||||
pgdata:
|
||||
```
|
||||
|
||||
## Kubernetes Deployment
|
||||
|
||||
Deploy to Kubernetes:
|
||||
|
||||
```yaml
|
||||
# deployment.yaml
|
||||
apiVersion: apps/v1
|
||||
kind: Deployment
|
||||
metadata:
|
||||
name: fastify-api
|
||||
spec:
|
||||
replicas: 3
|
||||
selector:
|
||||
matchLabels:
|
||||
app: fastify-api
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: fastify-api
|
||||
spec:
|
||||
containers:
|
||||
- name: api
|
||||
image: my-registry/fastify-api:latest
|
||||
ports:
|
||||
- containerPort: 3000
|
||||
env:
|
||||
- name: NODE_ENV
|
||||
value: "production"
|
||||
- name: DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: api-secrets
|
||||
key: database-url
|
||||
resources:
|
||||
requests:
|
||||
memory: "256Mi"
|
||||
cpu: "100m"
|
||||
limits:
|
||||
memory: "512Mi"
|
||||
cpu: "500m"
|
||||
livenessProbe:
|
||||
httpGet:
|
||||
path: /health/live
|
||||
port: 3000
|
||||
initialDelaySeconds: 5
|
||||
periodSeconds: 10
|
||||
readinessProbe:
|
||||
httpGet:
|
||||
path: /health/ready
|
||||
port: 3000
|
||||
initialDelaySeconds: 5
|
||||
periodSeconds: 5
|
||||
lifecycle:
|
||||
preStop:
|
||||
exec:
|
||||
command: ["/bin/sh", "-c", "sleep 5"]
|
||||
---
|
||||
apiVersion: v1
|
||||
kind: Service
|
||||
metadata:
|
||||
name: fastify-api
|
||||
spec:
|
||||
selector:
|
||||
app: fastify-api
|
||||
ports:
|
||||
- port: 80
|
||||
targetPort: 3000
|
||||
type: ClusterIP
|
||||
```
|
||||
|
||||
## Production Logger Configuration
|
||||
|
||||
Configure logging for production:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: process.env.LOG_LEVEL || 'info',
|
||||
// JSON output for log aggregation
|
||||
formatters: {
|
||||
level: (label) => ({ level: label }),
|
||||
bindings: (bindings) => ({
|
||||
pid: bindings.pid,
|
||||
hostname: bindings.hostname,
|
||||
service: 'fastify-api',
|
||||
version: process.env.APP_VERSION,
|
||||
}),
|
||||
},
|
||||
timestamp: () => `,"time":"${new Date().toISOString()}"`,
|
||||
// Redact sensitive data
|
||||
redact: {
|
||||
paths: [
|
||||
'req.headers.authorization',
|
||||
'req.headers.cookie',
|
||||
'*.password',
|
||||
'*.token',
|
||||
'*.secret',
|
||||
],
|
||||
censor: '[REDACTED]',
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Request Timeouts
|
||||
|
||||
Configure appropriate timeouts:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
connectionTimeout: 30000, // 30s connection timeout
|
||||
keepAliveTimeout: 72000, // 72s keep-alive (longer than ALB 60s)
|
||||
requestTimeout: 30000, // 30s request timeout
|
||||
bodyLimit: 1048576, // 1MB body limit
|
||||
});
|
||||
|
||||
// Per-route timeout
|
||||
app.get('/long-operation', {
|
||||
config: {
|
||||
timeout: 60000, // 60s for this route
|
||||
},
|
||||
}, longOperationHandler);
|
||||
```
|
||||
|
||||
## Trust Proxy Settings
|
||||
|
||||
Configure for load balancers:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
// Trust first proxy (load balancer)
|
||||
trustProxy: true,
|
||||
|
||||
// Or trust specific proxies
|
||||
trustProxy: ['127.0.0.1', '10.0.0.0/8'],
|
||||
|
||||
// Or number of proxies to trust
|
||||
trustProxy: 1,
|
||||
});
|
||||
|
||||
// Now request.ip returns real client IP
|
||||
```
|
||||
|
||||
## Static File Serving
|
||||
|
||||
Serve static files efficiently. **Always use `import.meta.dirname` as the base path**, never `process.cwd()`:
|
||||
|
||||
```typescript
|
||||
import fastifyStatic from '@fastify/static';
|
||||
import { join } from 'node:path';
|
||||
|
||||
app.register(fastifyStatic, {
|
||||
root: join(import.meta.dirname, '..', 'public'),
|
||||
prefix: '/static/',
|
||||
maxAge: '1d',
|
||||
immutable: true,
|
||||
etag: true,
|
||||
lastModified: true,
|
||||
});
|
||||
```
|
||||
|
||||
## Compression
|
||||
|
||||
Enable response compression:
|
||||
|
||||
```typescript
|
||||
import fastifyCompress from '@fastify/compress';
|
||||
|
||||
app.register(fastifyCompress, {
|
||||
global: true,
|
||||
threshold: 1024, // Only compress > 1KB
|
||||
encodings: ['gzip', 'deflate'],
|
||||
});
|
||||
```
|
||||
|
||||
## Metrics and Monitoring
|
||||
|
||||
Expose Prometheus metrics:
|
||||
|
||||
```typescript
|
||||
import { register, collectDefaultMetrics, Counter, Histogram } from 'prom-client';
|
||||
|
||||
collectDefaultMetrics();
|
||||
|
||||
const httpRequestDuration = new Histogram({
|
||||
name: 'http_request_duration_seconds',
|
||||
help: 'Duration of HTTP requests in seconds',
|
||||
labelNames: ['method', 'route', 'status'],
|
||||
buckets: [0.01, 0.05, 0.1, 0.5, 1, 5],
|
||||
});
|
||||
|
||||
const httpRequestTotal = new Counter({
|
||||
name: 'http_requests_total',
|
||||
help: 'Total number of HTTP requests',
|
||||
labelNames: ['method', 'route', 'status'],
|
||||
});
|
||||
|
||||
app.addHook('onResponse', (request, reply, done) => {
|
||||
const route = request.routeOptions.url || request.url;
|
||||
const labels = {
|
||||
method: request.method,
|
||||
route,
|
||||
status: reply.statusCode,
|
||||
};
|
||||
|
||||
httpRequestDuration.observe(labels, reply.elapsedTime / 1000);
|
||||
httpRequestTotal.inc(labels);
|
||||
done();
|
||||
});
|
||||
|
||||
app.get('/metrics', async (request, reply) => {
|
||||
reply.header('Content-Type', register.contentType);
|
||||
return register.metrics();
|
||||
});
|
||||
```
|
||||
|
||||
## Zero-Downtime Deployments
|
||||
|
||||
Support rolling updates:
|
||||
|
||||
```typescript
|
||||
import closeWithGrace from 'close-with-grace';
|
||||
|
||||
// Stop accepting new connections gracefully
|
||||
closeWithGrace({ delay: 30000 }, async ({ signal }) => {
|
||||
app.log.info({ signal }, 'Received shutdown signal');
|
||||
|
||||
// Stop accepting new connections
|
||||
// Existing connections continue to be served
|
||||
|
||||
// Wait for in-flight requests (handled by close-with-grace delay)
|
||||
await app.close();
|
||||
|
||||
app.log.info('Server closed');
|
||||
});
|
||||
```
|
||||
|
||||
412
.agents/skills/fastify-best-practices/rules/error-handling.md
Normal file
412
.agents/skills/fastify-best-practices/rules/error-handling.md
Normal file
|
|
@ -0,0 +1,412 @@
|
|||
---
|
||||
name: error-handling
|
||||
description: Error handling patterns in Fastify
|
||||
metadata:
|
||||
tags: errors, exceptions, error-handler, validation
|
||||
---
|
||||
|
||||
# Error Handling in Fastify
|
||||
|
||||
## Default Error Handler
|
||||
|
||||
Fastify has a built-in error handler. Thrown errors automatically become HTTP responses:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.get('/users/:id', async (request) => {
|
||||
const user = await findUser(request.params.id);
|
||||
if (!user) {
|
||||
// Throwing an error with statusCode sets the response status
|
||||
const error = new Error('User not found');
|
||||
error.statusCode = 404;
|
||||
throw error;
|
||||
}
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Error Classes
|
||||
|
||||
Use `@fastify/error` for creating typed errors:
|
||||
|
||||
```typescript
|
||||
import createError from '@fastify/error';
|
||||
|
||||
const NotFoundError = createError('NOT_FOUND', '%s not found', 404);
|
||||
const UnauthorizedError = createError('UNAUTHORIZED', 'Authentication required', 401);
|
||||
const ForbiddenError = createError('FORBIDDEN', 'Access denied: %s', 403);
|
||||
const ValidationError = createError('VALIDATION_ERROR', '%s', 400);
|
||||
const ConflictError = createError('CONFLICT', '%s already exists', 409);
|
||||
|
||||
// Usage
|
||||
app.get('/users/:id', async (request) => {
|
||||
const user = await findUser(request.params.id);
|
||||
if (!user) {
|
||||
throw new NotFoundError('User');
|
||||
}
|
||||
return user;
|
||||
});
|
||||
|
||||
app.post('/users', async (request) => {
|
||||
const exists = await userExists(request.body.email);
|
||||
if (exists) {
|
||||
throw new ConflictError('Email');
|
||||
}
|
||||
return createUser(request.body);
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Error Handler
|
||||
|
||||
Implement a centralized error handler:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import type { FastifyError, FastifyRequest, FastifyReply } from 'fastify';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.setErrorHandler((error: FastifyError, request: FastifyRequest, reply: FastifyReply) => {
|
||||
// Log the error
|
||||
request.log.error({ err: error }, 'Request error');
|
||||
|
||||
// Handle validation errors
|
||||
if (error.validation) {
|
||||
return reply.code(400).send({
|
||||
statusCode: 400,
|
||||
error: 'Bad Request',
|
||||
message: 'Validation failed',
|
||||
details: error.validation,
|
||||
});
|
||||
}
|
||||
|
||||
// Handle known errors with status codes
|
||||
const statusCode = error.statusCode ?? 500;
|
||||
const code = error.code ?? 'INTERNAL_ERROR';
|
||||
|
||||
// Don't expose internal error details in production
|
||||
const message = statusCode >= 500 && process.env.NODE_ENV === 'production'
|
||||
? 'Internal Server Error'
|
||||
: error.message;
|
||||
|
||||
return reply.code(statusCode).send({
|
||||
statusCode,
|
||||
error: code,
|
||||
message,
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Error Response Schema
|
||||
|
||||
Define consistent error response schemas:
|
||||
|
||||
```typescript
|
||||
app.addSchema({
|
||||
$id: 'httpError',
|
||||
type: 'object',
|
||||
properties: {
|
||||
statusCode: { type: 'integer' },
|
||||
error: { type: 'string' },
|
||||
message: { type: 'string' },
|
||||
details: {
|
||||
type: 'array',
|
||||
items: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
field: { type: 'string' },
|
||||
message: { type: 'string' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
required: ['statusCode', 'error', 'message'],
|
||||
});
|
||||
|
||||
// Use in route schemas
|
||||
app.get('/users/:id', {
|
||||
schema: {
|
||||
params: {
|
||||
type: 'object',
|
||||
properties: { id: { type: 'string' } },
|
||||
required: ['id'],
|
||||
},
|
||||
response: {
|
||||
200: { $ref: 'user#' },
|
||||
404: { $ref: 'httpError#' },
|
||||
500: { $ref: 'httpError#' },
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Reply Helpers with @fastify/sensible
|
||||
|
||||
Use `@fastify/sensible` for standard HTTP errors:
|
||||
|
||||
```typescript
|
||||
import fastifySensible from '@fastify/sensible';
|
||||
|
||||
app.register(fastifySensible);
|
||||
|
||||
app.get('/users/:id', async (request, reply) => {
|
||||
const user = await findUser(request.params.id);
|
||||
if (!user) {
|
||||
return reply.notFound('User not found');
|
||||
}
|
||||
if (!hasAccess(request.user, user)) {
|
||||
return reply.forbidden('You cannot access this user');
|
||||
}
|
||||
return user;
|
||||
});
|
||||
|
||||
// Available methods:
|
||||
// reply.badRequest(message?)
|
||||
// reply.unauthorized(message?)
|
||||
// reply.forbidden(message?)
|
||||
// reply.notFound(message?)
|
||||
// reply.methodNotAllowed(message?)
|
||||
// reply.conflict(message?)
|
||||
// reply.gone(message?)
|
||||
// reply.unprocessableEntity(message?)
|
||||
// reply.tooManyRequests(message?)
|
||||
// reply.internalServerError(message?)
|
||||
// reply.notImplemented(message?)
|
||||
// reply.badGateway(message?)
|
||||
// reply.serviceUnavailable(message?)
|
||||
// reply.gatewayTimeout(message?)
|
||||
```
|
||||
|
||||
## Async Error Handling
|
||||
|
||||
Errors in async handlers are automatically caught:
|
||||
|
||||
```typescript
|
||||
// Errors are automatically caught and passed to error handler
|
||||
app.get('/users', async (request) => {
|
||||
const users = await db.users.findAll(); // If this throws, error handler catches it
|
||||
return users;
|
||||
});
|
||||
|
||||
// Explicit error handling for custom logic
|
||||
app.get('/users/:id', async (request, reply) => {
|
||||
try {
|
||||
const user = await db.users.findById(request.params.id);
|
||||
if (!user) {
|
||||
return reply.code(404).send({ error: 'User not found' });
|
||||
}
|
||||
return user;
|
||||
} catch (error) {
|
||||
// Transform database errors
|
||||
if (error.code === 'CONNECTION_ERROR') {
|
||||
request.log.error({ err: error }, 'Database connection failed');
|
||||
return reply.code(503).send({ error: 'Service temporarily unavailable' });
|
||||
}
|
||||
throw error; // Re-throw for error handler
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Hook Error Handling
|
||||
|
||||
Errors in hooks are handled the same way:
|
||||
|
||||
```typescript
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
const token = request.headers.authorization;
|
||||
if (!token) {
|
||||
// This error goes to the error handler
|
||||
throw new UnauthorizedError();
|
||||
}
|
||||
|
||||
try {
|
||||
request.user = await verifyToken(token);
|
||||
} catch (error) {
|
||||
throw new UnauthorizedError();
|
||||
}
|
||||
});
|
||||
|
||||
// Or use reply to send response directly
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
if (!request.headers.authorization) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return; // Must return to stop processing
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Not Found Handler
|
||||
|
||||
Customize the 404 response:
|
||||
|
||||
```typescript
|
||||
app.setNotFoundHandler(async (request, reply) => {
|
||||
return reply.code(404).send({
|
||||
statusCode: 404,
|
||||
error: 'Not Found',
|
||||
message: `Route ${request.method} ${request.url} not found`,
|
||||
});
|
||||
});
|
||||
|
||||
// With schema validation
|
||||
app.setNotFoundHandler({
|
||||
preValidation: async (request, reply) => {
|
||||
// Pre-validation hook for 404 handler
|
||||
},
|
||||
}, async (request, reply) => {
|
||||
return reply.code(404).send({ error: 'Not Found' });
|
||||
});
|
||||
```
|
||||
|
||||
## Error Wrapping
|
||||
|
||||
Wrap external errors with context:
|
||||
|
||||
```typescript
|
||||
import createError from '@fastify/error';
|
||||
|
||||
const DatabaseError = createError('DATABASE_ERROR', 'Database operation failed: %s', 500);
|
||||
const ExternalServiceError = createError('EXTERNAL_SERVICE_ERROR', 'External service failed: %s', 502);
|
||||
|
||||
app.get('/users/:id', async (request) => {
|
||||
try {
|
||||
return await db.users.findById(request.params.id);
|
||||
} catch (error) {
|
||||
throw new DatabaseError(error.message, { cause: error });
|
||||
}
|
||||
});
|
||||
|
||||
app.get('/weather', async (request) => {
|
||||
try {
|
||||
return await weatherApi.fetch(request.query.city);
|
||||
} catch (error) {
|
||||
throw new ExternalServiceError(error.message, { cause: error });
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Validation Error Customization
|
||||
|
||||
Customize validation error format:
|
||||
|
||||
```typescript
|
||||
app.setErrorHandler((error, request, reply) => {
|
||||
if (error.validation) {
|
||||
const details = error.validation.map((err) => {
|
||||
const field = err.instancePath
|
||||
? err.instancePath.slice(1).replace(/\//g, '.')
|
||||
: err.params?.missingProperty || 'unknown';
|
||||
|
||||
return {
|
||||
field,
|
||||
message: err.message,
|
||||
value: err.data,
|
||||
};
|
||||
});
|
||||
|
||||
return reply.code(400).send({
|
||||
statusCode: 400,
|
||||
error: 'Validation Error',
|
||||
message: `Invalid ${error.validationContext}: ${details.map(d => d.field).join(', ')}`,
|
||||
details,
|
||||
});
|
||||
}
|
||||
|
||||
// Handle other errors...
|
||||
throw error;
|
||||
});
|
||||
```
|
||||
|
||||
## Error Cause Chain
|
||||
|
||||
Preserve error chains for debugging:
|
||||
|
||||
```typescript
|
||||
app.get('/complex-operation', async (request) => {
|
||||
try {
|
||||
await step1();
|
||||
} catch (error) {
|
||||
const wrapped = new Error('Step 1 failed', { cause: error });
|
||||
wrapped.statusCode = 500;
|
||||
throw wrapped;
|
||||
}
|
||||
});
|
||||
|
||||
// In error handler, log the full chain
|
||||
app.setErrorHandler((error, request, reply) => {
|
||||
// Log error with cause chain
|
||||
let current = error;
|
||||
const chain = [];
|
||||
while (current) {
|
||||
chain.push({
|
||||
message: current.message,
|
||||
code: current.code,
|
||||
stack: current.stack,
|
||||
});
|
||||
current = current.cause;
|
||||
}
|
||||
|
||||
request.log.error({ errorChain: chain }, 'Request failed');
|
||||
|
||||
reply.code(error.statusCode || 500).send({
|
||||
error: error.message,
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Plugin-Scoped Error Handlers
|
||||
|
||||
Set error handlers at the plugin level:
|
||||
|
||||
```typescript
|
||||
app.register(async function apiRoutes(fastify) {
|
||||
// This error handler only applies to routes in this plugin
|
||||
fastify.setErrorHandler((error, request, reply) => {
|
||||
request.log.error({ err: error }, 'API error');
|
||||
|
||||
reply.code(error.statusCode || 500).send({
|
||||
error: {
|
||||
code: error.code || 'API_ERROR',
|
||||
message: error.message,
|
||||
},
|
||||
});
|
||||
});
|
||||
|
||||
fastify.get('/data', async () => {
|
||||
throw new Error('API-specific error');
|
||||
});
|
||||
}, { prefix: '/api' });
|
||||
```
|
||||
|
||||
## Graceful Error Recovery
|
||||
|
||||
Handle errors gracefully without crashing:
|
||||
|
||||
```typescript
|
||||
app.get('/resilient', async (request, reply) => {
|
||||
const results = await Promise.allSettled([
|
||||
fetchPrimaryData(),
|
||||
fetchSecondaryData(),
|
||||
fetchOptionalData(),
|
||||
]);
|
||||
|
||||
const [primary, secondary, optional] = results;
|
||||
|
||||
if (primary.status === 'rejected') {
|
||||
// Primary data is required
|
||||
throw new Error('Primary data unavailable');
|
||||
}
|
||||
|
||||
return {
|
||||
data: primary.value,
|
||||
secondary: secondary.status === 'fulfilled' ? secondary.value : null,
|
||||
optional: optional.status === 'fulfilled' ? optional.value : null,
|
||||
warnings: results
|
||||
.filter((r) => r.status === 'rejected')
|
||||
.map((r) => r.reason.message),
|
||||
};
|
||||
});
|
||||
```
|
||||
464
.agents/skills/fastify-best-practices/rules/hooks.md
Normal file
464
.agents/skills/fastify-best-practices/rules/hooks.md
Normal file
|
|
@ -0,0 +1,464 @@
|
|||
---
|
||||
name: hooks
|
||||
description: Hooks and request lifecycle in Fastify
|
||||
metadata:
|
||||
tags: hooks, lifecycle, middleware, onRequest, preHandler
|
||||
---
|
||||
|
||||
# Hooks and Request Lifecycle
|
||||
|
||||
## Request Lifecycle Overview
|
||||
|
||||
Fastify executes hooks in a specific order:
|
||||
|
||||
```
|
||||
Incoming Request
|
||||
|
|
||||
onRequest
|
||||
|
|
||||
preParsing
|
||||
|
|
||||
preValidation
|
||||
|
|
||||
preHandler
|
||||
|
|
||||
Handler
|
||||
|
|
||||
preSerialization
|
||||
|
|
||||
onSend
|
||||
|
|
||||
onResponse
|
||||
```
|
||||
|
||||
## onRequest Hook
|
||||
|
||||
First hook to execute, before body parsing. Use for authentication, request ID setup:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Global onRequest hook
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
request.startTime = Date.now();
|
||||
request.log.info({ url: request.url, method: request.method }, 'Request started');
|
||||
});
|
||||
|
||||
// Authentication check
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
// Skip auth for public routes
|
||||
if (request.url.startsWith('/public')) {
|
||||
return;
|
||||
}
|
||||
|
||||
const token = request.headers.authorization?.replace('Bearer ', '');
|
||||
if (!token) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return; // Stop processing
|
||||
}
|
||||
|
||||
try {
|
||||
request.user = await verifyToken(token);
|
||||
} catch {
|
||||
reply.code(401).send({ error: 'Invalid token' });
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## preParsing Hook
|
||||
|
||||
Execute before body parsing. Can modify the payload stream:
|
||||
|
||||
```typescript
|
||||
app.addHook('preParsing', async (request, reply, payload) => {
|
||||
// Log raw payload size
|
||||
request.log.debug({ contentLength: request.headers['content-length'] }, 'Parsing body');
|
||||
|
||||
// Return modified payload stream if needed
|
||||
return payload;
|
||||
});
|
||||
|
||||
// Decompress incoming data
|
||||
app.addHook('preParsing', async (request, reply, payload) => {
|
||||
if (request.headers['content-encoding'] === 'gzip') {
|
||||
return payload.pipe(zlib.createGunzip());
|
||||
}
|
||||
return payload;
|
||||
});
|
||||
```
|
||||
|
||||
## preValidation Hook
|
||||
|
||||
Execute after parsing, before schema validation:
|
||||
|
||||
```typescript
|
||||
app.addHook('preValidation', async (request, reply) => {
|
||||
// Modify body before validation
|
||||
if (request.body && typeof request.body === 'object') {
|
||||
// Normalize data
|
||||
request.body.email = request.body.email?.toLowerCase().trim();
|
||||
}
|
||||
});
|
||||
|
||||
// Rate limiting check
|
||||
app.addHook('preValidation', async (request, reply) => {
|
||||
const key = request.ip;
|
||||
const count = await redis.incr(`ratelimit:${key}`);
|
||||
|
||||
if (count === 1) {
|
||||
await redis.expire(`ratelimit:${key}`, 60);
|
||||
}
|
||||
|
||||
if (count > 100) {
|
||||
reply.code(429).send({ error: 'Too many requests' });
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## preHandler Hook
|
||||
|
||||
Most common hook, execute after validation, before handler:
|
||||
|
||||
```typescript
|
||||
// Authorization check
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
const { userId } = request.params as { userId: string };
|
||||
|
||||
if (request.user.id !== userId && !request.user.isAdmin) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
});
|
||||
|
||||
// Load related data
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
if (request.params?.projectId) {
|
||||
request.project = await db.projects.findById(request.params.projectId);
|
||||
if (!request.project) {
|
||||
reply.code(404).send({ error: 'Project not found' });
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Transaction wrapper
|
||||
app.addHook('preHandler', async (request) => {
|
||||
request.transaction = await db.beginTransaction();
|
||||
});
|
||||
|
||||
app.addHook('onResponse', async (request) => {
|
||||
if (request.transaction) {
|
||||
await request.transaction.commit();
|
||||
}
|
||||
});
|
||||
|
||||
app.addHook('onError', async (request, reply, error) => {
|
||||
if (request.transaction) {
|
||||
await request.transaction.rollback();
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## preSerialization Hook
|
||||
|
||||
Modify payload before serialization:
|
||||
|
||||
```typescript
|
||||
app.addHook('preSerialization', async (request, reply, payload) => {
|
||||
// Add metadata to all responses
|
||||
if (payload && typeof payload === 'object') {
|
||||
return {
|
||||
...payload,
|
||||
_meta: {
|
||||
requestId: request.id,
|
||||
timestamp: new Date().toISOString(),
|
||||
},
|
||||
};
|
||||
}
|
||||
return payload;
|
||||
});
|
||||
|
||||
// Remove sensitive fields
|
||||
app.addHook('preSerialization', async (request, reply, payload) => {
|
||||
if (payload?.user?.password) {
|
||||
const { password, ...user } = payload.user;
|
||||
return { ...payload, user };
|
||||
}
|
||||
return payload;
|
||||
});
|
||||
```
|
||||
|
||||
## onSend Hook
|
||||
|
||||
Modify response after serialization:
|
||||
|
||||
```typescript
|
||||
app.addHook('onSend', async (request, reply, payload) => {
|
||||
// Add response headers
|
||||
reply.header('X-Response-Time', Date.now() - request.startTime);
|
||||
|
||||
// Compress response
|
||||
if (payload && payload.length > 1024) {
|
||||
const compressed = await gzip(payload);
|
||||
reply.header('Content-Encoding', 'gzip');
|
||||
return compressed;
|
||||
}
|
||||
|
||||
return payload;
|
||||
});
|
||||
|
||||
// Transform JSON string response
|
||||
app.addHook('onSend', async (request, reply, payload) => {
|
||||
if (reply.getHeader('content-type')?.includes('application/json')) {
|
||||
// payload is already a string at this point
|
||||
return payload;
|
||||
}
|
||||
return payload;
|
||||
});
|
||||
```
|
||||
|
||||
## onResponse Hook
|
||||
|
||||
Execute after response is sent. Cannot modify response:
|
||||
|
||||
```typescript
|
||||
app.addHook('onResponse', async (request, reply) => {
|
||||
// Log response time
|
||||
const responseTime = Date.now() - request.startTime;
|
||||
request.log.info({
|
||||
method: request.method,
|
||||
url: request.url,
|
||||
statusCode: reply.statusCode,
|
||||
responseTime,
|
||||
}, 'Request completed');
|
||||
|
||||
// Track metrics
|
||||
metrics.histogram('http_request_duration', responseTime, {
|
||||
method: request.method,
|
||||
route: request.routeOptions.url,
|
||||
status: reply.statusCode,
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## onError Hook
|
||||
|
||||
Execute when an error is thrown:
|
||||
|
||||
```typescript
|
||||
app.addHook('onError', async (request, reply, error) => {
|
||||
// Log error details
|
||||
request.log.error({
|
||||
err: error,
|
||||
url: request.url,
|
||||
method: request.method,
|
||||
body: request.body,
|
||||
}, 'Request error');
|
||||
|
||||
// Track error metrics
|
||||
metrics.increment('http_errors', {
|
||||
error: error.code || 'UNKNOWN',
|
||||
route: request.routeOptions.url,
|
||||
});
|
||||
|
||||
// Cleanup resources
|
||||
if (request.tempFile) {
|
||||
await fs.unlink(request.tempFile).catch(() => {});
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## onTimeout Hook
|
||||
|
||||
Execute when request times out:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
connectionTimeout: 30000, // 30 seconds
|
||||
});
|
||||
|
||||
app.addHook('onTimeout', async (request, reply) => {
|
||||
request.log.warn({
|
||||
url: request.url,
|
||||
method: request.method,
|
||||
}, 'Request timeout');
|
||||
|
||||
// Cleanup
|
||||
if (request.abortController) {
|
||||
request.abortController.abort();
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## onRequestAbort Hook
|
||||
|
||||
Execute when client closes connection:
|
||||
|
||||
```typescript
|
||||
app.addHook('onRequestAbort', async (request) => {
|
||||
request.log.info('Client aborted request');
|
||||
|
||||
// Cancel ongoing operations
|
||||
if (request.abortController) {
|
||||
request.abortController.abort();
|
||||
}
|
||||
|
||||
// Cleanup uploaded files
|
||||
if (request.uploadedFiles) {
|
||||
for (const file of request.uploadedFiles) {
|
||||
await fs.unlink(file.path).catch(() => {});
|
||||
}
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Application Lifecycle Hooks
|
||||
|
||||
Hooks that run at application startup/shutdown:
|
||||
|
||||
```typescript
|
||||
// After all plugins are loaded
|
||||
app.addHook('onReady', async function () {
|
||||
this.log.info('Server is ready');
|
||||
|
||||
// Initialize connections
|
||||
await this.db.connect();
|
||||
await this.redis.connect();
|
||||
|
||||
// Warm caches
|
||||
await this.cache.warmup();
|
||||
});
|
||||
|
||||
// When server is closing
|
||||
app.addHook('onClose', async function () {
|
||||
this.log.info('Server is closing');
|
||||
|
||||
// Cleanup connections
|
||||
await this.db.close();
|
||||
await this.redis.disconnect();
|
||||
});
|
||||
|
||||
// After routes are registered
|
||||
app.addHook('onRoute', (routeOptions) => {
|
||||
console.log(`Route registered: ${routeOptions.method} ${routeOptions.url}`);
|
||||
|
||||
// Track all routes
|
||||
routes.push({
|
||||
method: routeOptions.method,
|
||||
url: routeOptions.url,
|
||||
schema: routeOptions.schema,
|
||||
});
|
||||
});
|
||||
|
||||
// After plugin is registered
|
||||
app.addHook('onRegister', (instance, options) => {
|
||||
console.log(`Plugin registered with prefix: ${options.prefix}`);
|
||||
});
|
||||
```
|
||||
|
||||
## Scoped Hooks
|
||||
|
||||
Hooks are scoped to their encapsulation context:
|
||||
|
||||
```typescript
|
||||
app.addHook('onRequest', async (request) => {
|
||||
// Runs for ALL routes
|
||||
request.log.info('Global hook');
|
||||
});
|
||||
|
||||
app.register(async function adminRoutes(fastify) {
|
||||
// Only runs for routes in this plugin
|
||||
fastify.addHook('onRequest', async (request, reply) => {
|
||||
if (!request.user?.isAdmin) {
|
||||
reply.code(403).send({ error: 'Admin only' });
|
||||
}
|
||||
});
|
||||
|
||||
fastify.get('/admin/users', async () => {
|
||||
return { users: [] };
|
||||
});
|
||||
}, { prefix: '/admin' });
|
||||
```
|
||||
|
||||
## Hook Execution Order
|
||||
|
||||
Multiple hooks of the same type execute in registration order:
|
||||
|
||||
```typescript
|
||||
app.addHook('onRequest', async () => {
|
||||
console.log('First');
|
||||
});
|
||||
|
||||
app.addHook('onRequest', async () => {
|
||||
console.log('Second');
|
||||
});
|
||||
|
||||
app.addHook('onRequest', async () => {
|
||||
console.log('Third');
|
||||
});
|
||||
|
||||
// Output: First, Second, Third
|
||||
```
|
||||
|
||||
## Stopping Hook Execution
|
||||
|
||||
Return early from hooks to stop processing:
|
||||
|
||||
```typescript
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
if (!request.user) {
|
||||
// Send response and return to stop further processing
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return;
|
||||
}
|
||||
// Continue to next hook and handler
|
||||
});
|
||||
```
|
||||
|
||||
## Route-Level Hooks
|
||||
|
||||
Add hooks to specific routes:
|
||||
|
||||
```typescript
|
||||
const adminOnlyHook = async (request, reply) => {
|
||||
if (!request.user?.isAdmin) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
};
|
||||
|
||||
app.get('/admin/settings', {
|
||||
preHandler: [adminOnlyHook],
|
||||
handler: async (request) => {
|
||||
return { settings: {} };
|
||||
},
|
||||
});
|
||||
|
||||
// Multiple hooks
|
||||
app.post('/orders', {
|
||||
preValidation: [validateApiKey],
|
||||
preHandler: [loadUser, checkQuota, logOrder],
|
||||
handler: createOrderHandler,
|
||||
});
|
||||
```
|
||||
|
||||
## Async Hook Patterns
|
||||
|
||||
Always use async/await in hooks:
|
||||
|
||||
```typescript
|
||||
// GOOD - async hook
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
const user = await loadUser(request.headers.authorization);
|
||||
request.user = user;
|
||||
});
|
||||
|
||||
// AVOID - callback style (deprecated)
|
||||
app.addHook('preHandler', (request, reply, done) => {
|
||||
loadUser(request.headers.authorization)
|
||||
.then((user) => {
|
||||
request.user = user;
|
||||
done();
|
||||
})
|
||||
.catch(done);
|
||||
});
|
||||
```
|
||||
247
.agents/skills/fastify-best-practices/rules/http-proxy.md
Normal file
247
.agents/skills/fastify-best-practices/rules/http-proxy.md
Normal file
|
|
@ -0,0 +1,247 @@
|
|||
---
|
||||
name: http-proxy
|
||||
description: HTTP proxying and reply.from() in Fastify
|
||||
metadata:
|
||||
tags: proxy, gateway, reverse-proxy, microservices
|
||||
---
|
||||
|
||||
# HTTP Proxy and Reply.from()
|
||||
|
||||
## @fastify/http-proxy
|
||||
|
||||
Use `@fastify/http-proxy` for simple reverse proxy scenarios:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import httpProxy from '@fastify/http-proxy';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
// Proxy all requests to /api/* to another service
|
||||
app.register(httpProxy, {
|
||||
upstream: 'http://backend-service:3001',
|
||||
prefix: '/api',
|
||||
rewritePrefix: '/v1',
|
||||
http2: false,
|
||||
});
|
||||
|
||||
// With authentication
|
||||
app.register(httpProxy, {
|
||||
upstream: 'http://internal-api:3002',
|
||||
prefix: '/internal',
|
||||
preHandler: async (request, reply) => {
|
||||
// Verify authentication before proxying
|
||||
if (!request.headers.authorization) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
await app.listen({ port: 3000 });
|
||||
```
|
||||
|
||||
## @fastify/reply-from
|
||||
|
||||
For more control over proxying, use `@fastify/reply-from` with `reply.from()`:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import replyFrom from '@fastify/reply-from';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.register(replyFrom, {
|
||||
base: 'http://backend-service:3001',
|
||||
http2: false,
|
||||
});
|
||||
|
||||
// Proxy with request/response manipulation
|
||||
app.get('/users/:id', async (request, reply) => {
|
||||
const { id } = request.params;
|
||||
|
||||
return reply.from(`/api/users/${id}`, {
|
||||
// Modify request before forwarding
|
||||
rewriteRequestHeaders: (originalReq, headers) => ({
|
||||
...headers,
|
||||
'x-request-id': request.id,
|
||||
'x-forwarded-for': request.ip,
|
||||
}),
|
||||
// Modify response before sending
|
||||
onResponse: (request, reply, res) => {
|
||||
reply.header('x-proxy', 'fastify');
|
||||
reply.send(res);
|
||||
},
|
||||
});
|
||||
});
|
||||
|
||||
// Conditional routing
|
||||
app.all('/api/*', async (request, reply) => {
|
||||
const upstream = selectUpstream(request);
|
||||
|
||||
return reply.from(request.url, {
|
||||
base: upstream,
|
||||
});
|
||||
});
|
||||
|
||||
function selectUpstream(request) {
|
||||
// Route to different backends based on request
|
||||
if (request.headers['x-beta']) {
|
||||
return 'http://beta-backend:3001';
|
||||
}
|
||||
return 'http://stable-backend:3001';
|
||||
}
|
||||
```
|
||||
|
||||
## API Gateway Pattern
|
||||
|
||||
Build an API gateway with multiple backends:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import replyFrom from '@fastify/reply-from';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
// Configure multiple upstreams
|
||||
const services = {
|
||||
users: 'http://users-service:3001',
|
||||
orders: 'http://orders-service:3002',
|
||||
products: 'http://products-service:3003',
|
||||
};
|
||||
|
||||
app.register(replyFrom);
|
||||
|
||||
// Route to user service
|
||||
app.register(async function (fastify) {
|
||||
fastify.all('/*', async (request, reply) => {
|
||||
return reply.from(request.url.replace('/users', ''), {
|
||||
base: services.users,
|
||||
});
|
||||
});
|
||||
}, { prefix: '/users' });
|
||||
|
||||
// Route to orders service
|
||||
app.register(async function (fastify) {
|
||||
fastify.all('/*', async (request, reply) => {
|
||||
return reply.from(request.url.replace('/orders', ''), {
|
||||
base: services.orders,
|
||||
});
|
||||
});
|
||||
}, { prefix: '/orders' });
|
||||
|
||||
// Route to products service
|
||||
app.register(async function (fastify) {
|
||||
fastify.all('/*', async (request, reply) => {
|
||||
return reply.from(request.url.replace('/products', ''), {
|
||||
base: services.products,
|
||||
});
|
||||
});
|
||||
}, { prefix: '/products' });
|
||||
```
|
||||
|
||||
## Request Body Handling
|
||||
|
||||
Handle request bodies when proxying:
|
||||
|
||||
```typescript
|
||||
app.post('/api/data', async (request, reply) => {
|
||||
return reply.from('/data', {
|
||||
body: request.body,
|
||||
contentType: request.headers['content-type'],
|
||||
});
|
||||
});
|
||||
|
||||
// Stream large bodies
|
||||
app.post('/upload', async (request, reply) => {
|
||||
return reply.from('/upload', {
|
||||
body: request.raw,
|
||||
contentType: request.headers['content-type'],
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
Handle upstream errors gracefully:
|
||||
|
||||
```typescript
|
||||
app.register(replyFrom, {
|
||||
base: 'http://backend:3001',
|
||||
// Called when upstream returns an error
|
||||
onError: (reply, error) => {
|
||||
reply.log.error({ err: error }, 'Proxy error');
|
||||
reply.code(502).send({
|
||||
error: 'Bad Gateway',
|
||||
message: 'Upstream service unavailable',
|
||||
});
|
||||
},
|
||||
});
|
||||
|
||||
// Custom error handling per route
|
||||
app.get('/data', async (request, reply) => {
|
||||
try {
|
||||
return await reply.from('/data');
|
||||
} catch (error) {
|
||||
request.log.error({ err: error }, 'Failed to proxy request');
|
||||
return reply.code(503).send({
|
||||
error: 'Service Unavailable',
|
||||
retryAfter: 30,
|
||||
});
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## WebSocket Proxying
|
||||
|
||||
Proxy WebSocket connections:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import httpProxy from '@fastify/http-proxy';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.register(httpProxy, {
|
||||
upstream: 'http://ws-backend:3001',
|
||||
prefix: '/ws',
|
||||
websocket: true,
|
||||
});
|
||||
```
|
||||
|
||||
## Timeout Configuration
|
||||
|
||||
Configure proxy timeouts:
|
||||
|
||||
```typescript
|
||||
app.register(replyFrom, {
|
||||
base: 'http://backend:3001',
|
||||
http: {
|
||||
requestOptions: {
|
||||
timeout: 30000, // 30 seconds
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Caching Proxied Responses
|
||||
|
||||
Add caching to proxied responses:
|
||||
|
||||
```typescript
|
||||
import { createCache } from 'async-cache-dedupe';
|
||||
|
||||
const cache = createCache({
|
||||
ttl: 60,
|
||||
storage: { type: 'memory' },
|
||||
});
|
||||
|
||||
cache.define('proxyGet', async (url: string) => {
|
||||
const response = await fetch(`http://backend:3001${url}`);
|
||||
return response.json();
|
||||
});
|
||||
|
||||
app.get('/cached/*', async (request, reply) => {
|
||||
const data = await cache.proxyGet(request.url);
|
||||
return data;
|
||||
});
|
||||
```
|
||||
402
.agents/skills/fastify-best-practices/rules/logging.md
Normal file
402
.agents/skills/fastify-best-practices/rules/logging.md
Normal file
|
|
@ -0,0 +1,402 @@
|
|||
---
|
||||
name: logging
|
||||
description: Logging with Pino in Fastify
|
||||
metadata:
|
||||
tags: logging, pino, debugging, observability
|
||||
---
|
||||
|
||||
# Logging with Pino
|
||||
|
||||
## Built-in Pino Integration
|
||||
|
||||
Fastify uses Pino for high-performance logging:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
logger: true, // Enable default logging
|
||||
});
|
||||
|
||||
// Or with configuration
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
transport: {
|
||||
target: 'pino-pretty',
|
||||
options: {
|
||||
colorize: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Log Levels
|
||||
|
||||
Available log levels (in order of severity):
|
||||
|
||||
```typescript
|
||||
app.log.trace('Detailed debugging');
|
||||
app.log.debug('Debugging information');
|
||||
app.log.info('General information');
|
||||
app.log.warn('Warning messages');
|
||||
app.log.error('Error messages');
|
||||
app.log.fatal('Fatal errors');
|
||||
```
|
||||
|
||||
## Request-Scoped Logging
|
||||
|
||||
Each request has its own logger with request context:
|
||||
|
||||
```typescript
|
||||
app.get('/users/:id', async (request) => {
|
||||
// Logs include request ID automatically
|
||||
request.log.info('Fetching user');
|
||||
|
||||
const user = await db.users.findById(request.params.id);
|
||||
|
||||
if (!user) {
|
||||
request.log.warn({ userId: request.params.id }, 'User not found');
|
||||
return { error: 'Not found' };
|
||||
}
|
||||
|
||||
request.log.info({ userId: user.id }, 'User fetched');
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## Structured Logging
|
||||
|
||||
Always use structured logging with objects:
|
||||
|
||||
```typescript
|
||||
// GOOD - structured, searchable
|
||||
request.log.info({
|
||||
action: 'user_created',
|
||||
userId: user.id,
|
||||
email: user.email,
|
||||
}, 'User created successfully');
|
||||
|
||||
request.log.error({
|
||||
err: error,
|
||||
userId: request.params.id,
|
||||
operation: 'fetch_user',
|
||||
}, 'Failed to fetch user');
|
||||
|
||||
// BAD - unstructured, hard to parse
|
||||
request.log.info(`User ${user.id} created with email ${user.email}`);
|
||||
request.log.error(`Failed to fetch user: ${error.message}`);
|
||||
```
|
||||
|
||||
## Logging Configuration by Environment
|
||||
|
||||
```typescript
|
||||
function getLoggerConfig() {
|
||||
if (process.env.NODE_ENV === 'production') {
|
||||
return {
|
||||
level: 'info',
|
||||
// JSON output for log aggregation
|
||||
};
|
||||
}
|
||||
|
||||
if (process.env.NODE_ENV === 'test') {
|
||||
return false; // Disable logging in tests
|
||||
}
|
||||
|
||||
// Development
|
||||
return {
|
||||
level: 'debug',
|
||||
transport: {
|
||||
target: 'pino-pretty',
|
||||
options: {
|
||||
colorize: true,
|
||||
translateTime: 'HH:MM:ss Z',
|
||||
ignore: 'pid,hostname',
|
||||
},
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
const app = Fastify({
|
||||
logger: getLoggerConfig(),
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Serializers
|
||||
|
||||
Customize how objects are serialized:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
serializers: {
|
||||
// Customize request serialization
|
||||
req: (request) => ({
|
||||
method: request.method,
|
||||
url: request.url,
|
||||
headers: {
|
||||
host: request.headers.host,
|
||||
'user-agent': request.headers['user-agent'],
|
||||
},
|
||||
remoteAddress: request.ip,
|
||||
}),
|
||||
|
||||
// Customize response serialization
|
||||
res: (response) => ({
|
||||
statusCode: response.statusCode,
|
||||
}),
|
||||
|
||||
// Custom serializer for users
|
||||
user: (user) => ({
|
||||
id: user.id,
|
||||
email: user.email,
|
||||
// Exclude sensitive fields
|
||||
}),
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Use custom serializer
|
||||
request.log.info({ user: request.user }, 'User action');
|
||||
```
|
||||
|
||||
## Redacting Sensitive Data
|
||||
|
||||
Prevent logging sensitive information:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
redact: {
|
||||
paths: [
|
||||
'req.headers.authorization',
|
||||
'req.headers.cookie',
|
||||
'body.password',
|
||||
'body.creditCard',
|
||||
'*.password',
|
||||
'*.secret',
|
||||
'*.token',
|
||||
],
|
||||
censor: '[REDACTED]',
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Child Loggers
|
||||
|
||||
Create child loggers with additional context:
|
||||
|
||||
```typescript
|
||||
app.addHook('onRequest', async (request) => {
|
||||
// Add user context to all logs for this request
|
||||
if (request.user) {
|
||||
request.log = request.log.child({
|
||||
userId: request.user.id,
|
||||
userRole: request.user.role,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Service-level child logger
|
||||
const userService = {
|
||||
log: app.log.child({ service: 'UserService' }),
|
||||
|
||||
async create(data) {
|
||||
this.log.info({ email: data.email }, 'Creating user');
|
||||
// ...
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
## Request Logging Configuration
|
||||
|
||||
Customize automatic request logging:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
logger: true,
|
||||
disableRequestLogging: true, // Disable default request/response logs
|
||||
});
|
||||
|
||||
// Custom request logging
|
||||
app.addHook('onRequest', async (request) => {
|
||||
request.log.info({
|
||||
method: request.method,
|
||||
url: request.url,
|
||||
query: request.query,
|
||||
}, 'Request received');
|
||||
});
|
||||
|
||||
app.addHook('onResponse', async (request, reply) => {
|
||||
request.log.info({
|
||||
statusCode: reply.statusCode,
|
||||
responseTime: reply.elapsedTime,
|
||||
}, 'Request completed');
|
||||
});
|
||||
```
|
||||
|
||||
## Logging Errors
|
||||
|
||||
Properly log errors with stack traces:
|
||||
|
||||
```typescript
|
||||
app.setErrorHandler((error, request, reply) => {
|
||||
// Log error with full details
|
||||
request.log.error({
|
||||
err: error, // Pino serializes error objects properly
|
||||
url: request.url,
|
||||
method: request.method,
|
||||
body: request.body,
|
||||
query: request.query,
|
||||
}, 'Request error');
|
||||
|
||||
reply.code(error.statusCode || 500).send({
|
||||
error: error.message,
|
||||
});
|
||||
});
|
||||
|
||||
// In handlers
|
||||
app.get('/data', async (request) => {
|
||||
try {
|
||||
return await fetchData();
|
||||
} catch (error) {
|
||||
request.log.error({ err: error }, 'Failed to fetch data');
|
||||
throw error;
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Log Destinations
|
||||
|
||||
Configure where logs are sent:
|
||||
|
||||
```typescript
|
||||
import { createWriteStream } from 'node:fs';
|
||||
|
||||
// File output
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
stream: createWriteStream('./app.log'),
|
||||
},
|
||||
});
|
||||
|
||||
// Multiple destinations with pino.multistream
|
||||
import pino from 'pino';
|
||||
|
||||
const streams = [
|
||||
{ stream: process.stdout },
|
||||
{ stream: createWriteStream('./app.log') },
|
||||
{ level: 'error', stream: createWriteStream('./error.log') },
|
||||
];
|
||||
|
||||
const app = Fastify({
|
||||
logger: pino({ level: 'info' }, pino.multistream(streams)),
|
||||
});
|
||||
```
|
||||
|
||||
## Log Rotation
|
||||
|
||||
Use pino-roll for log rotation:
|
||||
|
||||
```bash
|
||||
node app.js | pino-roll --frequency daily --extension .log
|
||||
```
|
||||
|
||||
Or configure programmatically:
|
||||
|
||||
```typescript
|
||||
import { createStream } from 'rotating-file-stream';
|
||||
|
||||
const stream = createStream('app.log', {
|
||||
size: '10M', // Rotate every 10MB
|
||||
interval: '1d', // Rotate daily
|
||||
compress: 'gzip',
|
||||
path: './logs',
|
||||
});
|
||||
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
stream,
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Log Aggregation
|
||||
|
||||
Format logs for aggregation services:
|
||||
|
||||
```typescript
|
||||
// For ELK Stack, Datadog, etc. - use default JSON format
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
// Default JSON output works with most log aggregators
|
||||
},
|
||||
});
|
||||
|
||||
// Add service metadata
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
base: {
|
||||
service: 'user-api',
|
||||
version: process.env.APP_VERSION,
|
||||
environment: process.env.NODE_ENV,
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Request ID Tracking
|
||||
|
||||
Use request IDs for distributed tracing:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
logger: true,
|
||||
requestIdHeader: 'x-request-id', // Use incoming header
|
||||
genReqId: (request) => {
|
||||
// Generate ID if not provided
|
||||
return request.headers['x-request-id'] || crypto.randomUUID();
|
||||
},
|
||||
});
|
||||
|
||||
// Forward request ID to downstream services
|
||||
app.addHook('onRequest', async (request) => {
|
||||
request.requestId = request.id;
|
||||
});
|
||||
|
||||
// Include in outgoing requests
|
||||
const response = await fetch('http://other-service/api', {
|
||||
headers: {
|
||||
'x-request-id': request.id,
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
Pino is fast, but consider:
|
||||
|
||||
```typescript
|
||||
// Avoid string concatenation in log calls
|
||||
// BAD
|
||||
request.log.info('User ' + user.id + ' did ' + action);
|
||||
|
||||
// GOOD
|
||||
request.log.info({ userId: user.id, action }, 'User action');
|
||||
|
||||
// Use appropriate log levels
|
||||
// Don't log at info level in hot paths
|
||||
if (app.log.isLevelEnabled('debug')) {
|
||||
request.log.debug({ details: expensiveToCompute() }, 'Debug info');
|
||||
}
|
||||
```
|
||||
425
.agents/skills/fastify-best-practices/rules/performance.md
Normal file
425
.agents/skills/fastify-best-practices/rules/performance.md
Normal file
|
|
@ -0,0 +1,425 @@
|
|||
---
|
||||
name: performance
|
||||
description: Performance optimization for Fastify applications
|
||||
metadata:
|
||||
tags: performance, optimization, speed, benchmarking
|
||||
---
|
||||
|
||||
# Performance Optimization
|
||||
|
||||
## Fastify is Fast by Default
|
||||
|
||||
Fastify is designed for performance. Key optimizations are built-in:
|
||||
|
||||
- Fast JSON serialization with `fast-json-stringify`
|
||||
- Efficient routing with `find-my-way`
|
||||
- Schema-based validation with `ajv` (compiled validators)
|
||||
- Low overhead request/response handling
|
||||
|
||||
## Use @fastify/under-pressure for Load Shedding
|
||||
|
||||
Protect your application from overload with `@fastify/under-pressure`:
|
||||
|
||||
```typescript
|
||||
import underPressure from '@fastify/under-pressure';
|
||||
|
||||
app.register(underPressure, {
|
||||
maxEventLoopDelay: 1000, // Max event loop delay in ms
|
||||
maxHeapUsedBytes: 1000000000, // Max heap used (~1GB)
|
||||
maxRssBytes: 1500000000, // Max RSS (~1.5GB)
|
||||
maxEventLoopUtilization: 0.98, // Max event loop utilization
|
||||
pressureHandler: (request, reply, type, value) => {
|
||||
reply.code(503).send({
|
||||
error: 'Service Unavailable',
|
||||
message: `Server under pressure: ${type}`,
|
||||
});
|
||||
},
|
||||
});
|
||||
|
||||
// Health check that respects pressure
|
||||
app.get('/health', async (request, reply) => {
|
||||
return { status: 'ok' };
|
||||
});
|
||||
```
|
||||
|
||||
## Always Define Response Schemas
|
||||
|
||||
Response schemas enable fast-json-stringify, which is significantly faster than JSON.stringify:
|
||||
|
||||
```typescript
|
||||
// FAST - uses fast-json-stringify
|
||||
app.get('/users', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'array',
|
||||
items: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return db.users.findAll();
|
||||
});
|
||||
|
||||
// SLOW - uses JSON.stringify
|
||||
app.get('/users-slow', async () => {
|
||||
return db.users.findAll();
|
||||
});
|
||||
```
|
||||
|
||||
## Avoid Dynamic Schema Compilation
|
||||
|
||||
Add schemas at startup, not at request time:
|
||||
|
||||
```typescript
|
||||
// GOOD - schemas compiled at startup
|
||||
app.addSchema({ $id: 'user', ... });
|
||||
|
||||
app.get('/users', {
|
||||
schema: { response: { 200: { $ref: 'user#' } } },
|
||||
}, handler);
|
||||
|
||||
// BAD - schema compiled per request
|
||||
app.get('/users', async (request, reply) => {
|
||||
const schema = getSchemaForUser(request.user);
|
||||
// This is slow!
|
||||
});
|
||||
```
|
||||
|
||||
## Use Logger Wisely
|
||||
|
||||
Pino is fast, but excessive logging has overhead:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
// Set log level via environment variable
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: process.env.LOG_LEVEL || 'info',
|
||||
},
|
||||
});
|
||||
|
||||
// Avoid logging large objects
|
||||
app.get('/data', async (request) => {
|
||||
// BAD - logs entire payload
|
||||
request.log.info({ data: largeObject }, 'Processing');
|
||||
|
||||
// GOOD - log only what's needed
|
||||
request.log.info({ id: largeObject.id }, 'Processing');
|
||||
|
||||
return largeObject;
|
||||
});
|
||||
```
|
||||
|
||||
## Connection Pooling
|
||||
|
||||
Use connection pools for databases:
|
||||
|
||||
```typescript
|
||||
import postgres from 'postgres';
|
||||
|
||||
// Create pool at startup
|
||||
const sql = postgres(process.env.DATABASE_URL, {
|
||||
max: 20, // Maximum pool size
|
||||
idle_timeout: 20,
|
||||
connect_timeout: 10,
|
||||
});
|
||||
|
||||
app.decorate('db', sql);
|
||||
|
||||
// Connections are reused
|
||||
app.get('/users', async () => {
|
||||
return app.db`SELECT * FROM users LIMIT 100`;
|
||||
});
|
||||
```
|
||||
|
||||
## Avoid Blocking the Event Loop
|
||||
|
||||
Use `piscina` for CPU-intensive operations. It provides a robust worker thread pool:
|
||||
|
||||
```typescript
|
||||
import Piscina from 'piscina';
|
||||
import { join } from 'node:path';
|
||||
|
||||
const piscina = new Piscina({
|
||||
filename: join(import.meta.dirname, 'workers', 'compute.js'),
|
||||
});
|
||||
|
||||
app.post('/compute', async (request) => {
|
||||
const result = await piscina.run(request.body);
|
||||
return result;
|
||||
});
|
||||
```
|
||||
|
||||
```typescript
|
||||
// workers/compute.js
|
||||
export default function compute(data) {
|
||||
// CPU-intensive work here
|
||||
return processedResult;
|
||||
}
|
||||
```
|
||||
|
||||
## Stream Large Responses
|
||||
|
||||
Stream large payloads instead of buffering:
|
||||
|
||||
```typescript
|
||||
import { createReadStream } from 'node:fs';
|
||||
import { pipeline } from 'node:stream/promises';
|
||||
|
||||
// GOOD - stream file
|
||||
app.get('/large-file', async (request, reply) => {
|
||||
const stream = createReadStream('./large-file.json');
|
||||
reply.type('application/json');
|
||||
return reply.send(stream);
|
||||
});
|
||||
|
||||
// BAD - load entire file into memory
|
||||
app.get('/large-file-bad', async () => {
|
||||
const content = await fs.readFile('./large-file.json', 'utf-8');
|
||||
return JSON.parse(content);
|
||||
});
|
||||
|
||||
// Stream database results
|
||||
app.get('/export', async (request, reply) => {
|
||||
reply.type('application/json');
|
||||
|
||||
const cursor = db.users.findCursor();
|
||||
reply.raw.write('[');
|
||||
|
||||
let first = true;
|
||||
for await (const user of cursor) {
|
||||
if (!first) reply.raw.write(',');
|
||||
reply.raw.write(JSON.stringify(user));
|
||||
first = false;
|
||||
}
|
||||
|
||||
reply.raw.write(']');
|
||||
reply.raw.end();
|
||||
});
|
||||
```
|
||||
|
||||
## Caching Strategies
|
||||
|
||||
Implement caching for expensive operations:
|
||||
|
||||
```typescript
|
||||
import { LRUCache } from 'lru-cache';
|
||||
|
||||
const cache = new LRUCache<string, unknown>({
|
||||
max: 1000,
|
||||
ttl: 60000, // 1 minute
|
||||
});
|
||||
|
||||
app.get('/expensive/:id', async (request) => {
|
||||
const { id } = request.params;
|
||||
const cacheKey = `expensive:${id}`;
|
||||
|
||||
const cached = cache.get(cacheKey);
|
||||
if (cached) {
|
||||
return cached;
|
||||
}
|
||||
|
||||
const result = await expensiveOperation(id);
|
||||
cache.set(cacheKey, result);
|
||||
|
||||
return result;
|
||||
});
|
||||
|
||||
// Cache control headers
|
||||
app.get('/static-data', async (request, reply) => {
|
||||
reply.header('Cache-Control', 'public, max-age=3600');
|
||||
return { data: 'static' };
|
||||
});
|
||||
```
|
||||
|
||||
## Request Coalescing with async-cache-dedupe
|
||||
|
||||
Use `async-cache-dedupe` for deduplicating concurrent identical requests and caching:
|
||||
|
||||
```typescript
|
||||
import { createCache } from 'async-cache-dedupe';
|
||||
|
||||
const cache = createCache({
|
||||
ttl: 60, // seconds
|
||||
stale: 5, // serve stale while revalidating
|
||||
storage: { type: 'memory' },
|
||||
});
|
||||
|
||||
cache.define('fetchData', async (id: string) => {
|
||||
return db.findById(id);
|
||||
});
|
||||
|
||||
app.get('/data/:id', async (request) => {
|
||||
const { id } = request.params;
|
||||
// Automatically deduplicates concurrent requests for the same id
|
||||
// and caches the result
|
||||
return cache.fetchData(id);
|
||||
});
|
||||
```
|
||||
|
||||
For distributed caching, use Redis storage:
|
||||
|
||||
```typescript
|
||||
import { createCache } from 'async-cache-dedupe';
|
||||
import Redis from 'ioredis';
|
||||
|
||||
const redis = new Redis(process.env.REDIS_URL);
|
||||
|
||||
const cache = createCache({
|
||||
ttl: 60,
|
||||
storage: { type: 'redis', options: { client: redis } },
|
||||
});
|
||||
```
|
||||
|
||||
## Payload Limits
|
||||
|
||||
Set appropriate payload limits:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
bodyLimit: 1048576, // 1MB default
|
||||
});
|
||||
|
||||
// Per-route limit for file uploads
|
||||
app.post('/upload', {
|
||||
bodyLimit: 10485760, // 10MB for this route
|
||||
}, uploadHandler);
|
||||
```
|
||||
|
||||
## Compression
|
||||
|
||||
Use compression for responses:
|
||||
|
||||
```typescript
|
||||
import fastifyCompress from '@fastify/compress';
|
||||
|
||||
app.register(fastifyCompress, {
|
||||
global: true,
|
||||
threshold: 1024, // Only compress responses > 1KB
|
||||
encodings: ['gzip', 'deflate'],
|
||||
});
|
||||
|
||||
// Disable for specific route
|
||||
app.get('/already-compressed', {
|
||||
compress: false,
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Connection Timeouts
|
||||
|
||||
Configure appropriate timeouts:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
connectionTimeout: 30000, // 30 seconds
|
||||
keepAliveTimeout: 5000, // 5 seconds
|
||||
});
|
||||
|
||||
// Per-route timeout
|
||||
app.get('/long-operation', {
|
||||
config: {
|
||||
timeout: 60000, // 60 seconds
|
||||
},
|
||||
}, async (request) => {
|
||||
return longOperation();
|
||||
});
|
||||
```
|
||||
|
||||
## Disable Unnecessary Features
|
||||
|
||||
Disable features you don't need:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
disableRequestLogging: true, // If you don't need request logs
|
||||
trustProxy: false, // If not behind proxy
|
||||
caseSensitive: true, // Enable for slight performance gain
|
||||
ignoreDuplicateSlashes: false,
|
||||
});
|
||||
```
|
||||
|
||||
## Benchmarking
|
||||
|
||||
Use autocannon for load testing:
|
||||
|
||||
```bash
|
||||
# Install
|
||||
npm install -g autocannon
|
||||
|
||||
# Basic benchmark
|
||||
autocannon http://localhost:3000/api/users
|
||||
|
||||
# With options
|
||||
autocannon -c 100 -d 30 -p 10 http://localhost:3000/api/users
|
||||
# -c: connections
|
||||
# -d: duration in seconds
|
||||
# -p: pipelining factor
|
||||
```
|
||||
|
||||
```typescript
|
||||
// Programmatic benchmarking
|
||||
import autocannon from 'autocannon';
|
||||
|
||||
const result = await autocannon({
|
||||
url: 'http://localhost:3000/api/users',
|
||||
connections: 100,
|
||||
duration: 30,
|
||||
pipelining: 10,
|
||||
});
|
||||
|
||||
console.log(autocannon.printResult(result));
|
||||
```
|
||||
|
||||
## Profiling
|
||||
|
||||
Use `@platformatic/flame` for flame graph profiling:
|
||||
|
||||
```bash
|
||||
npx @platformatic/flame app.js
|
||||
```
|
||||
|
||||
This generates an interactive flame graph to identify performance bottlenecks.
|
||||
|
||||
## Memory Management
|
||||
|
||||
Monitor and optimize memory usage:
|
||||
|
||||
```typescript
|
||||
// Add health endpoint with memory info
|
||||
app.get('/health', async () => {
|
||||
const memory = process.memoryUsage();
|
||||
return {
|
||||
status: 'ok',
|
||||
memory: {
|
||||
heapUsed: Math.round(memory.heapUsed / 1024 / 1024) + 'MB',
|
||||
heapTotal: Math.round(memory.heapTotal / 1024 / 1024) + 'MB',
|
||||
rss: Math.round(memory.rss / 1024 / 1024) + 'MB',
|
||||
},
|
||||
};
|
||||
});
|
||||
|
||||
// Avoid memory leaks in closures
|
||||
app.addHook('onRequest', async (request) => {
|
||||
// BAD - holding reference to large object
|
||||
const largeData = await loadLargeData();
|
||||
request.getData = () => largeData;
|
||||
|
||||
// GOOD - load on demand
|
||||
request.getData = () => loadLargeData();
|
||||
});
|
||||
```
|
||||
320
.agents/skills/fastify-best-practices/rules/plugins.md
Normal file
320
.agents/skills/fastify-best-practices/rules/plugins.md
Normal file
|
|
@ -0,0 +1,320 @@
|
|||
---
|
||||
name: plugins
|
||||
description: Plugin development and encapsulation in Fastify
|
||||
metadata:
|
||||
tags: plugins, encapsulation, modules, architecture
|
||||
---
|
||||
|
||||
# Plugin Development and Encapsulation
|
||||
|
||||
## Understanding Encapsulation
|
||||
|
||||
Fastify's plugin system provides automatic encapsulation. Each plugin creates its own context, isolating decorators, hooks, and plugins registered within it:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// This plugin is encapsulated - its decorators are NOT available to siblings
|
||||
app.register(async function childPlugin(fastify) {
|
||||
fastify.decorate('privateUtil', () => 'only available here');
|
||||
|
||||
// This decorator is only available within this plugin and its children
|
||||
fastify.get('/child', async function (request, reply) {
|
||||
return this.privateUtil();
|
||||
});
|
||||
});
|
||||
|
||||
// This route CANNOT access privateUtil - it's in a different context
|
||||
app.get('/parent', async function (request, reply) {
|
||||
// this.privateUtil is undefined here
|
||||
return { status: 'ok' };
|
||||
});
|
||||
```
|
||||
|
||||
## Breaking Encapsulation with fastify-plugin
|
||||
|
||||
Use `fastify-plugin` when you need to share decorators, hooks, or plugins with the parent context:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
// This plugin's decorators will be available to the parent and siblings
|
||||
export default fp(async function databasePlugin(fastify, options) {
|
||||
const db = await createConnection(options.connectionString);
|
||||
|
||||
fastify.decorate('db', db);
|
||||
|
||||
fastify.addHook('onClose', async () => {
|
||||
await db.close();
|
||||
});
|
||||
}, {
|
||||
name: 'database-plugin',
|
||||
dependencies: [], // List plugin dependencies
|
||||
});
|
||||
```
|
||||
|
||||
## Plugin Registration Order
|
||||
|
||||
Plugins are registered in order, but loading is asynchronous. Use `after()` for sequential dependencies:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import databasePlugin from './plugins/database.js';
|
||||
import authPlugin from './plugins/auth.js';
|
||||
import routesPlugin from './routes/index.js';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Database must be ready before auth
|
||||
app.register(databasePlugin);
|
||||
|
||||
// Auth depends on database
|
||||
app.register(authPlugin);
|
||||
|
||||
// Routes depend on both
|
||||
app.register(routesPlugin);
|
||||
|
||||
// Or use after() for explicit sequencing
|
||||
app.register(databasePlugin).after(() => {
|
||||
app.register(authPlugin).after(() => {
|
||||
app.register(routesPlugin);
|
||||
});
|
||||
});
|
||||
|
||||
await app.ready();
|
||||
```
|
||||
|
||||
## Plugin Options
|
||||
|
||||
Always validate and document plugin options:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
interface CachePluginOptions {
|
||||
ttl: number;
|
||||
maxSize?: number;
|
||||
prefix?: string;
|
||||
}
|
||||
|
||||
export default fp<CachePluginOptions>(async function cachePlugin(fastify, options) {
|
||||
const { ttl, maxSize = 1000, prefix = 'cache:' } = options;
|
||||
|
||||
if (typeof ttl !== 'number' || ttl <= 0) {
|
||||
throw new Error('Cache plugin requires a positive ttl option');
|
||||
}
|
||||
|
||||
const cache = new Map<string, { value: unknown; expires: number }>();
|
||||
|
||||
fastify.decorate('cache', {
|
||||
get(key: string): unknown | undefined {
|
||||
const item = cache.get(prefix + key);
|
||||
if (!item) return undefined;
|
||||
if (Date.now() > item.expires) {
|
||||
cache.delete(prefix + key);
|
||||
return undefined;
|
||||
}
|
||||
return item.value;
|
||||
},
|
||||
set(key: string, value: unknown): void {
|
||||
if (cache.size >= maxSize) {
|
||||
const firstKey = cache.keys().next().value;
|
||||
cache.delete(firstKey);
|
||||
}
|
||||
cache.set(prefix + key, { value, expires: Date.now() + ttl });
|
||||
},
|
||||
});
|
||||
}, {
|
||||
name: 'cache-plugin',
|
||||
});
|
||||
```
|
||||
|
||||
## Plugin Factory Pattern
|
||||
|
||||
Create configurable plugins using factory functions:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
interface RateLimitOptions {
|
||||
max: number;
|
||||
timeWindow: number;
|
||||
}
|
||||
|
||||
function createRateLimiter(defaults: Partial<RateLimitOptions> = {}) {
|
||||
return fp<RateLimitOptions>(async function rateLimitPlugin(fastify, options) {
|
||||
const config = { ...defaults, ...options };
|
||||
|
||||
// Implementation
|
||||
fastify.decorate('rateLimit', config);
|
||||
}, {
|
||||
name: 'rate-limiter',
|
||||
});
|
||||
}
|
||||
|
||||
// Usage
|
||||
app.register(createRateLimiter({ max: 100 }), { timeWindow: 60000 });
|
||||
```
|
||||
|
||||
## Plugin Dependencies
|
||||
|
||||
Declare dependencies to ensure proper load order:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
export default fp(async function authPlugin(fastify) {
|
||||
// This plugin requires 'database-plugin' to be loaded first
|
||||
if (!fastify.hasDecorator('db')) {
|
||||
throw new Error('Auth plugin requires database plugin');
|
||||
}
|
||||
|
||||
fastify.decorate('authenticate', async (request) => {
|
||||
const user = await fastify.db.users.findByToken(request.headers.authorization);
|
||||
return user;
|
||||
});
|
||||
}, {
|
||||
name: 'auth-plugin',
|
||||
dependencies: ['database-plugin'],
|
||||
});
|
||||
```
|
||||
|
||||
## Scoped Plugins for Route Groups
|
||||
|
||||
Use encapsulation to scope plugins to specific routes:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Public routes - no auth required
|
||||
app.register(async function publicRoutes(fastify) {
|
||||
fastify.get('/health', async () => ({ status: 'ok' }));
|
||||
fastify.get('/docs', async () => ({ version: '1.0.0' }));
|
||||
});
|
||||
|
||||
// Protected routes - auth required
|
||||
app.register(async function protectedRoutes(fastify) {
|
||||
// Auth hook only applies to routes in this plugin
|
||||
fastify.addHook('onRequest', async (request, reply) => {
|
||||
const token = request.headers.authorization;
|
||||
if (!token) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return;
|
||||
}
|
||||
request.user = await verifyToken(token);
|
||||
});
|
||||
|
||||
fastify.get('/profile', async (request) => {
|
||||
return { user: request.user };
|
||||
});
|
||||
|
||||
fastify.get('/settings', async (request) => {
|
||||
return { settings: await getSettings(request.user.id) };
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Prefix Routes with Register
|
||||
|
||||
Use the `prefix` option to namespace routes:
|
||||
|
||||
```typescript
|
||||
app.register(import('./routes/users.js'), { prefix: '/api/v1/users' });
|
||||
app.register(import('./routes/posts.js'), { prefix: '/api/v1/posts' });
|
||||
|
||||
// In routes/users.js
|
||||
export default async function userRoutes(fastify) {
|
||||
// Becomes /api/v1/users
|
||||
fastify.get('/', async () => {
|
||||
return { users: [] };
|
||||
});
|
||||
|
||||
// Becomes /api/v1/users/:id
|
||||
fastify.get('/:id', async (request) => {
|
||||
return { user: { id: request.params.id } };
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## Plugin Metadata
|
||||
|
||||
Add metadata for documentation and tooling:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
async function metricsPlugin(fastify) {
|
||||
// Implementation
|
||||
}
|
||||
|
||||
export default fp(metricsPlugin, {
|
||||
name: 'metrics-plugin',
|
||||
fastify: '5.x', // Fastify version compatibility
|
||||
dependencies: ['pino-plugin'],
|
||||
decorators: {
|
||||
fastify: ['db'], // Required decorators
|
||||
request: [],
|
||||
reply: [],
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Autoload Plugins
|
||||
|
||||
Use `@fastify/autoload` for automatic plugin loading:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import autoload from '@fastify/autoload';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
import { dirname, join } from 'node:path';
|
||||
|
||||
const __dirname = dirname(fileURLToPath(import.meta.url));
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Load all plugins from the plugins directory
|
||||
app.register(autoload, {
|
||||
dir: join(__dirname, 'plugins'),
|
||||
options: { prefix: '/api' },
|
||||
});
|
||||
|
||||
// Load all routes from the routes directory
|
||||
app.register(autoload, {
|
||||
dir: join(__dirname, 'routes'),
|
||||
options: { prefix: '/api' },
|
||||
});
|
||||
```
|
||||
|
||||
## Testing Plugins in Isolation
|
||||
|
||||
Test plugins independently:
|
||||
|
||||
```typescript
|
||||
import { describe, it, before, after } from 'node:test';
|
||||
import Fastify from 'fastify';
|
||||
import myPlugin from './my-plugin.js';
|
||||
|
||||
describe('MyPlugin', () => {
|
||||
let app;
|
||||
|
||||
before(async () => {
|
||||
app = Fastify();
|
||||
app.register(myPlugin, { option: 'value' });
|
||||
await app.ready();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should decorate fastify instance', (t) => {
|
||||
t.assert.ok(app.hasDecorator('myDecorator'));
|
||||
});
|
||||
});
|
||||
```
|
||||
467
.agents/skills/fastify-best-practices/rules/routes.md
Normal file
467
.agents/skills/fastify-best-practices/rules/routes.md
Normal file
|
|
@ -0,0 +1,467 @@
|
|||
---
|
||||
name: routes
|
||||
description: Route organization and handlers in Fastify
|
||||
metadata:
|
||||
tags: routes, handlers, http, rest, api
|
||||
---
|
||||
|
||||
# Route Organization and Handlers
|
||||
|
||||
## Basic Route Definition
|
||||
|
||||
Define routes with the shorthand methods or the full route method:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Shorthand methods
|
||||
app.get('/users', async (request, reply) => {
|
||||
return { users: [] };
|
||||
});
|
||||
|
||||
app.post('/users', async (request, reply) => {
|
||||
return { created: true };
|
||||
});
|
||||
|
||||
// Full route method with all options
|
||||
app.route({
|
||||
method: 'GET',
|
||||
url: '/users/:id',
|
||||
schema: {
|
||||
params: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
},
|
||||
required: ['id'],
|
||||
},
|
||||
},
|
||||
handler: async (request, reply) => {
|
||||
return { id: request.params.id };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Route Parameters
|
||||
|
||||
Access URL parameters through `request.params`:
|
||||
|
||||
```typescript
|
||||
// Single parameter
|
||||
app.get('/users/:id', async (request) => {
|
||||
const { id } = request.params as { id: string };
|
||||
return { userId: id };
|
||||
});
|
||||
|
||||
// Multiple parameters
|
||||
app.get('/users/:userId/posts/:postId', async (request) => {
|
||||
const { userId, postId } = request.params as { userId: string; postId: string };
|
||||
return { userId, postId };
|
||||
});
|
||||
|
||||
// Wildcard parameter (captures everything after)
|
||||
app.get('/files/*', async (request) => {
|
||||
const path = (request.params as { '*': string })['*'];
|
||||
return { filePath: path };
|
||||
});
|
||||
|
||||
// Regex parameters (Fastify uses find-my-way)
|
||||
app.get('/orders/:id(\\d+)', async (request) => {
|
||||
// Only matches numeric IDs
|
||||
const { id } = request.params as { id: string };
|
||||
return { orderId: parseInt(id, 10) };
|
||||
});
|
||||
```
|
||||
|
||||
## Query String Parameters
|
||||
|
||||
Access query parameters through `request.query`:
|
||||
|
||||
```typescript
|
||||
app.get('/search', {
|
||||
schema: {
|
||||
querystring: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
q: { type: 'string' },
|
||||
page: { type: 'integer', default: 1 },
|
||||
limit: { type: 'integer', default: 10, maximum: 100 },
|
||||
},
|
||||
required: ['q'],
|
||||
},
|
||||
},
|
||||
handler: async (request) => {
|
||||
const { q, page, limit } = request.query as {
|
||||
q: string;
|
||||
page: number;
|
||||
limit: number;
|
||||
};
|
||||
return { query: q, page, limit };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Request Body
|
||||
|
||||
Access the request body through `request.body`:
|
||||
|
||||
```typescript
|
||||
app.post('/users', {
|
||||
schema: {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string', minLength: 1 },
|
||||
email: { type: 'string', format: 'email' },
|
||||
age: { type: 'integer', minimum: 0 },
|
||||
},
|
||||
required: ['name', 'email'],
|
||||
},
|
||||
},
|
||||
handler: async (request, reply) => {
|
||||
const user = request.body as { name: string; email: string; age?: number };
|
||||
// Create user...
|
||||
reply.code(201);
|
||||
return { user };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Headers
|
||||
|
||||
Access request headers through `request.headers`:
|
||||
|
||||
```typescript
|
||||
app.get('/protected', {
|
||||
schema: {
|
||||
headers: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
authorization: { type: 'string' },
|
||||
},
|
||||
required: ['authorization'],
|
||||
},
|
||||
},
|
||||
handler: async (request) => {
|
||||
const token = request.headers.authorization;
|
||||
return { authenticated: true };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Reply Methods
|
||||
|
||||
Use reply methods to control the response:
|
||||
|
||||
```typescript
|
||||
app.get('/examples', async (request, reply) => {
|
||||
// Set status code
|
||||
reply.code(201);
|
||||
|
||||
// Set headers
|
||||
reply.header('X-Custom-Header', 'value');
|
||||
reply.headers({ 'X-Another': 'value', 'X-Third': 'value' });
|
||||
|
||||
// Set content type
|
||||
reply.type('application/json');
|
||||
|
||||
// Redirect
|
||||
// reply.redirect('/other-url');
|
||||
// reply.redirect(301, '/permanent-redirect');
|
||||
|
||||
// Return response (automatic serialization)
|
||||
return { status: 'ok' };
|
||||
});
|
||||
|
||||
// Explicit send (useful in non-async handlers)
|
||||
app.get('/explicit', (request, reply) => {
|
||||
reply.send({ status: 'ok' });
|
||||
});
|
||||
|
||||
// Stream response
|
||||
app.get('/stream', async (request, reply) => {
|
||||
const stream = fs.createReadStream('./large-file.txt');
|
||||
reply.type('text/plain');
|
||||
return reply.send(stream);
|
||||
});
|
||||
```
|
||||
|
||||
## Route Organization by Feature
|
||||
|
||||
Organize routes by feature/domain in separate files:
|
||||
|
||||
```
|
||||
src/
|
||||
routes/
|
||||
users/
|
||||
index.ts # Route definitions
|
||||
handlers.ts # Handler functions
|
||||
schemas.ts # JSON schemas
|
||||
posts/
|
||||
index.ts
|
||||
handlers.ts
|
||||
schemas.ts
|
||||
```
|
||||
|
||||
```typescript
|
||||
// routes/users/schemas.ts
|
||||
export const userSchema = {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string', format: 'uuid' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string', format: 'email' },
|
||||
},
|
||||
};
|
||||
|
||||
export const createUserSchema = {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string', minLength: 1 },
|
||||
email: { type: 'string', format: 'email' },
|
||||
},
|
||||
required: ['name', 'email'],
|
||||
},
|
||||
response: {
|
||||
201: userSchema,
|
||||
},
|
||||
};
|
||||
|
||||
// routes/users/handlers.ts
|
||||
import type { FastifyRequest, FastifyReply } from 'fastify';
|
||||
|
||||
export async function createUser(
|
||||
request: FastifyRequest<{ Body: { name: string; email: string } }>,
|
||||
reply: FastifyReply,
|
||||
) {
|
||||
const { name, email } = request.body;
|
||||
const user = await request.server.db.users.create({ name, email });
|
||||
reply.code(201);
|
||||
return user;
|
||||
}
|
||||
|
||||
export async function getUsers(request: FastifyRequest) {
|
||||
return request.server.db.users.findAll();
|
||||
}
|
||||
|
||||
// routes/users/index.ts
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import { createUser, getUsers } from './handlers.js';
|
||||
import { createUserSchema } from './schemas.js';
|
||||
|
||||
export default async function userRoutes(fastify: FastifyInstance) {
|
||||
fastify.get('/', getUsers);
|
||||
fastify.post('/', { schema: createUserSchema }, createUser);
|
||||
}
|
||||
```
|
||||
|
||||
## Route Constraints
|
||||
|
||||
Add constraints to routes for versioning or host-based routing:
|
||||
|
||||
```typescript
|
||||
// Version constraint
|
||||
app.get('/users', {
|
||||
constraints: { version: '1.0.0' },
|
||||
handler: async () => ({ version: '1.0.0', users: [] }),
|
||||
});
|
||||
|
||||
app.get('/users', {
|
||||
constraints: { version: '2.0.0' },
|
||||
handler: async () => ({ version: '2.0.0', data: { users: [] } }),
|
||||
});
|
||||
|
||||
// Client sends: Accept-Version: 1.0.0
|
||||
|
||||
// Host constraint
|
||||
app.get('/', {
|
||||
constraints: { host: 'api.example.com' },
|
||||
handler: async () => ({ api: true }),
|
||||
});
|
||||
|
||||
app.get('/', {
|
||||
constraints: { host: 'www.example.com' },
|
||||
handler: async () => ({ web: true }),
|
||||
});
|
||||
```
|
||||
|
||||
## Route Prefixing
|
||||
|
||||
Use prefixes to namespace routes:
|
||||
|
||||
```typescript
|
||||
// Using register
|
||||
app.register(async function (fastify) {
|
||||
fastify.get('/list', async () => ({ users: [] }));
|
||||
fastify.get('/:id', async (request) => ({ id: request.params.id }));
|
||||
}, { prefix: '/users' });
|
||||
|
||||
// Results in:
|
||||
// GET /users/list
|
||||
// GET /users/:id
|
||||
```
|
||||
|
||||
## Multiple Methods
|
||||
|
||||
Handle multiple HTTP methods with one handler:
|
||||
|
||||
```typescript
|
||||
app.route({
|
||||
method: ['GET', 'HEAD'],
|
||||
url: '/resource',
|
||||
handler: async (request) => {
|
||||
return { data: 'resource' };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## 404 Handler
|
||||
|
||||
Customize the not found handler:
|
||||
|
||||
```typescript
|
||||
app.setNotFoundHandler({
|
||||
preValidation: async (request, reply) => {
|
||||
// Optional pre-validation hook
|
||||
},
|
||||
preHandler: async (request, reply) => {
|
||||
// Optional pre-handler hook
|
||||
},
|
||||
}, async (request, reply) => {
|
||||
reply.code(404);
|
||||
return {
|
||||
error: 'Not Found',
|
||||
message: `Route ${request.method} ${request.url} not found`,
|
||||
statusCode: 404,
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Method Not Allowed
|
||||
|
||||
Handle method not allowed responses:
|
||||
|
||||
```typescript
|
||||
// Fastify doesn't have built-in 405 handling
|
||||
// Implement with a custom not found handler that checks allowed methods
|
||||
app.setNotFoundHandler(async (request, reply) => {
|
||||
// Check if the URL exists with a different method
|
||||
const route = app.hasRoute({
|
||||
url: request.url,
|
||||
method: 'GET', // Check other methods
|
||||
});
|
||||
|
||||
if (route) {
|
||||
reply.code(405);
|
||||
return { error: 'Method Not Allowed' };
|
||||
}
|
||||
|
||||
reply.code(404);
|
||||
return { error: 'Not Found' };
|
||||
});
|
||||
```
|
||||
|
||||
## Route-Level Configuration
|
||||
|
||||
Apply configuration to specific routes:
|
||||
|
||||
```typescript
|
||||
app.get('/slow-operation', {
|
||||
config: {
|
||||
rateLimit: { max: 10, timeWindow: '1 minute' },
|
||||
},
|
||||
handler: async (request) => {
|
||||
return { result: await slowOperation() };
|
||||
},
|
||||
});
|
||||
|
||||
// Access config in hooks
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
const config = request.routeOptions.config;
|
||||
if (config.rateLimit) {
|
||||
// Apply rate limiting
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Async Route Registration
|
||||
|
||||
Register routes from async sources:
|
||||
|
||||
```typescript
|
||||
app.register(async function (fastify) {
|
||||
const routeConfigs = await loadRoutesFromDatabase();
|
||||
|
||||
for (const config of routeConfigs) {
|
||||
fastify.route({
|
||||
method: config.method,
|
||||
url: config.path,
|
||||
handler: createDynamicHandler(config),
|
||||
});
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Auto-loading Routes with @fastify/autoload
|
||||
|
||||
Use `@fastify/autoload` to automatically load routes from a directory structure:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import autoload from '@fastify/autoload';
|
||||
import { join } from 'node:path';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
// Auto-load plugins
|
||||
app.register(autoload, {
|
||||
dir: join(import.meta.dirname, 'plugins'),
|
||||
options: { prefix: '' },
|
||||
});
|
||||
|
||||
// Auto-load routes
|
||||
app.register(autoload, {
|
||||
dir: join(import.meta.dirname, 'routes'),
|
||||
options: { prefix: '/api' },
|
||||
});
|
||||
|
||||
await app.listen({ port: 3000 });
|
||||
```
|
||||
|
||||
Directory structure:
|
||||
|
||||
```
|
||||
src/
|
||||
plugins/
|
||||
database.ts # Loaded automatically
|
||||
auth.ts # Loaded automatically
|
||||
routes/
|
||||
users/
|
||||
index.ts # GET/POST /api/users
|
||||
_id/
|
||||
index.ts # GET/PUT/DELETE /api/users/:id
|
||||
posts/
|
||||
index.ts # GET/POST /api/posts
|
||||
```
|
||||
|
||||
Route file example:
|
||||
|
||||
```typescript
|
||||
// routes/users/index.ts
|
||||
import type { FastifyPluginAsync } from 'fastify';
|
||||
|
||||
const users: FastifyPluginAsync = async (fastify) => {
|
||||
fastify.get('/', async () => {
|
||||
return fastify.repositories.users.findAll();
|
||||
});
|
||||
|
||||
fastify.post('/', async (request) => {
|
||||
return fastify.repositories.users.create(request.body);
|
||||
});
|
||||
};
|
||||
|
||||
export default users;
|
||||
```
|
||||
585
.agents/skills/fastify-best-practices/rules/schemas.md
Normal file
585
.agents/skills/fastify-best-practices/rules/schemas.md
Normal file
|
|
@ -0,0 +1,585 @@
|
|||
---
|
||||
name: schemas
|
||||
description: JSON Schema validation in Fastify with TypeBox
|
||||
metadata:
|
||||
tags: validation, json-schema, schemas, ajv, typebox
|
||||
---
|
||||
|
||||
# JSON Schema Validation
|
||||
|
||||
## Use TypeBox for Type-Safe Schemas
|
||||
|
||||
**Prefer TypeBox for defining schemas.** It provides TypeScript types automatically and compiles to JSON Schema:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Define schema with TypeBox - get TypeScript types for free
|
||||
const CreateUserBody = Type.Object({
|
||||
name: Type.String({ minLength: 1, maxLength: 100 }),
|
||||
email: Type.String({ format: 'email' }),
|
||||
age: Type.Optional(Type.Integer({ minimum: 0, maximum: 150 })),
|
||||
});
|
||||
|
||||
const UserResponse = Type.Object({
|
||||
id: Type.String({ format: 'uuid' }),
|
||||
name: Type.String(),
|
||||
email: Type.String(),
|
||||
createdAt: Type.String({ format: 'date-time' }),
|
||||
});
|
||||
|
||||
// TypeScript types are derived automatically
|
||||
type CreateUserBodyType = Static<typeof CreateUserBody>;
|
||||
type UserResponseType = Static<typeof UserResponse>;
|
||||
|
||||
app.post<{
|
||||
Body: CreateUserBodyType;
|
||||
Reply: UserResponseType;
|
||||
}>('/users', {
|
||||
schema: {
|
||||
body: CreateUserBody,
|
||||
response: {
|
||||
201: UserResponse,
|
||||
},
|
||||
},
|
||||
}, async (request, reply) => {
|
||||
// request.body is fully typed as CreateUserBodyType
|
||||
const user = await createUser(request.body);
|
||||
reply.code(201);
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## TypeBox Common Patterns
|
||||
|
||||
```typescript
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
// Enums
|
||||
const Status = Type.Union([
|
||||
Type.Literal('active'),
|
||||
Type.Literal('inactive'),
|
||||
Type.Literal('pending'),
|
||||
]);
|
||||
|
||||
// Arrays
|
||||
const Tags = Type.Array(Type.String(), { minItems: 1, maxItems: 10 });
|
||||
|
||||
// Nested objects
|
||||
const Address = Type.Object({
|
||||
street: Type.String(),
|
||||
city: Type.String(),
|
||||
country: Type.String(),
|
||||
zip: Type.Optional(Type.String()),
|
||||
});
|
||||
|
||||
// References (reusable schemas)
|
||||
const User = Type.Object({
|
||||
id: Type.String({ format: 'uuid' }),
|
||||
name: Type.String(),
|
||||
address: Address,
|
||||
tags: Tags,
|
||||
status: Status,
|
||||
});
|
||||
|
||||
// Nullable
|
||||
const NullableString = Type.Union([Type.String(), Type.Null()]);
|
||||
|
||||
// Record/Map
|
||||
const Metadata = Type.Record(Type.String(), Type.Unknown());
|
||||
```
|
||||
|
||||
## Register TypeBox Schemas Globally
|
||||
|
||||
```typescript
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
// Define shared schemas
|
||||
const ErrorResponse = Type.Object({
|
||||
error: Type.String(),
|
||||
message: Type.String(),
|
||||
statusCode: Type.Integer(),
|
||||
});
|
||||
|
||||
const PaginationQuery = Type.Object({
|
||||
page: Type.Integer({ minimum: 1, default: 1 }),
|
||||
limit: Type.Integer({ minimum: 1, maximum: 100, default: 20 }),
|
||||
});
|
||||
|
||||
// Register globally
|
||||
app.addSchema(Type.Object({ $id: 'ErrorResponse', ...ErrorResponse }));
|
||||
app.addSchema(Type.Object({ $id: 'PaginationQuery', ...PaginationQuery }));
|
||||
|
||||
// Reference in routes
|
||||
app.get('/items', {
|
||||
schema: {
|
||||
querystring: { $ref: 'PaginationQuery#' },
|
||||
response: {
|
||||
400: { $ref: 'ErrorResponse#' },
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Plain JSON Schema (Alternative)
|
||||
|
||||
You can also use plain JSON Schema directly:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
const createUserSchema = {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string', minLength: 1, maxLength: 100 },
|
||||
email: { type: 'string', format: 'email' },
|
||||
age: { type: 'integer', minimum: 0, maximum: 150 },
|
||||
},
|
||||
required: ['name', 'email'],
|
||||
additionalProperties: false,
|
||||
},
|
||||
response: {
|
||||
201: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string', format: 'uuid' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string' },
|
||||
createdAt: { type: 'string', format: 'date-time' },
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
app.post('/users', { schema: createUserSchema }, async (request, reply) => {
|
||||
const user = await createUser(request.body);
|
||||
reply.code(201);
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## Request Validation Parts
|
||||
|
||||
Validate different parts of the request:
|
||||
|
||||
```typescript
|
||||
const fullRequestSchema = {
|
||||
// URL parameters
|
||||
params: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string', format: 'uuid' },
|
||||
},
|
||||
required: ['id'],
|
||||
},
|
||||
|
||||
// Query string
|
||||
querystring: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
include: { type: 'string', enum: ['posts', 'comments', 'all'] },
|
||||
limit: { type: 'integer', minimum: 1, maximum: 100, default: 10 },
|
||||
},
|
||||
},
|
||||
|
||||
// Request headers
|
||||
headers: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
'x-api-key': { type: 'string', minLength: 32 },
|
||||
},
|
||||
required: ['x-api-key'],
|
||||
},
|
||||
|
||||
// Request body
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
data: { type: 'object' },
|
||||
},
|
||||
required: ['data'],
|
||||
},
|
||||
};
|
||||
|
||||
app.put('/resources/:id', { schema: fullRequestSchema }, handler);
|
||||
```
|
||||
|
||||
## Shared Schemas with $id
|
||||
|
||||
Define reusable schemas with `$id` and reference them with `$ref`:
|
||||
|
||||
```typescript
|
||||
// Add shared schemas to Fastify
|
||||
app.addSchema({
|
||||
$id: 'user',
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string', format: 'uuid' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string', format: 'email' },
|
||||
createdAt: { type: 'string', format: 'date-time' },
|
||||
},
|
||||
required: ['id', 'name', 'email'],
|
||||
});
|
||||
|
||||
app.addSchema({
|
||||
$id: 'userCreate',
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string', minLength: 1 },
|
||||
email: { type: 'string', format: 'email' },
|
||||
},
|
||||
required: ['name', 'email'],
|
||||
additionalProperties: false,
|
||||
});
|
||||
|
||||
app.addSchema({
|
||||
$id: 'error',
|
||||
type: 'object',
|
||||
properties: {
|
||||
statusCode: { type: 'integer' },
|
||||
error: { type: 'string' },
|
||||
message: { type: 'string' },
|
||||
},
|
||||
});
|
||||
|
||||
// Reference shared schemas
|
||||
app.post('/users', {
|
||||
schema: {
|
||||
body: { $ref: 'userCreate#' },
|
||||
response: {
|
||||
201: { $ref: 'user#' },
|
||||
400: { $ref: 'error#' },
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
|
||||
app.get('/users/:id', {
|
||||
schema: {
|
||||
params: {
|
||||
type: 'object',
|
||||
properties: { id: { type: 'string', format: 'uuid' } },
|
||||
required: ['id'],
|
||||
},
|
||||
response: {
|
||||
200: { $ref: 'user#' },
|
||||
404: { $ref: 'error#' },
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Array Schemas
|
||||
|
||||
Define schemas for array responses:
|
||||
|
||||
```typescript
|
||||
app.addSchema({
|
||||
$id: 'userList',
|
||||
type: 'object',
|
||||
properties: {
|
||||
users: {
|
||||
type: 'array',
|
||||
items: { $ref: 'user#' },
|
||||
},
|
||||
total: { type: 'integer' },
|
||||
page: { type: 'integer' },
|
||||
pageSize: { type: 'integer' },
|
||||
},
|
||||
});
|
||||
|
||||
app.get('/users', {
|
||||
schema: {
|
||||
querystring: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
page: { type: 'integer', minimum: 1, default: 1 },
|
||||
pageSize: { type: 'integer', minimum: 1, maximum: 100, default: 20 },
|
||||
},
|
||||
},
|
||||
response: {
|
||||
200: { $ref: 'userList#' },
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Custom Formats
|
||||
|
||||
Add custom validation formats:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
ajv: {
|
||||
customOptions: {
|
||||
formats: {
|
||||
'iso-country': /^[A-Z]{2}$/,
|
||||
'phone': /^\+?[1-9]\d{1,14}$/,
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Or add formats dynamically
|
||||
app.addSchema({
|
||||
$id: 'address',
|
||||
type: 'object',
|
||||
properties: {
|
||||
street: { type: 'string' },
|
||||
country: { type: 'string', format: 'iso-country' },
|
||||
phone: { type: 'string', format: 'phone' },
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Keywords
|
||||
|
||||
Add custom validation keywords:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import Ajv from 'ajv';
|
||||
|
||||
const app = Fastify({
|
||||
ajv: {
|
||||
customOptions: {
|
||||
keywords: [
|
||||
{
|
||||
keyword: 'isEven',
|
||||
type: 'number',
|
||||
validate: (schema: boolean, data: number) => {
|
||||
if (schema) {
|
||||
return data % 2 === 0;
|
||||
}
|
||||
return true;
|
||||
},
|
||||
errors: false,
|
||||
},
|
||||
],
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Use custom keyword
|
||||
app.post('/numbers', {
|
||||
schema: {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
value: { type: 'integer', isEven: true },
|
||||
},
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Coercion
|
||||
|
||||
Fastify coerces types by default for query strings and params:
|
||||
|
||||
```typescript
|
||||
// Query string "?page=5&active=true" becomes:
|
||||
// { page: 5, active: true } (number and boolean, not strings)
|
||||
|
||||
app.get('/items', {
|
||||
schema: {
|
||||
querystring: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
page: { type: 'integer' }, // "5" -> 5
|
||||
active: { type: 'boolean' }, // "true" -> true
|
||||
tags: {
|
||||
type: 'array',
|
||||
items: { type: 'string' }, // "a,b,c" -> ["a", "b", "c"]
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Validation Error Handling
|
||||
|
||||
Customize validation error responses:
|
||||
|
||||
```typescript
|
||||
app.setErrorHandler((error, request, reply) => {
|
||||
if (error.validation) {
|
||||
reply.code(400).send({
|
||||
error: 'Validation Error',
|
||||
message: 'Request validation failed',
|
||||
details: error.validation.map((err) => ({
|
||||
field: err.instancePath || err.params?.missingProperty,
|
||||
message: err.message,
|
||||
keyword: err.keyword,
|
||||
})),
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Handle other errors
|
||||
reply.code(error.statusCode || 500).send({
|
||||
error: error.name,
|
||||
message: error.message,
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Schema Compiler Options
|
||||
|
||||
Configure the Ajv schema compiler:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
ajv: {
|
||||
customOptions: {
|
||||
removeAdditional: 'all', // Remove extra properties
|
||||
useDefaults: true, // Apply default values
|
||||
coerceTypes: true, // Coerce types
|
||||
allErrors: true, // Report all errors, not just first
|
||||
},
|
||||
plugins: [
|
||||
require('ajv-formats'), // Add format validators
|
||||
],
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Nullable Fields
|
||||
|
||||
Handle nullable fields properly:
|
||||
|
||||
```typescript
|
||||
app.addSchema({
|
||||
$id: 'profile',
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string' },
|
||||
bio: { type: ['string', 'null'] }, // Can be string or null
|
||||
avatar: {
|
||||
oneOf: [
|
||||
{ type: 'string', format: 'uri' },
|
||||
{ type: 'null' },
|
||||
],
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Conditional Validation
|
||||
|
||||
Use if/then/else for conditional validation:
|
||||
|
||||
```typescript
|
||||
app.addSchema({
|
||||
$id: 'payment',
|
||||
type: 'object',
|
||||
properties: {
|
||||
method: { type: 'string', enum: ['card', 'bank'] },
|
||||
cardNumber: { type: 'string' },
|
||||
bankAccount: { type: 'string' },
|
||||
},
|
||||
required: ['method'],
|
||||
if: {
|
||||
properties: { method: { const: 'card' } },
|
||||
},
|
||||
then: {
|
||||
required: ['cardNumber'],
|
||||
},
|
||||
else: {
|
||||
required: ['bankAccount'],
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Schema Organization
|
||||
|
||||
Organize schemas in a dedicated file:
|
||||
|
||||
```typescript
|
||||
// schemas/index.ts
|
||||
export const schemas = [
|
||||
{
|
||||
$id: 'user',
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string', format: 'uuid' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string', format: 'email' },
|
||||
},
|
||||
},
|
||||
{
|
||||
$id: 'error',
|
||||
type: 'object',
|
||||
properties: {
|
||||
statusCode: { type: 'integer' },
|
||||
error: { type: 'string' },
|
||||
message: { type: 'string' },
|
||||
},
|
||||
},
|
||||
];
|
||||
|
||||
// app.ts
|
||||
import { schemas } from './schemas/index.js';
|
||||
|
||||
for (const schema of schemas) {
|
||||
app.addSchema(schema);
|
||||
}
|
||||
```
|
||||
|
||||
## OpenAPI/Swagger Integration
|
||||
|
||||
Schemas work directly with @fastify/swagger:
|
||||
|
||||
```typescript
|
||||
import fastifySwagger from '@fastify/swagger';
|
||||
import fastifySwaggerUi from '@fastify/swagger-ui';
|
||||
|
||||
app.register(fastifySwagger, {
|
||||
openapi: {
|
||||
info: {
|
||||
title: 'My API',
|
||||
version: '1.0.0',
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
app.register(fastifySwaggerUi, {
|
||||
routePrefix: '/docs',
|
||||
});
|
||||
|
||||
// Schemas are automatically converted to OpenAPI definitions
|
||||
```
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
Response schemas enable fast-json-stringify for serialization:
|
||||
|
||||
```typescript
|
||||
// With response schema - uses fast-json-stringify (faster)
|
||||
app.get('/users', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'array',
|
||||
items: { $ref: 'user#' },
|
||||
},
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
|
||||
// Without response schema - uses JSON.stringify (slower)
|
||||
app.get('/users-slow', handler);
|
||||
```
|
||||
|
||||
Always define response schemas for production APIs to benefit from optimized serialization.
|
||||
475
.agents/skills/fastify-best-practices/rules/serialization.md
Normal file
475
.agents/skills/fastify-best-practices/rules/serialization.md
Normal file
|
|
@ -0,0 +1,475 @@
|
|||
---
|
||||
name: serialization
|
||||
description: Response serialization in Fastify with TypeBox
|
||||
metadata:
|
||||
tags: serialization, response, json, fast-json-stringify, typebox
|
||||
---
|
||||
|
||||
# Response Serialization
|
||||
|
||||
## Use TypeBox for Type-Safe Response Schemas
|
||||
|
||||
Define response schemas with TypeBox for automatic TypeScript types and fast serialization:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Define response schema with TypeBox
|
||||
const UserResponse = Type.Object({
|
||||
id: Type.String(),
|
||||
name: Type.String(),
|
||||
email: Type.String(),
|
||||
});
|
||||
|
||||
const UsersResponse = Type.Array(UserResponse);
|
||||
|
||||
type UserResponseType = Static<typeof UserResponse>;
|
||||
|
||||
// With TypeBox schema - uses fast-json-stringify (faster) + TypeScript types
|
||||
app.get<{ Reply: Static<typeof UsersResponse> }>('/users', {
|
||||
schema: {
|
||||
response: {
|
||||
200: UsersResponse,
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return db.users.findAll();
|
||||
});
|
||||
|
||||
// Without schema - uses JSON.stringify (slower), no type safety
|
||||
app.get('/users-slow', async () => {
|
||||
return db.users.findAll();
|
||||
});
|
||||
```
|
||||
|
||||
## Fast JSON Stringify
|
||||
|
||||
Fastify uses `fast-json-stringify` when response schemas are defined. This provides:
|
||||
|
||||
1. **Performance**: 2-3x faster serialization than JSON.stringify
|
||||
2. **Security**: Only defined properties are serialized (strips sensitive data)
|
||||
3. **Type coercion**: Ensures output matches the schema
|
||||
4. **TypeScript**: Full type inference with TypeBox
|
||||
|
||||
## Response Schema Benefits
|
||||
|
||||
1. **Performance**: 2-3x faster serialization
|
||||
2. **Security**: Only defined properties are included
|
||||
3. **Documentation**: OpenAPI/Swagger integration
|
||||
4. **Type coercion**: Ensures correct output types
|
||||
|
||||
```typescript
|
||||
app.get('/user/:id', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
name: { type: 'string' },
|
||||
// password is NOT in schema, so it's stripped
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async (request) => {
|
||||
const user = await db.users.findById(request.params.id);
|
||||
// Even if user has password field, it won't be serialized
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## Multiple Status Code Schemas
|
||||
|
||||
Define schemas for different response codes:
|
||||
|
||||
```typescript
|
||||
app.get('/users/:id', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string' },
|
||||
},
|
||||
},
|
||||
404: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
statusCode: { type: 'integer' },
|
||||
error: { type: 'string' },
|
||||
message: { type: 'string' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async (request, reply) => {
|
||||
const user = await db.users.findById(request.params.id);
|
||||
|
||||
if (!user) {
|
||||
reply.code(404);
|
||||
return { statusCode: 404, error: 'Not Found', message: 'User not found' };
|
||||
}
|
||||
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## Default Response Schema
|
||||
|
||||
Use 'default' for common error responses:
|
||||
|
||||
```typescript
|
||||
app.get('/resource', {
|
||||
schema: {
|
||||
response: {
|
||||
200: { $ref: 'resource#' },
|
||||
'4xx': {
|
||||
type: 'object',
|
||||
properties: {
|
||||
statusCode: { type: 'integer' },
|
||||
error: { type: 'string' },
|
||||
message: { type: 'string' },
|
||||
},
|
||||
},
|
||||
'5xx': {
|
||||
type: 'object',
|
||||
properties: {
|
||||
statusCode: { type: 'integer' },
|
||||
error: { type: 'string' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Custom Serializers
|
||||
|
||||
Create custom serialization functions:
|
||||
|
||||
```typescript
|
||||
// Per-route serializer
|
||||
app.get('/custom', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
value: { type: 'string' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
serializerCompiler: ({ schema }) => {
|
||||
return (data) => {
|
||||
// Custom serialization logic
|
||||
return JSON.stringify({
|
||||
value: String(data.value).toUpperCase(),
|
||||
serializedAt: new Date().toISOString(),
|
||||
});
|
||||
};
|
||||
},
|
||||
}, async () => {
|
||||
return { value: 'hello' };
|
||||
});
|
||||
```
|
||||
|
||||
## Shared Serializers
|
||||
|
||||
Use the global serializer compiler:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
serializerCompiler: ({ schema, method, url, httpStatus }) => {
|
||||
// Custom compilation logic
|
||||
const stringify = fastJson(schema);
|
||||
return (data) => stringify(data);
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Serialization with Type Coercion
|
||||
|
||||
fast-json-stringify coerces types:
|
||||
|
||||
```typescript
|
||||
app.get('/data', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
count: { type: 'integer' }, // '5' -> 5
|
||||
active: { type: 'boolean' }, // 'true' -> true
|
||||
tags: {
|
||||
type: 'array',
|
||||
items: { type: 'string' }, // [1, 2] -> ['1', '2']
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return {
|
||||
count: '5', // Coerced to integer
|
||||
active: 'true', // Coerced to boolean
|
||||
tags: [1, 2, 3], // Coerced to strings
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Nullable Fields
|
||||
|
||||
Handle nullable fields properly:
|
||||
|
||||
```typescript
|
||||
app.get('/profile', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string' },
|
||||
bio: { type: ['string', 'null'] },
|
||||
avatar: {
|
||||
oneOf: [
|
||||
{ type: 'string', format: 'uri' },
|
||||
{ type: 'null' },
|
||||
],
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return {
|
||||
name: 'John',
|
||||
bio: null,
|
||||
avatar: null,
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Additional Properties
|
||||
|
||||
Control extra properties in response:
|
||||
|
||||
```typescript
|
||||
// Strip additional properties (default)
|
||||
app.get('/strict', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
name: { type: 'string' },
|
||||
},
|
||||
additionalProperties: false,
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return { id: '1', name: 'John', secret: 'hidden' };
|
||||
// Output: { "id": "1", "name": "John" }
|
||||
});
|
||||
|
||||
// Allow additional properties
|
||||
app.get('/flexible', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
},
|
||||
additionalProperties: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return { id: '1', extra: 'included' };
|
||||
// Output: { "id": "1", "extra": "included" }
|
||||
});
|
||||
```
|
||||
|
||||
## Nested Objects
|
||||
|
||||
Serialize nested structures:
|
||||
|
||||
```typescript
|
||||
app.addSchema({
|
||||
$id: 'address',
|
||||
type: 'object',
|
||||
properties: {
|
||||
street: { type: 'string' },
|
||||
city: { type: 'string' },
|
||||
country: { type: 'string' },
|
||||
},
|
||||
});
|
||||
|
||||
app.get('/user', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string' },
|
||||
address: { $ref: 'address#' },
|
||||
contacts: {
|
||||
type: 'array',
|
||||
items: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
type: { type: 'string' },
|
||||
value: { type: 'string' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return {
|
||||
name: 'John',
|
||||
address: { street: '123 Main', city: 'Boston', country: 'USA' },
|
||||
contacts: [
|
||||
{ type: 'email', value: 'john@example.com' },
|
||||
{ type: 'phone', value: '+1234567890' },
|
||||
],
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Date Serialization
|
||||
|
||||
Handle dates consistently:
|
||||
|
||||
```typescript
|
||||
app.get('/events', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'array',
|
||||
items: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string' },
|
||||
date: { type: 'string', format: 'date-time' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
const events = await db.events.findAll();
|
||||
|
||||
// Convert Date objects to ISO strings
|
||||
return events.map((e) => ({
|
||||
...e,
|
||||
date: e.date.toISOString(),
|
||||
}));
|
||||
});
|
||||
```
|
||||
|
||||
## BigInt Serialization
|
||||
|
||||
Handle BigInt values:
|
||||
|
||||
```typescript
|
||||
// BigInt is not JSON serializable by default
|
||||
app.get('/large-number', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' }, // Serialize as string
|
||||
count: { type: 'integer' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
const bigValue = 9007199254740993n;
|
||||
|
||||
return {
|
||||
id: bigValue.toString(), // Convert to string
|
||||
count: Number(bigValue), // Or number if safe
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Stream Responses
|
||||
|
||||
Stream responses bypass serialization:
|
||||
|
||||
```typescript
|
||||
import { createReadStream } from 'node:fs';
|
||||
|
||||
app.get('/file', async (request, reply) => {
|
||||
const stream = createReadStream('./data.json');
|
||||
reply.type('application/json');
|
||||
return reply.send(stream);
|
||||
});
|
||||
|
||||
// Streaming JSON array
|
||||
app.get('/stream', async (request, reply) => {
|
||||
reply.type('application/json');
|
||||
|
||||
const cursor = db.users.findCursor();
|
||||
|
||||
reply.raw.write('[');
|
||||
let first = true;
|
||||
|
||||
for await (const user of cursor) {
|
||||
if (!first) reply.raw.write(',');
|
||||
reply.raw.write(JSON.stringify(user));
|
||||
first = false;
|
||||
}
|
||||
|
||||
reply.raw.write(']');
|
||||
reply.raw.end();
|
||||
});
|
||||
```
|
||||
|
||||
## Pre-Serialization Hook
|
||||
|
||||
Modify data before serialization:
|
||||
|
||||
```typescript
|
||||
app.addHook('preSerialization', async (request, reply, payload) => {
|
||||
// Add metadata to responses
|
||||
if (payload && typeof payload === 'object' && !Array.isArray(payload)) {
|
||||
return {
|
||||
...payload,
|
||||
_links: {
|
||||
self: request.url,
|
||||
},
|
||||
};
|
||||
}
|
||||
return payload;
|
||||
});
|
||||
```
|
||||
|
||||
## Disable Serialization
|
||||
|
||||
Skip serialization for specific routes:
|
||||
|
||||
```typescript
|
||||
app.get('/raw', async (request, reply) => {
|
||||
const data = JSON.stringify({ raw: true });
|
||||
reply.type('application/json');
|
||||
reply.serializer((payload) => payload); // Pass through
|
||||
return data;
|
||||
});
|
||||
```
|
||||
536
.agents/skills/fastify-best-practices/rules/testing.md
Normal file
536
.agents/skills/fastify-best-practices/rules/testing.md
Normal file
|
|
@ -0,0 +1,536 @@
|
|||
---
|
||||
name: testing
|
||||
description: Testing Fastify applications with inject()
|
||||
metadata:
|
||||
tags: testing, inject, node-test, integration, unit
|
||||
---
|
||||
|
||||
# Testing Fastify Applications
|
||||
|
||||
## Using inject() for Request Testing
|
||||
|
||||
Fastify's `inject()` method simulates HTTP requests without network overhead:
|
||||
|
||||
```typescript
|
||||
import { describe, it, before, after } from 'node:test';
|
||||
import Fastify from 'fastify';
|
||||
import { buildApp } from './app.js';
|
||||
|
||||
describe('User API', () => {
|
||||
let app;
|
||||
|
||||
before(async () => {
|
||||
app = await buildApp();
|
||||
await app.ready();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should return users list', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/users',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
t.assert.equal(response.headers['content-type'], 'application/json; charset=utf-8');
|
||||
|
||||
const body = response.json();
|
||||
t.assert.ok(Array.isArray(body.users));
|
||||
});
|
||||
|
||||
it('should create a user', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/users',
|
||||
payload: {
|
||||
name: 'John Doe',
|
||||
email: 'john@example.com',
|
||||
},
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 201);
|
||||
|
||||
const body = response.json();
|
||||
t.assert.equal(body.name, 'John Doe');
|
||||
t.assert.ok(body.id);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Testing with Headers and Authentication
|
||||
|
||||
Test authenticated endpoints:
|
||||
|
||||
```typescript
|
||||
describe('Protected Routes', () => {
|
||||
let app;
|
||||
let authToken;
|
||||
|
||||
before(async () => {
|
||||
app = await buildApp();
|
||||
await app.ready();
|
||||
|
||||
// Get auth token
|
||||
const loginResponse = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/auth/login',
|
||||
payload: {
|
||||
email: 'test@example.com',
|
||||
password: 'password123',
|
||||
},
|
||||
});
|
||||
|
||||
authToken = loginResponse.json().token;
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should reject unauthenticated requests', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/profile',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 401);
|
||||
});
|
||||
|
||||
it('should return profile for authenticated user', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/profile',
|
||||
headers: {
|
||||
authorization: `Bearer ${authToken}`,
|
||||
},
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
t.assert.equal(response.json().email, 'test@example.com');
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Testing Query Parameters
|
||||
|
||||
Test routes with query strings:
|
||||
|
||||
```typescript
|
||||
it('should filter users by status', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/users',
|
||||
query: {
|
||||
status: 'active',
|
||||
page: '1',
|
||||
limit: '10',
|
||||
},
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
const body = response.json();
|
||||
t.assert.ok(body.users.every((u) => u.status === 'active'));
|
||||
});
|
||||
|
||||
// Or use URL with query string
|
||||
it('should search users', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/users?q=john&sort=name',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
});
|
||||
```
|
||||
|
||||
## Testing URL Parameters
|
||||
|
||||
Test routes with path parameters:
|
||||
|
||||
```typescript
|
||||
it('should return user by id', async (t) => {
|
||||
const userId = 'user-123';
|
||||
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: `/users/${userId}`,
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
t.assert.equal(response.json().id, userId);
|
||||
});
|
||||
|
||||
it('should return 404 for non-existent user', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/users/non-existent',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 404);
|
||||
});
|
||||
```
|
||||
|
||||
## Testing Validation Errors
|
||||
|
||||
Test schema validation:
|
||||
|
||||
```typescript
|
||||
describe('Validation', () => {
|
||||
it('should reject invalid email', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/users',
|
||||
payload: {
|
||||
name: 'John',
|
||||
email: 'not-an-email',
|
||||
},
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 400);
|
||||
const body = response.json();
|
||||
t.assert.ok(body.message.includes('email'));
|
||||
});
|
||||
|
||||
it('should reject missing required fields', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/users',
|
||||
payload: {
|
||||
name: 'John',
|
||||
// missing email
|
||||
},
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 400);
|
||||
});
|
||||
|
||||
it('should coerce query parameters', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/items?limit=10&active=true',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
// limit is coerced to number, active to boolean
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Testing File Uploads
|
||||
|
||||
Test multipart form data:
|
||||
|
||||
```typescript
|
||||
import { createReadStream } from 'node:fs';
|
||||
import FormData from 'form-data';
|
||||
|
||||
it('should upload file', async (t) => {
|
||||
const form = new FormData();
|
||||
form.append('file', createReadStream('./test/fixtures/test.pdf'));
|
||||
form.append('name', 'test-document');
|
||||
|
||||
const response = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/upload',
|
||||
payload: form,
|
||||
headers: form.getHeaders(),
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
t.assert.ok(response.json().fileId);
|
||||
});
|
||||
```
|
||||
|
||||
## Testing Streams
|
||||
|
||||
Test streaming responses:
|
||||
|
||||
```typescript
|
||||
it('should stream large file', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/files/large-file',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
t.assert.ok(response.rawPayload.length > 0);
|
||||
});
|
||||
```
|
||||
|
||||
## Mocking Dependencies
|
||||
|
||||
Mock external services and databases:
|
||||
|
||||
```typescript
|
||||
import { describe, it, before, after, mock } from 'node:test';
|
||||
|
||||
describe('User Service', () => {
|
||||
let app;
|
||||
|
||||
before(async () => {
|
||||
// Create app with mocked dependencies
|
||||
const mockDb = {
|
||||
users: {
|
||||
findAll: mock.fn(async () => [
|
||||
{ id: '1', name: 'User 1' },
|
||||
{ id: '2', name: 'User 2' },
|
||||
]),
|
||||
findById: mock.fn(async (id) => {
|
||||
if (id === '1') return { id: '1', name: 'User 1' };
|
||||
return null;
|
||||
}),
|
||||
create: mock.fn(async (data) => ({ id: 'new-id', ...data })),
|
||||
},
|
||||
};
|
||||
|
||||
app = Fastify();
|
||||
app.decorate('db', mockDb);
|
||||
app.register(import('./routes/users.js'));
|
||||
await app.ready();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should call findAll', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/users',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
t.assert.equal(app.db.users.findAll.mock.calls.length, 1);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Testing Plugins in Isolation
|
||||
|
||||
Test plugins independently:
|
||||
|
||||
```typescript
|
||||
import { describe, it, before, after } from 'node:test';
|
||||
import Fastify from 'fastify';
|
||||
import cachePlugin from './plugins/cache.js';
|
||||
|
||||
describe('Cache Plugin', () => {
|
||||
let app;
|
||||
|
||||
before(async () => {
|
||||
app = Fastify();
|
||||
app.register(cachePlugin, { ttl: 1000 });
|
||||
await app.ready();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should decorate fastify with cache', (t) => {
|
||||
t.assert.ok(app.hasDecorator('cache'));
|
||||
t.assert.equal(typeof app.cache.get, 'function');
|
||||
t.assert.equal(typeof app.cache.set, 'function');
|
||||
});
|
||||
|
||||
it('should cache and retrieve values', (t) => {
|
||||
app.cache.set('key', 'value');
|
||||
t.assert.equal(app.cache.get('key'), 'value');
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Testing Hooks
|
||||
|
||||
Test hook behavior:
|
||||
|
||||
```typescript
|
||||
describe('Hooks', () => {
|
||||
it('should add request id header', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/health',
|
||||
});
|
||||
|
||||
t.assert.ok(response.headers['x-request-id']);
|
||||
});
|
||||
|
||||
it('should log request timing', async (t) => {
|
||||
const logs = [];
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
stream: {
|
||||
write: (msg) => logs.push(JSON.parse(msg)),
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
app.register(import('./app.js'));
|
||||
await app.ready();
|
||||
|
||||
await app.inject({ method: 'GET', url: '/health' });
|
||||
|
||||
const responseLog = logs.find((l) => l.msg?.includes('completed'));
|
||||
t.assert.ok(responseLog);
|
||||
t.assert.ok(responseLog.responseTime);
|
||||
|
||||
await app.close();
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Test Factory Pattern
|
||||
|
||||
Create a reusable test app builder:
|
||||
|
||||
```typescript
|
||||
// test/helper.ts
|
||||
import Fastify from 'fastify';
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
|
||||
interface TestContext {
|
||||
app: FastifyInstance;
|
||||
inject: FastifyInstance['inject'];
|
||||
}
|
||||
|
||||
export async function buildTestApp(options = {}): Promise<TestContext> {
|
||||
const app = Fastify({
|
||||
logger: false, // Disable logging in tests
|
||||
...options,
|
||||
});
|
||||
|
||||
// Register plugins
|
||||
app.register(import('../src/plugins/database.js'), {
|
||||
connectionString: process.env.TEST_DATABASE_URL,
|
||||
});
|
||||
app.register(import('../src/routes/index.js'));
|
||||
|
||||
await app.ready();
|
||||
|
||||
return {
|
||||
app,
|
||||
inject: app.inject.bind(app),
|
||||
};
|
||||
}
|
||||
|
||||
// Usage in tests
|
||||
describe('API Tests', () => {
|
||||
let ctx: TestContext;
|
||||
|
||||
before(async () => {
|
||||
ctx = await buildTestApp();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await ctx.app.close();
|
||||
});
|
||||
|
||||
it('should work', async (t) => {
|
||||
const response = await ctx.inject({
|
||||
method: 'GET',
|
||||
url: '/health',
|
||||
});
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Database Testing with Transactions
|
||||
|
||||
Use transactions for test isolation:
|
||||
|
||||
```typescript
|
||||
describe('Database Integration', () => {
|
||||
let app;
|
||||
let transaction;
|
||||
|
||||
before(async () => {
|
||||
app = await buildApp();
|
||||
await app.ready();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
beforeEach(async () => {
|
||||
transaction = await app.db.beginTransaction();
|
||||
app.db.setTransaction(transaction);
|
||||
});
|
||||
|
||||
afterEach(async () => {
|
||||
await transaction.rollback();
|
||||
});
|
||||
|
||||
it('should create user', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/users',
|
||||
payload: { name: 'Test', email: 'test@example.com' },
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 201);
|
||||
// Transaction is rolled back after test
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Parallel Test Execution
|
||||
|
||||
Structure tests for parallel execution:
|
||||
|
||||
```typescript
|
||||
// Tests run in parallel by default with node:test
|
||||
// Use separate app instances or proper isolation
|
||||
|
||||
import { describe, it } from 'node:test';
|
||||
|
||||
describe('User API', async () => {
|
||||
// Each test suite gets its own app instance
|
||||
const app = await buildTestApp();
|
||||
|
||||
it('test 1', async (t) => {
|
||||
// ...
|
||||
});
|
||||
|
||||
it('test 2', async (t) => {
|
||||
// ...
|
||||
});
|
||||
|
||||
// Cleanup after all tests in this suite
|
||||
after(() => app.close());
|
||||
});
|
||||
|
||||
describe('Post API', async () => {
|
||||
const app = await buildTestApp();
|
||||
|
||||
it('test 1', async (t) => {
|
||||
// ...
|
||||
});
|
||||
|
||||
after(() => app.close());
|
||||
});
|
||||
```
|
||||
|
||||
## Running Tests
|
||||
|
||||
```bash
|
||||
# Run all tests
|
||||
node --test
|
||||
|
||||
# Run with TypeScript
|
||||
node --test src/**/*.test.ts
|
||||
|
||||
# Run specific file
|
||||
node --test src/routes/users.test.ts
|
||||
|
||||
# With coverage
|
||||
node --test --experimental-test-coverage
|
||||
|
||||
# Watch mode
|
||||
node --test --watch
|
||||
```
|
||||
458
.agents/skills/fastify-best-practices/rules/typescript.md
Normal file
458
.agents/skills/fastify-best-practices/rules/typescript.md
Normal file
|
|
@ -0,0 +1,458 @@
|
|||
---
|
||||
name: typescript
|
||||
description: TypeScript integration with Fastify
|
||||
metadata:
|
||||
tags: typescript, types, generics, type-safety
|
||||
---
|
||||
|
||||
# TypeScript Integration
|
||||
|
||||
## Type Stripping with Node.js
|
||||
|
||||
Use Node.js built-in type stripping (Node.js 22.6+):
|
||||
|
||||
```bash
|
||||
# Run TypeScript directly
|
||||
node --experimental-strip-types app.ts
|
||||
|
||||
# In Node.js 23+
|
||||
node app.ts
|
||||
```
|
||||
|
||||
```json
|
||||
// package.json
|
||||
{
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"start": "node app.ts",
|
||||
"dev": "node --watch app.ts"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
```typescript
|
||||
// tsconfig.json for type stripping
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ESNext",
|
||||
"module": "NodeNext",
|
||||
"moduleResolution": "NodeNext",
|
||||
"verbatimModuleSyntax": true,
|
||||
"erasableSyntaxOnly": true,
|
||||
"noEmit": true,
|
||||
"strict": true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Basic Type Safety
|
||||
|
||||
Type your Fastify application:
|
||||
|
||||
```typescript
|
||||
import Fastify, { type FastifyInstance, type FastifyRequest, type FastifyReply } from 'fastify';
|
||||
|
||||
const app: FastifyInstance = Fastify({ logger: true });
|
||||
|
||||
app.get('/health', async (request: FastifyRequest, reply: FastifyReply) => {
|
||||
return { status: 'ok' };
|
||||
});
|
||||
|
||||
await app.listen({ port: 3000 });
|
||||
```
|
||||
|
||||
## Typing Route Handlers
|
||||
|
||||
Use generics to type request parts:
|
||||
|
||||
```typescript
|
||||
import type { FastifyRequest, FastifyReply } from 'fastify';
|
||||
|
||||
interface CreateUserBody {
|
||||
name: string;
|
||||
email: string;
|
||||
}
|
||||
|
||||
interface UserParams {
|
||||
id: string;
|
||||
}
|
||||
|
||||
interface UserQuery {
|
||||
include?: string;
|
||||
}
|
||||
|
||||
// Type the request with generics
|
||||
app.post<{
|
||||
Body: CreateUserBody;
|
||||
}>('/users', async (request, reply) => {
|
||||
const { name, email } = request.body; // Fully typed
|
||||
return { name, email };
|
||||
});
|
||||
|
||||
app.get<{
|
||||
Params: UserParams;
|
||||
Querystring: UserQuery;
|
||||
}>('/users/:id', async (request) => {
|
||||
const { id } = request.params; // string
|
||||
const { include } = request.query; // string | undefined
|
||||
return { id, include };
|
||||
});
|
||||
|
||||
// Full route options typing
|
||||
app.route<{
|
||||
Params: UserParams;
|
||||
Querystring: UserQuery;
|
||||
Body: CreateUserBody;
|
||||
Reply: { user: { id: string; name: string } };
|
||||
}>({
|
||||
method: 'PUT',
|
||||
url: '/users/:id',
|
||||
handler: async (request, reply) => {
|
||||
return { user: { id: request.params.id, name: request.body.name } };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Type Providers
|
||||
|
||||
Use @fastify/type-provider-typebox for runtime + compile-time safety:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import { TypeBoxTypeProvider } from '@fastify/type-provider-typebox';
|
||||
import { Type } from '@sinclair/typebox';
|
||||
|
||||
const app = Fastify().withTypeProvider<TypeBoxTypeProvider>();
|
||||
|
||||
const UserSchema = Type.Object({
|
||||
id: Type.String(),
|
||||
name: Type.String(),
|
||||
email: Type.String({ format: 'email' }),
|
||||
});
|
||||
|
||||
const CreateUserSchema = Type.Object({
|
||||
name: Type.String({ minLength: 1 }),
|
||||
email: Type.String({ format: 'email' }),
|
||||
});
|
||||
|
||||
app.post('/users', {
|
||||
schema: {
|
||||
body: CreateUserSchema,
|
||||
response: {
|
||||
201: UserSchema,
|
||||
},
|
||||
},
|
||||
}, async (request, reply) => {
|
||||
// request.body is typed as { name: string; email: string }
|
||||
const { name, email } = request.body;
|
||||
|
||||
reply.code(201);
|
||||
return { id: 'generated', name, email };
|
||||
});
|
||||
```
|
||||
|
||||
## Typing Decorators
|
||||
|
||||
Extend Fastify types with declaration merging:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
// Declare types for decorators
|
||||
declare module 'fastify' {
|
||||
interface FastifyInstance {
|
||||
config: {
|
||||
port: number;
|
||||
host: string;
|
||||
};
|
||||
db: Database;
|
||||
}
|
||||
|
||||
interface FastifyRequest {
|
||||
user?: {
|
||||
id: string;
|
||||
email: string;
|
||||
role: string;
|
||||
};
|
||||
startTime: number;
|
||||
}
|
||||
|
||||
interface FastifyReply {
|
||||
sendSuccess: (data: unknown) => void;
|
||||
}
|
||||
}
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Add decorators
|
||||
app.decorate('config', { port: 3000, host: 'localhost' });
|
||||
app.decorate('db', new Database());
|
||||
|
||||
app.decorateRequest('user', null);
|
||||
app.decorateRequest('startTime', 0);
|
||||
|
||||
app.decorateReply('sendSuccess', function (data: unknown) {
|
||||
this.send({ success: true, data });
|
||||
});
|
||||
|
||||
// Now fully typed
|
||||
app.get('/profile', async (request, reply) => {
|
||||
const user = request.user; // { id: string; email: string; role: string } | undefined
|
||||
const config = app.config; // { port: number; host: string }
|
||||
|
||||
reply.sendSuccess({ user });
|
||||
});
|
||||
```
|
||||
|
||||
## Typing Plugins
|
||||
|
||||
Type plugin options and exports:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
import type { FastifyPluginAsync } from 'fastify';
|
||||
|
||||
interface DatabasePluginOptions {
|
||||
connectionString: string;
|
||||
poolSize?: number;
|
||||
}
|
||||
|
||||
declare module 'fastify' {
|
||||
interface FastifyInstance {
|
||||
db: {
|
||||
query: (sql: string, params?: unknown[]) => Promise<unknown[]>;
|
||||
close: () => Promise<void>;
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
const databasePlugin: FastifyPluginAsync<DatabasePluginOptions> = async (
|
||||
fastify,
|
||||
options,
|
||||
) => {
|
||||
const { connectionString, poolSize = 10 } = options;
|
||||
|
||||
const db = await createConnection(connectionString, poolSize);
|
||||
|
||||
fastify.decorate('db', {
|
||||
query: (sql: string, params?: unknown[]) => db.query(sql, params),
|
||||
close: () => db.end(),
|
||||
});
|
||||
|
||||
fastify.addHook('onClose', async () => {
|
||||
await db.end();
|
||||
});
|
||||
};
|
||||
|
||||
export default fp(databasePlugin, {
|
||||
name: 'database',
|
||||
});
|
||||
```
|
||||
|
||||
## Typing Hooks
|
||||
|
||||
Type hook functions:
|
||||
|
||||
```typescript
|
||||
import type {
|
||||
FastifyRequest,
|
||||
FastifyReply,
|
||||
onRequestHookHandler,
|
||||
preHandlerHookHandler,
|
||||
} from 'fastify';
|
||||
|
||||
const authHook: preHandlerHookHandler = async (
|
||||
request: FastifyRequest,
|
||||
reply: FastifyReply,
|
||||
) => {
|
||||
const token = request.headers.authorization;
|
||||
if (!token) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return;
|
||||
}
|
||||
request.user = await verifyToken(token);
|
||||
};
|
||||
|
||||
const timingHook: onRequestHookHandler = async (request) => {
|
||||
request.startTime = Date.now();
|
||||
};
|
||||
|
||||
app.addHook('onRequest', timingHook);
|
||||
app.addHook('preHandler', authHook);
|
||||
```
|
||||
|
||||
## Typing Schema Objects
|
||||
|
||||
Create reusable typed schemas:
|
||||
|
||||
```typescript
|
||||
import type { JSONSchema7 } from 'json-schema';
|
||||
|
||||
// Define schema with const assertion for type inference
|
||||
const userSchema = {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string', format: 'email' },
|
||||
},
|
||||
required: ['id', 'name', 'email'],
|
||||
} as const satisfies JSONSchema7;
|
||||
|
||||
// Infer TypeScript type from schema
|
||||
type User = {
|
||||
id: string;
|
||||
name: string;
|
||||
email: string;
|
||||
};
|
||||
|
||||
app.get<{ Reply: User }>('/users/:id', {
|
||||
schema: {
|
||||
response: {
|
||||
200: userSchema,
|
||||
},
|
||||
},
|
||||
}, async (request) => {
|
||||
return { id: '1', name: 'John', email: 'john@example.com' };
|
||||
});
|
||||
```
|
||||
|
||||
## Shared Types
|
||||
|
||||
Organize types in dedicated files:
|
||||
|
||||
```typescript
|
||||
// types/index.ts
|
||||
export interface User {
|
||||
id: string;
|
||||
name: string;
|
||||
email: string;
|
||||
role: 'admin' | 'user';
|
||||
}
|
||||
|
||||
export interface CreateUserInput {
|
||||
name: string;
|
||||
email: string;
|
||||
}
|
||||
|
||||
export interface PaginationQuery {
|
||||
page?: number;
|
||||
limit?: number;
|
||||
sort?: string;
|
||||
}
|
||||
|
||||
// routes/users.ts
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { User, CreateUserInput, PaginationQuery } from '../types/index.js';
|
||||
|
||||
export default async function userRoutes(fastify: FastifyInstance) {
|
||||
fastify.get<{
|
||||
Querystring: PaginationQuery;
|
||||
Reply: { users: User[]; total: number };
|
||||
}>('/', async (request) => {
|
||||
const { page = 1, limit = 10 } = request.query;
|
||||
// ...
|
||||
});
|
||||
|
||||
fastify.post<{
|
||||
Body: CreateUserInput;
|
||||
Reply: User;
|
||||
}>('/', async (request, reply) => {
|
||||
reply.code(201);
|
||||
// ...
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## Type-Safe Route Registration
|
||||
|
||||
Create typed route factories:
|
||||
|
||||
```typescript
|
||||
import type { FastifyInstance, RouteOptions } from 'fastify';
|
||||
|
||||
function createCrudRoutes<T extends { id: string }>(
|
||||
fastify: FastifyInstance,
|
||||
options: {
|
||||
prefix: string;
|
||||
schema: {
|
||||
item: object;
|
||||
create: object;
|
||||
update: object;
|
||||
};
|
||||
handlers: {
|
||||
list: () => Promise<T[]>;
|
||||
get: (id: string) => Promise<T | null>;
|
||||
create: (data: unknown) => Promise<T>;
|
||||
update: (id: string, data: unknown) => Promise<T>;
|
||||
delete: (id: string) => Promise<void>;
|
||||
};
|
||||
},
|
||||
) {
|
||||
const { prefix, schema, handlers } = options;
|
||||
|
||||
fastify.get(`${prefix}`, {
|
||||
schema: { response: { 200: { type: 'array', items: schema.item } } },
|
||||
}, async () => handlers.list());
|
||||
|
||||
fastify.get(`${prefix}/:id`, {
|
||||
schema: { response: { 200: schema.item } },
|
||||
}, async (request) => {
|
||||
const item = await handlers.get((request.params as { id: string }).id);
|
||||
if (!item) throw { statusCode: 404, message: 'Not found' };
|
||||
return item;
|
||||
});
|
||||
|
||||
// ... more routes
|
||||
}
|
||||
```
|
||||
|
||||
## Avoiding Type Gymnastics
|
||||
|
||||
Keep types simple and practical:
|
||||
|
||||
```typescript
|
||||
// GOOD - simple, readable types
|
||||
interface UserRequest {
|
||||
Params: { id: string };
|
||||
Body: { name: string };
|
||||
}
|
||||
|
||||
app.put<UserRequest>('/users/:id', handler);
|
||||
|
||||
// AVOID - overly complex generic types
|
||||
type DeepPartial<T> = T extends object ? {
|
||||
[P in keyof T]?: DeepPartial<T[P]>;
|
||||
} : T;
|
||||
|
||||
// AVOID - excessive type inference
|
||||
type InferSchemaType<T> = T extends { properties: infer P }
|
||||
? { [K in keyof P]: InferPropertyType<P[K]> }
|
||||
: never;
|
||||
```
|
||||
|
||||
## Type Checking Without Compilation
|
||||
|
||||
Use TypeScript for type checking only:
|
||||
|
||||
```bash
|
||||
# Type check without emitting
|
||||
npx tsc --noEmit
|
||||
|
||||
# Watch mode
|
||||
npx tsc --noEmit --watch
|
||||
|
||||
# In CI
|
||||
npm run typecheck
|
||||
```
|
||||
|
||||
```json
|
||||
// package.json
|
||||
{
|
||||
"scripts": {
|
||||
"start": "node app.ts",
|
||||
"typecheck": "tsc --noEmit",
|
||||
"test": "npm run typecheck && node --test"
|
||||
}
|
||||
}
|
||||
```
|
||||
421
.agents/skills/fastify-best-practices/rules/websockets.md
Normal file
421
.agents/skills/fastify-best-practices/rules/websockets.md
Normal file
|
|
@ -0,0 +1,421 @@
|
|||
---
|
||||
name: websockets
|
||||
description: WebSocket support in Fastify
|
||||
metadata:
|
||||
tags: websockets, realtime, ws, socket
|
||||
---
|
||||
|
||||
# WebSocket Support
|
||||
|
||||
## Using @fastify/websocket
|
||||
|
||||
Add WebSocket support to Fastify:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import websocket from '@fastify/websocket';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
app.register(websocket);
|
||||
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
socket.on('message', (message) => {
|
||||
const data = message.toString();
|
||||
console.log('Received:', data);
|
||||
|
||||
// Echo back
|
||||
socket.send(`Echo: ${data}`);
|
||||
});
|
||||
|
||||
socket.on('close', () => {
|
||||
console.log('Client disconnected');
|
||||
});
|
||||
|
||||
socket.on('error', (error) => {
|
||||
console.error('WebSocket error:', error);
|
||||
});
|
||||
});
|
||||
|
||||
await app.listen({ port: 3000 });
|
||||
```
|
||||
|
||||
## WebSocket with Hooks
|
||||
|
||||
Use Fastify hooks with WebSocket routes:
|
||||
|
||||
```typescript
|
||||
app.register(async function wsRoutes(fastify) {
|
||||
// This hook runs before WebSocket upgrade
|
||||
fastify.addHook('preValidation', async (request, reply) => {
|
||||
const token = request.headers.authorization;
|
||||
if (!token) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return;
|
||||
}
|
||||
request.user = await verifyToken(token);
|
||||
});
|
||||
|
||||
fastify.get('/ws', { websocket: true }, (socket, request) => {
|
||||
console.log('Connected user:', request.user.id);
|
||||
|
||||
socket.on('message', (message) => {
|
||||
// Handle authenticated messages
|
||||
});
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Connection Options
|
||||
|
||||
Configure WebSocket server options:
|
||||
|
||||
```typescript
|
||||
app.register(websocket, {
|
||||
options: {
|
||||
maxPayload: 1048576, // 1MB max message size
|
||||
clientTracking: true,
|
||||
perMessageDeflate: {
|
||||
zlibDeflateOptions: {
|
||||
chunkSize: 1024,
|
||||
memLevel: 7,
|
||||
level: 3,
|
||||
},
|
||||
zlibInflateOptions: {
|
||||
chunkSize: 10 * 1024,
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Broadcast to All Clients
|
||||
|
||||
Broadcast messages to connected clients:
|
||||
|
||||
```typescript
|
||||
const clients = new Set<WebSocket>();
|
||||
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
clients.add(socket);
|
||||
|
||||
socket.on('close', () => {
|
||||
clients.delete(socket);
|
||||
});
|
||||
|
||||
socket.on('message', (message) => {
|
||||
// Broadcast to all other clients
|
||||
for (const client of clients) {
|
||||
if (client !== socket && client.readyState === WebSocket.OPEN) {
|
||||
client.send(message);
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// Broadcast from HTTP route
|
||||
app.post('/broadcast', async (request) => {
|
||||
const { message } = request.body;
|
||||
|
||||
for (const client of clients) {
|
||||
if (client.readyState === WebSocket.OPEN) {
|
||||
client.send(JSON.stringify({ type: 'broadcast', message }));
|
||||
}
|
||||
}
|
||||
|
||||
return { sent: clients.size };
|
||||
});
|
||||
```
|
||||
|
||||
## Rooms/Channels Pattern
|
||||
|
||||
Organize connections into rooms:
|
||||
|
||||
```typescript
|
||||
const rooms = new Map<string, Set<WebSocket>>();
|
||||
|
||||
function joinRoom(socket: WebSocket, roomId: string) {
|
||||
if (!rooms.has(roomId)) {
|
||||
rooms.set(roomId, new Set());
|
||||
}
|
||||
rooms.get(roomId)!.add(socket);
|
||||
}
|
||||
|
||||
function leaveRoom(socket: WebSocket, roomId: string) {
|
||||
rooms.get(roomId)?.delete(socket);
|
||||
if (rooms.get(roomId)?.size === 0) {
|
||||
rooms.delete(roomId);
|
||||
}
|
||||
}
|
||||
|
||||
function broadcastToRoom(roomId: string, message: string, exclude?: WebSocket) {
|
||||
const room = rooms.get(roomId);
|
||||
if (!room) return;
|
||||
|
||||
for (const client of room) {
|
||||
if (client !== exclude && client.readyState === WebSocket.OPEN) {
|
||||
client.send(message);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
app.get('/ws/:roomId', { websocket: true }, (socket, request) => {
|
||||
const { roomId } = request.params as { roomId: string };
|
||||
|
||||
joinRoom(socket, roomId);
|
||||
|
||||
socket.on('message', (message) => {
|
||||
broadcastToRoom(roomId, message.toString(), socket);
|
||||
});
|
||||
|
||||
socket.on('close', () => {
|
||||
leaveRoom(socket, roomId);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Structured Message Protocol
|
||||
|
||||
Use JSON for structured messages:
|
||||
|
||||
```typescript
|
||||
interface WSMessage {
|
||||
type: string;
|
||||
payload?: unknown;
|
||||
id?: string;
|
||||
}
|
||||
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
function send(message: WSMessage) {
|
||||
socket.send(JSON.stringify(message));
|
||||
}
|
||||
|
||||
socket.on('message', (raw) => {
|
||||
let message: WSMessage;
|
||||
|
||||
try {
|
||||
message = JSON.parse(raw.toString());
|
||||
} catch {
|
||||
send({ type: 'error', payload: 'Invalid JSON' });
|
||||
return;
|
||||
}
|
||||
|
||||
switch (message.type) {
|
||||
case 'ping':
|
||||
send({ type: 'pong', id: message.id });
|
||||
break;
|
||||
|
||||
case 'subscribe':
|
||||
handleSubscribe(socket, message.payload);
|
||||
send({ type: 'subscribed', payload: message.payload, id: message.id });
|
||||
break;
|
||||
|
||||
case 'message':
|
||||
handleMessage(socket, message.payload);
|
||||
break;
|
||||
|
||||
default:
|
||||
send({ type: 'error', payload: 'Unknown message type' });
|
||||
}
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Heartbeat/Ping-Pong
|
||||
|
||||
Keep connections alive:
|
||||
|
||||
```typescript
|
||||
const HEARTBEAT_INTERVAL = 30000;
|
||||
const clients = new Map<WebSocket, { isAlive: boolean }>();
|
||||
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
clients.set(socket, { isAlive: true });
|
||||
|
||||
socket.on('pong', () => {
|
||||
const client = clients.get(socket);
|
||||
if (client) client.isAlive = true;
|
||||
});
|
||||
|
||||
socket.on('close', () => {
|
||||
clients.delete(socket);
|
||||
});
|
||||
});
|
||||
|
||||
// Heartbeat interval
|
||||
setInterval(() => {
|
||||
for (const [socket, state] of clients) {
|
||||
if (!state.isAlive) {
|
||||
socket.terminate();
|
||||
clients.delete(socket);
|
||||
continue;
|
||||
}
|
||||
|
||||
state.isAlive = false;
|
||||
socket.ping();
|
||||
}
|
||||
}, HEARTBEAT_INTERVAL);
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
Authenticate WebSocket connections:
|
||||
|
||||
```typescript
|
||||
app.get('/ws', {
|
||||
websocket: true,
|
||||
preValidation: async (request, reply) => {
|
||||
// Authenticate via query parameter or header
|
||||
const token = request.query.token || request.headers.authorization?.replace('Bearer ', '');
|
||||
|
||||
if (!token) {
|
||||
reply.code(401).send({ error: 'Token required' });
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
request.user = await verifyToken(token);
|
||||
} catch {
|
||||
reply.code(401).send({ error: 'Invalid token' });
|
||||
}
|
||||
},
|
||||
}, (socket, request) => {
|
||||
console.log('Authenticated user:', request.user);
|
||||
|
||||
socket.on('message', (message) => {
|
||||
// Handle authenticated messages
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
Handle WebSocket errors properly:
|
||||
|
||||
```typescript
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
socket.on('error', (error) => {
|
||||
request.log.error({ err: error }, 'WebSocket error');
|
||||
});
|
||||
|
||||
socket.on('message', async (raw) => {
|
||||
try {
|
||||
const message = JSON.parse(raw.toString());
|
||||
const result = await processMessage(message);
|
||||
socket.send(JSON.stringify({ success: true, result }));
|
||||
} catch (error) {
|
||||
request.log.error({ err: error }, 'Message processing error');
|
||||
socket.send(JSON.stringify({
|
||||
success: false,
|
||||
error: error.message,
|
||||
}));
|
||||
}
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Rate Limiting WebSocket Messages
|
||||
|
||||
Limit message frequency:
|
||||
|
||||
```typescript
|
||||
const rateLimits = new Map<WebSocket, { count: number; resetAt: number }>();
|
||||
|
||||
function checkRateLimit(socket: WebSocket, limit: number, window: number): boolean {
|
||||
const now = Date.now();
|
||||
let state = rateLimits.get(socket);
|
||||
|
||||
if (!state || now > state.resetAt) {
|
||||
state = { count: 0, resetAt: now + window };
|
||||
rateLimits.set(socket, state);
|
||||
}
|
||||
|
||||
state.count++;
|
||||
|
||||
if (state.count > limit) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
socket.on('message', (message) => {
|
||||
if (!checkRateLimit(socket, 100, 60000)) {
|
||||
socket.send(JSON.stringify({ error: 'Rate limit exceeded' }));
|
||||
return;
|
||||
}
|
||||
|
||||
// Process message
|
||||
});
|
||||
|
||||
socket.on('close', () => {
|
||||
rateLimits.delete(socket);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Graceful Shutdown
|
||||
|
||||
Close WebSocket connections on shutdown:
|
||||
|
||||
```typescript
|
||||
import closeWithGrace from 'close-with-grace';
|
||||
|
||||
const connections = new Set<WebSocket>();
|
||||
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
connections.add(socket);
|
||||
|
||||
socket.on('close', () => {
|
||||
connections.delete(socket);
|
||||
});
|
||||
});
|
||||
|
||||
closeWithGrace({ delay: 5000 }, async ({ signal }) => {
|
||||
// Notify clients
|
||||
for (const socket of connections) {
|
||||
if (socket.readyState === WebSocket.OPEN) {
|
||||
socket.send(JSON.stringify({ type: 'shutdown', message: 'Server is shutting down' }));
|
||||
socket.close(1001, 'Server shutdown');
|
||||
}
|
||||
}
|
||||
|
||||
await app.close();
|
||||
});
|
||||
```
|
||||
|
||||
## Full-Duplex Stream Pattern
|
||||
|
||||
Use WebSocket for streaming data:
|
||||
|
||||
```typescript
|
||||
app.get('/ws/stream', { websocket: true }, async (socket, request) => {
|
||||
const stream = createDataStream();
|
||||
|
||||
stream.on('data', (data) => {
|
||||
if (socket.readyState === WebSocket.OPEN) {
|
||||
socket.send(JSON.stringify({ type: 'data', payload: data }));
|
||||
}
|
||||
});
|
||||
|
||||
stream.on('end', () => {
|
||||
socket.send(JSON.stringify({ type: 'end' }));
|
||||
socket.close();
|
||||
});
|
||||
|
||||
socket.on('message', (message) => {
|
||||
const { type, payload } = JSON.parse(message.toString());
|
||||
|
||||
if (type === 'pause') {
|
||||
stream.pause();
|
||||
} else if (type === 'resume') {
|
||||
stream.resume();
|
||||
}
|
||||
});
|
||||
|
||||
socket.on('close', () => {
|
||||
stream.destroy();
|
||||
});
|
||||
});
|
||||
```
|
||||
11
.agents/skills/fastify-best-practices/tile.json
Normal file
11
.agents/skills/fastify-best-practices/tile.json
Normal file
|
|
@ -0,0 +1,11 @@
|
|||
{
|
||||
"name": "mcollina/fastify-best-practices",
|
||||
"version": "0.1.0",
|
||||
"private": false,
|
||||
"summary": "Guides development of Fastify Node.js backend servers and REST APIs using TypeScript or JavaScript. Use when building, configuring, or debugging a Fastify application — including defining routes, implementing plugins, setting up JSON Schema validation, handling errors, optimising performance, managing authentication, configuring CORS and security headers, integrating databases, working with WebSockets, and deploying to production. Covers the full Fastify request lifecycle (hooks, serialization, logging with Pino) and TypeScript integration via strip types. Trigger terms: Fastify, Node.js server, REST API, API routes, backend framework, fastify.config, server.ts, app.ts.",
|
||||
"skills": {
|
||||
"fastify-best-practices": {
|
||||
"path": "SKILL.md"
|
||||
}
|
||||
}
|
||||
}
|
||||
244
.agents/skills/fastify-typescript/SKILL.md
Normal file
244
.agents/skills/fastify-typescript/SKILL.md
Normal file
|
|
@ -0,0 +1,244 @@
|
|||
---
|
||||
name: fastify-typescript
|
||||
description: Guidelines for building high-performance APIs with Fastify and TypeScript, covering validation, Prisma integration, and testing best practices
|
||||
---
|
||||
|
||||
# Fastify TypeScript Development
|
||||
|
||||
You are an expert in Fastify and TypeScript development with deep knowledge of building high-performance, type-safe APIs.
|
||||
|
||||
## TypeScript General Guidelines
|
||||
|
||||
### Basic Principles
|
||||
|
||||
- Use English for all code and documentation
|
||||
- Always declare types for variables and functions (parameters and return values)
|
||||
- Avoid using `any` type - create necessary types instead
|
||||
- Use JSDoc to document public classes and methods
|
||||
- Write concise, maintainable, and technically accurate code
|
||||
- Use functional and declarative programming patterns; avoid classes
|
||||
- Prefer iteration and modularization to adhere to DRY principles
|
||||
|
||||
### Nomenclature
|
||||
|
||||
- Use PascalCase for types and interfaces
|
||||
- Use camelCase for variables, functions, and methods
|
||||
- Use kebab-case for file and directory names
|
||||
- Use UPPERCASE for environment variables
|
||||
- Use descriptive variable names with auxiliary verbs: `isLoading`, `hasError`, `canDelete`
|
||||
- Start each function with a verb
|
||||
|
||||
### Functions
|
||||
|
||||
- Write short functions with a single purpose
|
||||
- Use arrow functions for simple operations
|
||||
- Use async/await consistently throughout the codebase
|
||||
- Use the RO-RO pattern (Receive an Object, Return an Object) for multiple parameters
|
||||
|
||||
### Types and Interfaces
|
||||
|
||||
- Prefer interfaces over types for object shapes
|
||||
- Avoid enums; use maps or const objects instead
|
||||
- Use Zod for runtime validation with inferred types
|
||||
- Use `readonly` for immutable properties
|
||||
- Use `import type` for type-only imports
|
||||
|
||||
## Fastify-Specific Guidelines
|
||||
|
||||
### Project Structure
|
||||
|
||||
```
|
||||
src/
|
||||
routes/
|
||||
{resource}/
|
||||
index.ts
|
||||
handlers.ts
|
||||
schemas.ts
|
||||
plugins/
|
||||
auth.ts
|
||||
database.ts
|
||||
cors.ts
|
||||
services/
|
||||
{domain}Service.ts
|
||||
repositories/
|
||||
{entity}Repository.ts
|
||||
types/
|
||||
index.ts
|
||||
utils/
|
||||
config/
|
||||
app.ts
|
||||
server.ts
|
||||
```
|
||||
|
||||
### Route Organization
|
||||
|
||||
- Organize routes by resource/domain
|
||||
- Use route plugins for modular registration
|
||||
- Define schemas alongside route handlers
|
||||
- Use route prefixes for API versioning
|
||||
|
||||
```typescript
|
||||
import { FastifyPluginAsync } from 'fastify';
|
||||
|
||||
const usersRoutes: FastifyPluginAsync = async (fastify) => {
|
||||
fastify.get('/', { schema: listUsersSchema }, listUsersHandler);
|
||||
fastify.get('/:id', { schema: getUserSchema }, getUserHandler);
|
||||
fastify.post('/', { schema: createUserSchema }, createUserHandler);
|
||||
fastify.put('/:id', { schema: updateUserSchema }, updateUserHandler);
|
||||
fastify.delete('/:id', { schema: deleteUserSchema }, deleteUserHandler);
|
||||
};
|
||||
|
||||
export default usersRoutes;
|
||||
```
|
||||
|
||||
### Schema Validation with JSON Schema / Ajv
|
||||
|
||||
- Define JSON schemas for all request/response validation
|
||||
- Use @sinclair/typebox for type-safe schema definitions
|
||||
- Leverage Fastify's built-in Ajv integration
|
||||
|
||||
```typescript
|
||||
import { Type, Static } from '@sinclair/typebox';
|
||||
|
||||
const UserSchema = Type.Object({
|
||||
id: Type.String({ format: 'uuid' }),
|
||||
name: Type.String({ minLength: 1 }),
|
||||
email: Type.String({ format: 'email' }),
|
||||
createdAt: Type.String({ format: 'date-time' }),
|
||||
});
|
||||
|
||||
type User = Static<typeof UserSchema>;
|
||||
|
||||
const createUserSchema = {
|
||||
body: Type.Object({
|
||||
name: Type.String({ minLength: 1 }),
|
||||
email: Type.String({ format: 'email' }),
|
||||
}),
|
||||
response: {
|
||||
201: UserSchema,
|
||||
400: ErrorSchema,
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
### Plugins and Decorators
|
||||
|
||||
- Use plugins for shared functionality
|
||||
- Decorate Fastify instance with services and utilities
|
||||
- Register plugins with proper encapsulation
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
const databasePlugin = fp(async (fastify) => {
|
||||
const prisma = new PrismaClient();
|
||||
|
||||
await prisma.$connect();
|
||||
|
||||
fastify.decorate('prisma', prisma);
|
||||
|
||||
fastify.addHook('onClose', async () => {
|
||||
await prisma.$disconnect();
|
||||
});
|
||||
});
|
||||
|
||||
export default databasePlugin;
|
||||
```
|
||||
|
||||
### Prisma Integration
|
||||
|
||||
- Use Prisma as the ORM for database operations
|
||||
- Create repository classes for data access
|
||||
- Use transactions for complex operations
|
||||
|
||||
```typescript
|
||||
class UserRepository {
|
||||
constructor(private prisma: PrismaClient) {}
|
||||
|
||||
async findById(id: string): Promise<User | null> {
|
||||
return this.prisma.user.findUnique({ where: { id } });
|
||||
}
|
||||
|
||||
async create(data: CreateUserInput): Promise<User> {
|
||||
return this.prisma.user.create({ data });
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Error Handling
|
||||
|
||||
- Use Fastify's built-in error handling
|
||||
- Create custom error classes for domain errors
|
||||
- Return consistent error responses
|
||||
|
||||
```typescript
|
||||
import { FastifyError } from 'fastify';
|
||||
|
||||
class NotFoundError extends Error implements FastifyError {
|
||||
code = 'NOT_FOUND';
|
||||
statusCode = 404;
|
||||
|
||||
constructor(resource: string, id: string) {
|
||||
super(`${resource} with id ${id} not found`);
|
||||
this.name = 'NotFoundError';
|
||||
}
|
||||
}
|
||||
|
||||
// Global error handler
|
||||
fastify.setErrorHandler((error, request, reply) => {
|
||||
const statusCode = error.statusCode || 500;
|
||||
|
||||
reply.status(statusCode).send({
|
||||
error: error.name,
|
||||
message: error.message,
|
||||
statusCode,
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
### Testing with Jest
|
||||
|
||||
- Write unit tests for services and handlers
|
||||
- Use integration tests for routes
|
||||
- Mock external dependencies
|
||||
|
||||
```typescript
|
||||
import { build } from '../app';
|
||||
|
||||
describe('Users API', () => {
|
||||
let app: FastifyInstance;
|
||||
|
||||
beforeAll(async () => {
|
||||
app = await build();
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should list users', async () => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/users',
|
||||
});
|
||||
|
||||
expect(response.statusCode).toBe(200);
|
||||
expect(JSON.parse(response.payload)).toBeInstanceOf(Array);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
### Performance
|
||||
|
||||
- Fastify is one of the fastest Node.js frameworks
|
||||
- Use schema validation for automatic serialization optimization
|
||||
- Enable logging only when needed in production
|
||||
- Use connection pooling for database connections
|
||||
|
||||
### Security
|
||||
|
||||
- Use @fastify/helmet for security headers
|
||||
- Implement rate limiting with @fastify/rate-limit
|
||||
- Use @fastify/cors for CORS configuration
|
||||
- Validate all inputs with JSON Schema
|
||||
- Use JWT for authentication with @fastify/jwt
|
||||
1
.bg-shell/manifest.json
Normal file
1
.bg-shell/manifest.json
Normal file
|
|
@ -0,0 +1 @@
|
|||
[]
|
||||
20
.claude/settings.local.json
Normal file
20
.claude/settings.local.json
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
{
|
||||
"permissions": {
|
||||
"allow": [
|
||||
"Bash(MSYS_NO_PATHCONV=1 docker build -t tubearr-test:latest .)",
|
||||
"Bash(MSYS_NO_PATHCONV=1 docker compose up -d)",
|
||||
"Bash(MSYS_NO_PATHCONV=1 docker inspect --format='{{.State.Health.Status}}' tubearr 2>&1)",
|
||||
"Bash(MSYS_NO_PATHCONV=1 docker logs tubearr)",
|
||||
"Bash(MSYS_NO_PATHCONV=1 docker run --rm --entrypoint sh tubearr:latest -c \"ls /app/dist/config/ 2>&1\")",
|
||||
"Bash(MSYS_NO_PATHCONV=1 docker run --rm --entrypoint sh tubearr:latest -c \"cat /app/package.json | head -10\")",
|
||||
"Bash(MSYS_NO_PATHCONV=1 docker compose -f \"W:/programming/Projects/Tubearr/.gsd/worktrees/M001/docker-compose.yml\" down 2>&1)",
|
||||
"Bash(MSYS_NO_PATHCONV=1 docker compose build --no-cache)",
|
||||
"Bash(MSYS_NO_PATHCONV=1 docker compose build)",
|
||||
"Bash(MSYS_NO_PATHCONV=1 docker inspect --format='{{.State.Health.Status}}' tubearr)",
|
||||
"Bash(MSYS_NO_PATHCONV=1 docker inspect --format='{{json .State.Health}}' tubearr 2>&1)",
|
||||
"Bash(curl:*)",
|
||||
"Bash(python -c \"import sys,json; d=json.load\\(sys.stdin\\); print\\(f''''Channels: {len\\(d\\)}''''\\); [print\\(f'''' - {c[name]} \\({c[platform]}\\) monitoring={c.get\\(monitoringEnabled,?\\)} mode={c.get\\(monitoringMode,?\\)}''''\\) for c in d]\")",
|
||||
"Bash(python -c \":*)"
|
||||
]
|
||||
}
|
||||
}
|
||||
518
.claude/skills/drizzle-migrations/SKILL.md
Normal file
518
.claude/skills/drizzle-migrations/SKILL.md
Normal file
|
|
@ -0,0 +1,518 @@
|
|||
---
|
||||
name: drizzle-migrations
|
||||
description: "Migration-first database development workflow using Drizzle ORM for TypeScript/J..."
|
||||
version: 1.0.0
|
||||
tags: []
|
||||
progressive_disclosure:
|
||||
entry_point:
|
||||
summary: "Migration-first database development workflow using Drizzle ORM for TypeScript/J..."
|
||||
when_to_use: "When working with drizzle-migrations or related functionality."
|
||||
quick_start: "1. Review the core concepts below. 2. Apply patterns to your use case. 3. Follow best practices for implementation."
|
||||
---
|
||||
# Drizzle ORM Database Migrations (TypeScript)
|
||||
|
||||
Migration-first database development workflow using Drizzle ORM for TypeScript/JavaScript projects.
|
||||
|
||||
## When to Use This Skill
|
||||
|
||||
Use this skill when:
|
||||
- Working with Drizzle ORM in TypeScript/JavaScript projects
|
||||
- Need to create or modify database schema
|
||||
- Want migration-first development workflow
|
||||
- Setting up new database tables or columns
|
||||
- Need to ensure schema consistency across environments
|
||||
|
||||
## Core Principle: Migration-First Development
|
||||
|
||||
**Critical Rule**: Schema changes ALWAYS start with migrations, never code-first.
|
||||
|
||||
### Why Migration-First?
|
||||
- ✅ SQL migrations are the single source of truth
|
||||
- ✅ Prevents schema drift between environments
|
||||
- ✅ Enables rollback and versioning
|
||||
- ✅ Forces explicit schema design decisions
|
||||
- ✅ TypeScript types generated from migrations
|
||||
- ✅ CI/CD can validate schema changes
|
||||
|
||||
### Anti-Pattern (Code-First)
|
||||
❌ **WRONG**: Writing TypeScript schema first
|
||||
```typescript
|
||||
// DON'T DO THIS FIRST
|
||||
export const users = pgTable('users', {
|
||||
id: uuid('id').primaryKey(),
|
||||
email: text('email').notNull(),
|
||||
});
|
||||
```
|
||||
|
||||
### Correct Pattern (Migration-First)
|
||||
✅ **CORRECT**: Write SQL migration first
|
||||
```sql
|
||||
-- drizzle/0001_add_users_table.sql
|
||||
CREATE TABLE users (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
email TEXT NOT NULL UNIQUE,
|
||||
created_at TIMESTAMP DEFAULT NOW()
|
||||
);
|
||||
```
|
||||
|
||||
## Complete Migration Workflow
|
||||
|
||||
### Step 1: Design Schema in SQL Migration
|
||||
|
||||
Create descriptive SQL migration file:
|
||||
|
||||
```sql
|
||||
-- drizzle/0001_create_school_calendars.sql
|
||||
CREATE TABLE school_calendars (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
school_id UUID NOT NULL REFERENCES schools(id) ON DELETE CASCADE,
|
||||
start_date DATE NOT NULL,
|
||||
end_date DATE NOT NULL,
|
||||
academic_year TEXT NOT NULL,
|
||||
created_at TIMESTAMP DEFAULT NOW(),
|
||||
updated_at TIMESTAMP DEFAULT NOW()
|
||||
);
|
||||
|
||||
-- Add indexes for query performance
|
||||
CREATE INDEX idx_school_calendars_school_id ON school_calendars(school_id);
|
||||
CREATE INDEX idx_school_calendars_academic_year ON school_calendars(academic_year);
|
||||
|
||||
-- Add constraints
|
||||
ALTER TABLE school_calendars
|
||||
ADD CONSTRAINT check_date_range
|
||||
CHECK (end_date > start_date);
|
||||
```
|
||||
|
||||
**Naming Convention**:
|
||||
- Use sequential numbers: `0001_`, `0002_`, etc.
|
||||
- Descriptive names: `create_school_calendars`, `add_user_roles`
|
||||
- Format: `XXXX_descriptive_name.sql`
|
||||
|
||||
### Step 2: Generate TypeScript Definitions
|
||||
|
||||
Drizzle Kit generates TypeScript types from SQL:
|
||||
|
||||
```bash
|
||||
# Generate TypeScript schema and snapshots
|
||||
pnpm drizzle-kit generate
|
||||
|
||||
# Or using npm
|
||||
npm run db:generate
|
||||
```
|
||||
|
||||
**What This Creates**:
|
||||
1. TypeScript schema files (if using `drizzle-kit push`)
|
||||
2. Snapshot files in `drizzle/meta/XXXX_snapshot.json`
|
||||
3. Migration metadata
|
||||
|
||||
### Step 3: Create Schema Snapshot
|
||||
|
||||
Snapshots enable schema drift detection:
|
||||
|
||||
```json
|
||||
// drizzle/meta/0001_snapshot.json (auto-generated)
|
||||
{
|
||||
"version": "5",
|
||||
"dialect": "postgresql",
|
||||
"tables": {
|
||||
"school_calendars": {
|
||||
"name": "school_calendars",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "uuid",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"default": "gen_random_uuid()"
|
||||
},
|
||||
"school_id": {
|
||||
"name": "school_id",
|
||||
"type": "uuid",
|
||||
"notNull": true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Snapshots in Version Control**:
|
||||
- ✅ Commit snapshots to git
|
||||
- ✅ Enables drift detection in CI
|
||||
- ✅ Documents schema history
|
||||
|
||||
### Step 4: Implement TypeScript Schema
|
||||
|
||||
Now write TypeScript schema that mirrors SQL migration:
|
||||
|
||||
```typescript
|
||||
// src/lib/db/schema/school/calendar.ts
|
||||
import { pgTable, uuid, date, text, timestamp } from 'drizzle-orm/pg-core';
|
||||
import { schools } from './school';
|
||||
|
||||
export const schoolCalendars = pgTable('school_calendars', {
|
||||
id: uuid('id').primaryKey().defaultRandom(),
|
||||
schoolId: uuid('school_id')
|
||||
.notNull()
|
||||
.references(() => schools.id, { onDelete: 'cascade' }),
|
||||
startDate: date('start_date').notNull(),
|
||||
endDate: date('end_date').notNull(),
|
||||
academicYear: text('academic_year').notNull(),
|
||||
createdAt: timestamp('created_at').defaultNow(),
|
||||
updatedAt: timestamp('updated_at').defaultNow(),
|
||||
});
|
||||
|
||||
// Type inference
|
||||
export type SchoolCalendar = typeof schoolCalendars.$inferSelect;
|
||||
export type NewSchoolCalendar = typeof schoolCalendars.$inferInsert;
|
||||
```
|
||||
|
||||
**Key Points**:
|
||||
- Column names match SQL exactly: `school_id` → `'school_id'`
|
||||
- TypeScript property names use camelCase: `schoolId`
|
||||
- Constraints and indexes defined in SQL, not TypeScript
|
||||
- Foreign keys reference other tables
|
||||
|
||||
### Step 5: Organize Schemas by Domain
|
||||
|
||||
Structure schemas for maintainability:
|
||||
|
||||
```
|
||||
src/lib/db/schema/
|
||||
├── index.ts # Export all schemas
|
||||
├── school/
|
||||
│ ├── index.ts
|
||||
│ ├── district.ts
|
||||
│ ├── holiday.ts
|
||||
│ ├── school.ts
|
||||
│ └── calendar.ts
|
||||
├── providers.ts
|
||||
├── cart.ts
|
||||
└── users.ts
|
||||
```
|
||||
|
||||
**index.ts** (export all):
|
||||
```typescript
|
||||
// src/lib/db/schema/index.ts
|
||||
export * from './school';
|
||||
export * from './providers';
|
||||
export * from './cart';
|
||||
export * from './users';
|
||||
```
|
||||
|
||||
**school/index.ts**:
|
||||
```typescript
|
||||
// src/lib/db/schema/school/index.ts
|
||||
export * from './district';
|
||||
export * from './holiday';
|
||||
export * from './school';
|
||||
export * from './calendar';
|
||||
```
|
||||
|
||||
### Step 6: Add Quality Check to CI
|
||||
|
||||
Validate schema consistency in CI/CD:
|
||||
|
||||
```yaml
|
||||
# .github/workflows/quality.yml
|
||||
name: Quality Checks
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
branches: [main, develop]
|
||||
push:
|
||||
branches: [main]
|
||||
|
||||
jobs:
|
||||
quality:
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '20'
|
||||
cache: 'pnpm'
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --frozen-lockfile
|
||||
|
||||
- name: Check database schema drift
|
||||
run: pnpm drizzle-kit check
|
||||
|
||||
- name: Verify migrations (dry-run)
|
||||
run: pnpm drizzle-kit push --dry-run
|
||||
env:
|
||||
DATABASE_URL: ${{ secrets.STAGING_DATABASE_URL }}
|
||||
|
||||
- name: Run type checking
|
||||
run: pnpm tsc --noEmit
|
||||
|
||||
- name: Lint code
|
||||
run: pnpm lint
|
||||
```
|
||||
|
||||
**CI Checks Explained**:
|
||||
- `drizzle-kit check`: Validates snapshots match schema
|
||||
- `drizzle-kit push --dry-run`: Tests migration without applying
|
||||
- Type checking: Ensures TypeScript compiles
|
||||
- Linting: Enforces code style
|
||||
|
||||
### Step 7: Test on Staging
|
||||
|
||||
Before production, test migration on staging:
|
||||
|
||||
```bash
|
||||
# 1. Run migration on staging
|
||||
STAGING_DATABASE_URL="..." pnpm drizzle-kit push
|
||||
|
||||
# 2. Verify schema
|
||||
pnpm drizzle-kit check
|
||||
|
||||
# 3. Test affected API routes
|
||||
curl https://staging.example.com/api/schools/calendars
|
||||
|
||||
# 4. Check for data integrity issues
|
||||
# Run queries to verify data looks correct
|
||||
|
||||
# 5. Monitor logs for errors
|
||||
# Check application logs for migration-related errors
|
||||
```
|
||||
|
||||
**Staging Checklist**:
|
||||
- [ ] Migration runs without errors
|
||||
- [ ] Schema drift check passes
|
||||
- [ ] API routes using new schema work correctly
|
||||
- [ ] No data integrity issues
|
||||
- [ ] Application logs show no errors
|
||||
- [ ] Query performance acceptable
|
||||
|
||||
## Common Migration Patterns
|
||||
|
||||
### Adding a Column
|
||||
|
||||
```sql
|
||||
-- drizzle/0005_add_user_phone.sql
|
||||
ALTER TABLE users
|
||||
ADD COLUMN phone TEXT;
|
||||
|
||||
-- Add index if querying by phone
|
||||
CREATE INDEX idx_users_phone ON users(phone);
|
||||
```
|
||||
|
||||
TypeScript:
|
||||
```typescript
|
||||
export const users = pgTable('users', {
|
||||
id: uuid('id').primaryKey(),
|
||||
email: text('email').notNull(),
|
||||
phone: text('phone'), // New column
|
||||
});
|
||||
```
|
||||
|
||||
### Creating a Junction Table
|
||||
|
||||
```sql
|
||||
-- drizzle/0006_create_provider_specialties.sql
|
||||
CREATE TABLE provider_specialties (
|
||||
provider_id UUID NOT NULL REFERENCES providers(id) ON DELETE CASCADE,
|
||||
specialty_id UUID NOT NULL REFERENCES specialties(id) ON DELETE CASCADE,
|
||||
PRIMARY KEY (provider_id, specialty_id)
|
||||
);
|
||||
|
||||
CREATE INDEX idx_provider_specialties_provider ON provider_specialties(provider_id);
|
||||
CREATE INDEX idx_provider_specialties_specialty ON provider_specialties(specialty_id);
|
||||
```
|
||||
|
||||
TypeScript:
|
||||
```typescript
|
||||
export const providerSpecialties = pgTable('provider_specialties', {
|
||||
providerId: uuid('provider_id')
|
||||
.notNull()
|
||||
.references(() => providers.id, { onDelete: 'cascade' }),
|
||||
specialtyId: uuid('specialty_id')
|
||||
.notNull()
|
||||
.references(() => specialties.id, { onDelete: 'cascade' }),
|
||||
}, (table) => ({
|
||||
pk: primaryKey(table.providerId, table.specialtyId),
|
||||
}));
|
||||
```
|
||||
|
||||
### Modifying Column Type
|
||||
|
||||
```sql
|
||||
-- drizzle/0007_change_price_to_decimal.sql
|
||||
ALTER TABLE services
|
||||
ALTER COLUMN price TYPE DECIMAL(10, 2);
|
||||
```
|
||||
|
||||
TypeScript:
|
||||
```typescript
|
||||
import { decimal } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const services = pgTable('services', {
|
||||
id: uuid('id').primaryKey(),
|
||||
name: text('name').notNull(),
|
||||
price: decimal('price', { precision: 10, scale: 2 }).notNull(),
|
||||
});
|
||||
```
|
||||
|
||||
### Adding Constraints
|
||||
|
||||
```sql
|
||||
-- drizzle/0008_add_email_constraint.sql
|
||||
ALTER TABLE users
|
||||
ADD CONSTRAINT users_email_unique UNIQUE (email);
|
||||
|
||||
ALTER TABLE users
|
||||
ADD CONSTRAINT users_email_format CHECK (email ~* '^[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}$');
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
### drizzle.config.ts
|
||||
|
||||
```typescript
|
||||
import type { Config } from 'drizzle-kit';
|
||||
|
||||
export default {
|
||||
schema: './src/lib/db/schema/index.ts',
|
||||
out: './drizzle',
|
||||
driver: 'pg',
|
||||
dbCredentials: {
|
||||
connectionString: process.env.DATABASE_URL!,
|
||||
},
|
||||
} satisfies Config;
|
||||
```
|
||||
|
||||
### package.json Scripts
|
||||
|
||||
```json
|
||||
{
|
||||
"scripts": {
|
||||
"db:generate": "drizzle-kit generate:pg",
|
||||
"db:push": "drizzle-kit push:pg",
|
||||
"db:studio": "drizzle-kit studio",
|
||||
"db:check": "drizzle-kit check:pg",
|
||||
"db:up": "drizzle-kit up:pg"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Migration Testing Workflow
|
||||
|
||||
### Local Testing
|
||||
|
||||
```bash
|
||||
# 1. Create migration
|
||||
echo "CREATE TABLE test (...)" > drizzle/0009_test.sql
|
||||
|
||||
# 2. Generate TypeScript
|
||||
pnpm db:generate
|
||||
|
||||
# 3. Push to local database
|
||||
pnpm db:push
|
||||
|
||||
# 4. Verify schema
|
||||
pnpm db:check
|
||||
|
||||
# 5. Test in application
|
||||
pnpm dev
|
||||
# Manually test affected features
|
||||
|
||||
# 6. Run tests
|
||||
pnpm test
|
||||
```
|
||||
|
||||
### Rollback Strategy
|
||||
|
||||
```sql
|
||||
-- drizzle/0010_add_feature.sql (up migration)
|
||||
CREATE TABLE new_feature (...);
|
||||
|
||||
-- drizzle/0010_add_feature_down.sql (down migration)
|
||||
DROP TABLE new_feature;
|
||||
```
|
||||
|
||||
Apply rollback:
|
||||
```bash
|
||||
# Manually run down migration
|
||||
psql $DATABASE_URL -f drizzle/0010_add_feature_down.sql
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Do's
|
||||
- ✅ Write SQL migrations first
|
||||
- ✅ Use descriptive migration names
|
||||
- ✅ Add indexes for foreign keys
|
||||
- ✅ Include constraints in migrations
|
||||
- ✅ Test migrations on staging before production
|
||||
- ✅ Commit snapshots to version control
|
||||
- ✅ Organize schemas by domain
|
||||
- ✅ Use `drizzle-kit check` in CI
|
||||
|
||||
### Don'ts
|
||||
- ❌ Never write TypeScript schema before SQL migration
|
||||
- ❌ Don't skip staging testing
|
||||
- ❌ Don't modify old migrations (create new ones)
|
||||
- ❌ Don't forget to add indexes
|
||||
- ❌ Don't use `drizzle-kit push` in production (use proper migrations)
|
||||
- ❌ Don't commit generated files without snapshots
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Schema Drift Detected
|
||||
**Error**: `Schema drift detected`
|
||||
|
||||
**Solution**:
|
||||
```bash
|
||||
# Check what changed
|
||||
pnpm drizzle-kit check
|
||||
|
||||
# Regenerate snapshots
|
||||
pnpm drizzle-kit generate
|
||||
|
||||
# Review changes and commit
|
||||
git add drizzle/meta/
|
||||
git commit -m "Update schema snapshots"
|
||||
```
|
||||
|
||||
### Migration Fails on Staging
|
||||
**Error**: Migration fails with data constraint violation
|
||||
|
||||
**Solution**:
|
||||
1. Rollback migration
|
||||
2. Create data migration script
|
||||
3. Run data migration first
|
||||
4. Then run schema migration
|
||||
|
||||
```sql
|
||||
-- First: Migrate data
|
||||
UPDATE users SET status = 'active' WHERE status IS NULL;
|
||||
|
||||
-- Then: Add constraint
|
||||
ALTER TABLE users
|
||||
ALTER COLUMN status SET NOT NULL;
|
||||
```
|
||||
|
||||
### TypeScript Types Out of Sync
|
||||
**Error**: TypeScript types don't match database
|
||||
|
||||
**Solution**:
|
||||
```bash
|
||||
# Regenerate everything
|
||||
pnpm db:generate
|
||||
pnpm tsc --noEmit
|
||||
|
||||
# If still broken, check schema files
|
||||
# Ensure column names match SQL exactly
|
||||
```
|
||||
|
||||
## Related Skills
|
||||
|
||||
- `universal-data-database-migration` - Universal migration patterns
|
||||
- `toolchains-typescript-data-drizzle` - Drizzle ORM usage patterns
|
||||
- `toolchains-typescript-core` - TypeScript best practices
|
||||
- `universal-debugging-verification-before-completion` - Verification workflows
|
||||
396
.claude/skills/drizzle-orm/SKILL.md
Normal file
396
.claude/skills/drizzle-orm/SKILL.md
Normal file
|
|
@ -0,0 +1,396 @@
|
|||
---
|
||||
name: drizzle-orm
|
||||
description: "Type-safe SQL ORM for TypeScript with zero runtime overhead"
|
||||
progressive_disclosure:
|
||||
entry_point:
|
||||
summary: "Type-safe SQL ORM for TypeScript with zero runtime overhead"
|
||||
when_to_use: "When working with drizzle-orm or related functionality."
|
||||
quick_start: "1. Review the core concepts below. 2. Apply patterns to your use case. 3. Follow best practices for implementation."
|
||||
references:
|
||||
- advanced-schemas.md
|
||||
- performance.md
|
||||
- query-patterns.md
|
||||
- vs-prisma.md
|
||||
---
|
||||
# Drizzle ORM
|
||||
|
||||
Modern TypeScript-first ORM with zero dependencies, compile-time type safety, and SQL-like syntax. Optimized for edge runtimes and serverless environments.
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Installation
|
||||
|
||||
```bash
|
||||
# Core ORM
|
||||
npm install drizzle-orm
|
||||
|
||||
# Database driver (choose one)
|
||||
npm install pg # PostgreSQL
|
||||
npm install mysql2 # MySQL
|
||||
npm install better-sqlite3 # SQLite
|
||||
|
||||
# Drizzle Kit (migrations)
|
||||
npm install -D drizzle-kit
|
||||
```
|
||||
|
||||
### Basic Setup
|
||||
|
||||
```typescript
|
||||
// db/schema.ts
|
||||
import { pgTable, serial, text, timestamp } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: text('email').notNull().unique(),
|
||||
name: text('name').notNull(),
|
||||
createdAt: timestamp('created_at').defaultNow(),
|
||||
});
|
||||
|
||||
// db/client.ts
|
||||
import { drizzle } from 'drizzle-orm/node-postgres';
|
||||
import { Pool } from 'pg';
|
||||
import * as schema from './schema';
|
||||
|
||||
const pool = new Pool({ connectionString: process.env.DATABASE_URL });
|
||||
export const db = drizzle(pool, { schema });
|
||||
```
|
||||
|
||||
### First Query
|
||||
|
||||
```typescript
|
||||
import { db } from './db/client';
|
||||
import { users } from './db/schema';
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
// Insert
|
||||
const newUser = await db.insert(users).values({
|
||||
email: 'user@example.com',
|
||||
name: 'John Doe',
|
||||
}).returning();
|
||||
|
||||
// Select
|
||||
const allUsers = await db.select().from(users);
|
||||
|
||||
// Where
|
||||
const user = await db.select().from(users).where(eq(users.id, 1));
|
||||
|
||||
// Update
|
||||
await db.update(users).set({ name: 'Jane Doe' }).where(eq(users.id, 1));
|
||||
|
||||
// Delete
|
||||
await db.delete(users).where(eq(users.id, 1));
|
||||
```
|
||||
|
||||
## Schema Definition
|
||||
|
||||
### Column Types Reference
|
||||
|
||||
| PostgreSQL | MySQL | SQLite | TypeScript |
|
||||
|------------|-------|--------|------------|
|
||||
| `serial()` | `serial()` | `integer()` | `number` |
|
||||
| `text()` | `text()` | `text()` | `string` |
|
||||
| `integer()` | `int()` | `integer()` | `number` |
|
||||
| `boolean()` | `boolean()` | `integer()` | `boolean` |
|
||||
| `timestamp()` | `datetime()` | `integer()` | `Date` |
|
||||
| `json()` | `json()` | `text()` | `unknown` |
|
||||
| `uuid()` | `varchar(36)` | `text()` | `string` |
|
||||
|
||||
### Common Schema Patterns
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, varchar, integer, boolean, timestamp, json, unique } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: varchar('email', { length: 255 }).notNull().unique(),
|
||||
passwordHash: varchar('password_hash', { length: 255 }).notNull(),
|
||||
role: text('role', { enum: ['admin', 'user', 'guest'] }).default('user'),
|
||||
metadata: json('metadata').$type<{ theme: string; locale: string }>(),
|
||||
isActive: boolean('is_active').default(true),
|
||||
createdAt: timestamp('created_at').defaultNow().notNull(),
|
||||
updatedAt: timestamp('updated_at').defaultNow().notNull(),
|
||||
}, (table) => ({
|
||||
emailIdx: unique('email_unique_idx').on(table.email),
|
||||
}));
|
||||
|
||||
// Infer TypeScript types
|
||||
type User = typeof users.$inferSelect;
|
||||
type NewUser = typeof users.$inferInsert;
|
||||
```
|
||||
|
||||
## Relations
|
||||
|
||||
### One-to-Many
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, integer } from 'drizzle-orm/pg-core';
|
||||
import { relations } from 'drizzle-orm';
|
||||
|
||||
export const authors = pgTable('authors', {
|
||||
id: serial('id').primaryKey(),
|
||||
name: text('name').notNull(),
|
||||
});
|
||||
|
||||
export const posts = pgTable('posts', {
|
||||
id: serial('id').primaryKey(),
|
||||
title: text('title').notNull(),
|
||||
authorId: integer('author_id').notNull().references(() => authors.id),
|
||||
});
|
||||
|
||||
export const authorsRelations = relations(authors, ({ many }) => ({
|
||||
posts: many(posts),
|
||||
}));
|
||||
|
||||
export const postsRelations = relations(posts, ({ one }) => ({
|
||||
author: one(authors, {
|
||||
fields: [posts.authorId],
|
||||
references: [authors.id],
|
||||
}),
|
||||
}));
|
||||
|
||||
// Query with relations
|
||||
const authorsWithPosts = await db.query.authors.findMany({
|
||||
with: { posts: true },
|
||||
});
|
||||
```
|
||||
|
||||
### Many-to-Many
|
||||
|
||||
```typescript
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
name: text('name').notNull(),
|
||||
});
|
||||
|
||||
export const groups = pgTable('groups', {
|
||||
id: serial('id').primaryKey(),
|
||||
name: text('name').notNull(),
|
||||
});
|
||||
|
||||
export const usersToGroups = pgTable('users_to_groups', {
|
||||
userId: integer('user_id').notNull().references(() => users.id),
|
||||
groupId: integer('group_id').notNull().references(() => groups.id),
|
||||
}, (table) => ({
|
||||
pk: primaryKey({ columns: [table.userId, table.groupId] }),
|
||||
}));
|
||||
|
||||
export const usersRelations = relations(users, ({ many }) => ({
|
||||
groups: many(usersToGroups),
|
||||
}));
|
||||
|
||||
export const groupsRelations = relations(groups, ({ many }) => ({
|
||||
users: many(usersToGroups),
|
||||
}));
|
||||
|
||||
export const usersToGroupsRelations = relations(usersToGroups, ({ one }) => ({
|
||||
user: one(users, { fields: [usersToGroups.userId], references: [users.id] }),
|
||||
group: one(groups, { fields: [usersToGroups.groupId], references: [groups.id] }),
|
||||
}));
|
||||
```
|
||||
|
||||
## Queries
|
||||
|
||||
### Filtering
|
||||
|
||||
```typescript
|
||||
import { eq, ne, gt, gte, lt, lte, like, ilike, inArray, isNull, isNotNull, and, or, between } from 'drizzle-orm';
|
||||
|
||||
// Equality
|
||||
await db.select().from(users).where(eq(users.email, 'user@example.com'));
|
||||
|
||||
// Comparison
|
||||
await db.select().from(users).where(gt(users.id, 10));
|
||||
|
||||
// Pattern matching
|
||||
await db.select().from(users).where(like(users.name, '%John%'));
|
||||
|
||||
// Multiple conditions
|
||||
await db.select().from(users).where(
|
||||
and(
|
||||
eq(users.role, 'admin'),
|
||||
gt(users.createdAt, new Date('2024-01-01'))
|
||||
)
|
||||
);
|
||||
|
||||
// IN clause
|
||||
await db.select().from(users).where(inArray(users.id, [1, 2, 3]));
|
||||
|
||||
// NULL checks
|
||||
await db.select().from(users).where(isNull(users.deletedAt));
|
||||
```
|
||||
|
||||
### Joins
|
||||
|
||||
```typescript
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
// Inner join
|
||||
const result = await db
|
||||
.select({
|
||||
user: users,
|
||||
post: posts,
|
||||
})
|
||||
.from(users)
|
||||
.innerJoin(posts, eq(users.id, posts.authorId));
|
||||
|
||||
// Left join
|
||||
const result = await db
|
||||
.select({
|
||||
user: users,
|
||||
post: posts,
|
||||
})
|
||||
.from(users)
|
||||
.leftJoin(posts, eq(users.id, posts.authorId));
|
||||
|
||||
// Multiple joins with aggregation
|
||||
import { count, sql } from 'drizzle-orm';
|
||||
|
||||
const result = await db
|
||||
.select({
|
||||
authorName: authors.name,
|
||||
postCount: count(posts.id),
|
||||
})
|
||||
.from(authors)
|
||||
.leftJoin(posts, eq(authors.id, posts.authorId))
|
||||
.groupBy(authors.id);
|
||||
```
|
||||
|
||||
### Pagination & Sorting
|
||||
|
||||
```typescript
|
||||
import { desc, asc } from 'drizzle-orm';
|
||||
|
||||
// Order by
|
||||
await db.select().from(users).orderBy(desc(users.createdAt));
|
||||
|
||||
// Limit & offset
|
||||
await db.select().from(users).limit(10).offset(20);
|
||||
|
||||
// Pagination helper
|
||||
function paginate(page: number, pageSize: number = 10) {
|
||||
return db.select().from(users)
|
||||
.limit(pageSize)
|
||||
.offset(page * pageSize);
|
||||
}
|
||||
```
|
||||
|
||||
## Transactions
|
||||
|
||||
```typescript
|
||||
// Auto-rollback on error
|
||||
await db.transaction(async (tx) => {
|
||||
await tx.insert(users).values({ email: 'user@example.com', name: 'John' });
|
||||
await tx.insert(posts).values({ title: 'First Post', authorId: 1 });
|
||||
// If any query fails, entire transaction rolls back
|
||||
});
|
||||
|
||||
// Manual control
|
||||
const tx = db.transaction(async (tx) => {
|
||||
const user = await tx.insert(users).values({ ... }).returning();
|
||||
|
||||
if (!user) {
|
||||
tx.rollback();
|
||||
return;
|
||||
}
|
||||
|
||||
await tx.insert(posts).values({ authorId: user.id });
|
||||
});
|
||||
```
|
||||
|
||||
## Migrations
|
||||
|
||||
### Drizzle Kit Configuration
|
||||
|
||||
```typescript
|
||||
// drizzle.config.ts
|
||||
import type { Config } from 'drizzle-kit';
|
||||
|
||||
export default {
|
||||
schema: './db/schema.ts',
|
||||
out: './drizzle',
|
||||
dialect: 'postgresql',
|
||||
dbCredentials: {
|
||||
url: process.env.DATABASE_URL!,
|
||||
},
|
||||
} satisfies Config;
|
||||
```
|
||||
|
||||
### Migration Workflow
|
||||
|
||||
```bash
|
||||
# Generate migration
|
||||
npx drizzle-kit generate
|
||||
|
||||
# View SQL
|
||||
cat drizzle/0000_migration.sql
|
||||
|
||||
# Apply migration
|
||||
npx drizzle-kit migrate
|
||||
|
||||
# Introspect existing database
|
||||
npx drizzle-kit introspect
|
||||
|
||||
# Drizzle Studio (database GUI)
|
||||
npx drizzle-kit studio
|
||||
```
|
||||
|
||||
### Example Migration
|
||||
|
||||
```sql
|
||||
-- drizzle/0000_initial.sql
|
||||
CREATE TABLE IF NOT EXISTS "users" (
|
||||
"id" serial PRIMARY KEY NOT NULL,
|
||||
"email" varchar(255) NOT NULL,
|
||||
"name" text NOT NULL,
|
||||
"created_at" timestamp DEFAULT now() NOT NULL,
|
||||
CONSTRAINT "users_email_unique" UNIQUE("email")
|
||||
);
|
||||
```
|
||||
|
||||
## Navigation
|
||||
|
||||
### Detailed References
|
||||
|
||||
- **[🏗️ Advanced Schemas](./references/advanced-schemas.md)** - Custom types, composite keys, indexes, constraints, multi-tenant patterns. Load when designing complex database schemas.
|
||||
|
||||
- **[🔍 Query Patterns](./references/query-patterns.md)** - Subqueries, CTEs, raw SQL, prepared statements, batch operations. Load when optimizing queries or handling complex filtering.
|
||||
|
||||
- **[⚡ Performance](./references/performance.md)** - Connection pooling, query optimization, N+1 prevention, prepared statements, edge runtime integration. Load when scaling or optimizing database performance.
|
||||
|
||||
- **[🔄 vs Prisma](./references/vs-prisma.md)** - Feature comparison, migration guide, when to choose Drizzle over Prisma. Load when evaluating ORMs or migrating from Prisma.
|
||||
|
||||
## Red Flags
|
||||
|
||||
**Stop and reconsider if:**
|
||||
- Using `any` or `unknown` for JSON columns without type annotation
|
||||
- Building raw SQL strings without using `sql` template (SQL injection risk)
|
||||
- Not using transactions for multi-step data modifications
|
||||
- Fetching all rows without pagination in production queries
|
||||
- Missing indexes on foreign keys or frequently queried columns
|
||||
- Using `select()` without specifying columns for large tables
|
||||
|
||||
## Performance Benefits vs Prisma
|
||||
|
||||
| Metric | Drizzle | Prisma |
|
||||
|--------|---------|--------|
|
||||
| **Bundle Size** | ~35KB | ~230KB |
|
||||
| **Cold Start** | ~10ms | ~250ms |
|
||||
| **Query Speed** | Baseline | ~2-3x slower |
|
||||
| **Memory** | ~10MB | ~50MB |
|
||||
| **Type Generation** | Runtime inference | Build-time generation |
|
||||
|
||||
## Integration
|
||||
|
||||
- **typescript-core**: Type-safe schema inference with `satisfies`
|
||||
- **nextjs-core**: Server Actions, Route Handlers, Middleware integration
|
||||
- **Database Migration**: Safe schema evolution patterns
|
||||
|
||||
## Related Skills
|
||||
|
||||
When using Drizzle, these skills enhance your workflow:
|
||||
- **prisma**: Alternative ORM comparison: Drizzle vs Prisma trade-offs
|
||||
- **typescript**: Advanced TypeScript patterns for type-safe queries
|
||||
- **nextjs**: Drizzle with Next.js Server Actions and API routes
|
||||
- **sqlalchemy**: SQLAlchemy patterns for Python developers learning Drizzle
|
||||
|
||||
[Full documentation available in these skills if deployed in your bundle]
|
||||
380
.claude/skills/drizzle-orm/references/advanced-schemas.md
Normal file
380
.claude/skills/drizzle-orm/references/advanced-schemas.md
Normal file
|
|
@ -0,0 +1,380 @@
|
|||
# Advanced Schemas
|
||||
|
||||
Deep dive into complex schema patterns, custom types, and database-specific features in Drizzle ORM.
|
||||
|
||||
## Custom Column Types
|
||||
|
||||
### Enums
|
||||
|
||||
```typescript
|
||||
import { pgEnum, pgTable, serial } from 'drizzle-orm/pg-core';
|
||||
|
||||
// PostgreSQL native enum
|
||||
export const roleEnum = pgEnum('role', ['admin', 'user', 'guest']);
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
role: roleEnum('role').default('user'),
|
||||
});
|
||||
|
||||
// MySQL/SQLite: Use text with constraints
|
||||
import { mysqlTable, text } from 'drizzle-orm/mysql-core';
|
||||
|
||||
export const users = mysqlTable('users', {
|
||||
role: text('role', { enum: ['admin', 'user', 'guest'] }).default('user'),
|
||||
});
|
||||
```
|
||||
|
||||
### Custom JSON Types
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, json } from 'drizzle-orm/pg-core';
|
||||
import { z } from 'zod';
|
||||
|
||||
// Type-safe JSON with Zod
|
||||
const MetadataSchema = z.object({
|
||||
theme: z.enum(['light', 'dark']),
|
||||
locale: z.string(),
|
||||
notifications: z.boolean(),
|
||||
});
|
||||
|
||||
type Metadata = z.infer<typeof MetadataSchema>;
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
metadata: json('metadata').$type<Metadata>(),
|
||||
});
|
||||
|
||||
// Runtime validation
|
||||
async function updateMetadata(userId: number, metadata: unknown) {
|
||||
const validated = MetadataSchema.parse(metadata);
|
||||
await db.update(users).set({ metadata: validated }).where(eq(users.id, userId));
|
||||
}
|
||||
```
|
||||
|
||||
### Arrays
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const posts = pgTable('posts', {
|
||||
id: serial('id').primaryKey(),
|
||||
tags: text('tags').array(),
|
||||
});
|
||||
|
||||
// Query array columns
|
||||
import { arrayContains, arrayContained } from 'drizzle-orm';
|
||||
|
||||
await db.select().from(posts).where(arrayContains(posts.tags, ['typescript', 'drizzle']));
|
||||
```
|
||||
|
||||
## Indexes
|
||||
|
||||
### Basic Indexes
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, varchar, index, uniqueIndex } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: varchar('email', { length: 255 }).notNull(),
|
||||
name: text('name'),
|
||||
city: text('city'),
|
||||
}, (table) => ({
|
||||
emailIdx: uniqueIndex('email_idx').on(table.email),
|
||||
nameIdx: index('name_idx').on(table.name),
|
||||
cityNameIdx: index('city_name_idx').on(table.city, table.name),
|
||||
}));
|
||||
```
|
||||
|
||||
### Partial Indexes
|
||||
|
||||
```typescript
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: varchar('email', { length: 255 }),
|
||||
deletedAt: timestamp('deleted_at'),
|
||||
}, (table) => ({
|
||||
activeEmailIdx: uniqueIndex('active_email_idx')
|
||||
.on(table.email)
|
||||
.where(sql`${table.deletedAt} IS NULL`),
|
||||
}));
|
||||
```
|
||||
|
||||
### Full-Text Search
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, index } from 'drizzle-orm/pg-core';
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const posts = pgTable('posts', {
|
||||
id: serial('id').primaryKey(),
|
||||
title: text('title').notNull(),
|
||||
content: text('content').notNull(),
|
||||
}, (table) => ({
|
||||
searchIdx: index('search_idx').using(
|
||||
'gin',
|
||||
sql`to_tsvector('english', ${table.title} || ' ' || ${table.content})`
|
||||
),
|
||||
}));
|
||||
|
||||
// Full-text search query
|
||||
const results = await db.select().from(posts).where(
|
||||
sql`to_tsvector('english', ${posts.title} || ' ' || ${posts.content}) @@ plainto_tsquery('english', 'typescript orm')`
|
||||
);
|
||||
```
|
||||
|
||||
## Composite Keys
|
||||
|
||||
```typescript
|
||||
import { pgTable, text, primaryKey } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const userPreferences = pgTable('user_preferences', {
|
||||
userId: integer('user_id').notNull(),
|
||||
key: text('key').notNull(),
|
||||
value: text('value').notNull(),
|
||||
}, (table) => ({
|
||||
pk: primaryKey({ columns: [table.userId, table.key] }),
|
||||
}));
|
||||
```
|
||||
|
||||
## Check Constraints
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, integer, check } from 'drizzle-orm/pg-core';
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const products = pgTable('products', {
|
||||
id: serial('id').primaryKey(),
|
||||
price: integer('price').notNull(),
|
||||
discountPrice: integer('discount_price'),
|
||||
}, (table) => ({
|
||||
priceCheck: check('price_check', sql`${table.price} > 0`),
|
||||
discountCheck: check('discount_check', sql`${table.discountPrice} < ${table.price}`),
|
||||
}));
|
||||
```
|
||||
|
||||
## Generated Columns
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, integer } from 'drizzle-orm/pg-core';
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
firstName: text('first_name').notNull(),
|
||||
lastName: text('last_name').notNull(),
|
||||
fullName: text('full_name').generatedAlwaysAs(
|
||||
(): SQL => sql`${users.firstName} || ' ' || ${users.lastName}`,
|
||||
{ mode: 'stored' }
|
||||
),
|
||||
});
|
||||
```
|
||||
|
||||
## Multi-Tenant Patterns
|
||||
|
||||
### Row-Level Security (PostgreSQL)
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, uuid } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const tenants = pgTable('tenants', {
|
||||
id: uuid('id').defaultRandom().primaryKey(),
|
||||
name: text('name').notNull(),
|
||||
});
|
||||
|
||||
export const documents = pgTable('documents', {
|
||||
id: serial('id').primaryKey(),
|
||||
tenantId: uuid('tenant_id').notNull().references(() => tenants.id),
|
||||
title: text('title').notNull(),
|
||||
content: text('content'),
|
||||
});
|
||||
|
||||
// Apply RLS policy (via migration SQL)
|
||||
/*
|
||||
ALTER TABLE documents ENABLE ROW LEVEL SECURITY;
|
||||
|
||||
CREATE POLICY tenant_isolation ON documents
|
||||
USING (tenant_id = current_setting('app.current_tenant_id')::uuid);
|
||||
*/
|
||||
|
||||
// Set tenant context
|
||||
await db.execute(sql`SET app.current_tenant_id = ${tenantId}`);
|
||||
```
|
||||
|
||||
### Schema-Per-Tenant
|
||||
|
||||
```typescript
|
||||
import { drizzle } from 'drizzle-orm/node-postgres';
|
||||
|
||||
// Create schema-aware connection
|
||||
function getTenantDb(tenantId: string) {
|
||||
const schemaName = `tenant_${tenantId}`;
|
||||
|
||||
return drizzle(pool, {
|
||||
schema: {
|
||||
...schema,
|
||||
},
|
||||
schemaPrefix: schemaName,
|
||||
});
|
||||
}
|
||||
|
||||
// Use tenant-specific DB
|
||||
const tenantDb = getTenantDb('tenant123');
|
||||
await tenantDb.select().from(users);
|
||||
```
|
||||
|
||||
## Database-Specific Features
|
||||
|
||||
### PostgreSQL: JSONB Operations
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, jsonb } from 'drizzle-orm/pg-core';
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const settings = pgTable('settings', {
|
||||
id: serial('id').primaryKey(),
|
||||
config: jsonb('config').$type<Record<string, unknown>>(),
|
||||
});
|
||||
|
||||
// JSONB operators
|
||||
await db.select().from(settings).where(
|
||||
sql`${settings.config}->>'theme' = 'dark'`
|
||||
);
|
||||
|
||||
// JSONB path query
|
||||
await db.select().from(settings).where(
|
||||
sql`${settings.config} @> '{"notifications": {"email": true}}'::jsonb`
|
||||
);
|
||||
```
|
||||
|
||||
### MySQL: Spatial Types
|
||||
|
||||
```typescript
|
||||
import { mysqlTable, serial, geometry } from 'drizzle-orm/mysql-core';
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const locations = mysqlTable('locations', {
|
||||
id: serial('id').primaryKey(),
|
||||
point: geometry('point', { type: 'point', srid: 4326 }),
|
||||
});
|
||||
|
||||
// Spatial query
|
||||
await db.select().from(locations).where(
|
||||
sql`ST_Distance_Sphere(${locations.point}, POINT(${lng}, ${lat})) < 1000`
|
||||
);
|
||||
```
|
||||
|
||||
### SQLite: FTS5
|
||||
|
||||
```typescript
|
||||
import { sqliteTable, text } from 'drizzle-orm/sqlite-core';
|
||||
|
||||
export const documents = sqliteTable('documents', {
|
||||
title: text('title'),
|
||||
content: text('content'),
|
||||
});
|
||||
|
||||
// Create FTS5 virtual table (via migration)
|
||||
/*
|
||||
CREATE VIRTUAL TABLE documents_fts USING fts5(title, content, content='documents');
|
||||
*/
|
||||
```
|
||||
|
||||
## Schema Versioning
|
||||
|
||||
### Migration Strategy
|
||||
|
||||
```typescript
|
||||
// db/schema.ts
|
||||
export const schemaVersion = pgTable('schema_version', {
|
||||
version: serial('version').primaryKey(),
|
||||
appliedAt: timestamp('applied_at').defaultNow(),
|
||||
});
|
||||
|
||||
// Track migrations
|
||||
await db.insert(schemaVersion).values({ version: 1 });
|
||||
|
||||
// Check version
|
||||
const [currentVersion] = await db.select().from(schemaVersion).orderBy(desc(schemaVersion.version)).limit(1);
|
||||
```
|
||||
|
||||
## Type Inference Helpers
|
||||
|
||||
```typescript
|
||||
import { InferSelectModel, InferInsertModel } from 'drizzle-orm';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: text('email').notNull(),
|
||||
name: text('name'),
|
||||
});
|
||||
|
||||
// Generate types
|
||||
export type User = InferSelectModel<typeof users>;
|
||||
export type NewUser = InferInsertModel<typeof users>;
|
||||
|
||||
// Partial updates
|
||||
export type UserUpdate = Partial<NewUser>;
|
||||
|
||||
// Nested relation types
|
||||
export type UserWithPosts = User & {
|
||||
posts: Post[];
|
||||
};
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Schema Organization
|
||||
|
||||
```typescript
|
||||
// db/schema/users.ts
|
||||
export const users = pgTable('users', { ... });
|
||||
export const userRelations = relations(users, { ... });
|
||||
|
||||
// db/schema/posts.ts
|
||||
export const posts = pgTable('posts', { ... });
|
||||
export const postRelations = relations(posts, { ... });
|
||||
|
||||
// db/schema/index.ts
|
||||
export * from './users';
|
||||
export * from './posts';
|
||||
|
||||
// db/client.ts
|
||||
import * as schema from './schema';
|
||||
export const db = drizzle(pool, { schema });
|
||||
```
|
||||
|
||||
### Naming Conventions
|
||||
|
||||
```typescript
|
||||
// ✅ Good: Consistent naming
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
firstName: text('first_name'),
|
||||
createdAt: timestamp('created_at'),
|
||||
});
|
||||
|
||||
// ❌ Bad: Inconsistent naming
|
||||
export const Users = pgTable('user', {
|
||||
ID: serial('userId').primaryKey(),
|
||||
first_name: text('firstname'),
|
||||
});
|
||||
```
|
||||
|
||||
### Default Values
|
||||
|
||||
```typescript
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
export const posts = pgTable('posts', {
|
||||
id: serial('id').primaryKey(),
|
||||
slug: text('slug').notNull(),
|
||||
views: integer('views').default(0),
|
||||
createdAt: timestamp('created_at').defaultNow(),
|
||||
updatedAt: timestamp('updated_at').default(sql`CURRENT_TIMESTAMP`),
|
||||
uuid: uuid('uuid').defaultRandom(),
|
||||
});
|
||||
```
|
||||
594
.claude/skills/drizzle-orm/references/performance.md
Normal file
594
.claude/skills/drizzle-orm/references/performance.md
Normal file
|
|
@ -0,0 +1,594 @@
|
|||
# Performance Optimization
|
||||
|
||||
Connection pooling, query optimization, edge runtime integration, and performance best practices.
|
||||
|
||||
## Connection Pooling
|
||||
|
||||
### PostgreSQL (node-postgres)
|
||||
|
||||
```typescript
|
||||
import { Pool } from 'pg';
|
||||
import { drizzle } from 'drizzle-orm/node-postgres';
|
||||
|
||||
const pool = new Pool({
|
||||
host: process.env.DB_HOST,
|
||||
port: parseInt(process.env.DB_PORT || '5432'),
|
||||
database: process.env.DB_NAME,
|
||||
user: process.env.DB_USER,
|
||||
password: process.env.DB_PASSWORD,
|
||||
max: 20, // Maximum pool size
|
||||
idleTimeoutMillis: 30000, // Close idle clients after 30s
|
||||
connectionTimeoutMillis: 2000, // Timeout connection attempts
|
||||
});
|
||||
|
||||
export const db = drizzle(pool);
|
||||
|
||||
// Graceful shutdown
|
||||
process.on('SIGTERM', async () => {
|
||||
await pool.end();
|
||||
});
|
||||
```
|
||||
|
||||
### MySQL (mysql2)
|
||||
|
||||
```typescript
|
||||
import mysql from 'mysql2/promise';
|
||||
import { drizzle } from 'drizzle-orm/mysql2';
|
||||
|
||||
const poolConnection = mysql.createPool({
|
||||
host: process.env.DB_HOST,
|
||||
user: process.env.DB_USER,
|
||||
password: process.env.DB_PASSWORD,
|
||||
database: process.env.DB_NAME,
|
||||
waitForConnections: true,
|
||||
connectionLimit: 10,
|
||||
maxIdle: 10,
|
||||
idleTimeout: 60000,
|
||||
queueLimit: 0,
|
||||
enableKeepAlive: true,
|
||||
keepAliveInitialDelay: 0,
|
||||
});
|
||||
|
||||
export const db = drizzle(poolConnection);
|
||||
```
|
||||
|
||||
### SQLite (better-sqlite3)
|
||||
|
||||
```typescript
|
||||
import Database from 'better-sqlite3';
|
||||
import { drizzle } from 'drizzle-orm/better-sqlite3';
|
||||
|
||||
const sqlite = new Database('sqlite.db', {
|
||||
readonly: false,
|
||||
fileMustExist: false,
|
||||
timeout: 5000,
|
||||
verbose: console.log, // Remove in production
|
||||
});
|
||||
|
||||
// Performance pragmas
|
||||
sqlite.pragma('journal_mode = WAL');
|
||||
sqlite.pragma('synchronous = normal');
|
||||
sqlite.pragma('cache_size = -64000'); // 64MB cache
|
||||
sqlite.pragma('temp_store = memory');
|
||||
|
||||
export const db = drizzle(sqlite);
|
||||
|
||||
process.on('exit', () => sqlite.close());
|
||||
```
|
||||
|
||||
## Query Optimization
|
||||
|
||||
### Select Only Needed Columns
|
||||
|
||||
```typescript
|
||||
// ❌ Bad: Fetch all columns
|
||||
const users = await db.select().from(users);
|
||||
|
||||
// ✅ Good: Fetch only needed columns
|
||||
const users = await db.select({
|
||||
id: users.id,
|
||||
email: users.email,
|
||||
name: users.name,
|
||||
}).from(users);
|
||||
```
|
||||
|
||||
### Use Indexes Effectively
|
||||
|
||||
```typescript
|
||||
import { pgTable, serial, text, varchar, index } from 'drizzle-orm/pg-core';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: varchar('email', { length: 255 }).notNull(),
|
||||
city: text('city'),
|
||||
status: text('status'),
|
||||
}, (table) => ({
|
||||
// Index frequently queried columns
|
||||
emailIdx: index('email_idx').on(table.email),
|
||||
|
||||
// Composite index for common query patterns
|
||||
cityStatusIdx: index('city_status_idx').on(table.city, table.status),
|
||||
}));
|
||||
|
||||
// Query uses index
|
||||
const activeUsersInNYC = await db.select()
|
||||
.from(users)
|
||||
.where(and(
|
||||
eq(users.city, 'NYC'),
|
||||
eq(users.status, 'active')
|
||||
));
|
||||
```
|
||||
|
||||
### Analyze Query Plans
|
||||
|
||||
```typescript
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
// PostgreSQL EXPLAIN
|
||||
const plan = await db.execute(
|
||||
sql`EXPLAIN ANALYZE SELECT * FROM ${users} WHERE ${users.email} = 'user@example.com'`
|
||||
);
|
||||
|
||||
console.log(plan.rows);
|
||||
|
||||
// Check for:
|
||||
// - "Seq Scan" (bad) vs "Index Scan" (good)
|
||||
// - Actual time vs estimated time
|
||||
// - Rows removed by filter
|
||||
```
|
||||
|
||||
### Pagination Performance
|
||||
|
||||
```typescript
|
||||
// ❌ Bad: OFFSET on large datasets (gets slower as offset increases)
|
||||
const page = await db.select()
|
||||
.from(users)
|
||||
.limit(20)
|
||||
.offset(10000); // Scans 10,020 rows!
|
||||
|
||||
// ✅ Good: Cursor-based pagination (constant time)
|
||||
const page = await db.select()
|
||||
.from(users)
|
||||
.where(gt(users.id, lastSeenId))
|
||||
.orderBy(asc(users.id))
|
||||
.limit(20);
|
||||
|
||||
// ✅ Good: Seek method for timestamp-based pagination
|
||||
const page = await db.select()
|
||||
.from(posts)
|
||||
.where(lt(posts.createdAt, lastSeenTimestamp))
|
||||
.orderBy(desc(posts.createdAt))
|
||||
.limit(20);
|
||||
```
|
||||
|
||||
## Edge Runtime Integration
|
||||
|
||||
### Cloudflare Workers (D1)
|
||||
|
||||
```typescript
|
||||
import { drizzle } from 'drizzle-orm/d1';
|
||||
|
||||
export default {
|
||||
async fetch(request: Request, env: Env): Promise<Response> {
|
||||
const db = drizzle(env.DB);
|
||||
|
||||
const users = await db.select().from(users).limit(10);
|
||||
|
||||
return Response.json(users);
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
### Vercel Edge (Neon)
|
||||
|
||||
```typescript
|
||||
import { neon } from '@neondatabase/serverless';
|
||||
import { drizzle } from 'drizzle-orm/neon-http';
|
||||
|
||||
export const runtime = 'edge';
|
||||
|
||||
export async function GET() {
|
||||
const sql = neon(process.env.DATABASE_URL!);
|
||||
const db = drizzle(sql);
|
||||
|
||||
const users = await db.select().from(users);
|
||||
|
||||
return Response.json(users);
|
||||
}
|
||||
```
|
||||
|
||||
### Supabase Edge Functions
|
||||
|
||||
```typescript
|
||||
import { createClient } from '@supabase/supabase-js';
|
||||
import { drizzle } from 'drizzle-orm/postgres-js';
|
||||
import postgres from 'postgres';
|
||||
|
||||
Deno.serve(async (req) => {
|
||||
const client = postgres(Deno.env.get('DATABASE_URL')!);
|
||||
const db = drizzle(client);
|
||||
|
||||
const data = await db.select().from(users);
|
||||
|
||||
return new Response(JSON.stringify(data), {
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Caching Strategies
|
||||
|
||||
### In-Memory Cache
|
||||
|
||||
```typescript
|
||||
import { LRUCache } from 'lru-cache';
|
||||
|
||||
const cache = new LRUCache<string, any>({
|
||||
max: 500,
|
||||
ttl: 1000 * 60 * 5, // 5 minutes
|
||||
});
|
||||
|
||||
async function getCachedUser(id: number) {
|
||||
const key = `user:${id}`;
|
||||
const cached = cache.get(key);
|
||||
|
||||
if (cached) return cached;
|
||||
|
||||
const user = await db.select().from(users).where(eq(users.id, id));
|
||||
cache.set(key, user);
|
||||
|
||||
return user;
|
||||
}
|
||||
```
|
||||
|
||||
### Redis Cache Layer
|
||||
|
||||
```typescript
|
||||
import { Redis } from 'ioredis';
|
||||
|
||||
const redis = new Redis(process.env.REDIS_URL);
|
||||
|
||||
async function getCachedData<T>(
|
||||
key: string,
|
||||
fetcher: () => Promise<T>,
|
||||
ttl: number = 300
|
||||
): Promise<T> {
|
||||
// Try cache first
|
||||
const cached = await redis.get(key);
|
||||
if (cached) return JSON.parse(cached);
|
||||
|
||||
// Fetch from database
|
||||
const data = await fetcher();
|
||||
|
||||
// Store in cache
|
||||
await redis.setex(key, ttl, JSON.stringify(data));
|
||||
|
||||
return data;
|
||||
}
|
||||
|
||||
// Usage
|
||||
const users = await getCachedData(
|
||||
'users:all',
|
||||
() => db.select().from(users),
|
||||
600
|
||||
);
|
||||
```
|
||||
|
||||
### Materialized Views (PostgreSQL)
|
||||
|
||||
```typescript
|
||||
// Create materialized view (via migration)
|
||||
/*
|
||||
CREATE MATERIALIZED VIEW user_stats AS
|
||||
SELECT
|
||||
u.id,
|
||||
u.name,
|
||||
COUNT(p.id) AS post_count,
|
||||
COUNT(c.id) AS comment_count
|
||||
FROM users u
|
||||
LEFT JOIN posts p ON p.author_id = u.id
|
||||
LEFT JOIN comments c ON c.user_id = u.id
|
||||
GROUP BY u.id;
|
||||
|
||||
CREATE UNIQUE INDEX ON user_stats (id);
|
||||
*/
|
||||
|
||||
// Define schema
|
||||
export const userStats = pgMaterializedView('user_stats').as((qb) =>
|
||||
qb.select({
|
||||
id: users.id,
|
||||
name: users.name,
|
||||
postCount: sql<number>`COUNT(${posts.id})`,
|
||||
commentCount: sql<number>`COUNT(${comments.id})`,
|
||||
})
|
||||
.from(users)
|
||||
.leftJoin(posts, eq(posts.authorId, users.id))
|
||||
.leftJoin(comments, eq(comments.userId, users.id))
|
||||
.groupBy(users.id)
|
||||
);
|
||||
|
||||
// Refresh materialized view
|
||||
await db.execute(sql`REFRESH MATERIALIZED VIEW CONCURRENTLY user_stats`);
|
||||
|
||||
// Query materialized view (fast!)
|
||||
const stats = await db.select().from(userStats);
|
||||
```
|
||||
|
||||
## Batch Operations Optimization
|
||||
|
||||
### Batch Insert with COPY (PostgreSQL)
|
||||
|
||||
```typescript
|
||||
import { copyFrom } from 'pg-copy-streams';
|
||||
import { pipeline } from 'stream/promises';
|
||||
import { Readable } from 'stream';
|
||||
|
||||
async function bulkInsert(data: any[]) {
|
||||
const client = await pool.connect();
|
||||
|
||||
try {
|
||||
const stream = client.query(
|
||||
copyFrom(`COPY users (email, name) FROM STDIN WITH (FORMAT csv)`)
|
||||
);
|
||||
|
||||
const input = Readable.from(
|
||||
data.map(row => `${row.email},${row.name}\n`)
|
||||
);
|
||||
|
||||
await pipeline(input, stream);
|
||||
} finally {
|
||||
client.release();
|
||||
}
|
||||
}
|
||||
|
||||
// 10x faster than batch INSERT for large datasets
|
||||
```
|
||||
|
||||
### Chunk Processing
|
||||
|
||||
```typescript
|
||||
async function* chunked<T>(array: T[], size: number) {
|
||||
for (let i = 0; i < array.length; i += size) {
|
||||
yield array.slice(i, i + size);
|
||||
}
|
||||
}
|
||||
|
||||
async function bulkUpdate(updates: { id: number; name: string }[]) {
|
||||
for await (const chunk of chunked(updates, 100)) {
|
||||
await db.transaction(async (tx) => {
|
||||
for (const update of chunk) {
|
||||
await tx.update(users)
|
||||
.set({ name: update.name })
|
||||
.where(eq(users.id, update.id));
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Connection Management
|
||||
|
||||
### Serverless Optimization
|
||||
|
||||
```typescript
|
||||
// ❌ Bad: New connection per request
|
||||
export async function handler() {
|
||||
const pool = new Pool({ connectionString: process.env.DATABASE_URL });
|
||||
const db = drizzle(pool);
|
||||
|
||||
const users = await db.select().from(users);
|
||||
|
||||
await pool.end();
|
||||
return users;
|
||||
}
|
||||
|
||||
// ✅ Good: Reuse connection across warm starts
|
||||
let cachedDb: ReturnType<typeof drizzle> | null = null;
|
||||
|
||||
export async function handler() {
|
||||
if (!cachedDb) {
|
||||
const pool = new Pool({
|
||||
connectionString: process.env.DATABASE_URL,
|
||||
max: 1, // Serverless: single connection per instance
|
||||
});
|
||||
cachedDb = drizzle(pool);
|
||||
}
|
||||
|
||||
const users = await cachedDb.select().from(users);
|
||||
return users;
|
||||
}
|
||||
```
|
||||
|
||||
### HTTP-based Databases (Neon, Turso)
|
||||
|
||||
```typescript
|
||||
// No connection pooling needed - uses HTTP
|
||||
import { neon } from '@neondatabase/serverless';
|
||||
import { drizzle } from 'drizzle-orm/neon-http';
|
||||
|
||||
const sql = neon(process.env.DATABASE_URL!);
|
||||
const db = drizzle(sql);
|
||||
|
||||
// Each query is a single HTTP request
|
||||
const users = await db.select().from(users);
|
||||
```
|
||||
|
||||
## Read Replicas
|
||||
|
||||
```typescript
|
||||
import { Pool } from 'pg';
|
||||
import { drizzle } from 'drizzle-orm/node-postgres';
|
||||
|
||||
// Primary (writes)
|
||||
const primaryPool = new Pool({ connectionString: process.env.PRIMARY_DB_URL });
|
||||
const primaryDb = drizzle(primaryPool);
|
||||
|
||||
// Replica (reads)
|
||||
const replicaPool = new Pool({ connectionString: process.env.REPLICA_DB_URL });
|
||||
const replicaDb = drizzle(replicaPool);
|
||||
|
||||
// Route queries appropriately
|
||||
async function getUsers() {
|
||||
return replicaDb.select().from(users); // Read from replica
|
||||
}
|
||||
|
||||
async function createUser(data: NewUser) {
|
||||
return primaryDb.insert(users).values(data).returning(); // Write to primary
|
||||
}
|
||||
```
|
||||
|
||||
## Monitoring & Profiling
|
||||
|
||||
### Query Logging
|
||||
|
||||
```typescript
|
||||
import { drizzle } from 'drizzle-orm/node-postgres';
|
||||
|
||||
const db = drizzle(pool, {
|
||||
logger: {
|
||||
logQuery(query: string, params: unknown[]) {
|
||||
console.log('Query:', query);
|
||||
console.log('Params:', params);
|
||||
console.time('query');
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Custom logger with metrics
|
||||
class MetricsLogger {
|
||||
private queries: Map<string, { count: number; totalTime: number }> = new Map();
|
||||
|
||||
logQuery(query: string) {
|
||||
const start = Date.now();
|
||||
|
||||
return () => {
|
||||
const duration = Date.now() - start;
|
||||
const stats = this.queries.get(query) || { count: 0, totalTime: 0 };
|
||||
|
||||
this.queries.set(query, {
|
||||
count: stats.count + 1,
|
||||
totalTime: stats.totalTime + duration,
|
||||
});
|
||||
|
||||
if (duration > 1000) {
|
||||
console.warn(`Slow query (${duration}ms):`, query);
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
getStats() {
|
||||
return Array.from(this.queries.entries()).map(([query, stats]) => ({
|
||||
query,
|
||||
count: stats.count,
|
||||
avgTime: stats.totalTime / stats.count,
|
||||
}));
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Performance Monitoring
|
||||
|
||||
```typescript
|
||||
import { performance } from 'perf_hooks';
|
||||
|
||||
async function measureQuery<T>(
|
||||
name: string,
|
||||
query: Promise<T>
|
||||
): Promise<T> {
|
||||
const start = performance.now();
|
||||
|
||||
try {
|
||||
const result = await query;
|
||||
const duration = performance.now() - start;
|
||||
|
||||
console.log(`[${name}] completed in ${duration.toFixed(2)}ms`);
|
||||
|
||||
return result;
|
||||
} catch (error) {
|
||||
const duration = performance.now() - start;
|
||||
console.error(`[${name}] failed after ${duration.toFixed(2)}ms`, error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Usage
|
||||
const users = await measureQuery(
|
||||
'fetchUsers',
|
||||
db.select().from(users).limit(100)
|
||||
);
|
||||
```
|
||||
|
||||
## Database-Specific Optimizations
|
||||
|
||||
### PostgreSQL
|
||||
|
||||
```typescript
|
||||
// Connection optimization
|
||||
const pool = new Pool({
|
||||
max: 20,
|
||||
application_name: 'myapp',
|
||||
statement_timeout: 30000, // 30s query timeout
|
||||
query_timeout: 30000,
|
||||
connectionTimeoutMillis: 5000,
|
||||
idle_in_transaction_session_timeout: 10000,
|
||||
});
|
||||
|
||||
// Session optimization
|
||||
await db.execute(sql`SET work_mem = '256MB'`);
|
||||
await db.execute(sql`SET maintenance_work_mem = '512MB'`);
|
||||
await db.execute(sql`SET effective_cache_size = '4GB'`);
|
||||
```
|
||||
|
||||
### MySQL
|
||||
|
||||
```typescript
|
||||
const pool = mysql.createPool({
|
||||
waitForConnections: true,
|
||||
connectionLimit: 10,
|
||||
queueLimit: 0,
|
||||
enableKeepAlive: true,
|
||||
keepAliveInitialDelay: 0,
|
||||
dateStrings: false,
|
||||
supportBigNumbers: true,
|
||||
bigNumberStrings: false,
|
||||
multipleStatements: false, // Security
|
||||
timezone: 'Z', // UTC
|
||||
});
|
||||
```
|
||||
|
||||
### SQLite
|
||||
|
||||
```typescript
|
||||
// WAL mode for concurrent reads
|
||||
sqlite.pragma('journal_mode = WAL');
|
||||
|
||||
// Optimize for performance
|
||||
sqlite.pragma('synchronous = NORMAL');
|
||||
sqlite.pragma('cache_size = -64000'); // 64MB
|
||||
sqlite.pragma('temp_store = MEMORY');
|
||||
sqlite.pragma('mmap_size = 30000000000'); // 30GB mmap
|
||||
|
||||
// Disable for bulk inserts
|
||||
const stmt = sqlite.prepare('INSERT INTO users (email, name) VALUES (?, ?)');
|
||||
|
||||
const insertMany = sqlite.transaction((users) => {
|
||||
for (const user of users) {
|
||||
stmt.run(user.email, user.name);
|
||||
}
|
||||
});
|
||||
|
||||
insertMany(users); // 100x faster than individual inserts
|
||||
```
|
||||
|
||||
## Best Practices Summary
|
||||
|
||||
1. **Always use connection pooling** in long-running processes
|
||||
2. **Select only needed columns** to reduce network transfer
|
||||
3. **Add indexes** on frequently queried columns and foreign keys
|
||||
4. **Use cursor-based pagination** instead of OFFSET for large datasets
|
||||
5. **Batch operations** when inserting/updating multiple records
|
||||
6. **Cache expensive queries** with appropriate TTL
|
||||
7. **Monitor slow queries** and optimize with EXPLAIN ANALYZE
|
||||
8. **Use prepared statements** for frequently executed queries
|
||||
9. **Implement read replicas** for high-traffic read operations
|
||||
10. **Use HTTP-based databases** (Neon, Turso) for edge/serverless
|
||||
577
.claude/skills/drizzle-orm/references/query-patterns.md
Normal file
577
.claude/skills/drizzle-orm/references/query-patterns.md
Normal file
|
|
@ -0,0 +1,577 @@
|
|||
# Query Patterns
|
||||
|
||||
Advanced querying techniques, subqueries, CTEs, and raw SQL in Drizzle ORM.
|
||||
|
||||
## Subqueries
|
||||
|
||||
### SELECT Subqueries
|
||||
|
||||
```typescript
|
||||
import { sql, eq } from 'drizzle-orm';
|
||||
|
||||
// Scalar subquery
|
||||
const avgPrice = db.select({ value: avg(products.price) }).from(products);
|
||||
|
||||
const expensiveProducts = await db
|
||||
.select()
|
||||
.from(products)
|
||||
.where(gt(products.price, avgPrice));
|
||||
|
||||
// Correlated subquery
|
||||
const authorsWithPostCount = await db
|
||||
.select({
|
||||
author: authors,
|
||||
postCount: sql<number>`(
|
||||
SELECT COUNT(*)
|
||||
FROM ${posts}
|
||||
WHERE ${posts.authorId} = ${authors.id}
|
||||
)`,
|
||||
})
|
||||
.from(authors);
|
||||
```
|
||||
|
||||
### EXISTS Subqueries
|
||||
|
||||
```typescript
|
||||
// Find authors with posts
|
||||
const authorsWithPosts = await db
|
||||
.select()
|
||||
.from(authors)
|
||||
.where(
|
||||
sql`EXISTS (
|
||||
SELECT 1
|
||||
FROM ${posts}
|
||||
WHERE ${posts.authorId} = ${authors.id}
|
||||
)`
|
||||
);
|
||||
|
||||
// Find authors without posts
|
||||
const authorsWithoutPosts = await db
|
||||
.select()
|
||||
.from(authors)
|
||||
.where(
|
||||
sql`NOT EXISTS (
|
||||
SELECT 1
|
||||
FROM ${posts}
|
||||
WHERE ${posts.authorId} = ${authors.id}
|
||||
)`
|
||||
);
|
||||
```
|
||||
|
||||
### IN Subqueries
|
||||
|
||||
```typescript
|
||||
// Find users who commented
|
||||
const usersWhoCommented = await db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(
|
||||
sql`${users.id} IN (
|
||||
SELECT DISTINCT ${comments.userId}
|
||||
FROM ${comments}
|
||||
)`
|
||||
);
|
||||
```
|
||||
|
||||
## Common Table Expressions (CTEs)
|
||||
|
||||
### Basic CTE
|
||||
|
||||
```typescript
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
const topAuthors = db.$with('top_authors').as(
|
||||
db.select({
|
||||
id: authors.id,
|
||||
name: authors.name,
|
||||
postCount: sql<number>`COUNT(${posts.id})`.as('post_count'),
|
||||
})
|
||||
.from(authors)
|
||||
.leftJoin(posts, eq(authors.id, posts.authorId))
|
||||
.groupBy(authors.id)
|
||||
.having(sql`COUNT(${posts.id}) > 10`)
|
||||
);
|
||||
|
||||
const result = await db
|
||||
.with(topAuthors)
|
||||
.select()
|
||||
.from(topAuthors);
|
||||
```
|
||||
|
||||
### Recursive CTE
|
||||
|
||||
```typescript
|
||||
// Organizational hierarchy
|
||||
export const employees = pgTable('employees', {
|
||||
id: serial('id').primaryKey(),
|
||||
name: text('name').notNull(),
|
||||
managerId: integer('manager_id').references((): AnyPgColumn => employees.id),
|
||||
});
|
||||
|
||||
const employeeHierarchy = db.$with('employee_hierarchy').as(
|
||||
db.select({
|
||||
id: employees.id,
|
||||
name: employees.name,
|
||||
managerId: employees.managerId,
|
||||
level: sql<number>`1`.as('level'),
|
||||
})
|
||||
.from(employees)
|
||||
.where(isNull(employees.managerId))
|
||||
.unionAll(
|
||||
db.select({
|
||||
id: employees.id,
|
||||
name: employees.name,
|
||||
managerId: employees.managerId,
|
||||
level: sql<number>`employee_hierarchy.level + 1`,
|
||||
})
|
||||
.from(employees)
|
||||
.innerJoin(
|
||||
sql`employee_hierarchy`,
|
||||
sql`${employees.managerId} = employee_hierarchy.id`
|
||||
)
|
||||
)
|
||||
);
|
||||
|
||||
const hierarchy = await db
|
||||
.with(employeeHierarchy)
|
||||
.select()
|
||||
.from(employeeHierarchy);
|
||||
```
|
||||
|
||||
### Multiple CTEs
|
||||
|
||||
```typescript
|
||||
const activeUsers = db.$with('active_users').as(
|
||||
db.select().from(users).where(eq(users.isActive, true))
|
||||
);
|
||||
|
||||
const recentPosts = db.$with('recent_posts').as(
|
||||
db.select().from(posts).where(gt(posts.createdAt, sql`NOW() - INTERVAL '30 days'`))
|
||||
);
|
||||
|
||||
const result = await db
|
||||
.with(activeUsers, recentPosts)
|
||||
.select({
|
||||
user: activeUsers,
|
||||
post: recentPosts,
|
||||
})
|
||||
.from(activeUsers)
|
||||
.leftJoin(recentPosts, eq(activeUsers.id, recentPosts.authorId));
|
||||
```
|
||||
|
||||
## Raw SQL
|
||||
|
||||
### Safe Raw Queries
|
||||
|
||||
```typescript
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
// Parameterized query (safe from SQL injection)
|
||||
const userId = 123;
|
||||
const user = await db.execute(
|
||||
sql`SELECT * FROM ${users} WHERE ${users.id} = ${userId}`
|
||||
);
|
||||
|
||||
// Raw SQL with type safety
|
||||
const result = await db.execute<{ count: number }>(
|
||||
sql`SELECT COUNT(*) as count FROM ${users}`
|
||||
);
|
||||
```
|
||||
|
||||
### SQL Template Composition
|
||||
|
||||
```typescript
|
||||
// Reusable SQL fragments
|
||||
function whereActive() {
|
||||
return sql`${users.isActive} = true`;
|
||||
}
|
||||
|
||||
function whereRole(role: string) {
|
||||
return sql`${users.role} = ${role}`;
|
||||
}
|
||||
|
||||
// Compose fragments
|
||||
const admins = await db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(sql`${whereActive()} AND ${whereRole('admin')}`);
|
||||
```
|
||||
|
||||
### Dynamic WHERE Clauses
|
||||
|
||||
```typescript
|
||||
import { and, SQL } from 'drizzle-orm';
|
||||
|
||||
interface Filters {
|
||||
name?: string;
|
||||
role?: string;
|
||||
isActive?: boolean;
|
||||
}
|
||||
|
||||
function buildFilters(filters: Filters): SQL | undefined {
|
||||
const conditions: SQL[] = [];
|
||||
|
||||
if (filters.name) {
|
||||
conditions.push(like(users.name, `%${filters.name}%`));
|
||||
}
|
||||
|
||||
if (filters.role) {
|
||||
conditions.push(eq(users.role, filters.role));
|
||||
}
|
||||
|
||||
if (filters.isActive !== undefined) {
|
||||
conditions.push(eq(users.isActive, filters.isActive));
|
||||
}
|
||||
|
||||
return conditions.length > 0 ? and(...conditions) : undefined;
|
||||
}
|
||||
|
||||
// Usage
|
||||
const filters: Filters = { name: 'John', isActive: true };
|
||||
const users = await db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(buildFilters(filters));
|
||||
```
|
||||
|
||||
## Aggregations
|
||||
|
||||
### Basic Aggregates
|
||||
|
||||
```typescript
|
||||
import { count, sum, avg, min, max, sql } from 'drizzle-orm';
|
||||
|
||||
// Count
|
||||
const userCount = await db.select({ count: count() }).from(users);
|
||||
|
||||
// Sum
|
||||
const totalRevenue = await db.select({ total: sum(orders.amount) }).from(orders);
|
||||
|
||||
// Average
|
||||
const avgPrice = await db.select({ avg: avg(products.price) }).from(products);
|
||||
|
||||
// Multiple aggregates
|
||||
const stats = await db
|
||||
.select({
|
||||
count: count(),
|
||||
total: sum(orders.amount),
|
||||
avg: avg(orders.amount),
|
||||
min: min(orders.amount),
|
||||
max: max(orders.amount),
|
||||
})
|
||||
.from(orders);
|
||||
```
|
||||
|
||||
### GROUP BY with HAVING
|
||||
|
||||
```typescript
|
||||
// Authors with more than 5 posts
|
||||
const prolificAuthors = await db
|
||||
.select({
|
||||
author: authors.name,
|
||||
postCount: count(posts.id),
|
||||
})
|
||||
.from(authors)
|
||||
.leftJoin(posts, eq(authors.id, posts.authorId))
|
||||
.groupBy(authors.id)
|
||||
.having(sql`COUNT(${posts.id}) > 5`);
|
||||
```
|
||||
|
||||
### Window Functions
|
||||
|
||||
```typescript
|
||||
// Rank products by price within category
|
||||
const rankedProducts = await db
|
||||
.select({
|
||||
product: products,
|
||||
priceRank: sql<number>`RANK() OVER (PARTITION BY ${products.categoryId} ORDER BY ${products.price} DESC)`,
|
||||
})
|
||||
.from(products);
|
||||
|
||||
// Running total
|
||||
const ordersWithRunningTotal = await db
|
||||
.select({
|
||||
order: orders,
|
||||
runningTotal: sql<number>`SUM(${orders.amount}) OVER (ORDER BY ${orders.createdAt})`,
|
||||
})
|
||||
.from(orders);
|
||||
|
||||
// Row number
|
||||
const numberedUsers = await db
|
||||
.select({
|
||||
user: users,
|
||||
rowNum: sql<number>`ROW_NUMBER() OVER (ORDER BY ${users.createdAt})`,
|
||||
})
|
||||
.from(users);
|
||||
```
|
||||
|
||||
## Prepared Statements
|
||||
|
||||
### Reusable Queries
|
||||
|
||||
```typescript
|
||||
// Prepare once, execute many times
|
||||
const getUserById = db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(eq(users.id, sql.placeholder('id')))
|
||||
.prepare('get_user_by_id');
|
||||
|
||||
// Execute with different parameters
|
||||
const user1 = await getUserById.execute({ id: 1 });
|
||||
const user2 = await getUserById.execute({ id: 2 });
|
||||
|
||||
// Complex prepared statement
|
||||
const searchUsers = db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(
|
||||
and(
|
||||
like(users.name, sql.placeholder('name')),
|
||||
eq(users.role, sql.placeholder('role'))
|
||||
)
|
||||
)
|
||||
.prepare('search_users');
|
||||
|
||||
const admins = await searchUsers.execute({ name: '%John%', role: 'admin' });
|
||||
```
|
||||
|
||||
## Batch Operations
|
||||
|
||||
### Batch Insert
|
||||
|
||||
```typescript
|
||||
// Insert multiple rows
|
||||
const newUsers = await db.insert(users).values([
|
||||
{ email: 'user1@example.com', name: 'User 1' },
|
||||
{ email: 'user2@example.com', name: 'User 2' },
|
||||
{ email: 'user3@example.com', name: 'User 3' },
|
||||
]).returning();
|
||||
|
||||
// Batch with onConflictDoNothing
|
||||
await db.insert(users).values(bulkUsers).onConflictDoNothing();
|
||||
|
||||
// Batch with onConflictDoUpdate (upsert)
|
||||
await db.insert(users)
|
||||
.values(bulkUsers)
|
||||
.onConflictDoUpdate({
|
||||
target: users.email,
|
||||
set: { name: sql`EXCLUDED.name` },
|
||||
});
|
||||
```
|
||||
|
||||
### Batch Update
|
||||
|
||||
```typescript
|
||||
// Update multiple specific rows
|
||||
await db.transaction(async (tx) => {
|
||||
for (const update of updates) {
|
||||
await tx.update(users)
|
||||
.set({ name: update.name })
|
||||
.where(eq(users.id, update.id));
|
||||
}
|
||||
});
|
||||
|
||||
// Bulk update with CASE
|
||||
await db.execute(sql`
|
||||
UPDATE ${users}
|
||||
SET ${users.role} = CASE ${users.id}
|
||||
${sql.join(
|
||||
updates.map((u) => sql`WHEN ${u.id} THEN ${u.role}`),
|
||||
sql.raw(' ')
|
||||
)}
|
||||
END
|
||||
WHERE ${users.id} IN (${sql.join(updates.map((u) => u.id), sql.raw(', '))})
|
||||
`);
|
||||
```
|
||||
|
||||
### Batch Delete
|
||||
|
||||
```typescript
|
||||
// Delete multiple IDs
|
||||
await db.delete(users).where(inArray(users.id, [1, 2, 3, 4, 5]));
|
||||
|
||||
// Conditional batch delete
|
||||
await db.delete(posts).where(
|
||||
and(
|
||||
lt(posts.createdAt, new Date('2023-01-01')),
|
||||
eq(posts.isDraft, true)
|
||||
)
|
||||
);
|
||||
```
|
||||
|
||||
## LATERAL Joins
|
||||
|
||||
```typescript
|
||||
// Get top 3 posts for each author
|
||||
const authorsWithTopPosts = await db
|
||||
.select({
|
||||
author: authors,
|
||||
post: posts,
|
||||
})
|
||||
.from(authors)
|
||||
.leftJoin(
|
||||
sql`LATERAL (
|
||||
SELECT * FROM ${posts}
|
||||
WHERE ${posts.authorId} = ${authors.id}
|
||||
ORDER BY ${posts.views} DESC
|
||||
LIMIT 3
|
||||
) AS ${posts}`,
|
||||
sql`true`
|
||||
);
|
||||
```
|
||||
|
||||
## UNION Queries
|
||||
|
||||
```typescript
|
||||
// Combine results from multiple queries
|
||||
const allContent = await db
|
||||
.select({ id: posts.id, title: posts.title, type: sql<string>`'post'` })
|
||||
.from(posts)
|
||||
.union(
|
||||
db.select({ id: articles.id, title: articles.title, type: sql<string>`'article'` })
|
||||
.from(articles)
|
||||
);
|
||||
|
||||
// UNION ALL (includes duplicates)
|
||||
const allItems = await db
|
||||
.select({ id: products.id, name: products.name })
|
||||
.from(products)
|
||||
.unionAll(
|
||||
db.select({ id: services.id, name: services.name }).from(services)
|
||||
);
|
||||
```
|
||||
|
||||
## Distinct Queries
|
||||
|
||||
```typescript
|
||||
// DISTINCT
|
||||
const uniqueRoles = await db.selectDistinct({ role: users.role }).from(users);
|
||||
|
||||
// DISTINCT ON (PostgreSQL)
|
||||
const latestPostPerAuthor = await db
|
||||
.selectDistinctOn([posts.authorId], {
|
||||
post: posts,
|
||||
})
|
||||
.from(posts)
|
||||
.orderBy(posts.authorId, desc(posts.createdAt));
|
||||
```
|
||||
|
||||
## Locking Strategies
|
||||
|
||||
```typescript
|
||||
// FOR UPDATE (pessimistic locking)
|
||||
await db.transaction(async (tx) => {
|
||||
const user = await tx
|
||||
.select()
|
||||
.from(users)
|
||||
.where(eq(users.id, userId))
|
||||
.for('update');
|
||||
|
||||
// Critical section - user row is locked
|
||||
await tx.update(users)
|
||||
.set({ balance: user.balance - amount })
|
||||
.where(eq(users.id, userId));
|
||||
});
|
||||
|
||||
// FOR SHARE (shared lock)
|
||||
const user = await db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(eq(users.id, userId))
|
||||
.for('share');
|
||||
|
||||
// SKIP LOCKED
|
||||
const availableTask = await db
|
||||
.select()
|
||||
.from(tasks)
|
||||
.where(eq(tasks.status, 'pending'))
|
||||
.limit(1)
|
||||
.for('update', { skipLocked: true });
|
||||
```
|
||||
|
||||
## Query Builder Patterns
|
||||
|
||||
### Type-Safe Query Builder
|
||||
|
||||
```typescript
|
||||
class UserQueryBuilder {
|
||||
private query = db.select().from(users);
|
||||
|
||||
whereRole(role: string) {
|
||||
this.query = this.query.where(eq(users.role, role));
|
||||
return this;
|
||||
}
|
||||
|
||||
whereActive() {
|
||||
this.query = this.query.where(eq(users.isActive, true));
|
||||
return this;
|
||||
}
|
||||
|
||||
orderByCreated() {
|
||||
this.query = this.query.orderBy(desc(users.createdAt));
|
||||
return this;
|
||||
}
|
||||
|
||||
async execute() {
|
||||
return await this.query;
|
||||
}
|
||||
}
|
||||
|
||||
// Usage
|
||||
const admins = await new UserQueryBuilder()
|
||||
.whereRole('admin')
|
||||
.whereActive()
|
||||
.orderByCreated()
|
||||
.execute();
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Avoid N+1 Queries
|
||||
|
||||
```typescript
|
||||
// ❌ Bad: N+1 query
|
||||
const authors = await db.select().from(authors);
|
||||
for (const author of authors) {
|
||||
author.posts = await db.select().from(posts).where(eq(posts.authorId, author.id));
|
||||
}
|
||||
|
||||
// ✅ Good: Single query with join
|
||||
const authorsWithPosts = await db.query.authors.findMany({
|
||||
with: { posts: true },
|
||||
});
|
||||
|
||||
// ✅ Good: Dataloader pattern
|
||||
import DataLoader from 'dataloader';
|
||||
|
||||
const postLoader = new DataLoader(async (authorIds: number[]) => {
|
||||
const posts = await db.select().from(posts).where(inArray(posts.authorId, authorIds));
|
||||
|
||||
const grouped = authorIds.map(id =>
|
||||
posts.filter(post => post.authorId === id)
|
||||
);
|
||||
|
||||
return grouped;
|
||||
});
|
||||
```
|
||||
|
||||
### Query Timeouts
|
||||
|
||||
```typescript
|
||||
// PostgreSQL statement timeout
|
||||
await db.execute(sql`SET statement_timeout = '5s'`);
|
||||
|
||||
// Per-query timeout
|
||||
const withTimeout = async <T>(promise: Promise<T>, ms: number): Promise<T> => {
|
||||
const timeout = new Promise<never>((_, reject) =>
|
||||
setTimeout(() => reject(new Error('Query timeout')), ms)
|
||||
);
|
||||
return Promise.race([promise, timeout]);
|
||||
};
|
||||
|
||||
const users = await withTimeout(
|
||||
db.select().from(users),
|
||||
5000
|
||||
);
|
||||
```
|
||||
503
.claude/skills/drizzle-orm/references/vs-prisma.md
Normal file
503
.claude/skills/drizzle-orm/references/vs-prisma.md
Normal file
|
|
@ -0,0 +1,503 @@
|
|||
# Drizzle vs Prisma Comparison
|
||||
|
||||
Feature comparison, migration guide, and decision framework for choosing between Drizzle and Prisma.
|
||||
|
||||
## Quick Comparison
|
||||
|
||||
| Feature | Drizzle ORM | Prisma |
|
||||
|---------|-------------|--------|
|
||||
| **Type Safety** | ✅ Compile-time inference | ✅ Generated types |
|
||||
| **Bundle Size** | **~35KB** | ~230KB |
|
||||
| **Runtime** | **Zero dependencies** | Heavy runtime |
|
||||
| **Cold Start** | **~10ms** | ~250ms |
|
||||
| **Query Performance** | **Faster (native SQL)** | Slower (translation layer) |
|
||||
| **Learning Curve** | Moderate (SQL knowledge helpful) | Easier (abstracted) |
|
||||
| **Migrations** | SQL-based | Declarative schema |
|
||||
| **Raw SQL** | **First-class support** | Limited support |
|
||||
| **Edge Runtime** | **Fully compatible** | Limited support |
|
||||
| **Ecosystem** | Growing | Mature |
|
||||
| **Studio (GUI)** | ✅ Drizzle Studio | ✅ Prisma Studio |
|
||||
|
||||
## When to Choose Drizzle
|
||||
|
||||
### ✅ Choose Drizzle if you need:
|
||||
|
||||
1. **Performance-critical applications**
|
||||
- Microservices with tight latency requirements
|
||||
- High-throughput APIs (>10K req/s)
|
||||
- Serverless/edge functions with cold start concerns
|
||||
|
||||
2. **Minimal bundle size**
|
||||
- Client-side database (SQLite in browser)
|
||||
- Edge runtime deployments
|
||||
- Mobile applications with bundle constraints
|
||||
|
||||
3. **SQL control**
|
||||
- Complex queries with CTEs, window functions
|
||||
- Raw SQL for specific database features
|
||||
- Database-specific optimizations
|
||||
|
||||
4. **Type inference over generation**
|
||||
- No build step for type generation
|
||||
- Immediate TypeScript feedback
|
||||
- Schema changes reflected instantly
|
||||
|
||||
### Example: Edge Function with Drizzle
|
||||
|
||||
```typescript
|
||||
import { neon } from '@neondatabase/serverless';
|
||||
import { drizzle } from 'drizzle-orm/neon-http';
|
||||
|
||||
export const runtime = 'edge';
|
||||
|
||||
export async function GET() {
|
||||
const sql = neon(process.env.DATABASE_URL!);
|
||||
const db = drizzle(sql); // ~35KB bundle, <10ms cold start
|
||||
|
||||
const users = await db.select().from(users);
|
||||
return Response.json(users);
|
||||
}
|
||||
```
|
||||
|
||||
## When to Choose Prisma
|
||||
|
||||
### ✅ Choose Prisma if you need:
|
||||
|
||||
1. **Rapid prototyping**
|
||||
- Quick schema iterations
|
||||
- Automatic migrations
|
||||
- Less SQL knowledge required
|
||||
|
||||
2. **Team with varied SQL experience**
|
||||
- Abstracted query interface
|
||||
- Declarative migrations
|
||||
- Generated documentation
|
||||
|
||||
3. **Mature ecosystem**
|
||||
- Extensive community resources
|
||||
- Third-party integrations (Nexus, tRPC)
|
||||
- Enterprise support options
|
||||
|
||||
4. **Rich developer experience**
|
||||
- Prisma Studio (GUI)
|
||||
- VS Code extension
|
||||
- Comprehensive documentation
|
||||
|
||||
### Example: Next.js App with Prisma
|
||||
|
||||
```typescript
|
||||
// schema.prisma
|
||||
model User {
|
||||
id Int @id @default(autoincrement())
|
||||
email String @unique
|
||||
posts Post[]
|
||||
}
|
||||
|
||||
model Post {
|
||||
id Int @id @default(autoincrement())
|
||||
title String
|
||||
authorId Int
|
||||
author User @relation(fields: [authorId], references: [id])
|
||||
}
|
||||
|
||||
// app/api/users/route.ts
|
||||
import { prisma } from '@/lib/prisma';
|
||||
|
||||
export async function GET() {
|
||||
const users = await prisma.user.findMany({
|
||||
include: { posts: true },
|
||||
});
|
||||
return Response.json(users);
|
||||
}
|
||||
```
|
||||
|
||||
## Feature Comparison
|
||||
|
||||
### Schema Definition
|
||||
|
||||
**Drizzle** (TypeScript-first):
|
||||
```typescript
|
||||
import { pgTable, serial, text, integer } from 'drizzle-orm/pg-core';
|
||||
import { relations } from 'drizzle-orm';
|
||||
|
||||
export const users = pgTable('users', {
|
||||
id: serial('id').primaryKey(),
|
||||
email: text('email').notNull().unique(),
|
||||
});
|
||||
|
||||
export const posts = pgTable('posts', {
|
||||
id: serial('id').primaryKey(),
|
||||
title: text('title').notNull(),
|
||||
authorId: integer('author_id').notNull().references(() => users.id),
|
||||
});
|
||||
|
||||
export const usersRelations = relations(users, ({ many }) => ({
|
||||
posts: many(posts),
|
||||
}));
|
||||
```
|
||||
|
||||
**Prisma** (Schema DSL):
|
||||
```prisma
|
||||
model User {
|
||||
id Int @id @default(autoincrement())
|
||||
email String @unique
|
||||
posts Post[]
|
||||
}
|
||||
|
||||
model Post {
|
||||
id Int @id @default(autoincrement())
|
||||
title String
|
||||
authorId Int
|
||||
author User @relation(fields: [authorId], references: [id])
|
||||
}
|
||||
```
|
||||
|
||||
### Querying
|
||||
|
||||
**Drizzle** (SQL-like):
|
||||
```typescript
|
||||
import { eq, like, and, gt } from 'drizzle-orm';
|
||||
|
||||
// Simple query
|
||||
const user = await db.select().from(users).where(eq(users.id, 1));
|
||||
|
||||
// Complex filtering
|
||||
const results = await db.select()
|
||||
.from(users)
|
||||
.where(
|
||||
and(
|
||||
like(users.email, '%@example.com'),
|
||||
gt(users.createdAt, new Date('2024-01-01'))
|
||||
)
|
||||
);
|
||||
|
||||
// Joins
|
||||
const usersWithPosts = await db
|
||||
.select({
|
||||
user: users,
|
||||
post: posts,
|
||||
})
|
||||
.from(users)
|
||||
.leftJoin(posts, eq(users.id, posts.authorId));
|
||||
```
|
||||
|
||||
**Prisma** (Fluent API):
|
||||
```typescript
|
||||
// Simple query
|
||||
const user = await prisma.user.findUnique({ where: { id: 1 } });
|
||||
|
||||
// Complex filtering
|
||||
const results = await prisma.user.findMany({
|
||||
where: {
|
||||
email: { endsWith: '@example.com' },
|
||||
createdAt: { gt: new Date('2024-01-01') },
|
||||
},
|
||||
});
|
||||
|
||||
// Relations
|
||||
const usersWithPosts = await prisma.user.findMany({
|
||||
include: { posts: true },
|
||||
});
|
||||
```
|
||||
|
||||
### Migrations
|
||||
|
||||
**Drizzle** (SQL-based):
|
||||
```bash
|
||||
# Generate migration
|
||||
npx drizzle-kit generate
|
||||
|
||||
# Output: drizzle/0000_migration.sql
|
||||
# CREATE TABLE "users" (
|
||||
# "id" serial PRIMARY KEY,
|
||||
# "email" text NOT NULL UNIQUE
|
||||
# );
|
||||
|
||||
# Apply migration
|
||||
npx drizzle-kit migrate
|
||||
```
|
||||
|
||||
**Prisma** (Declarative):
|
||||
```bash
|
||||
# Generate and apply migration
|
||||
npx prisma migrate dev --name add_users
|
||||
|
||||
# Prisma compares schema.prisma to database
|
||||
# Generates SQL automatically
|
||||
# Applies migration
|
||||
```
|
||||
|
||||
### Type Generation
|
||||
|
||||
**Drizzle** (Inferred):
|
||||
```typescript
|
||||
// Types are inferred at compile time
|
||||
type User = typeof users.$inferSelect;
|
||||
type NewUser = typeof users.$inferInsert;
|
||||
|
||||
// Immediate feedback in IDE
|
||||
const user: User = await db.select().from(users);
|
||||
```
|
||||
|
||||
**Prisma** (Generated):
|
||||
```typescript
|
||||
// Types generated after schema change
|
||||
// Run: npx prisma generate
|
||||
|
||||
import { User, Post } from '@prisma/client';
|
||||
|
||||
const user: User = await prisma.user.findUnique({ where: { id: 1 } });
|
||||
```
|
||||
|
||||
### Raw SQL
|
||||
|
||||
**Drizzle** (First-class):
|
||||
```typescript
|
||||
import { sql } from 'drizzle-orm';
|
||||
|
||||
// Tagged template with type safety
|
||||
const result = await db.execute(
|
||||
sql`SELECT * FROM ${users} WHERE ${users.email} = ${email}`
|
||||
);
|
||||
|
||||
// Mix ORM and raw SQL
|
||||
const customQuery = await db
|
||||
.select({
|
||||
user: users,
|
||||
postCount: sql<number>`COUNT(${posts.id})`,
|
||||
})
|
||||
.from(users)
|
||||
.leftJoin(posts, eq(users.id, posts.authorId))
|
||||
.groupBy(users.id);
|
||||
```
|
||||
|
||||
**Prisma** (Limited):
|
||||
```typescript
|
||||
// Raw query (loses type safety)
|
||||
const result = await prisma.$queryRaw`
|
||||
SELECT * FROM users WHERE email = ${email}
|
||||
`;
|
||||
|
||||
// Typed raw query (manual type annotation)
|
||||
const users = await prisma.$queryRaw<User[]>`
|
||||
SELECT * FROM users
|
||||
`;
|
||||
```
|
||||
|
||||
## Performance Benchmarks
|
||||
|
||||
### Query Execution Time (1000 queries)
|
||||
|
||||
| Operation | Drizzle | Prisma | Difference |
|
||||
|-----------|---------|--------|------------|
|
||||
| findUnique | 1.2s | 3.1s | **2.6x faster** |
|
||||
| findMany (10 rows) | 1.5s | 3.8s | **2.5x faster** |
|
||||
| findMany (100 rows) | 2.1s | 5.2s | **2.5x faster** |
|
||||
| create | 1.8s | 4.1s | **2.3x faster** |
|
||||
| update | 1.7s | 3.9s | **2.3x faster** |
|
||||
|
||||
### Bundle Size Impact
|
||||
|
||||
```bash
|
||||
# Next.js production build
|
||||
|
||||
# With Drizzle
|
||||
├─ Client (First Load JS)
|
||||
│ └─ pages/index.js: 85 KB (+35KB Drizzle)
|
||||
|
||||
# With Prisma
|
||||
├─ Client (First Load JS)
|
||||
│ └─ pages/index.js: 280 KB (+230KB Prisma)
|
||||
```
|
||||
|
||||
### Cold Start Times (AWS Lambda)
|
||||
|
||||
| Database | Drizzle | Prisma |
|
||||
|----------|---------|--------|
|
||||
| PostgreSQL | ~50ms | ~300ms |
|
||||
| MySQL | ~45ms | ~280ms |
|
||||
| SQLite | ~10ms | ~150ms |
|
||||
|
||||
## Migration from Prisma to Drizzle
|
||||
|
||||
### Step 1: Install Drizzle
|
||||
|
||||
```bash
|
||||
npm install drizzle-orm
|
||||
npm install -D drizzle-kit
|
||||
|
||||
# Keep Prisma temporarily
|
||||
# npm uninstall prisma @prisma/client
|
||||
```
|
||||
|
||||
### Step 2: Introspect Existing Database
|
||||
|
||||
```typescript
|
||||
// drizzle.config.ts
|
||||
import type { Config } from 'drizzle-kit';
|
||||
|
||||
export default {
|
||||
schema: './db/schema.ts',
|
||||
out: './drizzle',
|
||||
dialect: 'postgresql',
|
||||
dbCredentials: {
|
||||
url: process.env.DATABASE_URL!,
|
||||
},
|
||||
} satisfies Config;
|
||||
```
|
||||
|
||||
```bash
|
||||
# Generate Drizzle schema from existing database
|
||||
npx drizzle-kit introspect
|
||||
```
|
||||
|
||||
### Step 3: Convert Queries
|
||||
|
||||
**Prisma**:
|
||||
```typescript
|
||||
// Before (Prisma)
|
||||
const users = await prisma.user.findMany({
|
||||
where: { email: { contains: 'example.com' } },
|
||||
include: { posts: true },
|
||||
orderBy: { createdAt: 'desc' },
|
||||
take: 10,
|
||||
});
|
||||
```
|
||||
|
||||
**Drizzle**:
|
||||
```typescript
|
||||
// After (Drizzle)
|
||||
import { like, desc } from 'drizzle-orm';
|
||||
|
||||
const users = await db.query.users.findMany({
|
||||
where: like(users.email, '%example.com%'),
|
||||
with: { posts: true },
|
||||
orderBy: [desc(users.createdAt)],
|
||||
limit: 10,
|
||||
});
|
||||
|
||||
// Or SQL-style
|
||||
const users = await db
|
||||
.select()
|
||||
.from(users)
|
||||
.where(like(users.email, '%example.com%'))
|
||||
.orderBy(desc(users.createdAt))
|
||||
.limit(10);
|
||||
```
|
||||
|
||||
### Step 4: Conversion Patterns
|
||||
|
||||
```typescript
|
||||
// Prisma → Drizzle mapping
|
||||
|
||||
// findUnique
|
||||
await prisma.user.findUnique({ where: { id: 1 } });
|
||||
await db.select().from(users).where(eq(users.id, 1));
|
||||
|
||||
// findMany with filters
|
||||
await prisma.user.findMany({ where: { role: 'admin' } });
|
||||
await db.select().from(users).where(eq(users.role, 'admin'));
|
||||
|
||||
// create
|
||||
await prisma.user.create({ data: { email: 'user@example.com' } });
|
||||
await db.insert(users).values({ email: 'user@example.com' }).returning();
|
||||
|
||||
// update
|
||||
await prisma.user.update({ where: { id: 1 }, data: { name: 'John' } });
|
||||
await db.update(users).set({ name: 'John' }).where(eq(users.id, 1));
|
||||
|
||||
// delete
|
||||
await prisma.user.delete({ where: { id: 1 } });
|
||||
await db.delete(users).where(eq(users.id, 1));
|
||||
|
||||
// count
|
||||
await prisma.user.count();
|
||||
await db.select({ count: count() }).from(users);
|
||||
|
||||
// aggregate
|
||||
await prisma.post.aggregate({ _avg: { views: true } });
|
||||
await db.select({ avg: avg(posts.views) }).from(posts);
|
||||
```
|
||||
|
||||
### Step 5: Test & Remove Prisma
|
||||
|
||||
```bash
|
||||
# Run tests with Drizzle
|
||||
npm test
|
||||
|
||||
# Remove Prisma when confident
|
||||
npm uninstall prisma @prisma/client
|
||||
rm -rf prisma/
|
||||
```
|
||||
|
||||
## Decision Matrix
|
||||
|
||||
| Requirement | Drizzle | Prisma |
|
||||
|-------------|---------|--------|
|
||||
| Need minimal bundle size | ✅ | ❌ |
|
||||
| Edge runtime deployment | ✅ | ⚠️ |
|
||||
| Team unfamiliar with SQL | ❌ | ✅ |
|
||||
| Complex raw SQL queries | ✅ | ❌ |
|
||||
| Rapid prototyping | ⚠️ | ✅ |
|
||||
| Type-safe migrations | ✅ | ✅ |
|
||||
| Performance critical | ✅ | ❌ |
|
||||
| Mature ecosystem | ⚠️ | ✅ |
|
||||
| First-class TypeScript | ✅ | ✅ |
|
||||
| Zero dependencies | ✅ | ❌ |
|
||||
|
||||
## Hybrid Approach
|
||||
|
||||
You can use both in the same project:
|
||||
|
||||
```typescript
|
||||
// Use Drizzle for performance-critical paths
|
||||
import { db as drizzleDb } from './lib/drizzle';
|
||||
|
||||
export async function GET() {
|
||||
const users = await drizzleDb.select().from(users);
|
||||
return Response.json(users);
|
||||
}
|
||||
|
||||
// Use Prisma for admin dashboards (less performance-critical)
|
||||
import { prisma } from './lib/prisma';
|
||||
|
||||
export async function getStaticProps() {
|
||||
const stats = await prisma.user.aggregate({
|
||||
_count: true,
|
||||
_avg: { posts: true },
|
||||
});
|
||||
return { props: { stats } };
|
||||
}
|
||||
```
|
||||
|
||||
## Community & Resources
|
||||
|
||||
### Drizzle
|
||||
- Docs: [orm.drizzle.team](https://orm.drizzle.team)
|
||||
- Discord: [drizzle.team/discord](https://drizzle.team/discord)
|
||||
- GitHub: [drizzle-team/drizzle-orm](https://github.com/drizzle-team/drizzle-orm)
|
||||
|
||||
### Prisma
|
||||
- Docs: [prisma.io/docs](https://prisma.io/docs)
|
||||
- Discord: [pris.ly/discord](https://pris.ly/discord)
|
||||
- GitHub: [prisma/prisma](https://github.com/prisma/prisma)
|
||||
|
||||
## Final Recommendation
|
||||
|
||||
**Choose Drizzle for:**
|
||||
- Greenfield projects prioritizing performance
|
||||
- Edge/serverless applications
|
||||
- Teams comfortable with SQL
|
||||
- Minimal bundle size requirements
|
||||
|
||||
**Choose Prisma for:**
|
||||
- Established teams with Prisma experience
|
||||
- Rapid MVP development
|
||||
- Teams new to databases
|
||||
- Reliance on Prisma ecosystem (Nexus, etc.)
|
||||
|
||||
**Consider migration when:**
|
||||
- Performance becomes a bottleneck
|
||||
- Bundle size impacts user experience
|
||||
- Edge runtime deployment needed
|
||||
- Team SQL proficiency increases
|
||||
75
.claude/skills/fastify-best-practices/SKILL.md
Normal file
75
.claude/skills/fastify-best-practices/SKILL.md
Normal file
|
|
@ -0,0 +1,75 @@
|
|||
---
|
||||
name: fastify-best-practices
|
||||
description: "Guides development of Fastify Node.js backend servers and REST APIs using TypeScript or JavaScript. Use when building, configuring, or debugging a Fastify application — including defining routes, implementing plugins, setting up JSON Schema validation, handling errors, optimising performance, managing authentication, configuring CORS and security headers, integrating databases, working with WebSockets, and deploying to production. Covers the full Fastify request lifecycle (hooks, serialization, logging with Pino) and TypeScript integration via strip types. Trigger terms: Fastify, Node.js server, REST API, API routes, backend framework, fastify.config, server.ts, app.ts."
|
||||
metadata:
|
||||
tags: fastify, nodejs, typescript, backend, api, server, http
|
||||
---
|
||||
|
||||
## When to use
|
||||
|
||||
Use this skill when you need to:
|
||||
- Develop backend applications using Fastify
|
||||
- Implement Fastify plugins and route handlers
|
||||
- Get guidance on Fastify architecture and patterns
|
||||
- Use TypeScript with Fastify (strip types)
|
||||
- Implement testing with Fastify's inject method
|
||||
- Configure validation, serialization, and error handling
|
||||
|
||||
## Quick Start
|
||||
|
||||
A minimal, runnable Fastify server to get started immediately:
|
||||
|
||||
```ts
|
||||
import Fastify from 'fastify'
|
||||
|
||||
const app = Fastify({ logger: true })
|
||||
|
||||
app.get('/health', async (request, reply) => {
|
||||
return { status: 'ok' }
|
||||
})
|
||||
|
||||
const start = async () => {
|
||||
await app.listen({ port: 3000, host: '0.0.0.0' })
|
||||
}
|
||||
start()
|
||||
```
|
||||
|
||||
## Recommended Reading Order for Common Scenarios
|
||||
|
||||
- **New to Fastify?** Start with `plugins.md` → `routes.md` → `schemas.md`
|
||||
- **Adding authentication:** `plugins.md` → `hooks.md` → `authentication.md`
|
||||
- **Improving performance:** `schemas.md` → `serialization.md` → `performance.md`
|
||||
- **Setting up testing:** `routes.md` → `testing.md`
|
||||
- **Going to production:** `logging.md` → `configuration.md` → `deployment.md`
|
||||
|
||||
## How to use
|
||||
|
||||
Read individual rule files for detailed explanations and code examples:
|
||||
|
||||
- [rules/plugins.md](rules/plugins.md) - Plugin development and encapsulation
|
||||
- [rules/routes.md](rules/routes.md) - Route organization and handlers
|
||||
- [rules/schemas.md](rules/schemas.md) - JSON Schema validation
|
||||
- [rules/error-handling.md](rules/error-handling.md) - Error handling patterns
|
||||
- [rules/hooks.md](rules/hooks.md) - Hooks and request lifecycle
|
||||
- [rules/authentication.md](rules/authentication.md) - Authentication and authorization
|
||||
- [rules/testing.md](rules/testing.md) - Testing with inject()
|
||||
- [rules/performance.md](rules/performance.md) - Performance optimization
|
||||
- [rules/logging.md](rules/logging.md) - Logging with Pino
|
||||
- [rules/typescript.md](rules/typescript.md) - TypeScript integration
|
||||
- [rules/decorators.md](rules/decorators.md) - Decorators and extensions
|
||||
- [rules/content-type.md](rules/content-type.md) - Content type parsing
|
||||
- [rules/serialization.md](rules/serialization.md) - Response serialization
|
||||
- [rules/cors-security.md](rules/cors-security.md) - CORS and security headers
|
||||
- [rules/websockets.md](rules/websockets.md) - WebSocket support
|
||||
- [rules/database.md](rules/database.md) - Database integration patterns
|
||||
- [rules/configuration.md](rules/configuration.md) - Application configuration
|
||||
- [rules/deployment.md](rules/deployment.md) - Production deployment
|
||||
- [rules/http-proxy.md](rules/http-proxy.md) - HTTP proxying and reply.from()
|
||||
|
||||
## Core Principles
|
||||
|
||||
- **Encapsulation**: Fastify's plugin system provides automatic encapsulation
|
||||
- **Schema-first**: Define schemas for validation and serialization
|
||||
- **Performance**: Fastify is optimized for speed; use its features correctly
|
||||
- **Async/await**: All handlers and hooks support async functions
|
||||
- **Minimal dependencies**: Prefer Fastify's built-in features and official plugins
|
||||
521
.claude/skills/fastify-best-practices/rules/authentication.md
Normal file
521
.claude/skills/fastify-best-practices/rules/authentication.md
Normal file
|
|
@ -0,0 +1,521 @@
|
|||
---
|
||||
name: authentication
|
||||
description: Authentication and authorization patterns in Fastify
|
||||
metadata:
|
||||
tags: auth, jwt, session, oauth, security, authorization
|
||||
---
|
||||
|
||||
# Authentication and Authorization
|
||||
|
||||
## JWT Authentication with @fastify/jwt
|
||||
|
||||
Use `@fastify/jwt` for JSON Web Token authentication:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import fastifyJwt from '@fastify/jwt';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
app.register(fastifyJwt, {
|
||||
secret: process.env.JWT_SECRET,
|
||||
sign: {
|
||||
expiresIn: '1h',
|
||||
},
|
||||
});
|
||||
|
||||
// Decorate request with authentication method
|
||||
app.decorate('authenticate', async function (request, reply) {
|
||||
try {
|
||||
await request.jwtVerify();
|
||||
} catch (err) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
}
|
||||
});
|
||||
|
||||
// Login route
|
||||
app.post('/login', {
|
||||
schema: {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
email: { type: 'string', format: 'email' },
|
||||
password: { type: 'string' },
|
||||
},
|
||||
required: ['email', 'password'],
|
||||
},
|
||||
},
|
||||
}, async (request, reply) => {
|
||||
const { email, password } = request.body;
|
||||
const user = await validateCredentials(email, password);
|
||||
|
||||
if (!user) {
|
||||
return reply.code(401).send({ error: 'Invalid credentials' });
|
||||
}
|
||||
|
||||
const token = app.jwt.sign({
|
||||
id: user.id,
|
||||
email: user.email,
|
||||
role: user.role,
|
||||
});
|
||||
|
||||
return { token };
|
||||
});
|
||||
|
||||
// Protected route
|
||||
app.get('/profile', {
|
||||
onRequest: [app.authenticate],
|
||||
}, async (request) => {
|
||||
return { user: request.user };
|
||||
});
|
||||
```
|
||||
|
||||
## Refresh Tokens
|
||||
|
||||
Implement refresh token rotation:
|
||||
|
||||
```typescript
|
||||
import fastifyJwt from '@fastify/jwt';
|
||||
import { randomBytes } from 'node:crypto';
|
||||
|
||||
app.register(fastifyJwt, {
|
||||
secret: process.env.JWT_SECRET,
|
||||
sign: {
|
||||
expiresIn: '15m', // Short-lived access tokens
|
||||
},
|
||||
});
|
||||
|
||||
// Store refresh tokens (use Redis in production)
|
||||
const refreshTokens = new Map<string, { userId: string; expires: number }>();
|
||||
|
||||
app.post('/auth/login', async (request, reply) => {
|
||||
const { email, password } = request.body;
|
||||
const user = await validateCredentials(email, password);
|
||||
|
||||
if (!user) {
|
||||
return reply.code(401).send({ error: 'Invalid credentials' });
|
||||
}
|
||||
|
||||
const accessToken = app.jwt.sign({ id: user.id, role: user.role });
|
||||
const refreshToken = randomBytes(32).toString('hex');
|
||||
|
||||
refreshTokens.set(refreshToken, {
|
||||
userId: user.id,
|
||||
expires: Date.now() + 7 * 24 * 60 * 60 * 1000, // 7 days
|
||||
});
|
||||
|
||||
return { accessToken, refreshToken };
|
||||
});
|
||||
|
||||
app.post('/auth/refresh', async (request, reply) => {
|
||||
const { refreshToken } = request.body;
|
||||
const stored = refreshTokens.get(refreshToken);
|
||||
|
||||
if (!stored || stored.expires < Date.now()) {
|
||||
refreshTokens.delete(refreshToken);
|
||||
return reply.code(401).send({ error: 'Invalid refresh token' });
|
||||
}
|
||||
|
||||
// Delete old token (rotation)
|
||||
refreshTokens.delete(refreshToken);
|
||||
|
||||
const user = await db.users.findById(stored.userId);
|
||||
const accessToken = app.jwt.sign({ id: user.id, role: user.role });
|
||||
const newRefreshToken = randomBytes(32).toString('hex');
|
||||
|
||||
refreshTokens.set(newRefreshToken, {
|
||||
userId: user.id,
|
||||
expires: Date.now() + 7 * 24 * 60 * 60 * 1000,
|
||||
});
|
||||
|
||||
return { accessToken, refreshToken: newRefreshToken };
|
||||
});
|
||||
|
||||
app.post('/auth/logout', async (request, reply) => {
|
||||
const { refreshToken } = request.body;
|
||||
refreshTokens.delete(refreshToken);
|
||||
return { success: true };
|
||||
});
|
||||
```
|
||||
|
||||
## Role-Based Access Control
|
||||
|
||||
Implement RBAC with decorators:
|
||||
|
||||
```typescript
|
||||
type Role = 'admin' | 'user' | 'moderator';
|
||||
|
||||
// Create authorization decorator
|
||||
app.decorate('authorize', function (...allowedRoles: Role[]) {
|
||||
return async (request, reply) => {
|
||||
await request.jwtVerify();
|
||||
|
||||
const userRole = request.user.role as Role;
|
||||
if (!allowedRoles.includes(userRole)) {
|
||||
return reply.code(403).send({
|
||||
error: 'Forbidden',
|
||||
message: `Role '${userRole}' is not authorized for this resource`,
|
||||
});
|
||||
}
|
||||
};
|
||||
});
|
||||
|
||||
// Admin only route
|
||||
app.get('/admin/users', {
|
||||
onRequest: [app.authorize('admin')],
|
||||
}, async (request) => {
|
||||
return db.users.findAll();
|
||||
});
|
||||
|
||||
// Admin or moderator
|
||||
app.delete('/posts/:id', {
|
||||
onRequest: [app.authorize('admin', 'moderator')],
|
||||
}, async (request) => {
|
||||
await db.posts.delete(request.params.id);
|
||||
return { deleted: true };
|
||||
});
|
||||
```
|
||||
|
||||
## Permission-Based Authorization
|
||||
|
||||
Fine-grained permission checks:
|
||||
|
||||
```typescript
|
||||
interface Permission {
|
||||
resource: string;
|
||||
action: 'create' | 'read' | 'update' | 'delete';
|
||||
}
|
||||
|
||||
const rolePermissions: Record<string, Permission[]> = {
|
||||
admin: [
|
||||
{ resource: '*', action: 'create' },
|
||||
{ resource: '*', action: 'read' },
|
||||
{ resource: '*', action: 'update' },
|
||||
{ resource: '*', action: 'delete' },
|
||||
],
|
||||
user: [
|
||||
{ resource: 'posts', action: 'create' },
|
||||
{ resource: 'posts', action: 'read' },
|
||||
{ resource: 'comments', action: 'create' },
|
||||
{ resource: 'comments', action: 'read' },
|
||||
],
|
||||
};
|
||||
|
||||
function hasPermission(role: string, resource: string, action: string): boolean {
|
||||
const permissions = rolePermissions[role] || [];
|
||||
return permissions.some(
|
||||
(p) =>
|
||||
(p.resource === '*' || p.resource === resource) &&
|
||||
p.action === action
|
||||
);
|
||||
}
|
||||
|
||||
app.decorate('checkPermission', function (resource: string, action: string) {
|
||||
return async (request, reply) => {
|
||||
await request.jwtVerify();
|
||||
|
||||
if (!hasPermission(request.user.role, resource, action)) {
|
||||
return reply.code(403).send({
|
||||
error: 'Forbidden',
|
||||
message: `Not allowed to ${action} ${resource}`,
|
||||
});
|
||||
}
|
||||
};
|
||||
});
|
||||
|
||||
// Usage
|
||||
app.post('/posts', {
|
||||
onRequest: [app.checkPermission('posts', 'create')],
|
||||
}, createPostHandler);
|
||||
|
||||
app.delete('/posts/:id', {
|
||||
onRequest: [app.checkPermission('posts', 'delete')],
|
||||
}, deletePostHandler);
|
||||
```
|
||||
|
||||
## API Key / Bearer Token Authentication
|
||||
|
||||
Use `@fastify/bearer-auth` for API key and bearer token authentication:
|
||||
|
||||
```typescript
|
||||
import bearerAuth from '@fastify/bearer-auth';
|
||||
|
||||
const validKeys = new Set([process.env.API_KEY]);
|
||||
|
||||
app.register(bearerAuth, {
|
||||
keys: validKeys,
|
||||
errorResponse: (err) => ({
|
||||
error: 'Unauthorized',
|
||||
message: 'Invalid API key',
|
||||
}),
|
||||
});
|
||||
|
||||
// All routes are now protected
|
||||
app.get('/api/data', async (request) => {
|
||||
return { data: [] };
|
||||
});
|
||||
```
|
||||
|
||||
For database-backed API keys with custom validation:
|
||||
|
||||
```typescript
|
||||
import bearerAuth from '@fastify/bearer-auth';
|
||||
|
||||
app.register(bearerAuth, {
|
||||
auth: async (key, request) => {
|
||||
const apiKey = await db.apiKeys.findByKey(key);
|
||||
|
||||
if (!apiKey || !apiKey.active) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Track usage (fire and forget)
|
||||
db.apiKeys.recordUsage(apiKey.id, {
|
||||
ip: request.ip,
|
||||
timestamp: new Date(),
|
||||
});
|
||||
|
||||
request.apiKey = apiKey;
|
||||
return true;
|
||||
},
|
||||
errorResponse: (err) => ({
|
||||
error: 'Unauthorized',
|
||||
message: 'Invalid API key',
|
||||
}),
|
||||
});
|
||||
```
|
||||
|
||||
## OAuth 2.0 Integration
|
||||
|
||||
Integrate with OAuth providers using @fastify/oauth2:
|
||||
|
||||
```typescript
|
||||
import fastifyOauth2 from '@fastify/oauth2';
|
||||
|
||||
app.register(fastifyOauth2, {
|
||||
name: 'googleOAuth2',
|
||||
scope: ['profile', 'email'],
|
||||
credentials: {
|
||||
client: {
|
||||
id: process.env.GOOGLE_CLIENT_ID,
|
||||
secret: process.env.GOOGLE_CLIENT_SECRET,
|
||||
},
|
||||
},
|
||||
startRedirectPath: '/auth/google',
|
||||
callbackUri: 'http://localhost:3000/auth/google/callback',
|
||||
discovery: {
|
||||
issuer: 'https://accounts.google.com',
|
||||
},
|
||||
});
|
||||
|
||||
app.get('/auth/google/callback', async (request, reply) => {
|
||||
const { token } = await app.googleOAuth2.getAccessTokenFromAuthorizationCodeFlow(request);
|
||||
|
||||
// Fetch user info from Google
|
||||
const userInfo = await fetch('https://www.googleapis.com/oauth2/v2/userinfo', {
|
||||
headers: { Authorization: `Bearer ${token.access_token}` },
|
||||
}).then((r) => r.json());
|
||||
|
||||
// Find or create user
|
||||
let user = await db.users.findByEmail(userInfo.email);
|
||||
if (!user) {
|
||||
user = await db.users.create({
|
||||
email: userInfo.email,
|
||||
name: userInfo.name,
|
||||
provider: 'google',
|
||||
providerId: userInfo.id,
|
||||
});
|
||||
}
|
||||
|
||||
// Generate JWT
|
||||
const jwt = app.jwt.sign({ id: user.id, role: user.role });
|
||||
|
||||
// Redirect to frontend with token
|
||||
return reply.redirect(`/auth/success?token=${jwt}`);
|
||||
});
|
||||
```
|
||||
|
||||
## Session-Based Authentication
|
||||
|
||||
Use @fastify/session for session management:
|
||||
|
||||
```typescript
|
||||
import fastifyCookie from '@fastify/cookie';
|
||||
import fastifySession from '@fastify/session';
|
||||
import RedisStore from 'connect-redis';
|
||||
import { createClient } from 'redis';
|
||||
|
||||
const redisClient = createClient({ url: process.env.REDIS_URL });
|
||||
await redisClient.connect();
|
||||
|
||||
app.register(fastifyCookie);
|
||||
app.register(fastifySession, {
|
||||
secret: process.env.SESSION_SECRET,
|
||||
store: new RedisStore({ client: redisClient }),
|
||||
cookie: {
|
||||
secure: process.env.NODE_ENV === 'production',
|
||||
httpOnly: true,
|
||||
maxAge: 24 * 60 * 60 * 1000, // 1 day
|
||||
},
|
||||
});
|
||||
|
||||
app.post('/login', async (request, reply) => {
|
||||
const { email, password } = request.body;
|
||||
const user = await validateCredentials(email, password);
|
||||
|
||||
if (!user) {
|
||||
return reply.code(401).send({ error: 'Invalid credentials' });
|
||||
}
|
||||
|
||||
request.session.userId = user.id;
|
||||
request.session.role = user.role;
|
||||
|
||||
return { success: true };
|
||||
});
|
||||
|
||||
app.decorate('requireSession', async function (request, reply) {
|
||||
if (!request.session.userId) {
|
||||
return reply.code(401).send({ error: 'Not authenticated' });
|
||||
}
|
||||
});
|
||||
|
||||
app.get('/profile', {
|
||||
onRequest: [app.requireSession],
|
||||
}, async (request) => {
|
||||
const user = await db.users.findById(request.session.userId);
|
||||
return { user };
|
||||
});
|
||||
|
||||
app.post('/logout', async (request, reply) => {
|
||||
await request.session.destroy();
|
||||
return { success: true };
|
||||
});
|
||||
```
|
||||
|
||||
## Resource-Based Authorization
|
||||
|
||||
Check ownership of resources:
|
||||
|
||||
```typescript
|
||||
app.decorate('checkOwnership', function (getResourceOwnerId: (request) => Promise<string>) {
|
||||
return async (request, reply) => {
|
||||
const ownerId = await getResourceOwnerId(request);
|
||||
|
||||
if (ownerId !== request.user.id && request.user.role !== 'admin') {
|
||||
return reply.code(403).send({
|
||||
error: 'Forbidden',
|
||||
message: 'You do not own this resource',
|
||||
});
|
||||
}
|
||||
};
|
||||
});
|
||||
|
||||
// Check post ownership
|
||||
app.put('/posts/:id', {
|
||||
onRequest: [
|
||||
app.authenticate,
|
||||
app.checkOwnership(async (request) => {
|
||||
const post = await db.posts.findById(request.params.id);
|
||||
return post?.authorId;
|
||||
}),
|
||||
],
|
||||
}, updatePostHandler);
|
||||
|
||||
// Alternative: inline check
|
||||
app.put('/posts/:id', {
|
||||
onRequest: [app.authenticate],
|
||||
}, async (request, reply) => {
|
||||
const post = await db.posts.findById(request.params.id);
|
||||
|
||||
if (!post) {
|
||||
return reply.code(404).send({ error: 'Post not found' });
|
||||
}
|
||||
|
||||
if (post.authorId !== request.user.id && request.user.role !== 'admin') {
|
||||
return reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
|
||||
return db.posts.update(post.id, request.body);
|
||||
});
|
||||
```
|
||||
|
||||
## Password Hashing
|
||||
|
||||
Use secure password hashing with argon2:
|
||||
|
||||
```typescript
|
||||
import { hash, verify } from '@node-rs/argon2';
|
||||
|
||||
async function hashPassword(password: string): Promise<string> {
|
||||
return hash(password, {
|
||||
memoryCost: 65536,
|
||||
timeCost: 3,
|
||||
parallelism: 4,
|
||||
});
|
||||
}
|
||||
|
||||
async function verifyPassword(hash: string, password: string): Promise<boolean> {
|
||||
return verify(hash, password);
|
||||
}
|
||||
|
||||
app.post('/register', async (request, reply) => {
|
||||
const { email, password } = request.body;
|
||||
|
||||
const hashedPassword = await hashPassword(password);
|
||||
const user = await db.users.create({
|
||||
email,
|
||||
password: hashedPassword,
|
||||
});
|
||||
|
||||
reply.code(201);
|
||||
return { id: user.id, email: user.email };
|
||||
});
|
||||
|
||||
app.post('/login', async (request, reply) => {
|
||||
const { email, password } = request.body;
|
||||
const user = await db.users.findByEmail(email);
|
||||
|
||||
if (!user || !(await verifyPassword(user.password, password))) {
|
||||
return reply.code(401).send({ error: 'Invalid credentials' });
|
||||
}
|
||||
|
||||
const token = app.jwt.sign({ id: user.id, role: user.role });
|
||||
return { token };
|
||||
});
|
||||
```
|
||||
|
||||
## Rate Limiting for Auth Endpoints
|
||||
|
||||
Protect auth endpoints from brute force. **IMPORTANT: For production security, you MUST configure rate limiting with a Redis backend.** In-memory rate limiting is not safe for distributed deployments and can be bypassed.
|
||||
|
||||
```typescript
|
||||
import fastifyRateLimit from '@fastify/rate-limit';
|
||||
import Redis from 'ioredis';
|
||||
|
||||
const redis = new Redis(process.env.REDIS_URL);
|
||||
|
||||
// Global rate limit with Redis backend
|
||||
app.register(fastifyRateLimit, {
|
||||
max: 100,
|
||||
timeWindow: '1 minute',
|
||||
redis, // REQUIRED for production - ensures rate limiting works across all instances
|
||||
});
|
||||
|
||||
// Stricter limit for auth endpoints
|
||||
app.register(async function authRoutes(fastify) {
|
||||
await fastify.register(fastifyRateLimit, {
|
||||
max: 5,
|
||||
timeWindow: '1 minute',
|
||||
redis, // REQUIRED for production
|
||||
keyGenerator: (request) => {
|
||||
// Rate limit by IP + email combination
|
||||
const email = request.body?.email || '';
|
||||
return `${request.ip}:${email}`;
|
||||
},
|
||||
});
|
||||
|
||||
fastify.post('/login', loginHandler);
|
||||
fastify.post('/register', registerHandler);
|
||||
fastify.post('/forgot-password', forgotPasswordHandler);
|
||||
}, { prefix: '/auth' });
|
||||
```
|
||||
217
.claude/skills/fastify-best-practices/rules/configuration.md
Normal file
217
.claude/skills/fastify-best-practices/rules/configuration.md
Normal file
|
|
@ -0,0 +1,217 @@
|
|||
---
|
||||
name: configuration
|
||||
description: Application configuration in Fastify using env-schema
|
||||
metadata:
|
||||
tags: configuration, environment, env, settings, env-schema
|
||||
---
|
||||
|
||||
# Application Configuration
|
||||
|
||||
## Use env-schema for Configuration
|
||||
|
||||
**Always use `env-schema` for configuration validation.** It provides JSON Schema validation for environment variables with sensible defaults.
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import envSchema from 'env-schema';
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
const schema = Type.Object({
|
||||
PORT: Type.Number({ default: 3000 }),
|
||||
HOST: Type.String({ default: '0.0.0.0' }),
|
||||
DATABASE_URL: Type.String(),
|
||||
JWT_SECRET: Type.String({ minLength: 32 }),
|
||||
LOG_LEVEL: Type.Union([
|
||||
Type.Literal('trace'),
|
||||
Type.Literal('debug'),
|
||||
Type.Literal('info'),
|
||||
Type.Literal('warn'),
|
||||
Type.Literal('error'),
|
||||
Type.Literal('fatal'),
|
||||
], { default: 'info' }),
|
||||
});
|
||||
|
||||
type Config = Static<typeof schema>;
|
||||
|
||||
const config = envSchema<Config>({
|
||||
schema,
|
||||
dotenv: true, // Load from .env file
|
||||
});
|
||||
|
||||
const app = Fastify({
|
||||
logger: { level: config.LOG_LEVEL },
|
||||
});
|
||||
|
||||
app.decorate('config', config);
|
||||
|
||||
declare module 'fastify' {
|
||||
interface FastifyInstance {
|
||||
config: Config;
|
||||
}
|
||||
}
|
||||
|
||||
await app.listen({ port: config.PORT, host: config.HOST });
|
||||
```
|
||||
|
||||
## Configuration as Plugin
|
||||
|
||||
Encapsulate configuration in a plugin for reuse:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
import envSchema from 'env-schema';
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
const schema = Type.Object({
|
||||
PORT: Type.Number({ default: 3000 }),
|
||||
HOST: Type.String({ default: '0.0.0.0' }),
|
||||
DATABASE_URL: Type.String(),
|
||||
JWT_SECRET: Type.String({ minLength: 32 }),
|
||||
LOG_LEVEL: Type.String({ default: 'info' }),
|
||||
});
|
||||
|
||||
type Config = Static<typeof schema>;
|
||||
|
||||
declare module 'fastify' {
|
||||
interface FastifyInstance {
|
||||
config: Config;
|
||||
}
|
||||
}
|
||||
|
||||
export default fp(async function configPlugin(fastify) {
|
||||
const config = envSchema<Config>({
|
||||
schema,
|
||||
dotenv: true,
|
||||
});
|
||||
|
||||
fastify.decorate('config', config);
|
||||
}, {
|
||||
name: 'config',
|
||||
});
|
||||
```
|
||||
|
||||
## Secrets Management
|
||||
|
||||
Handle secrets securely:
|
||||
|
||||
```typescript
|
||||
// Never log secrets
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: config.LOG_LEVEL,
|
||||
redact: ['req.headers.authorization', '*.password', '*.secret', '*.apiKey'],
|
||||
},
|
||||
});
|
||||
|
||||
// For production, use secret managers (AWS Secrets Manager, Vault, etc.)
|
||||
// Pass secrets through environment variables - never commit them
|
||||
```
|
||||
|
||||
## Feature Flags
|
||||
|
||||
Implement feature flags via environment variables:
|
||||
|
||||
```typescript
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
const schema = Type.Object({
|
||||
// ... other config
|
||||
FEATURE_NEW_DASHBOARD: Type.Boolean({ default: false }),
|
||||
FEATURE_BETA_API: Type.Boolean({ default: false }),
|
||||
});
|
||||
|
||||
type Config = Static<typeof schema>;
|
||||
|
||||
const config = envSchema<Config>({ schema, dotenv: true });
|
||||
|
||||
// Use in routes
|
||||
app.get('/dashboard', async (request) => {
|
||||
if (app.config.FEATURE_NEW_DASHBOARD) {
|
||||
return { version: 'v2', data: await getNewDashboardData() };
|
||||
}
|
||||
return { version: 'v1', data: await getOldDashboardData() };
|
||||
});
|
||||
```
|
||||
|
||||
## Anti-Patterns to Avoid
|
||||
|
||||
### NEVER use configuration files
|
||||
|
||||
```typescript
|
||||
// ❌ NEVER DO THIS - configuration files are an antipattern
|
||||
import config from './config/production.json';
|
||||
|
||||
// ❌ NEVER DO THIS - per-environment config files
|
||||
const env = process.env.NODE_ENV || 'development';
|
||||
const config = await import(`./config/${env}.js`);
|
||||
```
|
||||
|
||||
Configuration files lead to:
|
||||
- Security risks (secrets in files)
|
||||
- Deployment complexity
|
||||
- Environment drift
|
||||
- Difficult secret rotation
|
||||
|
||||
### NEVER use per-environment configuration
|
||||
|
||||
```typescript
|
||||
// ❌ NEVER DO THIS
|
||||
const configs = {
|
||||
development: { logLevel: 'debug' },
|
||||
production: { logLevel: 'info' },
|
||||
test: { logLevel: 'silent' },
|
||||
};
|
||||
const config = configs[process.env.NODE_ENV];
|
||||
```
|
||||
|
||||
Instead, use a single configuration source (environment variables) with sensible defaults. The environment controls the values, not conditional code.
|
||||
|
||||
### Use specific environment variables, not NODE_ENV
|
||||
|
||||
```typescript
|
||||
// ❌ AVOID checking NODE_ENV
|
||||
if (process.env.NODE_ENV === 'production') {
|
||||
// do something
|
||||
}
|
||||
|
||||
// ✅ BETTER - use explicit feature flags or configuration
|
||||
if (app.config.ENABLE_DETAILED_LOGGING) {
|
||||
// do something
|
||||
}
|
||||
```
|
||||
|
||||
## Dynamic Configuration
|
||||
|
||||
For configuration that needs to change without restart, fetch from an external service:
|
||||
|
||||
```typescript
|
||||
interface DynamicConfig {
|
||||
rateLimit: number;
|
||||
maintenanceMode: boolean;
|
||||
}
|
||||
|
||||
let dynamicConfig: DynamicConfig = {
|
||||
rateLimit: 100,
|
||||
maintenanceMode: false,
|
||||
};
|
||||
|
||||
async function refreshConfig() {
|
||||
try {
|
||||
const newConfig = await fetchConfigFromService();
|
||||
dynamicConfig = newConfig;
|
||||
app.log.info('Configuration refreshed');
|
||||
} catch (error) {
|
||||
app.log.error({ err: error }, 'Failed to refresh configuration');
|
||||
}
|
||||
}
|
||||
|
||||
// Refresh periodically
|
||||
setInterval(refreshConfig, 60000);
|
||||
|
||||
// Use in hooks
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
if (dynamicConfig.maintenanceMode && !request.url.startsWith('/health')) {
|
||||
reply.code(503).send({ error: 'Service under maintenance' });
|
||||
}
|
||||
});
|
||||
```
|
||||
387
.claude/skills/fastify-best-practices/rules/content-type.md
Normal file
387
.claude/skills/fastify-best-practices/rules/content-type.md
Normal file
|
|
@ -0,0 +1,387 @@
|
|||
---
|
||||
name: content-type
|
||||
description: Content type parsing in Fastify
|
||||
metadata:
|
||||
tags: content-type, parsing, body, multipart, json
|
||||
---
|
||||
|
||||
# Content Type Parsing
|
||||
|
||||
## Default Content Type Parsers
|
||||
|
||||
Fastify includes parsers for common content types:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Built-in parsers:
|
||||
// - application/json
|
||||
// - text/plain
|
||||
|
||||
app.post('/json', async (request) => {
|
||||
// request.body is parsed JSON object
|
||||
return { received: request.body };
|
||||
});
|
||||
|
||||
app.post('/text', async (request) => {
|
||||
// request.body is string for text/plain
|
||||
return { text: request.body };
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Content Type Parsers
|
||||
|
||||
Add parsers for additional content types:
|
||||
|
||||
```typescript
|
||||
// Parse application/x-www-form-urlencoded
|
||||
app.addContentTypeParser(
|
||||
'application/x-www-form-urlencoded',
|
||||
{ parseAs: 'string' },
|
||||
(request, body, done) => {
|
||||
const parsed = new URLSearchParams(body);
|
||||
done(null, Object.fromEntries(parsed));
|
||||
},
|
||||
);
|
||||
|
||||
// Async parser
|
||||
app.addContentTypeParser(
|
||||
'application/x-www-form-urlencoded',
|
||||
{ parseAs: 'string' },
|
||||
async (request, body) => {
|
||||
const parsed = new URLSearchParams(body);
|
||||
return Object.fromEntries(parsed);
|
||||
},
|
||||
);
|
||||
```
|
||||
|
||||
## XML Parsing
|
||||
|
||||
Parse XML content:
|
||||
|
||||
```typescript
|
||||
import { XMLParser } from 'fast-xml-parser';
|
||||
|
||||
const xmlParser = new XMLParser({
|
||||
ignoreAttributes: false,
|
||||
attributeNamePrefix: '@_',
|
||||
});
|
||||
|
||||
app.addContentTypeParser(
|
||||
'application/xml',
|
||||
{ parseAs: 'string' },
|
||||
async (request, body) => {
|
||||
return xmlParser.parse(body);
|
||||
},
|
||||
);
|
||||
|
||||
app.addContentTypeParser(
|
||||
'text/xml',
|
||||
{ parseAs: 'string' },
|
||||
async (request, body) => {
|
||||
return xmlParser.parse(body);
|
||||
},
|
||||
);
|
||||
|
||||
app.post('/xml', async (request) => {
|
||||
// request.body is parsed XML as JavaScript object
|
||||
return { data: request.body };
|
||||
});
|
||||
```
|
||||
|
||||
## Multipart Form Data
|
||||
|
||||
Use @fastify/multipart for file uploads. **Configure these critical options:**
|
||||
|
||||
```typescript
|
||||
import fastifyMultipart from '@fastify/multipart';
|
||||
|
||||
app.register(fastifyMultipart, {
|
||||
// CRITICAL: Always set explicit limits
|
||||
limits: {
|
||||
fieldNameSize: 100, // Max field name size in bytes
|
||||
fieldSize: 1024 * 1024, // Max field value size (1MB)
|
||||
fields: 10, // Max number of non-file fields
|
||||
fileSize: 10 * 1024 * 1024, // Max file size (10MB)
|
||||
files: 5, // Max number of files
|
||||
headerPairs: 2000, // Max number of header pairs
|
||||
parts: 1000, // Max number of parts (fields + files)
|
||||
},
|
||||
// IMPORTANT: Throw on limit exceeded (default is to truncate silently!)
|
||||
throwFileSizeLimit: true,
|
||||
// Attach all fields to request.body for easier access
|
||||
attachFieldsToBody: true,
|
||||
// Only accept specific file types (security!)
|
||||
// onFile: async (part) => {
|
||||
// if (!['image/jpeg', 'image/png'].includes(part.mimetype)) {
|
||||
// throw new Error('Invalid file type');
|
||||
// }
|
||||
// },
|
||||
});
|
||||
|
||||
// Handle file upload
|
||||
app.post('/upload', async (request, reply) => {
|
||||
const data = await request.file();
|
||||
|
||||
if (!data) {
|
||||
return reply.code(400).send({ error: 'No file uploaded' });
|
||||
}
|
||||
|
||||
// data.file is a stream
|
||||
const buffer = await data.toBuffer();
|
||||
|
||||
return {
|
||||
filename: data.filename,
|
||||
mimetype: data.mimetype,
|
||||
size: buffer.length,
|
||||
};
|
||||
});
|
||||
|
||||
// Handle multiple files
|
||||
app.post('/upload-multiple', async (request) => {
|
||||
const files = [];
|
||||
|
||||
for await (const part of request.files()) {
|
||||
const buffer = await part.toBuffer();
|
||||
files.push({
|
||||
filename: part.filename,
|
||||
mimetype: part.mimetype,
|
||||
size: buffer.length,
|
||||
});
|
||||
}
|
||||
|
||||
return { files };
|
||||
});
|
||||
|
||||
// Handle mixed form data
|
||||
app.post('/form', async (request) => {
|
||||
const parts = request.parts();
|
||||
const fields: Record<string, string> = {};
|
||||
const files: Array<{ name: string; size: number }> = [];
|
||||
|
||||
for await (const part of parts) {
|
||||
if (part.type === 'file') {
|
||||
const buffer = await part.toBuffer();
|
||||
files.push({ name: part.filename, size: buffer.length });
|
||||
} else {
|
||||
fields[part.fieldname] = part.value as string;
|
||||
}
|
||||
}
|
||||
|
||||
return { fields, files };
|
||||
});
|
||||
```
|
||||
|
||||
## Stream Processing
|
||||
|
||||
Process body as stream for large payloads:
|
||||
|
||||
```typescript
|
||||
import { pipeline } from 'node:stream/promises';
|
||||
import { createWriteStream } from 'node:fs';
|
||||
|
||||
// Add parser that returns stream
|
||||
app.addContentTypeParser(
|
||||
'application/octet-stream',
|
||||
async (request, payload) => {
|
||||
return payload; // Return stream directly
|
||||
},
|
||||
);
|
||||
|
||||
app.post('/upload-stream', async (request, reply) => {
|
||||
const destination = createWriteStream('./upload.bin');
|
||||
|
||||
await pipeline(request.body, destination);
|
||||
|
||||
return { success: true };
|
||||
});
|
||||
```
|
||||
|
||||
## Custom JSON Parser
|
||||
|
||||
Replace the default JSON parser:
|
||||
|
||||
```typescript
|
||||
// Remove default parser
|
||||
app.removeContentTypeParser('application/json');
|
||||
|
||||
// Add custom parser with error handling
|
||||
app.addContentTypeParser(
|
||||
'application/json',
|
||||
{ parseAs: 'string' },
|
||||
async (request, body) => {
|
||||
try {
|
||||
return JSON.parse(body);
|
||||
} catch (error) {
|
||||
throw {
|
||||
statusCode: 400,
|
||||
code: 'INVALID_JSON',
|
||||
message: 'Invalid JSON payload',
|
||||
};
|
||||
}
|
||||
},
|
||||
);
|
||||
```
|
||||
|
||||
## Content Type with Parameters
|
||||
|
||||
Handle content types with parameters:
|
||||
|
||||
```typescript
|
||||
// Match content type with any charset
|
||||
app.addContentTypeParser(
|
||||
'application/json; charset=utf-8',
|
||||
{ parseAs: 'string' },
|
||||
async (request, body) => {
|
||||
return JSON.parse(body);
|
||||
},
|
||||
);
|
||||
|
||||
// Use regex for flexible matching
|
||||
app.addContentTypeParser(
|
||||
/^application\/.*\+json$/,
|
||||
{ parseAs: 'string' },
|
||||
async (request, body) => {
|
||||
return JSON.parse(body);
|
||||
},
|
||||
);
|
||||
```
|
||||
|
||||
## Catch-All Parser
|
||||
|
||||
Handle unknown content types:
|
||||
|
||||
```typescript
|
||||
app.addContentTypeParser('*', async (request, payload) => {
|
||||
const chunks: Buffer[] = [];
|
||||
|
||||
for await (const chunk of payload) {
|
||||
chunks.push(chunk);
|
||||
}
|
||||
|
||||
const buffer = Buffer.concat(chunks);
|
||||
|
||||
// Try to determine content type
|
||||
const contentType = request.headers['content-type'];
|
||||
|
||||
if (contentType?.includes('json')) {
|
||||
return JSON.parse(buffer.toString('utf-8'));
|
||||
}
|
||||
|
||||
if (contentType?.includes('text')) {
|
||||
return buffer.toString('utf-8');
|
||||
}
|
||||
|
||||
return buffer;
|
||||
});
|
||||
```
|
||||
|
||||
## Body Limit Configuration
|
||||
|
||||
Configure body size limits:
|
||||
|
||||
```typescript
|
||||
// Global limit
|
||||
const app = Fastify({
|
||||
bodyLimit: 1048576, // 1MB
|
||||
});
|
||||
|
||||
// Per-route limit
|
||||
app.post('/large-upload', {
|
||||
bodyLimit: 52428800, // 50MB for this route
|
||||
}, async (request) => {
|
||||
return { size: JSON.stringify(request.body).length };
|
||||
});
|
||||
|
||||
// Per content type limit
|
||||
app.addContentTypeParser('application/json', {
|
||||
parseAs: 'string',
|
||||
bodyLimit: 2097152, // 2MB for JSON
|
||||
}, async (request, body) => {
|
||||
return JSON.parse(body);
|
||||
});
|
||||
```
|
||||
|
||||
## Protocol Buffers
|
||||
|
||||
Parse protobuf content:
|
||||
|
||||
```typescript
|
||||
import protobuf from 'protobufjs';
|
||||
|
||||
const root = await protobuf.load('./schema.proto');
|
||||
const MessageType = root.lookupType('package.MessageType');
|
||||
|
||||
app.addContentTypeParser(
|
||||
'application/x-protobuf',
|
||||
{ parseAs: 'buffer' },
|
||||
async (request, body) => {
|
||||
const message = MessageType.decode(body);
|
||||
return MessageType.toObject(message);
|
||||
},
|
||||
);
|
||||
```
|
||||
|
||||
## Form Data with @fastify/formbody
|
||||
|
||||
Simple form parsing:
|
||||
|
||||
```typescript
|
||||
import formbody from '@fastify/formbody';
|
||||
|
||||
app.register(formbody);
|
||||
|
||||
app.post('/form', async (request) => {
|
||||
// request.body is parsed form data
|
||||
const { name, email } = request.body as { name: string; email: string };
|
||||
return { name, email };
|
||||
});
|
||||
```
|
||||
|
||||
## Content Negotiation
|
||||
|
||||
Handle different request formats:
|
||||
|
||||
```typescript
|
||||
app.post('/data', async (request, reply) => {
|
||||
const contentType = request.headers['content-type'];
|
||||
|
||||
// Body is already parsed by the appropriate parser
|
||||
const data = request.body;
|
||||
|
||||
// Respond based on Accept header
|
||||
const accept = request.headers.accept;
|
||||
|
||||
if (accept?.includes('application/xml')) {
|
||||
reply.type('application/xml');
|
||||
return `<data>${JSON.stringify(data)}</data>`;
|
||||
}
|
||||
|
||||
reply.type('application/json');
|
||||
return data;
|
||||
});
|
||||
```
|
||||
|
||||
## Validation After Parsing
|
||||
|
||||
Validate parsed content:
|
||||
|
||||
```typescript
|
||||
app.post('/users', {
|
||||
schema: {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string', minLength: 1 },
|
||||
email: { type: 'string', format: 'email' },
|
||||
},
|
||||
required: ['name', 'email'],
|
||||
},
|
||||
},
|
||||
}, async (request) => {
|
||||
// Body is parsed AND validated
|
||||
return request.body;
|
||||
});
|
||||
```
|
||||
445
.claude/skills/fastify-best-practices/rules/cors-security.md
Normal file
445
.claude/skills/fastify-best-practices/rules/cors-security.md
Normal file
|
|
@ -0,0 +1,445 @@
|
|||
---
|
||||
name: cors-security
|
||||
description: CORS and security headers in Fastify
|
||||
metadata:
|
||||
tags: cors, security, headers, helmet, csrf
|
||||
---
|
||||
|
||||
# CORS and Security
|
||||
|
||||
## CORS with @fastify/cors
|
||||
|
||||
Enable Cross-Origin Resource Sharing:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import cors from '@fastify/cors';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Simple CORS - allow all origins
|
||||
app.register(cors);
|
||||
|
||||
// Configured CORS
|
||||
app.register(cors, {
|
||||
origin: ['https://example.com', 'https://app.example.com'],
|
||||
methods: ['GET', 'POST', 'PUT', 'DELETE'],
|
||||
allowedHeaders: ['Content-Type', 'Authorization'],
|
||||
exposedHeaders: ['X-Total-Count'],
|
||||
credentials: true,
|
||||
maxAge: 86400, // 24 hours
|
||||
});
|
||||
```
|
||||
|
||||
## Dynamic CORS Origin
|
||||
|
||||
Validate origins dynamically:
|
||||
|
||||
```typescript
|
||||
app.register(cors, {
|
||||
origin: (origin, callback) => {
|
||||
// Allow requests with no origin (mobile apps, curl, etc.)
|
||||
if (!origin) {
|
||||
return callback(null, true);
|
||||
}
|
||||
|
||||
// Check against allowed origins
|
||||
const allowedOrigins = [
|
||||
'https://example.com',
|
||||
'https://app.example.com',
|
||||
/\.example\.com$/,
|
||||
];
|
||||
|
||||
const isAllowed = allowedOrigins.some((allowed) => {
|
||||
if (allowed instanceof RegExp) {
|
||||
return allowed.test(origin);
|
||||
}
|
||||
return allowed === origin;
|
||||
});
|
||||
|
||||
if (isAllowed) {
|
||||
callback(null, true);
|
||||
} else {
|
||||
callback(new Error('Not allowed by CORS'), false);
|
||||
}
|
||||
},
|
||||
credentials: true,
|
||||
});
|
||||
```
|
||||
|
||||
## Per-Route CORS
|
||||
|
||||
Configure CORS for specific routes:
|
||||
|
||||
```typescript
|
||||
app.register(cors, {
|
||||
origin: true, // Reflect request origin
|
||||
credentials: true,
|
||||
});
|
||||
|
||||
// Or disable CORS for specific routes
|
||||
app.route({
|
||||
method: 'GET',
|
||||
url: '/internal',
|
||||
config: {
|
||||
cors: false,
|
||||
},
|
||||
handler: async () => {
|
||||
return { internal: true };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Security Headers with @fastify/helmet
|
||||
|
||||
Add security headers:
|
||||
|
||||
```typescript
|
||||
import helmet from '@fastify/helmet';
|
||||
|
||||
app.register(helmet, {
|
||||
contentSecurityPolicy: {
|
||||
directives: {
|
||||
defaultSrc: ["'self'"],
|
||||
scriptSrc: ["'self'", "'unsafe-inline'"],
|
||||
styleSrc: ["'self'", "'unsafe-inline'"],
|
||||
imgSrc: ["'self'", 'data:', 'https:'],
|
||||
connectSrc: ["'self'", 'https://api.example.com'],
|
||||
},
|
||||
},
|
||||
crossOriginEmbedderPolicy: false, // Disable if embedding external resources
|
||||
});
|
||||
```
|
||||
|
||||
## Configure Individual Headers
|
||||
|
||||
Fine-tune security headers:
|
||||
|
||||
```typescript
|
||||
app.register(helmet, {
|
||||
// Strict Transport Security
|
||||
hsts: {
|
||||
maxAge: 31536000, // 1 year
|
||||
includeSubDomains: true,
|
||||
preload: true,
|
||||
},
|
||||
|
||||
// Content Security Policy
|
||||
contentSecurityPolicy: {
|
||||
useDefaults: true,
|
||||
directives: {
|
||||
'script-src': ["'self'", 'https://trusted-cdn.com'],
|
||||
},
|
||||
},
|
||||
|
||||
// X-Frame-Options
|
||||
frameguard: {
|
||||
action: 'deny', // or 'sameorigin'
|
||||
},
|
||||
|
||||
// X-Content-Type-Options
|
||||
noSniff: true,
|
||||
|
||||
// X-XSS-Protection (legacy)
|
||||
xssFilter: true,
|
||||
|
||||
// Referrer-Policy
|
||||
referrerPolicy: {
|
||||
policy: 'strict-origin-when-cross-origin',
|
||||
},
|
||||
|
||||
// X-Permitted-Cross-Domain-Policies
|
||||
permittedCrossDomainPolicies: false,
|
||||
|
||||
// X-DNS-Prefetch-Control
|
||||
dnsPrefetchControl: {
|
||||
allow: false,
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Rate Limiting
|
||||
|
||||
Protect against abuse:
|
||||
|
||||
```typescript
|
||||
import rateLimit from '@fastify/rate-limit';
|
||||
|
||||
app.register(rateLimit, {
|
||||
max: 100,
|
||||
timeWindow: '1 minute',
|
||||
errorResponseBuilder: (request, context) => ({
|
||||
statusCode: 429,
|
||||
error: 'Too Many Requests',
|
||||
message: `Rate limit exceeded. Retry in ${context.after}`,
|
||||
retryAfter: context.after,
|
||||
}),
|
||||
});
|
||||
|
||||
// Per-route rate limit
|
||||
app.get('/expensive', {
|
||||
config: {
|
||||
rateLimit: {
|
||||
max: 10,
|
||||
timeWindow: '1 minute',
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
|
||||
// Skip rate limit for certain routes
|
||||
app.get('/health', {
|
||||
config: {
|
||||
rateLimit: false,
|
||||
},
|
||||
}, () => ({ status: 'ok' }));
|
||||
```
|
||||
|
||||
## Redis-Based Rate Limiting
|
||||
|
||||
Use Redis for distributed rate limiting:
|
||||
|
||||
```typescript
|
||||
import rateLimit from '@fastify/rate-limit';
|
||||
import Redis from 'ioredis';
|
||||
|
||||
const redis = new Redis(process.env.REDIS_URL);
|
||||
|
||||
app.register(rateLimit, {
|
||||
max: 100,
|
||||
timeWindow: '1 minute',
|
||||
redis,
|
||||
nameSpace: 'rate-limit:',
|
||||
keyGenerator: (request) => {
|
||||
// Rate limit by user ID if authenticated, otherwise by IP
|
||||
return request.user?.id || request.ip;
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## CSRF Protection
|
||||
|
||||
Protect against Cross-Site Request Forgery:
|
||||
|
||||
```typescript
|
||||
import fastifyCsrf from '@fastify/csrf-protection';
|
||||
import fastifyCookie from '@fastify/cookie';
|
||||
|
||||
app.register(fastifyCookie);
|
||||
app.register(fastifyCsrf, {
|
||||
cookieOpts: {
|
||||
signed: true,
|
||||
httpOnly: true,
|
||||
sameSite: 'strict',
|
||||
},
|
||||
});
|
||||
|
||||
// Generate token
|
||||
app.get('/csrf-token', async (request, reply) => {
|
||||
const token = reply.generateCsrf();
|
||||
return { token };
|
||||
});
|
||||
|
||||
// Protected route
|
||||
app.post('/transfer', {
|
||||
preHandler: app.csrfProtection,
|
||||
}, async (request) => {
|
||||
// CSRF token validated
|
||||
return { success: true };
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Security Headers
|
||||
|
||||
Add custom headers:
|
||||
|
||||
```typescript
|
||||
app.addHook('onSend', async (request, reply) => {
|
||||
// Custom security headers
|
||||
reply.header('X-Request-ID', request.id);
|
||||
reply.header('X-Content-Type-Options', 'nosniff');
|
||||
reply.header('X-Frame-Options', 'DENY');
|
||||
reply.header('Permissions-Policy', 'geolocation=(), camera=()');
|
||||
});
|
||||
|
||||
// Per-route headers
|
||||
app.get('/download', async (request, reply) => {
|
||||
reply.header('Content-Disposition', 'attachment; filename="file.pdf"');
|
||||
reply.header('X-Download-Options', 'noopen');
|
||||
return reply.send(fileStream);
|
||||
});
|
||||
```
|
||||
|
||||
## Secure Cookies
|
||||
|
||||
Configure secure cookies:
|
||||
|
||||
```typescript
|
||||
import cookie from '@fastify/cookie';
|
||||
|
||||
app.register(cookie, {
|
||||
secret: process.env.COOKIE_SECRET,
|
||||
parseOptions: {
|
||||
httpOnly: true,
|
||||
secure: process.env.NODE_ENV === 'production',
|
||||
sameSite: 'strict',
|
||||
path: '/',
|
||||
maxAge: 3600, // 1 hour
|
||||
},
|
||||
});
|
||||
|
||||
// Set secure cookie
|
||||
app.post('/login', async (request, reply) => {
|
||||
const token = await createSession(request.body);
|
||||
|
||||
reply.setCookie('session', token, {
|
||||
httpOnly: true,
|
||||
secure: true,
|
||||
sameSite: 'strict',
|
||||
path: '/',
|
||||
maxAge: 86400,
|
||||
signed: true,
|
||||
});
|
||||
|
||||
return { success: true };
|
||||
});
|
||||
|
||||
// Read signed cookie
|
||||
app.get('/profile', async (request) => {
|
||||
const session = request.cookies.session;
|
||||
const unsigned = request.unsignCookie(session);
|
||||
|
||||
if (!unsigned.valid) {
|
||||
throw { statusCode: 401, message: 'Invalid session' };
|
||||
}
|
||||
|
||||
return { sessionId: unsigned.value };
|
||||
});
|
||||
```
|
||||
|
||||
## Request Validation Security
|
||||
|
||||
Validate and sanitize input:
|
||||
|
||||
```typescript
|
||||
// Schema-based validation protects against injection
|
||||
app.post('/users', {
|
||||
schema: {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
email: {
|
||||
type: 'string',
|
||||
format: 'email',
|
||||
maxLength: 254,
|
||||
},
|
||||
name: {
|
||||
type: 'string',
|
||||
minLength: 1,
|
||||
maxLength: 100,
|
||||
pattern: '^[a-zA-Z\\s]+$', // Only letters and spaces
|
||||
},
|
||||
},
|
||||
required: ['email', 'name'],
|
||||
additionalProperties: false,
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## IP Filtering
|
||||
|
||||
Restrict access by IP:
|
||||
|
||||
```typescript
|
||||
const allowedIps = new Set([
|
||||
'192.168.1.0/24',
|
||||
'10.0.0.0/8',
|
||||
]);
|
||||
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
if (request.url.startsWith('/admin')) {
|
||||
const clientIp = request.ip;
|
||||
|
||||
if (!isIpAllowed(clientIp, allowedIps)) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
function isIpAllowed(ip: string, allowed: Set<string>): boolean {
|
||||
// Implement IP/CIDR matching
|
||||
for (const range of allowed) {
|
||||
if (ipInRange(ip, range)) return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
```
|
||||
|
||||
## Trust Proxy
|
||||
|
||||
Configure for reverse proxy environments:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
trustProxy: true, // Trust X-Forwarded-* headers
|
||||
});
|
||||
|
||||
// Or specific proxy configuration
|
||||
const app = Fastify({
|
||||
trustProxy: ['127.0.0.1', '10.0.0.0/8'],
|
||||
});
|
||||
|
||||
// Now request.ip returns the real client IP
|
||||
app.get('/ip', async (request) => {
|
||||
return {
|
||||
ip: request.ip,
|
||||
ips: request.ips, // Array of all IPs in chain
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## HTTPS Redirect
|
||||
|
||||
Force HTTPS in production:
|
||||
|
||||
```typescript
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
if (
|
||||
process.env.NODE_ENV === 'production' &&
|
||||
request.headers['x-forwarded-proto'] !== 'https'
|
||||
) {
|
||||
const httpsUrl = `https://${request.hostname}${request.url}`;
|
||||
reply.redirect(301, httpsUrl);
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Security Best Practices Summary
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import cors from '@fastify/cors';
|
||||
import helmet from '@fastify/helmet';
|
||||
import rateLimit from '@fastify/rate-limit';
|
||||
|
||||
const app = Fastify({
|
||||
trustProxy: true,
|
||||
bodyLimit: 1048576, // 1MB max body
|
||||
});
|
||||
|
||||
// Security plugins
|
||||
app.register(helmet);
|
||||
app.register(cors, {
|
||||
origin: process.env.ALLOWED_ORIGINS?.split(','),
|
||||
credentials: true,
|
||||
});
|
||||
app.register(rateLimit, {
|
||||
max: 100,
|
||||
timeWindow: '1 minute',
|
||||
});
|
||||
|
||||
// Validate all input with schemas
|
||||
// Never expose internal errors in production
|
||||
// Use parameterized queries for database
|
||||
// Keep dependencies updated
|
||||
```
|
||||
320
.claude/skills/fastify-best-practices/rules/database.md
Normal file
320
.claude/skills/fastify-best-practices/rules/database.md
Normal file
|
|
@ -0,0 +1,320 @@
|
|||
---
|
||||
name: database
|
||||
description: Database integration with Fastify using official adapters
|
||||
metadata:
|
||||
tags: database, postgres, mysql, mongodb, redis, sql
|
||||
---
|
||||
|
||||
# Database Integration
|
||||
|
||||
## Use Official Fastify Database Adapters
|
||||
|
||||
Always use the official Fastify database plugins from the `@fastify` organization. They provide proper connection pooling, encapsulation, and integration with Fastify's lifecycle.
|
||||
|
||||
## PostgreSQL with @fastify/postgres
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import fastifyPostgres from '@fastify/postgres';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.register(fastifyPostgres, {
|
||||
connectionString: process.env.DATABASE_URL,
|
||||
});
|
||||
|
||||
// Use in routes
|
||||
app.get('/users', async (request) => {
|
||||
const client = await app.pg.connect();
|
||||
try {
|
||||
const { rows } = await client.query('SELECT * FROM users');
|
||||
return rows;
|
||||
} finally {
|
||||
client.release();
|
||||
}
|
||||
});
|
||||
|
||||
// Or use the pool directly for simple queries
|
||||
app.get('/users/:id', async (request) => {
|
||||
const { id } = request.params;
|
||||
const { rows } = await app.pg.query(
|
||||
'SELECT * FROM users WHERE id = $1',
|
||||
[id],
|
||||
);
|
||||
return rows[0];
|
||||
});
|
||||
|
||||
// Transactions
|
||||
app.post('/transfer', async (request) => {
|
||||
const { fromId, toId, amount } = request.body;
|
||||
const client = await app.pg.connect();
|
||||
|
||||
try {
|
||||
await client.query('BEGIN');
|
||||
await client.query(
|
||||
'UPDATE accounts SET balance = balance - $1 WHERE id = $2',
|
||||
[amount, fromId],
|
||||
);
|
||||
await client.query(
|
||||
'UPDATE accounts SET balance = balance + $1 WHERE id = $2',
|
||||
[amount, toId],
|
||||
);
|
||||
await client.query('COMMIT');
|
||||
return { success: true };
|
||||
} catch (error) {
|
||||
await client.query('ROLLBACK');
|
||||
throw error;
|
||||
} finally {
|
||||
client.release();
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## MySQL with @fastify/mysql
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import fastifyMysql from '@fastify/mysql';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.register(fastifyMysql, {
|
||||
promise: true,
|
||||
connectionString: process.env.MYSQL_URL,
|
||||
});
|
||||
|
||||
app.get('/users', async (request) => {
|
||||
const connection = await app.mysql.getConnection();
|
||||
try {
|
||||
const [rows] = await connection.query('SELECT * FROM users');
|
||||
return rows;
|
||||
} finally {
|
||||
connection.release();
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## MongoDB with @fastify/mongodb
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import fastifyMongo from '@fastify/mongodb';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.register(fastifyMongo, {
|
||||
url: process.env.MONGODB_URL,
|
||||
});
|
||||
|
||||
app.get('/users', async (request) => {
|
||||
const users = await app.mongo.db
|
||||
.collection('users')
|
||||
.find({})
|
||||
.toArray();
|
||||
return users;
|
||||
});
|
||||
|
||||
app.get('/users/:id', async (request) => {
|
||||
const { id } = request.params;
|
||||
const user = await app.mongo.db
|
||||
.collection('users')
|
||||
.findOne({ _id: new app.mongo.ObjectId(id) });
|
||||
return user;
|
||||
});
|
||||
|
||||
app.post('/users', async (request) => {
|
||||
const result = await app.mongo.db
|
||||
.collection('users')
|
||||
.insertOne(request.body);
|
||||
return { id: result.insertedId };
|
||||
});
|
||||
```
|
||||
|
||||
## Redis with @fastify/redis
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import fastifyRedis from '@fastify/redis';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.register(fastifyRedis, {
|
||||
url: process.env.REDIS_URL,
|
||||
});
|
||||
|
||||
// Caching example
|
||||
app.get('/data/:key', async (request) => {
|
||||
const { key } = request.params;
|
||||
|
||||
// Try cache first
|
||||
const cached = await app.redis.get(`cache:${key}`);
|
||||
if (cached) {
|
||||
return JSON.parse(cached);
|
||||
}
|
||||
|
||||
// Fetch from database
|
||||
const data = await fetchFromDatabase(key);
|
||||
|
||||
// Cache for 5 minutes
|
||||
await app.redis.setex(`cache:${key}`, 300, JSON.stringify(data));
|
||||
|
||||
return data;
|
||||
});
|
||||
```
|
||||
|
||||
## Database as Plugin
|
||||
|
||||
Encapsulate database access in a plugin:
|
||||
|
||||
```typescript
|
||||
// plugins/database.ts
|
||||
import fp from 'fastify-plugin';
|
||||
import fastifyPostgres from '@fastify/postgres';
|
||||
|
||||
export default fp(async function databasePlugin(fastify) {
|
||||
await fastify.register(fastifyPostgres, {
|
||||
connectionString: fastify.config.DATABASE_URL,
|
||||
});
|
||||
|
||||
// Add health check
|
||||
fastify.decorate('checkDatabaseHealth', async () => {
|
||||
try {
|
||||
await fastify.pg.query('SELECT 1');
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
});
|
||||
}, {
|
||||
name: 'database',
|
||||
dependencies: ['config'],
|
||||
});
|
||||
```
|
||||
|
||||
## Repository Pattern
|
||||
|
||||
Abstract database access with repositories:
|
||||
|
||||
```typescript
|
||||
// repositories/user.repository.ts
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
|
||||
export interface User {
|
||||
id: string;
|
||||
email: string;
|
||||
name: string;
|
||||
}
|
||||
|
||||
export function createUserRepository(app: FastifyInstance) {
|
||||
return {
|
||||
async findById(id: string): Promise<User | null> {
|
||||
const { rows } = await app.pg.query(
|
||||
'SELECT * FROM users WHERE id = $1',
|
||||
[id],
|
||||
);
|
||||
return rows[0] || null;
|
||||
},
|
||||
|
||||
async findByEmail(email: string): Promise<User | null> {
|
||||
const { rows } = await app.pg.query(
|
||||
'SELECT * FROM users WHERE email = $1',
|
||||
[email],
|
||||
);
|
||||
return rows[0] || null;
|
||||
},
|
||||
|
||||
async create(data: Omit<User, 'id'>): Promise<User> {
|
||||
const { rows } = await app.pg.query(
|
||||
'INSERT INTO users (email, name) VALUES ($1, $2) RETURNING *',
|
||||
[data.email, data.name],
|
||||
);
|
||||
return rows[0];
|
||||
},
|
||||
|
||||
async update(id: string, data: Partial<User>): Promise<User | null> {
|
||||
const fields = Object.keys(data);
|
||||
const values = Object.values(data);
|
||||
const setClause = fields
|
||||
.map((f, i) => `${f} = $${i + 2}`)
|
||||
.join(', ');
|
||||
|
||||
const { rows } = await app.pg.query(
|
||||
`UPDATE users SET ${setClause} WHERE id = $1 RETURNING *`,
|
||||
[id, ...values],
|
||||
);
|
||||
return rows[0] || null;
|
||||
},
|
||||
|
||||
async delete(id: string): Promise<boolean> {
|
||||
const { rowCount } = await app.pg.query(
|
||||
'DELETE FROM users WHERE id = $1',
|
||||
[id],
|
||||
);
|
||||
return rowCount > 0;
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
// Usage in plugin
|
||||
import fp from 'fastify-plugin';
|
||||
import { createUserRepository } from './repositories/user.repository.js';
|
||||
|
||||
export default fp(async function repositoriesPlugin(fastify) {
|
||||
fastify.decorate('repositories', {
|
||||
users: createUserRepository(fastify),
|
||||
});
|
||||
}, {
|
||||
name: 'repositories',
|
||||
dependencies: ['database'],
|
||||
});
|
||||
```
|
||||
|
||||
## Testing with Database
|
||||
|
||||
Use transactions for test isolation:
|
||||
|
||||
```typescript
|
||||
import { describe, it, beforeEach, afterEach } from 'node:test';
|
||||
import { build } from './app.js';
|
||||
|
||||
describe('User API', () => {
|
||||
let app;
|
||||
let client;
|
||||
|
||||
beforeEach(async () => {
|
||||
app = await build();
|
||||
client = await app.pg.connect();
|
||||
await client.query('BEGIN');
|
||||
});
|
||||
|
||||
afterEach(async () => {
|
||||
await client.query('ROLLBACK');
|
||||
client.release();
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should create a user', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/users',
|
||||
payload: { email: 'test@example.com', name: 'Test' },
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 201);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Connection Pool Configuration
|
||||
|
||||
Configure connection pools appropriately:
|
||||
|
||||
```typescript
|
||||
app.register(fastifyPostgres, {
|
||||
connectionString: process.env.DATABASE_URL,
|
||||
// Pool configuration
|
||||
max: 20, // Maximum pool size
|
||||
idleTimeoutMillis: 30000, // Close idle clients after 30s
|
||||
connectionTimeoutMillis: 5000, // Timeout for new connections
|
||||
});
|
||||
```
|
||||
416
.claude/skills/fastify-best-practices/rules/decorators.md
Normal file
416
.claude/skills/fastify-best-practices/rules/decorators.md
Normal file
|
|
@ -0,0 +1,416 @@
|
|||
---
|
||||
name: decorators
|
||||
description: Decorators and request/reply extensions in Fastify
|
||||
metadata:
|
||||
tags: decorators, extensions, customization, utilities
|
||||
---
|
||||
|
||||
# Decorators and Extensions
|
||||
|
||||
## Understanding Decorators
|
||||
|
||||
Decorators add custom properties and methods to Fastify instances, requests, and replies:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Decorate the Fastify instance
|
||||
app.decorate('utility', {
|
||||
formatDate: (date: Date) => date.toISOString(),
|
||||
generateId: () => crypto.randomUUID(),
|
||||
});
|
||||
|
||||
// Use in routes
|
||||
app.get('/example', async function (request, reply) {
|
||||
const id = this.utility.generateId();
|
||||
return { id, timestamp: this.utility.formatDate(new Date()) };
|
||||
});
|
||||
```
|
||||
|
||||
## Decorator Types
|
||||
|
||||
Three types of decorators for different contexts:
|
||||
|
||||
```typescript
|
||||
// Instance decorator - available on fastify instance
|
||||
app.decorate('config', { apiVersion: '1.0.0' });
|
||||
app.decorate('db', databaseConnection);
|
||||
app.decorate('cache', cacheClient);
|
||||
|
||||
// Request decorator - available on each request
|
||||
app.decorateRequest('user', null); // Object property
|
||||
app.decorateRequest('startTime', 0); // Primitive
|
||||
app.decorateRequest('getData', function() { // Method
|
||||
return this.body;
|
||||
});
|
||||
|
||||
// Reply decorator - available on each reply
|
||||
app.decorateReply('sendError', function(code: number, message: string) {
|
||||
return this.code(code).send({ error: message });
|
||||
});
|
||||
app.decorateReply('success', function(data: unknown) {
|
||||
return this.send({ success: true, data });
|
||||
});
|
||||
```
|
||||
|
||||
## TypeScript Declaration Merging
|
||||
|
||||
Extend Fastify types for type safety:
|
||||
|
||||
```typescript
|
||||
// Declare custom properties
|
||||
declare module 'fastify' {
|
||||
interface FastifyInstance {
|
||||
config: {
|
||||
apiVersion: string;
|
||||
environment: string;
|
||||
};
|
||||
db: DatabaseClient;
|
||||
cache: CacheClient;
|
||||
}
|
||||
|
||||
interface FastifyRequest {
|
||||
user: {
|
||||
id: string;
|
||||
email: string;
|
||||
roles: string[];
|
||||
} | null;
|
||||
startTime: number;
|
||||
requestId: string;
|
||||
}
|
||||
|
||||
interface FastifyReply {
|
||||
sendError: (code: number, message: string) => void;
|
||||
success: (data: unknown) => void;
|
||||
}
|
||||
}
|
||||
|
||||
// Register decorators
|
||||
app.decorate('config', {
|
||||
apiVersion: '1.0.0',
|
||||
environment: process.env.NODE_ENV,
|
||||
});
|
||||
|
||||
app.decorateRequest('user', null);
|
||||
app.decorateRequest('startTime', 0);
|
||||
|
||||
app.decorateReply('sendError', function (code: number, message: string) {
|
||||
this.code(code).send({ error: message });
|
||||
});
|
||||
```
|
||||
|
||||
## Decorator Initialization
|
||||
|
||||
Initialize request/reply decorators in hooks:
|
||||
|
||||
```typescript
|
||||
// Decorators with primitive defaults are copied
|
||||
app.decorateRequest('startTime', 0);
|
||||
|
||||
// Initialize in hook
|
||||
app.addHook('onRequest', async (request) => {
|
||||
request.startTime = Date.now();
|
||||
});
|
||||
|
||||
// Object decorators need getter pattern for proper initialization
|
||||
app.decorateRequest('context', null);
|
||||
|
||||
app.addHook('onRequest', async (request) => {
|
||||
request.context = {
|
||||
traceId: request.headers['x-trace-id'] || crypto.randomUUID(),
|
||||
clientIp: request.ip,
|
||||
userAgent: request.headers['user-agent'],
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Dependency Injection with Decorators
|
||||
|
||||
Use decorators for dependency injection:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
// Database plugin
|
||||
export default fp(async function databasePlugin(fastify, options) {
|
||||
const db = await createDatabaseConnection(options.connectionString);
|
||||
|
||||
fastify.decorate('db', db);
|
||||
|
||||
fastify.addHook('onClose', async () => {
|
||||
await db.close();
|
||||
});
|
||||
});
|
||||
|
||||
// User service plugin
|
||||
export default fp(async function userServicePlugin(fastify) {
|
||||
// Depends on db decorator
|
||||
if (!fastify.hasDecorator('db')) {
|
||||
throw new Error('Database plugin must be registered first');
|
||||
}
|
||||
|
||||
const userService = {
|
||||
findById: (id: string) => fastify.db.query('SELECT * FROM users WHERE id = $1', [id]),
|
||||
create: (data: CreateUserInput) => fastify.db.query(
|
||||
'INSERT INTO users (name, email) VALUES ($1, $2) RETURNING *',
|
||||
[data.name, data.email]
|
||||
),
|
||||
};
|
||||
|
||||
fastify.decorate('userService', userService);
|
||||
}, {
|
||||
dependencies: ['database-plugin'],
|
||||
});
|
||||
|
||||
// Use in routes
|
||||
app.get('/users/:id', async function (request) {
|
||||
const user = await this.userService.findById(request.params.id);
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## Request Context Pattern
|
||||
|
||||
Build rich request context:
|
||||
|
||||
```typescript
|
||||
interface RequestContext {
|
||||
traceId: string;
|
||||
user: User | null;
|
||||
permissions: Set<string>;
|
||||
startTime: number;
|
||||
metadata: Map<string, unknown>;
|
||||
}
|
||||
|
||||
declare module 'fastify' {
|
||||
interface FastifyRequest {
|
||||
ctx: RequestContext;
|
||||
}
|
||||
}
|
||||
|
||||
app.decorateRequest('ctx', null);
|
||||
|
||||
app.addHook('onRequest', async (request) => {
|
||||
request.ctx = {
|
||||
traceId: request.headers['x-trace-id']?.toString() || crypto.randomUUID(),
|
||||
user: null,
|
||||
permissions: new Set(),
|
||||
startTime: Date.now(),
|
||||
metadata: new Map(),
|
||||
};
|
||||
});
|
||||
|
||||
// Auth hook populates user
|
||||
app.addHook('preHandler', async (request) => {
|
||||
const token = request.headers.authorization;
|
||||
if (token) {
|
||||
const user = await verifyToken(token);
|
||||
request.ctx.user = user;
|
||||
request.ctx.permissions = new Set(user.permissions);
|
||||
}
|
||||
});
|
||||
|
||||
// Use in handlers
|
||||
app.get('/profile', async (request, reply) => {
|
||||
if (!request.ctx.user) {
|
||||
return reply.code(401).send({ error: 'Unauthorized' });
|
||||
}
|
||||
|
||||
if (!request.ctx.permissions.has('read:profile')) {
|
||||
return reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
|
||||
return request.ctx.user;
|
||||
});
|
||||
```
|
||||
|
||||
## Reply Helpers
|
||||
|
||||
Create consistent response methods:
|
||||
|
||||
```typescript
|
||||
declare module 'fastify' {
|
||||
interface FastifyReply {
|
||||
ok: (data?: unknown) => void;
|
||||
created: (data: unknown) => void;
|
||||
noContent: () => void;
|
||||
badRequest: (message: string, details?: unknown) => void;
|
||||
unauthorized: (message?: string) => void;
|
||||
forbidden: (message?: string) => void;
|
||||
notFound: (resource?: string) => void;
|
||||
conflict: (message: string) => void;
|
||||
serverError: (message?: string) => void;
|
||||
}
|
||||
}
|
||||
|
||||
app.decorateReply('ok', function (data?: unknown) {
|
||||
this.code(200).send(data ?? { success: true });
|
||||
});
|
||||
|
||||
app.decorateReply('created', function (data: unknown) {
|
||||
this.code(201).send(data);
|
||||
});
|
||||
|
||||
app.decorateReply('noContent', function () {
|
||||
this.code(204).send();
|
||||
});
|
||||
|
||||
app.decorateReply('badRequest', function (message: string, details?: unknown) {
|
||||
this.code(400).send({
|
||||
statusCode: 400,
|
||||
error: 'Bad Request',
|
||||
message,
|
||||
details,
|
||||
});
|
||||
});
|
||||
|
||||
app.decorateReply('unauthorized', function (message = 'Authentication required') {
|
||||
this.code(401).send({
|
||||
statusCode: 401,
|
||||
error: 'Unauthorized',
|
||||
message,
|
||||
});
|
||||
});
|
||||
|
||||
app.decorateReply('notFound', function (resource = 'Resource') {
|
||||
this.code(404).send({
|
||||
statusCode: 404,
|
||||
error: 'Not Found',
|
||||
message: `${resource} not found`,
|
||||
});
|
||||
});
|
||||
|
||||
// Usage
|
||||
app.get('/users/:id', async (request, reply) => {
|
||||
const user = await db.users.findById(request.params.id);
|
||||
if (!user) {
|
||||
return reply.notFound('User');
|
||||
}
|
||||
return reply.ok(user);
|
||||
});
|
||||
|
||||
app.post('/users', async (request, reply) => {
|
||||
const user = await db.users.create(request.body);
|
||||
return reply.created(user);
|
||||
});
|
||||
```
|
||||
|
||||
## Checking Decorators
|
||||
|
||||
Check if decorators exist before using:
|
||||
|
||||
```typescript
|
||||
// Check at registration time
|
||||
app.register(async function (fastify) {
|
||||
if (!fastify.hasDecorator('db')) {
|
||||
throw new Error('Database decorator required');
|
||||
}
|
||||
|
||||
if (!fastify.hasRequestDecorator('user')) {
|
||||
throw new Error('User request decorator required');
|
||||
}
|
||||
|
||||
if (!fastify.hasReplyDecorator('sendError')) {
|
||||
throw new Error('sendError reply decorator required');
|
||||
}
|
||||
|
||||
// Safe to use decorators
|
||||
});
|
||||
```
|
||||
|
||||
## Decorator Encapsulation
|
||||
|
||||
Decorators respect encapsulation by default:
|
||||
|
||||
```typescript
|
||||
app.register(async function pluginA(fastify) {
|
||||
fastify.decorate('pluginAUtil', () => 'A');
|
||||
|
||||
fastify.get('/a', async function () {
|
||||
return this.pluginAUtil(); // Works
|
||||
});
|
||||
});
|
||||
|
||||
app.register(async function pluginB(fastify) {
|
||||
// this.pluginAUtil is NOT available here (encapsulated)
|
||||
|
||||
fastify.get('/b', async function () {
|
||||
// this.pluginAUtil() would be undefined
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
Use `fastify-plugin` to share decorators:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
export default fp(async function sharedDecorator(fastify) {
|
||||
fastify.decorate('sharedUtil', () => 'shared');
|
||||
});
|
||||
|
||||
// Now available to parent and sibling plugins
|
||||
```
|
||||
|
||||
## Functional Decorators
|
||||
|
||||
Create decorators that return functions:
|
||||
|
||||
```typescript
|
||||
declare module 'fastify' {
|
||||
interface FastifyInstance {
|
||||
createValidator: <T>(schema: object) => (data: unknown) => T;
|
||||
createRateLimiter: (options: RateLimitOptions) => RateLimiter;
|
||||
}
|
||||
}
|
||||
|
||||
app.decorate('createValidator', function <T>(schema: object) {
|
||||
const validate = ajv.compile(schema);
|
||||
return (data: unknown): T => {
|
||||
if (!validate(data)) {
|
||||
throw new ValidationError(validate.errors);
|
||||
}
|
||||
return data as T;
|
||||
};
|
||||
});
|
||||
|
||||
// Usage
|
||||
const validateUser = app.createValidator<User>(userSchema);
|
||||
|
||||
app.post('/users', async (request) => {
|
||||
const user = validateUser(request.body);
|
||||
return db.users.create(user);
|
||||
});
|
||||
```
|
||||
|
||||
## Async Decorator Initialization
|
||||
|
||||
Handle async initialization properly:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
export default fp(async function asyncPlugin(fastify) {
|
||||
// Async initialization
|
||||
const connection = await createAsyncConnection();
|
||||
const cache = await initializeCache();
|
||||
|
||||
fastify.decorate('asyncService', {
|
||||
connection,
|
||||
cache,
|
||||
query: async (sql: string) => connection.query(sql),
|
||||
});
|
||||
|
||||
fastify.addHook('onClose', async () => {
|
||||
await connection.close();
|
||||
await cache.disconnect();
|
||||
});
|
||||
});
|
||||
|
||||
// Plugin is fully initialized before routes execute
|
||||
app.get('/data', async function () {
|
||||
return this.asyncService.query('SELECT * FROM data');
|
||||
});
|
||||
```
|
||||
425
.claude/skills/fastify-best-practices/rules/deployment.md
Normal file
425
.claude/skills/fastify-best-practices/rules/deployment.md
Normal file
|
|
@ -0,0 +1,425 @@
|
|||
---
|
||||
name: deployment
|
||||
description: Production deployment for Fastify applications
|
||||
metadata:
|
||||
tags: deployment, production, docker, kubernetes, scaling
|
||||
---
|
||||
|
||||
# Production Deployment
|
||||
|
||||
## Graceful Shutdown with close-with-grace
|
||||
|
||||
Use `close-with-grace` for proper shutdown handling:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import closeWithGrace from 'close-with-grace';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
// Register plugins and routes
|
||||
await app.register(import('./plugins/index.js'));
|
||||
await app.register(import('./routes/index.js'));
|
||||
|
||||
// Graceful shutdown handler
|
||||
closeWithGrace({ delay: 10000 }, async ({ signal, err }) => {
|
||||
if (err) {
|
||||
app.log.error({ err }, 'Server closing due to error');
|
||||
} else {
|
||||
app.log.info({ signal }, 'Server closing due to signal');
|
||||
}
|
||||
|
||||
await app.close();
|
||||
});
|
||||
|
||||
// Start server
|
||||
await app.listen({
|
||||
port: parseInt(process.env.PORT || '3000', 10),
|
||||
host: '0.0.0.0',
|
||||
});
|
||||
|
||||
app.log.info(`Server listening on ${app.server.address()}`);
|
||||
```
|
||||
|
||||
## Health Check Endpoints
|
||||
|
||||
Implement comprehensive health checks:
|
||||
|
||||
```typescript
|
||||
app.get('/health', async () => {
|
||||
return { status: 'ok', timestamp: new Date().toISOString() };
|
||||
});
|
||||
|
||||
app.get('/health/live', async () => {
|
||||
return { status: 'ok' };
|
||||
});
|
||||
|
||||
app.get('/health/ready', async (request, reply) => {
|
||||
const checks = {
|
||||
database: false,
|
||||
cache: false,
|
||||
};
|
||||
|
||||
try {
|
||||
await app.db`SELECT 1`;
|
||||
checks.database = true;
|
||||
} catch {
|
||||
// Database not ready
|
||||
}
|
||||
|
||||
try {
|
||||
await app.cache.ping();
|
||||
checks.cache = true;
|
||||
} catch {
|
||||
// Cache not ready
|
||||
}
|
||||
|
||||
const allHealthy = Object.values(checks).every(Boolean);
|
||||
|
||||
if (!allHealthy) {
|
||||
reply.code(503);
|
||||
}
|
||||
|
||||
return {
|
||||
status: allHealthy ? 'ok' : 'degraded',
|
||||
checks,
|
||||
timestamp: new Date().toISOString(),
|
||||
};
|
||||
});
|
||||
|
||||
// Detailed health for monitoring
|
||||
app.get('/health/details', {
|
||||
preHandler: [app.authenticate, app.requireAdmin],
|
||||
}, async () => {
|
||||
const memory = process.memoryUsage();
|
||||
|
||||
return {
|
||||
status: 'ok',
|
||||
uptime: process.uptime(),
|
||||
memory: {
|
||||
heapUsed: Math.round(memory.heapUsed / 1024 / 1024),
|
||||
heapTotal: Math.round(memory.heapTotal / 1024 / 1024),
|
||||
rss: Math.round(memory.rss / 1024 / 1024),
|
||||
},
|
||||
version: process.env.APP_VERSION,
|
||||
nodeVersion: process.version,
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Docker Configuration
|
||||
|
||||
Create an optimized Dockerfile:
|
||||
|
||||
```dockerfile
|
||||
# Build stage
|
||||
FROM node:22-alpine AS builder
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
COPY package*.json ./
|
||||
RUN npm ci --only=production
|
||||
|
||||
COPY . .
|
||||
|
||||
# Production stage
|
||||
FROM node:22-alpine
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Run as non-root user
|
||||
RUN addgroup -g 1001 -S nodejs && \
|
||||
adduser -S nodejs -u 1001
|
||||
|
||||
# Copy from builder
|
||||
COPY --from=builder --chown=nodejs:nodejs /app/node_modules ./node_modules
|
||||
COPY --from=builder --chown=nodejs:nodejs /app/src ./src
|
||||
COPY --from=builder --chown=nodejs:nodejs /app/package.json ./
|
||||
|
||||
USER nodejs
|
||||
|
||||
EXPOSE 3000
|
||||
|
||||
ENV NODE_ENV=production
|
||||
ENV PORT=3000
|
||||
|
||||
# Health check
|
||||
HEALTHCHECK --interval=30s --timeout=5s --start-period=5s --retries=3 \
|
||||
CMD wget --no-verbose --tries=1 --spider http://localhost:3000/health || exit 1
|
||||
|
||||
CMD ["node", "src/app.ts"]
|
||||
```
|
||||
|
||||
```yaml
|
||||
# docker-compose.yml
|
||||
services:
|
||||
api:
|
||||
build: .
|
||||
ports:
|
||||
- "3000:3000"
|
||||
environment:
|
||||
- NODE_ENV=production
|
||||
- DATABASE_URL=postgres://user:pass@db:5432/app
|
||||
- JWT_SECRET=${JWT_SECRET}
|
||||
depends_on:
|
||||
db:
|
||||
condition: service_healthy
|
||||
restart: unless-stopped
|
||||
|
||||
db:
|
||||
image: postgres:16-alpine
|
||||
environment:
|
||||
- POSTGRES_USER=user
|
||||
- POSTGRES_PASSWORD=pass
|
||||
- POSTGRES_DB=app
|
||||
volumes:
|
||||
- pgdata:/var/lib/postgresql/data
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U user -d app"]
|
||||
interval: 5s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
|
||||
volumes:
|
||||
pgdata:
|
||||
```
|
||||
|
||||
## Kubernetes Deployment
|
||||
|
||||
Deploy to Kubernetes:
|
||||
|
||||
```yaml
|
||||
# deployment.yaml
|
||||
apiVersion: apps/v1
|
||||
kind: Deployment
|
||||
metadata:
|
||||
name: fastify-api
|
||||
spec:
|
||||
replicas: 3
|
||||
selector:
|
||||
matchLabels:
|
||||
app: fastify-api
|
||||
template:
|
||||
metadata:
|
||||
labels:
|
||||
app: fastify-api
|
||||
spec:
|
||||
containers:
|
||||
- name: api
|
||||
image: my-registry/fastify-api:latest
|
||||
ports:
|
||||
- containerPort: 3000
|
||||
env:
|
||||
- name: NODE_ENV
|
||||
value: "production"
|
||||
- name: DATABASE_URL
|
||||
valueFrom:
|
||||
secretKeyRef:
|
||||
name: api-secrets
|
||||
key: database-url
|
||||
resources:
|
||||
requests:
|
||||
memory: "256Mi"
|
||||
cpu: "100m"
|
||||
limits:
|
||||
memory: "512Mi"
|
||||
cpu: "500m"
|
||||
livenessProbe:
|
||||
httpGet:
|
||||
path: /health/live
|
||||
port: 3000
|
||||
initialDelaySeconds: 5
|
||||
periodSeconds: 10
|
||||
readinessProbe:
|
||||
httpGet:
|
||||
path: /health/ready
|
||||
port: 3000
|
||||
initialDelaySeconds: 5
|
||||
periodSeconds: 5
|
||||
lifecycle:
|
||||
preStop:
|
||||
exec:
|
||||
command: ["/bin/sh", "-c", "sleep 5"]
|
||||
---
|
||||
apiVersion: v1
|
||||
kind: Service
|
||||
metadata:
|
||||
name: fastify-api
|
||||
spec:
|
||||
selector:
|
||||
app: fastify-api
|
||||
ports:
|
||||
- port: 80
|
||||
targetPort: 3000
|
||||
type: ClusterIP
|
||||
```
|
||||
|
||||
## Production Logger Configuration
|
||||
|
||||
Configure logging for production:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: process.env.LOG_LEVEL || 'info',
|
||||
// JSON output for log aggregation
|
||||
formatters: {
|
||||
level: (label) => ({ level: label }),
|
||||
bindings: (bindings) => ({
|
||||
pid: bindings.pid,
|
||||
hostname: bindings.hostname,
|
||||
service: 'fastify-api',
|
||||
version: process.env.APP_VERSION,
|
||||
}),
|
||||
},
|
||||
timestamp: () => `,"time":"${new Date().toISOString()}"`,
|
||||
// Redact sensitive data
|
||||
redact: {
|
||||
paths: [
|
||||
'req.headers.authorization',
|
||||
'req.headers.cookie',
|
||||
'*.password',
|
||||
'*.token',
|
||||
'*.secret',
|
||||
],
|
||||
censor: '[REDACTED]',
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Request Timeouts
|
||||
|
||||
Configure appropriate timeouts:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
connectionTimeout: 30000, // 30s connection timeout
|
||||
keepAliveTimeout: 72000, // 72s keep-alive (longer than ALB 60s)
|
||||
requestTimeout: 30000, // 30s request timeout
|
||||
bodyLimit: 1048576, // 1MB body limit
|
||||
});
|
||||
|
||||
// Per-route timeout
|
||||
app.get('/long-operation', {
|
||||
config: {
|
||||
timeout: 60000, // 60s for this route
|
||||
},
|
||||
}, longOperationHandler);
|
||||
```
|
||||
|
||||
## Trust Proxy Settings
|
||||
|
||||
Configure for load balancers:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
// Trust first proxy (load balancer)
|
||||
trustProxy: true,
|
||||
|
||||
// Or trust specific proxies
|
||||
trustProxy: ['127.0.0.1', '10.0.0.0/8'],
|
||||
|
||||
// Or number of proxies to trust
|
||||
trustProxy: 1,
|
||||
});
|
||||
|
||||
// Now request.ip returns real client IP
|
||||
```
|
||||
|
||||
## Static File Serving
|
||||
|
||||
Serve static files efficiently. **Always use `import.meta.dirname` as the base path**, never `process.cwd()`:
|
||||
|
||||
```typescript
|
||||
import fastifyStatic from '@fastify/static';
|
||||
import { join } from 'node:path';
|
||||
|
||||
app.register(fastifyStatic, {
|
||||
root: join(import.meta.dirname, '..', 'public'),
|
||||
prefix: '/static/',
|
||||
maxAge: '1d',
|
||||
immutable: true,
|
||||
etag: true,
|
||||
lastModified: true,
|
||||
});
|
||||
```
|
||||
|
||||
## Compression
|
||||
|
||||
Enable response compression:
|
||||
|
||||
```typescript
|
||||
import fastifyCompress from '@fastify/compress';
|
||||
|
||||
app.register(fastifyCompress, {
|
||||
global: true,
|
||||
threshold: 1024, // Only compress > 1KB
|
||||
encodings: ['gzip', 'deflate'],
|
||||
});
|
||||
```
|
||||
|
||||
## Metrics and Monitoring
|
||||
|
||||
Expose Prometheus metrics:
|
||||
|
||||
```typescript
|
||||
import { register, collectDefaultMetrics, Counter, Histogram } from 'prom-client';
|
||||
|
||||
collectDefaultMetrics();
|
||||
|
||||
const httpRequestDuration = new Histogram({
|
||||
name: 'http_request_duration_seconds',
|
||||
help: 'Duration of HTTP requests in seconds',
|
||||
labelNames: ['method', 'route', 'status'],
|
||||
buckets: [0.01, 0.05, 0.1, 0.5, 1, 5],
|
||||
});
|
||||
|
||||
const httpRequestTotal = new Counter({
|
||||
name: 'http_requests_total',
|
||||
help: 'Total number of HTTP requests',
|
||||
labelNames: ['method', 'route', 'status'],
|
||||
});
|
||||
|
||||
app.addHook('onResponse', (request, reply, done) => {
|
||||
const route = request.routeOptions.url || request.url;
|
||||
const labels = {
|
||||
method: request.method,
|
||||
route,
|
||||
status: reply.statusCode,
|
||||
};
|
||||
|
||||
httpRequestDuration.observe(labels, reply.elapsedTime / 1000);
|
||||
httpRequestTotal.inc(labels);
|
||||
done();
|
||||
});
|
||||
|
||||
app.get('/metrics', async (request, reply) => {
|
||||
reply.header('Content-Type', register.contentType);
|
||||
return register.metrics();
|
||||
});
|
||||
```
|
||||
|
||||
## Zero-Downtime Deployments
|
||||
|
||||
Support rolling updates:
|
||||
|
||||
```typescript
|
||||
import closeWithGrace from 'close-with-grace';
|
||||
|
||||
// Stop accepting new connections gracefully
|
||||
closeWithGrace({ delay: 30000 }, async ({ signal }) => {
|
||||
app.log.info({ signal }, 'Received shutdown signal');
|
||||
|
||||
// Stop accepting new connections
|
||||
// Existing connections continue to be served
|
||||
|
||||
// Wait for in-flight requests (handled by close-with-grace delay)
|
||||
await app.close();
|
||||
|
||||
app.log.info('Server closed');
|
||||
});
|
||||
```
|
||||
|
||||
412
.claude/skills/fastify-best-practices/rules/error-handling.md
Normal file
412
.claude/skills/fastify-best-practices/rules/error-handling.md
Normal file
|
|
@ -0,0 +1,412 @@
|
|||
---
|
||||
name: error-handling
|
||||
description: Error handling patterns in Fastify
|
||||
metadata:
|
||||
tags: errors, exceptions, error-handler, validation
|
||||
---
|
||||
|
||||
# Error Handling in Fastify
|
||||
|
||||
## Default Error Handler
|
||||
|
||||
Fastify has a built-in error handler. Thrown errors automatically become HTTP responses:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.get('/users/:id', async (request) => {
|
||||
const user = await findUser(request.params.id);
|
||||
if (!user) {
|
||||
// Throwing an error with statusCode sets the response status
|
||||
const error = new Error('User not found');
|
||||
error.statusCode = 404;
|
||||
throw error;
|
||||
}
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Error Classes
|
||||
|
||||
Use `@fastify/error` for creating typed errors:
|
||||
|
||||
```typescript
|
||||
import createError from '@fastify/error';
|
||||
|
||||
const NotFoundError = createError('NOT_FOUND', '%s not found', 404);
|
||||
const UnauthorizedError = createError('UNAUTHORIZED', 'Authentication required', 401);
|
||||
const ForbiddenError = createError('FORBIDDEN', 'Access denied: %s', 403);
|
||||
const ValidationError = createError('VALIDATION_ERROR', '%s', 400);
|
||||
const ConflictError = createError('CONFLICT', '%s already exists', 409);
|
||||
|
||||
// Usage
|
||||
app.get('/users/:id', async (request) => {
|
||||
const user = await findUser(request.params.id);
|
||||
if (!user) {
|
||||
throw new NotFoundError('User');
|
||||
}
|
||||
return user;
|
||||
});
|
||||
|
||||
app.post('/users', async (request) => {
|
||||
const exists = await userExists(request.body.email);
|
||||
if (exists) {
|
||||
throw new ConflictError('Email');
|
||||
}
|
||||
return createUser(request.body);
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Error Handler
|
||||
|
||||
Implement a centralized error handler:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import type { FastifyError, FastifyRequest, FastifyReply } from 'fastify';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.setErrorHandler((error: FastifyError, request: FastifyRequest, reply: FastifyReply) => {
|
||||
// Log the error
|
||||
request.log.error({ err: error }, 'Request error');
|
||||
|
||||
// Handle validation errors
|
||||
if (error.validation) {
|
||||
return reply.code(400).send({
|
||||
statusCode: 400,
|
||||
error: 'Bad Request',
|
||||
message: 'Validation failed',
|
||||
details: error.validation,
|
||||
});
|
||||
}
|
||||
|
||||
// Handle known errors with status codes
|
||||
const statusCode = error.statusCode ?? 500;
|
||||
const code = error.code ?? 'INTERNAL_ERROR';
|
||||
|
||||
// Don't expose internal error details in production
|
||||
const message = statusCode >= 500 && process.env.NODE_ENV === 'production'
|
||||
? 'Internal Server Error'
|
||||
: error.message;
|
||||
|
||||
return reply.code(statusCode).send({
|
||||
statusCode,
|
||||
error: code,
|
||||
message,
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Error Response Schema
|
||||
|
||||
Define consistent error response schemas:
|
||||
|
||||
```typescript
|
||||
app.addSchema({
|
||||
$id: 'httpError',
|
||||
type: 'object',
|
||||
properties: {
|
||||
statusCode: { type: 'integer' },
|
||||
error: { type: 'string' },
|
||||
message: { type: 'string' },
|
||||
details: {
|
||||
type: 'array',
|
||||
items: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
field: { type: 'string' },
|
||||
message: { type: 'string' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
required: ['statusCode', 'error', 'message'],
|
||||
});
|
||||
|
||||
// Use in route schemas
|
||||
app.get('/users/:id', {
|
||||
schema: {
|
||||
params: {
|
||||
type: 'object',
|
||||
properties: { id: { type: 'string' } },
|
||||
required: ['id'],
|
||||
},
|
||||
response: {
|
||||
200: { $ref: 'user#' },
|
||||
404: { $ref: 'httpError#' },
|
||||
500: { $ref: 'httpError#' },
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Reply Helpers with @fastify/sensible
|
||||
|
||||
Use `@fastify/sensible` for standard HTTP errors:
|
||||
|
||||
```typescript
|
||||
import fastifySensible from '@fastify/sensible';
|
||||
|
||||
app.register(fastifySensible);
|
||||
|
||||
app.get('/users/:id', async (request, reply) => {
|
||||
const user = await findUser(request.params.id);
|
||||
if (!user) {
|
||||
return reply.notFound('User not found');
|
||||
}
|
||||
if (!hasAccess(request.user, user)) {
|
||||
return reply.forbidden('You cannot access this user');
|
||||
}
|
||||
return user;
|
||||
});
|
||||
|
||||
// Available methods:
|
||||
// reply.badRequest(message?)
|
||||
// reply.unauthorized(message?)
|
||||
// reply.forbidden(message?)
|
||||
// reply.notFound(message?)
|
||||
// reply.methodNotAllowed(message?)
|
||||
// reply.conflict(message?)
|
||||
// reply.gone(message?)
|
||||
// reply.unprocessableEntity(message?)
|
||||
// reply.tooManyRequests(message?)
|
||||
// reply.internalServerError(message?)
|
||||
// reply.notImplemented(message?)
|
||||
// reply.badGateway(message?)
|
||||
// reply.serviceUnavailable(message?)
|
||||
// reply.gatewayTimeout(message?)
|
||||
```
|
||||
|
||||
## Async Error Handling
|
||||
|
||||
Errors in async handlers are automatically caught:
|
||||
|
||||
```typescript
|
||||
// Errors are automatically caught and passed to error handler
|
||||
app.get('/users', async (request) => {
|
||||
const users = await db.users.findAll(); // If this throws, error handler catches it
|
||||
return users;
|
||||
});
|
||||
|
||||
// Explicit error handling for custom logic
|
||||
app.get('/users/:id', async (request, reply) => {
|
||||
try {
|
||||
const user = await db.users.findById(request.params.id);
|
||||
if (!user) {
|
||||
return reply.code(404).send({ error: 'User not found' });
|
||||
}
|
||||
return user;
|
||||
} catch (error) {
|
||||
// Transform database errors
|
||||
if (error.code === 'CONNECTION_ERROR') {
|
||||
request.log.error({ err: error }, 'Database connection failed');
|
||||
return reply.code(503).send({ error: 'Service temporarily unavailable' });
|
||||
}
|
||||
throw error; // Re-throw for error handler
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Hook Error Handling
|
||||
|
||||
Errors in hooks are handled the same way:
|
||||
|
||||
```typescript
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
const token = request.headers.authorization;
|
||||
if (!token) {
|
||||
// This error goes to the error handler
|
||||
throw new UnauthorizedError();
|
||||
}
|
||||
|
||||
try {
|
||||
request.user = await verifyToken(token);
|
||||
} catch (error) {
|
||||
throw new UnauthorizedError();
|
||||
}
|
||||
});
|
||||
|
||||
// Or use reply to send response directly
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
if (!request.headers.authorization) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return; // Must return to stop processing
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Not Found Handler
|
||||
|
||||
Customize the 404 response:
|
||||
|
||||
```typescript
|
||||
app.setNotFoundHandler(async (request, reply) => {
|
||||
return reply.code(404).send({
|
||||
statusCode: 404,
|
||||
error: 'Not Found',
|
||||
message: `Route ${request.method} ${request.url} not found`,
|
||||
});
|
||||
});
|
||||
|
||||
// With schema validation
|
||||
app.setNotFoundHandler({
|
||||
preValidation: async (request, reply) => {
|
||||
// Pre-validation hook for 404 handler
|
||||
},
|
||||
}, async (request, reply) => {
|
||||
return reply.code(404).send({ error: 'Not Found' });
|
||||
});
|
||||
```
|
||||
|
||||
## Error Wrapping
|
||||
|
||||
Wrap external errors with context:
|
||||
|
||||
```typescript
|
||||
import createError from '@fastify/error';
|
||||
|
||||
const DatabaseError = createError('DATABASE_ERROR', 'Database operation failed: %s', 500);
|
||||
const ExternalServiceError = createError('EXTERNAL_SERVICE_ERROR', 'External service failed: %s', 502);
|
||||
|
||||
app.get('/users/:id', async (request) => {
|
||||
try {
|
||||
return await db.users.findById(request.params.id);
|
||||
} catch (error) {
|
||||
throw new DatabaseError(error.message, { cause: error });
|
||||
}
|
||||
});
|
||||
|
||||
app.get('/weather', async (request) => {
|
||||
try {
|
||||
return await weatherApi.fetch(request.query.city);
|
||||
} catch (error) {
|
||||
throw new ExternalServiceError(error.message, { cause: error });
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Validation Error Customization
|
||||
|
||||
Customize validation error format:
|
||||
|
||||
```typescript
|
||||
app.setErrorHandler((error, request, reply) => {
|
||||
if (error.validation) {
|
||||
const details = error.validation.map((err) => {
|
||||
const field = err.instancePath
|
||||
? err.instancePath.slice(1).replace(/\//g, '.')
|
||||
: err.params?.missingProperty || 'unknown';
|
||||
|
||||
return {
|
||||
field,
|
||||
message: err.message,
|
||||
value: err.data,
|
||||
};
|
||||
});
|
||||
|
||||
return reply.code(400).send({
|
||||
statusCode: 400,
|
||||
error: 'Validation Error',
|
||||
message: `Invalid ${error.validationContext}: ${details.map(d => d.field).join(', ')}`,
|
||||
details,
|
||||
});
|
||||
}
|
||||
|
||||
// Handle other errors...
|
||||
throw error;
|
||||
});
|
||||
```
|
||||
|
||||
## Error Cause Chain
|
||||
|
||||
Preserve error chains for debugging:
|
||||
|
||||
```typescript
|
||||
app.get('/complex-operation', async (request) => {
|
||||
try {
|
||||
await step1();
|
||||
} catch (error) {
|
||||
const wrapped = new Error('Step 1 failed', { cause: error });
|
||||
wrapped.statusCode = 500;
|
||||
throw wrapped;
|
||||
}
|
||||
});
|
||||
|
||||
// In error handler, log the full chain
|
||||
app.setErrorHandler((error, request, reply) => {
|
||||
// Log error with cause chain
|
||||
let current = error;
|
||||
const chain = [];
|
||||
while (current) {
|
||||
chain.push({
|
||||
message: current.message,
|
||||
code: current.code,
|
||||
stack: current.stack,
|
||||
});
|
||||
current = current.cause;
|
||||
}
|
||||
|
||||
request.log.error({ errorChain: chain }, 'Request failed');
|
||||
|
||||
reply.code(error.statusCode || 500).send({
|
||||
error: error.message,
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Plugin-Scoped Error Handlers
|
||||
|
||||
Set error handlers at the plugin level:
|
||||
|
||||
```typescript
|
||||
app.register(async function apiRoutes(fastify) {
|
||||
// This error handler only applies to routes in this plugin
|
||||
fastify.setErrorHandler((error, request, reply) => {
|
||||
request.log.error({ err: error }, 'API error');
|
||||
|
||||
reply.code(error.statusCode || 500).send({
|
||||
error: {
|
||||
code: error.code || 'API_ERROR',
|
||||
message: error.message,
|
||||
},
|
||||
});
|
||||
});
|
||||
|
||||
fastify.get('/data', async () => {
|
||||
throw new Error('API-specific error');
|
||||
});
|
||||
}, { prefix: '/api' });
|
||||
```
|
||||
|
||||
## Graceful Error Recovery
|
||||
|
||||
Handle errors gracefully without crashing:
|
||||
|
||||
```typescript
|
||||
app.get('/resilient', async (request, reply) => {
|
||||
const results = await Promise.allSettled([
|
||||
fetchPrimaryData(),
|
||||
fetchSecondaryData(),
|
||||
fetchOptionalData(),
|
||||
]);
|
||||
|
||||
const [primary, secondary, optional] = results;
|
||||
|
||||
if (primary.status === 'rejected') {
|
||||
// Primary data is required
|
||||
throw new Error('Primary data unavailable');
|
||||
}
|
||||
|
||||
return {
|
||||
data: primary.value,
|
||||
secondary: secondary.status === 'fulfilled' ? secondary.value : null,
|
||||
optional: optional.status === 'fulfilled' ? optional.value : null,
|
||||
warnings: results
|
||||
.filter((r) => r.status === 'rejected')
|
||||
.map((r) => r.reason.message),
|
||||
};
|
||||
});
|
||||
```
|
||||
464
.claude/skills/fastify-best-practices/rules/hooks.md
Normal file
464
.claude/skills/fastify-best-practices/rules/hooks.md
Normal file
|
|
@ -0,0 +1,464 @@
|
|||
---
|
||||
name: hooks
|
||||
description: Hooks and request lifecycle in Fastify
|
||||
metadata:
|
||||
tags: hooks, lifecycle, middleware, onRequest, preHandler
|
||||
---
|
||||
|
||||
# Hooks and Request Lifecycle
|
||||
|
||||
## Request Lifecycle Overview
|
||||
|
||||
Fastify executes hooks in a specific order:
|
||||
|
||||
```
|
||||
Incoming Request
|
||||
|
|
||||
onRequest
|
||||
|
|
||||
preParsing
|
||||
|
|
||||
preValidation
|
||||
|
|
||||
preHandler
|
||||
|
|
||||
Handler
|
||||
|
|
||||
preSerialization
|
||||
|
|
||||
onSend
|
||||
|
|
||||
onResponse
|
||||
```
|
||||
|
||||
## onRequest Hook
|
||||
|
||||
First hook to execute, before body parsing. Use for authentication, request ID setup:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Global onRequest hook
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
request.startTime = Date.now();
|
||||
request.log.info({ url: request.url, method: request.method }, 'Request started');
|
||||
});
|
||||
|
||||
// Authentication check
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
// Skip auth for public routes
|
||||
if (request.url.startsWith('/public')) {
|
||||
return;
|
||||
}
|
||||
|
||||
const token = request.headers.authorization?.replace('Bearer ', '');
|
||||
if (!token) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return; // Stop processing
|
||||
}
|
||||
|
||||
try {
|
||||
request.user = await verifyToken(token);
|
||||
} catch {
|
||||
reply.code(401).send({ error: 'Invalid token' });
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## preParsing Hook
|
||||
|
||||
Execute before body parsing. Can modify the payload stream:
|
||||
|
||||
```typescript
|
||||
app.addHook('preParsing', async (request, reply, payload) => {
|
||||
// Log raw payload size
|
||||
request.log.debug({ contentLength: request.headers['content-length'] }, 'Parsing body');
|
||||
|
||||
// Return modified payload stream if needed
|
||||
return payload;
|
||||
});
|
||||
|
||||
// Decompress incoming data
|
||||
app.addHook('preParsing', async (request, reply, payload) => {
|
||||
if (request.headers['content-encoding'] === 'gzip') {
|
||||
return payload.pipe(zlib.createGunzip());
|
||||
}
|
||||
return payload;
|
||||
});
|
||||
```
|
||||
|
||||
## preValidation Hook
|
||||
|
||||
Execute after parsing, before schema validation:
|
||||
|
||||
```typescript
|
||||
app.addHook('preValidation', async (request, reply) => {
|
||||
// Modify body before validation
|
||||
if (request.body && typeof request.body === 'object') {
|
||||
// Normalize data
|
||||
request.body.email = request.body.email?.toLowerCase().trim();
|
||||
}
|
||||
});
|
||||
|
||||
// Rate limiting check
|
||||
app.addHook('preValidation', async (request, reply) => {
|
||||
const key = request.ip;
|
||||
const count = await redis.incr(`ratelimit:${key}`);
|
||||
|
||||
if (count === 1) {
|
||||
await redis.expire(`ratelimit:${key}`, 60);
|
||||
}
|
||||
|
||||
if (count > 100) {
|
||||
reply.code(429).send({ error: 'Too many requests' });
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## preHandler Hook
|
||||
|
||||
Most common hook, execute after validation, before handler:
|
||||
|
||||
```typescript
|
||||
// Authorization check
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
const { userId } = request.params as { userId: string };
|
||||
|
||||
if (request.user.id !== userId && !request.user.isAdmin) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
});
|
||||
|
||||
// Load related data
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
if (request.params?.projectId) {
|
||||
request.project = await db.projects.findById(request.params.projectId);
|
||||
if (!request.project) {
|
||||
reply.code(404).send({ error: 'Project not found' });
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Transaction wrapper
|
||||
app.addHook('preHandler', async (request) => {
|
||||
request.transaction = await db.beginTransaction();
|
||||
});
|
||||
|
||||
app.addHook('onResponse', async (request) => {
|
||||
if (request.transaction) {
|
||||
await request.transaction.commit();
|
||||
}
|
||||
});
|
||||
|
||||
app.addHook('onError', async (request, reply, error) => {
|
||||
if (request.transaction) {
|
||||
await request.transaction.rollback();
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## preSerialization Hook
|
||||
|
||||
Modify payload before serialization:
|
||||
|
||||
```typescript
|
||||
app.addHook('preSerialization', async (request, reply, payload) => {
|
||||
// Add metadata to all responses
|
||||
if (payload && typeof payload === 'object') {
|
||||
return {
|
||||
...payload,
|
||||
_meta: {
|
||||
requestId: request.id,
|
||||
timestamp: new Date().toISOString(),
|
||||
},
|
||||
};
|
||||
}
|
||||
return payload;
|
||||
});
|
||||
|
||||
// Remove sensitive fields
|
||||
app.addHook('preSerialization', async (request, reply, payload) => {
|
||||
if (payload?.user?.password) {
|
||||
const { password, ...user } = payload.user;
|
||||
return { ...payload, user };
|
||||
}
|
||||
return payload;
|
||||
});
|
||||
```
|
||||
|
||||
## onSend Hook
|
||||
|
||||
Modify response after serialization:
|
||||
|
||||
```typescript
|
||||
app.addHook('onSend', async (request, reply, payload) => {
|
||||
// Add response headers
|
||||
reply.header('X-Response-Time', Date.now() - request.startTime);
|
||||
|
||||
// Compress response
|
||||
if (payload && payload.length > 1024) {
|
||||
const compressed = await gzip(payload);
|
||||
reply.header('Content-Encoding', 'gzip');
|
||||
return compressed;
|
||||
}
|
||||
|
||||
return payload;
|
||||
});
|
||||
|
||||
// Transform JSON string response
|
||||
app.addHook('onSend', async (request, reply, payload) => {
|
||||
if (reply.getHeader('content-type')?.includes('application/json')) {
|
||||
// payload is already a string at this point
|
||||
return payload;
|
||||
}
|
||||
return payload;
|
||||
});
|
||||
```
|
||||
|
||||
## onResponse Hook
|
||||
|
||||
Execute after response is sent. Cannot modify response:
|
||||
|
||||
```typescript
|
||||
app.addHook('onResponse', async (request, reply) => {
|
||||
// Log response time
|
||||
const responseTime = Date.now() - request.startTime;
|
||||
request.log.info({
|
||||
method: request.method,
|
||||
url: request.url,
|
||||
statusCode: reply.statusCode,
|
||||
responseTime,
|
||||
}, 'Request completed');
|
||||
|
||||
// Track metrics
|
||||
metrics.histogram('http_request_duration', responseTime, {
|
||||
method: request.method,
|
||||
route: request.routeOptions.url,
|
||||
status: reply.statusCode,
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## onError Hook
|
||||
|
||||
Execute when an error is thrown:
|
||||
|
||||
```typescript
|
||||
app.addHook('onError', async (request, reply, error) => {
|
||||
// Log error details
|
||||
request.log.error({
|
||||
err: error,
|
||||
url: request.url,
|
||||
method: request.method,
|
||||
body: request.body,
|
||||
}, 'Request error');
|
||||
|
||||
// Track error metrics
|
||||
metrics.increment('http_errors', {
|
||||
error: error.code || 'UNKNOWN',
|
||||
route: request.routeOptions.url,
|
||||
});
|
||||
|
||||
// Cleanup resources
|
||||
if (request.tempFile) {
|
||||
await fs.unlink(request.tempFile).catch(() => {});
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## onTimeout Hook
|
||||
|
||||
Execute when request times out:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
connectionTimeout: 30000, // 30 seconds
|
||||
});
|
||||
|
||||
app.addHook('onTimeout', async (request, reply) => {
|
||||
request.log.warn({
|
||||
url: request.url,
|
||||
method: request.method,
|
||||
}, 'Request timeout');
|
||||
|
||||
// Cleanup
|
||||
if (request.abortController) {
|
||||
request.abortController.abort();
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## onRequestAbort Hook
|
||||
|
||||
Execute when client closes connection:
|
||||
|
||||
```typescript
|
||||
app.addHook('onRequestAbort', async (request) => {
|
||||
request.log.info('Client aborted request');
|
||||
|
||||
// Cancel ongoing operations
|
||||
if (request.abortController) {
|
||||
request.abortController.abort();
|
||||
}
|
||||
|
||||
// Cleanup uploaded files
|
||||
if (request.uploadedFiles) {
|
||||
for (const file of request.uploadedFiles) {
|
||||
await fs.unlink(file.path).catch(() => {});
|
||||
}
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Application Lifecycle Hooks
|
||||
|
||||
Hooks that run at application startup/shutdown:
|
||||
|
||||
```typescript
|
||||
// After all plugins are loaded
|
||||
app.addHook('onReady', async function () {
|
||||
this.log.info('Server is ready');
|
||||
|
||||
// Initialize connections
|
||||
await this.db.connect();
|
||||
await this.redis.connect();
|
||||
|
||||
// Warm caches
|
||||
await this.cache.warmup();
|
||||
});
|
||||
|
||||
// When server is closing
|
||||
app.addHook('onClose', async function () {
|
||||
this.log.info('Server is closing');
|
||||
|
||||
// Cleanup connections
|
||||
await this.db.close();
|
||||
await this.redis.disconnect();
|
||||
});
|
||||
|
||||
// After routes are registered
|
||||
app.addHook('onRoute', (routeOptions) => {
|
||||
console.log(`Route registered: ${routeOptions.method} ${routeOptions.url}`);
|
||||
|
||||
// Track all routes
|
||||
routes.push({
|
||||
method: routeOptions.method,
|
||||
url: routeOptions.url,
|
||||
schema: routeOptions.schema,
|
||||
});
|
||||
});
|
||||
|
||||
// After plugin is registered
|
||||
app.addHook('onRegister', (instance, options) => {
|
||||
console.log(`Plugin registered with prefix: ${options.prefix}`);
|
||||
});
|
||||
```
|
||||
|
||||
## Scoped Hooks
|
||||
|
||||
Hooks are scoped to their encapsulation context:
|
||||
|
||||
```typescript
|
||||
app.addHook('onRequest', async (request) => {
|
||||
// Runs for ALL routes
|
||||
request.log.info('Global hook');
|
||||
});
|
||||
|
||||
app.register(async function adminRoutes(fastify) {
|
||||
// Only runs for routes in this plugin
|
||||
fastify.addHook('onRequest', async (request, reply) => {
|
||||
if (!request.user?.isAdmin) {
|
||||
reply.code(403).send({ error: 'Admin only' });
|
||||
}
|
||||
});
|
||||
|
||||
fastify.get('/admin/users', async () => {
|
||||
return { users: [] };
|
||||
});
|
||||
}, { prefix: '/admin' });
|
||||
```
|
||||
|
||||
## Hook Execution Order
|
||||
|
||||
Multiple hooks of the same type execute in registration order:
|
||||
|
||||
```typescript
|
||||
app.addHook('onRequest', async () => {
|
||||
console.log('First');
|
||||
});
|
||||
|
||||
app.addHook('onRequest', async () => {
|
||||
console.log('Second');
|
||||
});
|
||||
|
||||
app.addHook('onRequest', async () => {
|
||||
console.log('Third');
|
||||
});
|
||||
|
||||
// Output: First, Second, Third
|
||||
```
|
||||
|
||||
## Stopping Hook Execution
|
||||
|
||||
Return early from hooks to stop processing:
|
||||
|
||||
```typescript
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
if (!request.user) {
|
||||
// Send response and return to stop further processing
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return;
|
||||
}
|
||||
// Continue to next hook and handler
|
||||
});
|
||||
```
|
||||
|
||||
## Route-Level Hooks
|
||||
|
||||
Add hooks to specific routes:
|
||||
|
||||
```typescript
|
||||
const adminOnlyHook = async (request, reply) => {
|
||||
if (!request.user?.isAdmin) {
|
||||
reply.code(403).send({ error: 'Forbidden' });
|
||||
}
|
||||
};
|
||||
|
||||
app.get('/admin/settings', {
|
||||
preHandler: [adminOnlyHook],
|
||||
handler: async (request) => {
|
||||
return { settings: {} };
|
||||
},
|
||||
});
|
||||
|
||||
// Multiple hooks
|
||||
app.post('/orders', {
|
||||
preValidation: [validateApiKey],
|
||||
preHandler: [loadUser, checkQuota, logOrder],
|
||||
handler: createOrderHandler,
|
||||
});
|
||||
```
|
||||
|
||||
## Async Hook Patterns
|
||||
|
||||
Always use async/await in hooks:
|
||||
|
||||
```typescript
|
||||
// GOOD - async hook
|
||||
app.addHook('preHandler', async (request, reply) => {
|
||||
const user = await loadUser(request.headers.authorization);
|
||||
request.user = user;
|
||||
});
|
||||
|
||||
// AVOID - callback style (deprecated)
|
||||
app.addHook('preHandler', (request, reply, done) => {
|
||||
loadUser(request.headers.authorization)
|
||||
.then((user) => {
|
||||
request.user = user;
|
||||
done();
|
||||
})
|
||||
.catch(done);
|
||||
});
|
||||
```
|
||||
247
.claude/skills/fastify-best-practices/rules/http-proxy.md
Normal file
247
.claude/skills/fastify-best-practices/rules/http-proxy.md
Normal file
|
|
@ -0,0 +1,247 @@
|
|||
---
|
||||
name: http-proxy
|
||||
description: HTTP proxying and reply.from() in Fastify
|
||||
metadata:
|
||||
tags: proxy, gateway, reverse-proxy, microservices
|
||||
---
|
||||
|
||||
# HTTP Proxy and Reply.from()
|
||||
|
||||
## @fastify/http-proxy
|
||||
|
||||
Use `@fastify/http-proxy` for simple reverse proxy scenarios:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import httpProxy from '@fastify/http-proxy';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
// Proxy all requests to /api/* to another service
|
||||
app.register(httpProxy, {
|
||||
upstream: 'http://backend-service:3001',
|
||||
prefix: '/api',
|
||||
rewritePrefix: '/v1',
|
||||
http2: false,
|
||||
});
|
||||
|
||||
// With authentication
|
||||
app.register(httpProxy, {
|
||||
upstream: 'http://internal-api:3002',
|
||||
prefix: '/internal',
|
||||
preHandler: async (request, reply) => {
|
||||
// Verify authentication before proxying
|
||||
if (!request.headers.authorization) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
await app.listen({ port: 3000 });
|
||||
```
|
||||
|
||||
## @fastify/reply-from
|
||||
|
||||
For more control over proxying, use `@fastify/reply-from` with `reply.from()`:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import replyFrom from '@fastify/reply-from';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.register(replyFrom, {
|
||||
base: 'http://backend-service:3001',
|
||||
http2: false,
|
||||
});
|
||||
|
||||
// Proxy with request/response manipulation
|
||||
app.get('/users/:id', async (request, reply) => {
|
||||
const { id } = request.params;
|
||||
|
||||
return reply.from(`/api/users/${id}`, {
|
||||
// Modify request before forwarding
|
||||
rewriteRequestHeaders: (originalReq, headers) => ({
|
||||
...headers,
|
||||
'x-request-id': request.id,
|
||||
'x-forwarded-for': request.ip,
|
||||
}),
|
||||
// Modify response before sending
|
||||
onResponse: (request, reply, res) => {
|
||||
reply.header('x-proxy', 'fastify');
|
||||
reply.send(res);
|
||||
},
|
||||
});
|
||||
});
|
||||
|
||||
// Conditional routing
|
||||
app.all('/api/*', async (request, reply) => {
|
||||
const upstream = selectUpstream(request);
|
||||
|
||||
return reply.from(request.url, {
|
||||
base: upstream,
|
||||
});
|
||||
});
|
||||
|
||||
function selectUpstream(request) {
|
||||
// Route to different backends based on request
|
||||
if (request.headers['x-beta']) {
|
||||
return 'http://beta-backend:3001';
|
||||
}
|
||||
return 'http://stable-backend:3001';
|
||||
}
|
||||
```
|
||||
|
||||
## API Gateway Pattern
|
||||
|
||||
Build an API gateway with multiple backends:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import replyFrom from '@fastify/reply-from';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
// Configure multiple upstreams
|
||||
const services = {
|
||||
users: 'http://users-service:3001',
|
||||
orders: 'http://orders-service:3002',
|
||||
products: 'http://products-service:3003',
|
||||
};
|
||||
|
||||
app.register(replyFrom);
|
||||
|
||||
// Route to user service
|
||||
app.register(async function (fastify) {
|
||||
fastify.all('/*', async (request, reply) => {
|
||||
return reply.from(request.url.replace('/users', ''), {
|
||||
base: services.users,
|
||||
});
|
||||
});
|
||||
}, { prefix: '/users' });
|
||||
|
||||
// Route to orders service
|
||||
app.register(async function (fastify) {
|
||||
fastify.all('/*', async (request, reply) => {
|
||||
return reply.from(request.url.replace('/orders', ''), {
|
||||
base: services.orders,
|
||||
});
|
||||
});
|
||||
}, { prefix: '/orders' });
|
||||
|
||||
// Route to products service
|
||||
app.register(async function (fastify) {
|
||||
fastify.all('/*', async (request, reply) => {
|
||||
return reply.from(request.url.replace('/products', ''), {
|
||||
base: services.products,
|
||||
});
|
||||
});
|
||||
}, { prefix: '/products' });
|
||||
```
|
||||
|
||||
## Request Body Handling
|
||||
|
||||
Handle request bodies when proxying:
|
||||
|
||||
```typescript
|
||||
app.post('/api/data', async (request, reply) => {
|
||||
return reply.from('/data', {
|
||||
body: request.body,
|
||||
contentType: request.headers['content-type'],
|
||||
});
|
||||
});
|
||||
|
||||
// Stream large bodies
|
||||
app.post('/upload', async (request, reply) => {
|
||||
return reply.from('/upload', {
|
||||
body: request.raw,
|
||||
contentType: request.headers['content-type'],
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
Handle upstream errors gracefully:
|
||||
|
||||
```typescript
|
||||
app.register(replyFrom, {
|
||||
base: 'http://backend:3001',
|
||||
// Called when upstream returns an error
|
||||
onError: (reply, error) => {
|
||||
reply.log.error({ err: error }, 'Proxy error');
|
||||
reply.code(502).send({
|
||||
error: 'Bad Gateway',
|
||||
message: 'Upstream service unavailable',
|
||||
});
|
||||
},
|
||||
});
|
||||
|
||||
// Custom error handling per route
|
||||
app.get('/data', async (request, reply) => {
|
||||
try {
|
||||
return await reply.from('/data');
|
||||
} catch (error) {
|
||||
request.log.error({ err: error }, 'Failed to proxy request');
|
||||
return reply.code(503).send({
|
||||
error: 'Service Unavailable',
|
||||
retryAfter: 30,
|
||||
});
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## WebSocket Proxying
|
||||
|
||||
Proxy WebSocket connections:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import httpProxy from '@fastify/http-proxy';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
app.register(httpProxy, {
|
||||
upstream: 'http://ws-backend:3001',
|
||||
prefix: '/ws',
|
||||
websocket: true,
|
||||
});
|
||||
```
|
||||
|
||||
## Timeout Configuration
|
||||
|
||||
Configure proxy timeouts:
|
||||
|
||||
```typescript
|
||||
app.register(replyFrom, {
|
||||
base: 'http://backend:3001',
|
||||
http: {
|
||||
requestOptions: {
|
||||
timeout: 30000, // 30 seconds
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Caching Proxied Responses
|
||||
|
||||
Add caching to proxied responses:
|
||||
|
||||
```typescript
|
||||
import { createCache } from 'async-cache-dedupe';
|
||||
|
||||
const cache = createCache({
|
||||
ttl: 60,
|
||||
storage: { type: 'memory' },
|
||||
});
|
||||
|
||||
cache.define('proxyGet', async (url: string) => {
|
||||
const response = await fetch(`http://backend:3001${url}`);
|
||||
return response.json();
|
||||
});
|
||||
|
||||
app.get('/cached/*', async (request, reply) => {
|
||||
const data = await cache.proxyGet(request.url);
|
||||
return data;
|
||||
});
|
||||
```
|
||||
402
.claude/skills/fastify-best-practices/rules/logging.md
Normal file
402
.claude/skills/fastify-best-practices/rules/logging.md
Normal file
|
|
@ -0,0 +1,402 @@
|
|||
---
|
||||
name: logging
|
||||
description: Logging with Pino in Fastify
|
||||
metadata:
|
||||
tags: logging, pino, debugging, observability
|
||||
---
|
||||
|
||||
# Logging with Pino
|
||||
|
||||
## Built-in Pino Integration
|
||||
|
||||
Fastify uses Pino for high-performance logging:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
logger: true, // Enable default logging
|
||||
});
|
||||
|
||||
// Or with configuration
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
transport: {
|
||||
target: 'pino-pretty',
|
||||
options: {
|
||||
colorize: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Log Levels
|
||||
|
||||
Available log levels (in order of severity):
|
||||
|
||||
```typescript
|
||||
app.log.trace('Detailed debugging');
|
||||
app.log.debug('Debugging information');
|
||||
app.log.info('General information');
|
||||
app.log.warn('Warning messages');
|
||||
app.log.error('Error messages');
|
||||
app.log.fatal('Fatal errors');
|
||||
```
|
||||
|
||||
## Request-Scoped Logging
|
||||
|
||||
Each request has its own logger with request context:
|
||||
|
||||
```typescript
|
||||
app.get('/users/:id', async (request) => {
|
||||
// Logs include request ID automatically
|
||||
request.log.info('Fetching user');
|
||||
|
||||
const user = await db.users.findById(request.params.id);
|
||||
|
||||
if (!user) {
|
||||
request.log.warn({ userId: request.params.id }, 'User not found');
|
||||
return { error: 'Not found' };
|
||||
}
|
||||
|
||||
request.log.info({ userId: user.id }, 'User fetched');
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## Structured Logging
|
||||
|
||||
Always use structured logging with objects:
|
||||
|
||||
```typescript
|
||||
// GOOD - structured, searchable
|
||||
request.log.info({
|
||||
action: 'user_created',
|
||||
userId: user.id,
|
||||
email: user.email,
|
||||
}, 'User created successfully');
|
||||
|
||||
request.log.error({
|
||||
err: error,
|
||||
userId: request.params.id,
|
||||
operation: 'fetch_user',
|
||||
}, 'Failed to fetch user');
|
||||
|
||||
// BAD - unstructured, hard to parse
|
||||
request.log.info(`User ${user.id} created with email ${user.email}`);
|
||||
request.log.error(`Failed to fetch user: ${error.message}`);
|
||||
```
|
||||
|
||||
## Logging Configuration by Environment
|
||||
|
||||
```typescript
|
||||
function getLoggerConfig() {
|
||||
if (process.env.NODE_ENV === 'production') {
|
||||
return {
|
||||
level: 'info',
|
||||
// JSON output for log aggregation
|
||||
};
|
||||
}
|
||||
|
||||
if (process.env.NODE_ENV === 'test') {
|
||||
return false; // Disable logging in tests
|
||||
}
|
||||
|
||||
// Development
|
||||
return {
|
||||
level: 'debug',
|
||||
transport: {
|
||||
target: 'pino-pretty',
|
||||
options: {
|
||||
colorize: true,
|
||||
translateTime: 'HH:MM:ss Z',
|
||||
ignore: 'pid,hostname',
|
||||
},
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
const app = Fastify({
|
||||
logger: getLoggerConfig(),
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Serializers
|
||||
|
||||
Customize how objects are serialized:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
serializers: {
|
||||
// Customize request serialization
|
||||
req: (request) => ({
|
||||
method: request.method,
|
||||
url: request.url,
|
||||
headers: {
|
||||
host: request.headers.host,
|
||||
'user-agent': request.headers['user-agent'],
|
||||
},
|
||||
remoteAddress: request.ip,
|
||||
}),
|
||||
|
||||
// Customize response serialization
|
||||
res: (response) => ({
|
||||
statusCode: response.statusCode,
|
||||
}),
|
||||
|
||||
// Custom serializer for users
|
||||
user: (user) => ({
|
||||
id: user.id,
|
||||
email: user.email,
|
||||
// Exclude sensitive fields
|
||||
}),
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Use custom serializer
|
||||
request.log.info({ user: request.user }, 'User action');
|
||||
```
|
||||
|
||||
## Redacting Sensitive Data
|
||||
|
||||
Prevent logging sensitive information:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
redact: {
|
||||
paths: [
|
||||
'req.headers.authorization',
|
||||
'req.headers.cookie',
|
||||
'body.password',
|
||||
'body.creditCard',
|
||||
'*.password',
|
||||
'*.secret',
|
||||
'*.token',
|
||||
],
|
||||
censor: '[REDACTED]',
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Child Loggers
|
||||
|
||||
Create child loggers with additional context:
|
||||
|
||||
```typescript
|
||||
app.addHook('onRequest', async (request) => {
|
||||
// Add user context to all logs for this request
|
||||
if (request.user) {
|
||||
request.log = request.log.child({
|
||||
userId: request.user.id,
|
||||
userRole: request.user.role,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Service-level child logger
|
||||
const userService = {
|
||||
log: app.log.child({ service: 'UserService' }),
|
||||
|
||||
async create(data) {
|
||||
this.log.info({ email: data.email }, 'Creating user');
|
||||
// ...
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
## Request Logging Configuration
|
||||
|
||||
Customize automatic request logging:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
logger: true,
|
||||
disableRequestLogging: true, // Disable default request/response logs
|
||||
});
|
||||
|
||||
// Custom request logging
|
||||
app.addHook('onRequest', async (request) => {
|
||||
request.log.info({
|
||||
method: request.method,
|
||||
url: request.url,
|
||||
query: request.query,
|
||||
}, 'Request received');
|
||||
});
|
||||
|
||||
app.addHook('onResponse', async (request, reply) => {
|
||||
request.log.info({
|
||||
statusCode: reply.statusCode,
|
||||
responseTime: reply.elapsedTime,
|
||||
}, 'Request completed');
|
||||
});
|
||||
```
|
||||
|
||||
## Logging Errors
|
||||
|
||||
Properly log errors with stack traces:
|
||||
|
||||
```typescript
|
||||
app.setErrorHandler((error, request, reply) => {
|
||||
// Log error with full details
|
||||
request.log.error({
|
||||
err: error, // Pino serializes error objects properly
|
||||
url: request.url,
|
||||
method: request.method,
|
||||
body: request.body,
|
||||
query: request.query,
|
||||
}, 'Request error');
|
||||
|
||||
reply.code(error.statusCode || 500).send({
|
||||
error: error.message,
|
||||
});
|
||||
});
|
||||
|
||||
// In handlers
|
||||
app.get('/data', async (request) => {
|
||||
try {
|
||||
return await fetchData();
|
||||
} catch (error) {
|
||||
request.log.error({ err: error }, 'Failed to fetch data');
|
||||
throw error;
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Log Destinations
|
||||
|
||||
Configure where logs are sent:
|
||||
|
||||
```typescript
|
||||
import { createWriteStream } from 'node:fs';
|
||||
|
||||
// File output
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
stream: createWriteStream('./app.log'),
|
||||
},
|
||||
});
|
||||
|
||||
// Multiple destinations with pino.multistream
|
||||
import pino from 'pino';
|
||||
|
||||
const streams = [
|
||||
{ stream: process.stdout },
|
||||
{ stream: createWriteStream('./app.log') },
|
||||
{ level: 'error', stream: createWriteStream('./error.log') },
|
||||
];
|
||||
|
||||
const app = Fastify({
|
||||
logger: pino({ level: 'info' }, pino.multistream(streams)),
|
||||
});
|
||||
```
|
||||
|
||||
## Log Rotation
|
||||
|
||||
Use pino-roll for log rotation:
|
||||
|
||||
```bash
|
||||
node app.js | pino-roll --frequency daily --extension .log
|
||||
```
|
||||
|
||||
Or configure programmatically:
|
||||
|
||||
```typescript
|
||||
import { createStream } from 'rotating-file-stream';
|
||||
|
||||
const stream = createStream('app.log', {
|
||||
size: '10M', // Rotate every 10MB
|
||||
interval: '1d', // Rotate daily
|
||||
compress: 'gzip',
|
||||
path: './logs',
|
||||
});
|
||||
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
stream,
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Log Aggregation
|
||||
|
||||
Format logs for aggregation services:
|
||||
|
||||
```typescript
|
||||
// For ELK Stack, Datadog, etc. - use default JSON format
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
// Default JSON output works with most log aggregators
|
||||
},
|
||||
});
|
||||
|
||||
// Add service metadata
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
base: {
|
||||
service: 'user-api',
|
||||
version: process.env.APP_VERSION,
|
||||
environment: process.env.NODE_ENV,
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Request ID Tracking
|
||||
|
||||
Use request IDs for distributed tracing:
|
||||
|
||||
```typescript
|
||||
const app = Fastify({
|
||||
logger: true,
|
||||
requestIdHeader: 'x-request-id', // Use incoming header
|
||||
genReqId: (request) => {
|
||||
// Generate ID if not provided
|
||||
return request.headers['x-request-id'] || crypto.randomUUID();
|
||||
},
|
||||
});
|
||||
|
||||
// Forward request ID to downstream services
|
||||
app.addHook('onRequest', async (request) => {
|
||||
request.requestId = request.id;
|
||||
});
|
||||
|
||||
// Include in outgoing requests
|
||||
const response = await fetch('http://other-service/api', {
|
||||
headers: {
|
||||
'x-request-id': request.id,
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
Pino is fast, but consider:
|
||||
|
||||
```typescript
|
||||
// Avoid string concatenation in log calls
|
||||
// BAD
|
||||
request.log.info('User ' + user.id + ' did ' + action);
|
||||
|
||||
// GOOD
|
||||
request.log.info({ userId: user.id, action }, 'User action');
|
||||
|
||||
// Use appropriate log levels
|
||||
// Don't log at info level in hot paths
|
||||
if (app.log.isLevelEnabled('debug')) {
|
||||
request.log.debug({ details: expensiveToCompute() }, 'Debug info');
|
||||
}
|
||||
```
|
||||
425
.claude/skills/fastify-best-practices/rules/performance.md
Normal file
425
.claude/skills/fastify-best-practices/rules/performance.md
Normal file
|
|
@ -0,0 +1,425 @@
|
|||
---
|
||||
name: performance
|
||||
description: Performance optimization for Fastify applications
|
||||
metadata:
|
||||
tags: performance, optimization, speed, benchmarking
|
||||
---
|
||||
|
||||
# Performance Optimization
|
||||
|
||||
## Fastify is Fast by Default
|
||||
|
||||
Fastify is designed for performance. Key optimizations are built-in:
|
||||
|
||||
- Fast JSON serialization with `fast-json-stringify`
|
||||
- Efficient routing with `find-my-way`
|
||||
- Schema-based validation with `ajv` (compiled validators)
|
||||
- Low overhead request/response handling
|
||||
|
||||
## Use @fastify/under-pressure for Load Shedding
|
||||
|
||||
Protect your application from overload with `@fastify/under-pressure`:
|
||||
|
||||
```typescript
|
||||
import underPressure from '@fastify/under-pressure';
|
||||
|
||||
app.register(underPressure, {
|
||||
maxEventLoopDelay: 1000, // Max event loop delay in ms
|
||||
maxHeapUsedBytes: 1000000000, // Max heap used (~1GB)
|
||||
maxRssBytes: 1500000000, // Max RSS (~1.5GB)
|
||||
maxEventLoopUtilization: 0.98, // Max event loop utilization
|
||||
pressureHandler: (request, reply, type, value) => {
|
||||
reply.code(503).send({
|
||||
error: 'Service Unavailable',
|
||||
message: `Server under pressure: ${type}`,
|
||||
});
|
||||
},
|
||||
});
|
||||
|
||||
// Health check that respects pressure
|
||||
app.get('/health', async (request, reply) => {
|
||||
return { status: 'ok' };
|
||||
});
|
||||
```
|
||||
|
||||
## Always Define Response Schemas
|
||||
|
||||
Response schemas enable fast-json-stringify, which is significantly faster than JSON.stringify:
|
||||
|
||||
```typescript
|
||||
// FAST - uses fast-json-stringify
|
||||
app.get('/users', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'array',
|
||||
items: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return db.users.findAll();
|
||||
});
|
||||
|
||||
// SLOW - uses JSON.stringify
|
||||
app.get('/users-slow', async () => {
|
||||
return db.users.findAll();
|
||||
});
|
||||
```
|
||||
|
||||
## Avoid Dynamic Schema Compilation
|
||||
|
||||
Add schemas at startup, not at request time:
|
||||
|
||||
```typescript
|
||||
// GOOD - schemas compiled at startup
|
||||
app.addSchema({ $id: 'user', ... });
|
||||
|
||||
app.get('/users', {
|
||||
schema: { response: { 200: { $ref: 'user#' } } },
|
||||
}, handler);
|
||||
|
||||
// BAD - schema compiled per request
|
||||
app.get('/users', async (request, reply) => {
|
||||
const schema = getSchemaForUser(request.user);
|
||||
// This is slow!
|
||||
});
|
||||
```
|
||||
|
||||
## Use Logger Wisely
|
||||
|
||||
Pino is fast, but excessive logging has overhead:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
// Set log level via environment variable
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: process.env.LOG_LEVEL || 'info',
|
||||
},
|
||||
});
|
||||
|
||||
// Avoid logging large objects
|
||||
app.get('/data', async (request) => {
|
||||
// BAD - logs entire payload
|
||||
request.log.info({ data: largeObject }, 'Processing');
|
||||
|
||||
// GOOD - log only what's needed
|
||||
request.log.info({ id: largeObject.id }, 'Processing');
|
||||
|
||||
return largeObject;
|
||||
});
|
||||
```
|
||||
|
||||
## Connection Pooling
|
||||
|
||||
Use connection pools for databases:
|
||||
|
||||
```typescript
|
||||
import postgres from 'postgres';
|
||||
|
||||
// Create pool at startup
|
||||
const sql = postgres(process.env.DATABASE_URL, {
|
||||
max: 20, // Maximum pool size
|
||||
idle_timeout: 20,
|
||||
connect_timeout: 10,
|
||||
});
|
||||
|
||||
app.decorate('db', sql);
|
||||
|
||||
// Connections are reused
|
||||
app.get('/users', async () => {
|
||||
return app.db`SELECT * FROM users LIMIT 100`;
|
||||
});
|
||||
```
|
||||
|
||||
## Avoid Blocking the Event Loop
|
||||
|
||||
Use `piscina` for CPU-intensive operations. It provides a robust worker thread pool:
|
||||
|
||||
```typescript
|
||||
import Piscina from 'piscina';
|
||||
import { join } from 'node:path';
|
||||
|
||||
const piscina = new Piscina({
|
||||
filename: join(import.meta.dirname, 'workers', 'compute.js'),
|
||||
});
|
||||
|
||||
app.post('/compute', async (request) => {
|
||||
const result = await piscina.run(request.body);
|
||||
return result;
|
||||
});
|
||||
```
|
||||
|
||||
```typescript
|
||||
// workers/compute.js
|
||||
export default function compute(data) {
|
||||
// CPU-intensive work here
|
||||
return processedResult;
|
||||
}
|
||||
```
|
||||
|
||||
## Stream Large Responses
|
||||
|
||||
Stream large payloads instead of buffering:
|
||||
|
||||
```typescript
|
||||
import { createReadStream } from 'node:fs';
|
||||
import { pipeline } from 'node:stream/promises';
|
||||
|
||||
// GOOD - stream file
|
||||
app.get('/large-file', async (request, reply) => {
|
||||
const stream = createReadStream('./large-file.json');
|
||||
reply.type('application/json');
|
||||
return reply.send(stream);
|
||||
});
|
||||
|
||||
// BAD - load entire file into memory
|
||||
app.get('/large-file-bad', async () => {
|
||||
const content = await fs.readFile('./large-file.json', 'utf-8');
|
||||
return JSON.parse(content);
|
||||
});
|
||||
|
||||
// Stream database results
|
||||
app.get('/export', async (request, reply) => {
|
||||
reply.type('application/json');
|
||||
|
||||
const cursor = db.users.findCursor();
|
||||
reply.raw.write('[');
|
||||
|
||||
let first = true;
|
||||
for await (const user of cursor) {
|
||||
if (!first) reply.raw.write(',');
|
||||
reply.raw.write(JSON.stringify(user));
|
||||
first = false;
|
||||
}
|
||||
|
||||
reply.raw.write(']');
|
||||
reply.raw.end();
|
||||
});
|
||||
```
|
||||
|
||||
## Caching Strategies
|
||||
|
||||
Implement caching for expensive operations:
|
||||
|
||||
```typescript
|
||||
import { LRUCache } from 'lru-cache';
|
||||
|
||||
const cache = new LRUCache<string, unknown>({
|
||||
max: 1000,
|
||||
ttl: 60000, // 1 minute
|
||||
});
|
||||
|
||||
app.get('/expensive/:id', async (request) => {
|
||||
const { id } = request.params;
|
||||
const cacheKey = `expensive:${id}`;
|
||||
|
||||
const cached = cache.get(cacheKey);
|
||||
if (cached) {
|
||||
return cached;
|
||||
}
|
||||
|
||||
const result = await expensiveOperation(id);
|
||||
cache.set(cacheKey, result);
|
||||
|
||||
return result;
|
||||
});
|
||||
|
||||
// Cache control headers
|
||||
app.get('/static-data', async (request, reply) => {
|
||||
reply.header('Cache-Control', 'public, max-age=3600');
|
||||
return { data: 'static' };
|
||||
});
|
||||
```
|
||||
|
||||
## Request Coalescing with async-cache-dedupe
|
||||
|
||||
Use `async-cache-dedupe` for deduplicating concurrent identical requests and caching:
|
||||
|
||||
```typescript
|
||||
import { createCache } from 'async-cache-dedupe';
|
||||
|
||||
const cache = createCache({
|
||||
ttl: 60, // seconds
|
||||
stale: 5, // serve stale while revalidating
|
||||
storage: { type: 'memory' },
|
||||
});
|
||||
|
||||
cache.define('fetchData', async (id: string) => {
|
||||
return db.findById(id);
|
||||
});
|
||||
|
||||
app.get('/data/:id', async (request) => {
|
||||
const { id } = request.params;
|
||||
// Automatically deduplicates concurrent requests for the same id
|
||||
// and caches the result
|
||||
return cache.fetchData(id);
|
||||
});
|
||||
```
|
||||
|
||||
For distributed caching, use Redis storage:
|
||||
|
||||
```typescript
|
||||
import { createCache } from 'async-cache-dedupe';
|
||||
import Redis from 'ioredis';
|
||||
|
||||
const redis = new Redis(process.env.REDIS_URL);
|
||||
|
||||
const cache = createCache({
|
||||
ttl: 60,
|
||||
storage: { type: 'redis', options: { client: redis } },
|
||||
});
|
||||
```
|
||||
|
||||
## Payload Limits
|
||||
|
||||
Set appropriate payload limits:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
bodyLimit: 1048576, // 1MB default
|
||||
});
|
||||
|
||||
// Per-route limit for file uploads
|
||||
app.post('/upload', {
|
||||
bodyLimit: 10485760, // 10MB for this route
|
||||
}, uploadHandler);
|
||||
```
|
||||
|
||||
## Compression
|
||||
|
||||
Use compression for responses:
|
||||
|
||||
```typescript
|
||||
import fastifyCompress from '@fastify/compress';
|
||||
|
||||
app.register(fastifyCompress, {
|
||||
global: true,
|
||||
threshold: 1024, // Only compress responses > 1KB
|
||||
encodings: ['gzip', 'deflate'],
|
||||
});
|
||||
|
||||
// Disable for specific route
|
||||
app.get('/already-compressed', {
|
||||
compress: false,
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Connection Timeouts
|
||||
|
||||
Configure appropriate timeouts:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
connectionTimeout: 30000, // 30 seconds
|
||||
keepAliveTimeout: 5000, // 5 seconds
|
||||
});
|
||||
|
||||
// Per-route timeout
|
||||
app.get('/long-operation', {
|
||||
config: {
|
||||
timeout: 60000, // 60 seconds
|
||||
},
|
||||
}, async (request) => {
|
||||
return longOperation();
|
||||
});
|
||||
```
|
||||
|
||||
## Disable Unnecessary Features
|
||||
|
||||
Disable features you don't need:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
disableRequestLogging: true, // If you don't need request logs
|
||||
trustProxy: false, // If not behind proxy
|
||||
caseSensitive: true, // Enable for slight performance gain
|
||||
ignoreDuplicateSlashes: false,
|
||||
});
|
||||
```
|
||||
|
||||
## Benchmarking
|
||||
|
||||
Use autocannon for load testing:
|
||||
|
||||
```bash
|
||||
# Install
|
||||
npm install -g autocannon
|
||||
|
||||
# Basic benchmark
|
||||
autocannon http://localhost:3000/api/users
|
||||
|
||||
# With options
|
||||
autocannon -c 100 -d 30 -p 10 http://localhost:3000/api/users
|
||||
# -c: connections
|
||||
# -d: duration in seconds
|
||||
# -p: pipelining factor
|
||||
```
|
||||
|
||||
```typescript
|
||||
// Programmatic benchmarking
|
||||
import autocannon from 'autocannon';
|
||||
|
||||
const result = await autocannon({
|
||||
url: 'http://localhost:3000/api/users',
|
||||
connections: 100,
|
||||
duration: 30,
|
||||
pipelining: 10,
|
||||
});
|
||||
|
||||
console.log(autocannon.printResult(result));
|
||||
```
|
||||
|
||||
## Profiling
|
||||
|
||||
Use `@platformatic/flame` for flame graph profiling:
|
||||
|
||||
```bash
|
||||
npx @platformatic/flame app.js
|
||||
```
|
||||
|
||||
This generates an interactive flame graph to identify performance bottlenecks.
|
||||
|
||||
## Memory Management
|
||||
|
||||
Monitor and optimize memory usage:
|
||||
|
||||
```typescript
|
||||
// Add health endpoint with memory info
|
||||
app.get('/health', async () => {
|
||||
const memory = process.memoryUsage();
|
||||
return {
|
||||
status: 'ok',
|
||||
memory: {
|
||||
heapUsed: Math.round(memory.heapUsed / 1024 / 1024) + 'MB',
|
||||
heapTotal: Math.round(memory.heapTotal / 1024 / 1024) + 'MB',
|
||||
rss: Math.round(memory.rss / 1024 / 1024) + 'MB',
|
||||
},
|
||||
};
|
||||
});
|
||||
|
||||
// Avoid memory leaks in closures
|
||||
app.addHook('onRequest', async (request) => {
|
||||
// BAD - holding reference to large object
|
||||
const largeData = await loadLargeData();
|
||||
request.getData = () => largeData;
|
||||
|
||||
// GOOD - load on demand
|
||||
request.getData = () => loadLargeData();
|
||||
});
|
||||
```
|
||||
320
.claude/skills/fastify-best-practices/rules/plugins.md
Normal file
320
.claude/skills/fastify-best-practices/rules/plugins.md
Normal file
|
|
@ -0,0 +1,320 @@
|
|||
---
|
||||
name: plugins
|
||||
description: Plugin development and encapsulation in Fastify
|
||||
metadata:
|
||||
tags: plugins, encapsulation, modules, architecture
|
||||
---
|
||||
|
||||
# Plugin Development and Encapsulation
|
||||
|
||||
## Understanding Encapsulation
|
||||
|
||||
Fastify's plugin system provides automatic encapsulation. Each plugin creates its own context, isolating decorators, hooks, and plugins registered within it:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// This plugin is encapsulated - its decorators are NOT available to siblings
|
||||
app.register(async function childPlugin(fastify) {
|
||||
fastify.decorate('privateUtil', () => 'only available here');
|
||||
|
||||
// This decorator is only available within this plugin and its children
|
||||
fastify.get('/child', async function (request, reply) {
|
||||
return this.privateUtil();
|
||||
});
|
||||
});
|
||||
|
||||
// This route CANNOT access privateUtil - it's in a different context
|
||||
app.get('/parent', async function (request, reply) {
|
||||
// this.privateUtil is undefined here
|
||||
return { status: 'ok' };
|
||||
});
|
||||
```
|
||||
|
||||
## Breaking Encapsulation with fastify-plugin
|
||||
|
||||
Use `fastify-plugin` when you need to share decorators, hooks, or plugins with the parent context:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
// This plugin's decorators will be available to the parent and siblings
|
||||
export default fp(async function databasePlugin(fastify, options) {
|
||||
const db = await createConnection(options.connectionString);
|
||||
|
||||
fastify.decorate('db', db);
|
||||
|
||||
fastify.addHook('onClose', async () => {
|
||||
await db.close();
|
||||
});
|
||||
}, {
|
||||
name: 'database-plugin',
|
||||
dependencies: [], // List plugin dependencies
|
||||
});
|
||||
```
|
||||
|
||||
## Plugin Registration Order
|
||||
|
||||
Plugins are registered in order, but loading is asynchronous. Use `after()` for sequential dependencies:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import databasePlugin from './plugins/database.js';
|
||||
import authPlugin from './plugins/auth.js';
|
||||
import routesPlugin from './routes/index.js';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Database must be ready before auth
|
||||
app.register(databasePlugin);
|
||||
|
||||
// Auth depends on database
|
||||
app.register(authPlugin);
|
||||
|
||||
// Routes depend on both
|
||||
app.register(routesPlugin);
|
||||
|
||||
// Or use after() for explicit sequencing
|
||||
app.register(databasePlugin).after(() => {
|
||||
app.register(authPlugin).after(() => {
|
||||
app.register(routesPlugin);
|
||||
});
|
||||
});
|
||||
|
||||
await app.ready();
|
||||
```
|
||||
|
||||
## Plugin Options
|
||||
|
||||
Always validate and document plugin options:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
interface CachePluginOptions {
|
||||
ttl: number;
|
||||
maxSize?: number;
|
||||
prefix?: string;
|
||||
}
|
||||
|
||||
export default fp<CachePluginOptions>(async function cachePlugin(fastify, options) {
|
||||
const { ttl, maxSize = 1000, prefix = 'cache:' } = options;
|
||||
|
||||
if (typeof ttl !== 'number' || ttl <= 0) {
|
||||
throw new Error('Cache plugin requires a positive ttl option');
|
||||
}
|
||||
|
||||
const cache = new Map<string, { value: unknown; expires: number }>();
|
||||
|
||||
fastify.decorate('cache', {
|
||||
get(key: string): unknown | undefined {
|
||||
const item = cache.get(prefix + key);
|
||||
if (!item) return undefined;
|
||||
if (Date.now() > item.expires) {
|
||||
cache.delete(prefix + key);
|
||||
return undefined;
|
||||
}
|
||||
return item.value;
|
||||
},
|
||||
set(key: string, value: unknown): void {
|
||||
if (cache.size >= maxSize) {
|
||||
const firstKey = cache.keys().next().value;
|
||||
cache.delete(firstKey);
|
||||
}
|
||||
cache.set(prefix + key, { value, expires: Date.now() + ttl });
|
||||
},
|
||||
});
|
||||
}, {
|
||||
name: 'cache-plugin',
|
||||
});
|
||||
```
|
||||
|
||||
## Plugin Factory Pattern
|
||||
|
||||
Create configurable plugins using factory functions:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
interface RateLimitOptions {
|
||||
max: number;
|
||||
timeWindow: number;
|
||||
}
|
||||
|
||||
function createRateLimiter(defaults: Partial<RateLimitOptions> = {}) {
|
||||
return fp<RateLimitOptions>(async function rateLimitPlugin(fastify, options) {
|
||||
const config = { ...defaults, ...options };
|
||||
|
||||
// Implementation
|
||||
fastify.decorate('rateLimit', config);
|
||||
}, {
|
||||
name: 'rate-limiter',
|
||||
});
|
||||
}
|
||||
|
||||
// Usage
|
||||
app.register(createRateLimiter({ max: 100 }), { timeWindow: 60000 });
|
||||
```
|
||||
|
||||
## Plugin Dependencies
|
||||
|
||||
Declare dependencies to ensure proper load order:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
export default fp(async function authPlugin(fastify) {
|
||||
// This plugin requires 'database-plugin' to be loaded first
|
||||
if (!fastify.hasDecorator('db')) {
|
||||
throw new Error('Auth plugin requires database plugin');
|
||||
}
|
||||
|
||||
fastify.decorate('authenticate', async (request) => {
|
||||
const user = await fastify.db.users.findByToken(request.headers.authorization);
|
||||
return user;
|
||||
});
|
||||
}, {
|
||||
name: 'auth-plugin',
|
||||
dependencies: ['database-plugin'],
|
||||
});
|
||||
```
|
||||
|
||||
## Scoped Plugins for Route Groups
|
||||
|
||||
Use encapsulation to scope plugins to specific routes:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Public routes - no auth required
|
||||
app.register(async function publicRoutes(fastify) {
|
||||
fastify.get('/health', async () => ({ status: 'ok' }));
|
||||
fastify.get('/docs', async () => ({ version: '1.0.0' }));
|
||||
});
|
||||
|
||||
// Protected routes - auth required
|
||||
app.register(async function protectedRoutes(fastify) {
|
||||
// Auth hook only applies to routes in this plugin
|
||||
fastify.addHook('onRequest', async (request, reply) => {
|
||||
const token = request.headers.authorization;
|
||||
if (!token) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return;
|
||||
}
|
||||
request.user = await verifyToken(token);
|
||||
});
|
||||
|
||||
fastify.get('/profile', async (request) => {
|
||||
return { user: request.user };
|
||||
});
|
||||
|
||||
fastify.get('/settings', async (request) => {
|
||||
return { settings: await getSettings(request.user.id) };
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Prefix Routes with Register
|
||||
|
||||
Use the `prefix` option to namespace routes:
|
||||
|
||||
```typescript
|
||||
app.register(import('./routes/users.js'), { prefix: '/api/v1/users' });
|
||||
app.register(import('./routes/posts.js'), { prefix: '/api/v1/posts' });
|
||||
|
||||
// In routes/users.js
|
||||
export default async function userRoutes(fastify) {
|
||||
// Becomes /api/v1/users
|
||||
fastify.get('/', async () => {
|
||||
return { users: [] };
|
||||
});
|
||||
|
||||
// Becomes /api/v1/users/:id
|
||||
fastify.get('/:id', async (request) => {
|
||||
return { user: { id: request.params.id } };
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## Plugin Metadata
|
||||
|
||||
Add metadata for documentation and tooling:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
async function metricsPlugin(fastify) {
|
||||
// Implementation
|
||||
}
|
||||
|
||||
export default fp(metricsPlugin, {
|
||||
name: 'metrics-plugin',
|
||||
fastify: '5.x', // Fastify version compatibility
|
||||
dependencies: ['pino-plugin'],
|
||||
decorators: {
|
||||
fastify: ['db'], // Required decorators
|
||||
request: [],
|
||||
reply: [],
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Autoload Plugins
|
||||
|
||||
Use `@fastify/autoload` for automatic plugin loading:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import autoload from '@fastify/autoload';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
import { dirname, join } from 'node:path';
|
||||
|
||||
const __dirname = dirname(fileURLToPath(import.meta.url));
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Load all plugins from the plugins directory
|
||||
app.register(autoload, {
|
||||
dir: join(__dirname, 'plugins'),
|
||||
options: { prefix: '/api' },
|
||||
});
|
||||
|
||||
// Load all routes from the routes directory
|
||||
app.register(autoload, {
|
||||
dir: join(__dirname, 'routes'),
|
||||
options: { prefix: '/api' },
|
||||
});
|
||||
```
|
||||
|
||||
## Testing Plugins in Isolation
|
||||
|
||||
Test plugins independently:
|
||||
|
||||
```typescript
|
||||
import { describe, it, before, after } from 'node:test';
|
||||
import Fastify from 'fastify';
|
||||
import myPlugin from './my-plugin.js';
|
||||
|
||||
describe('MyPlugin', () => {
|
||||
let app;
|
||||
|
||||
before(async () => {
|
||||
app = Fastify();
|
||||
app.register(myPlugin, { option: 'value' });
|
||||
await app.ready();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should decorate fastify instance', (t) => {
|
||||
t.assert.ok(app.hasDecorator('myDecorator'));
|
||||
});
|
||||
});
|
||||
```
|
||||
467
.claude/skills/fastify-best-practices/rules/routes.md
Normal file
467
.claude/skills/fastify-best-practices/rules/routes.md
Normal file
|
|
@ -0,0 +1,467 @@
|
|||
---
|
||||
name: routes
|
||||
description: Route organization and handlers in Fastify
|
||||
metadata:
|
||||
tags: routes, handlers, http, rest, api
|
||||
---
|
||||
|
||||
# Route Organization and Handlers
|
||||
|
||||
## Basic Route Definition
|
||||
|
||||
Define routes with the shorthand methods or the full route method:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Shorthand methods
|
||||
app.get('/users', async (request, reply) => {
|
||||
return { users: [] };
|
||||
});
|
||||
|
||||
app.post('/users', async (request, reply) => {
|
||||
return { created: true };
|
||||
});
|
||||
|
||||
// Full route method with all options
|
||||
app.route({
|
||||
method: 'GET',
|
||||
url: '/users/:id',
|
||||
schema: {
|
||||
params: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
},
|
||||
required: ['id'],
|
||||
},
|
||||
},
|
||||
handler: async (request, reply) => {
|
||||
return { id: request.params.id };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Route Parameters
|
||||
|
||||
Access URL parameters through `request.params`:
|
||||
|
||||
```typescript
|
||||
// Single parameter
|
||||
app.get('/users/:id', async (request) => {
|
||||
const { id } = request.params as { id: string };
|
||||
return { userId: id };
|
||||
});
|
||||
|
||||
// Multiple parameters
|
||||
app.get('/users/:userId/posts/:postId', async (request) => {
|
||||
const { userId, postId } = request.params as { userId: string; postId: string };
|
||||
return { userId, postId };
|
||||
});
|
||||
|
||||
// Wildcard parameter (captures everything after)
|
||||
app.get('/files/*', async (request) => {
|
||||
const path = (request.params as { '*': string })['*'];
|
||||
return { filePath: path };
|
||||
});
|
||||
|
||||
// Regex parameters (Fastify uses find-my-way)
|
||||
app.get('/orders/:id(\\d+)', async (request) => {
|
||||
// Only matches numeric IDs
|
||||
const { id } = request.params as { id: string };
|
||||
return { orderId: parseInt(id, 10) };
|
||||
});
|
||||
```
|
||||
|
||||
## Query String Parameters
|
||||
|
||||
Access query parameters through `request.query`:
|
||||
|
||||
```typescript
|
||||
app.get('/search', {
|
||||
schema: {
|
||||
querystring: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
q: { type: 'string' },
|
||||
page: { type: 'integer', default: 1 },
|
||||
limit: { type: 'integer', default: 10, maximum: 100 },
|
||||
},
|
||||
required: ['q'],
|
||||
},
|
||||
},
|
||||
handler: async (request) => {
|
||||
const { q, page, limit } = request.query as {
|
||||
q: string;
|
||||
page: number;
|
||||
limit: number;
|
||||
};
|
||||
return { query: q, page, limit };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Request Body
|
||||
|
||||
Access the request body through `request.body`:
|
||||
|
||||
```typescript
|
||||
app.post('/users', {
|
||||
schema: {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string', minLength: 1 },
|
||||
email: { type: 'string', format: 'email' },
|
||||
age: { type: 'integer', minimum: 0 },
|
||||
},
|
||||
required: ['name', 'email'],
|
||||
},
|
||||
},
|
||||
handler: async (request, reply) => {
|
||||
const user = request.body as { name: string; email: string; age?: number };
|
||||
// Create user...
|
||||
reply.code(201);
|
||||
return { user };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Headers
|
||||
|
||||
Access request headers through `request.headers`:
|
||||
|
||||
```typescript
|
||||
app.get('/protected', {
|
||||
schema: {
|
||||
headers: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
authorization: { type: 'string' },
|
||||
},
|
||||
required: ['authorization'],
|
||||
},
|
||||
},
|
||||
handler: async (request) => {
|
||||
const token = request.headers.authorization;
|
||||
return { authenticated: true };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Reply Methods
|
||||
|
||||
Use reply methods to control the response:
|
||||
|
||||
```typescript
|
||||
app.get('/examples', async (request, reply) => {
|
||||
// Set status code
|
||||
reply.code(201);
|
||||
|
||||
// Set headers
|
||||
reply.header('X-Custom-Header', 'value');
|
||||
reply.headers({ 'X-Another': 'value', 'X-Third': 'value' });
|
||||
|
||||
// Set content type
|
||||
reply.type('application/json');
|
||||
|
||||
// Redirect
|
||||
// reply.redirect('/other-url');
|
||||
// reply.redirect(301, '/permanent-redirect');
|
||||
|
||||
// Return response (automatic serialization)
|
||||
return { status: 'ok' };
|
||||
});
|
||||
|
||||
// Explicit send (useful in non-async handlers)
|
||||
app.get('/explicit', (request, reply) => {
|
||||
reply.send({ status: 'ok' });
|
||||
});
|
||||
|
||||
// Stream response
|
||||
app.get('/stream', async (request, reply) => {
|
||||
const stream = fs.createReadStream('./large-file.txt');
|
||||
reply.type('text/plain');
|
||||
return reply.send(stream);
|
||||
});
|
||||
```
|
||||
|
||||
## Route Organization by Feature
|
||||
|
||||
Organize routes by feature/domain in separate files:
|
||||
|
||||
```
|
||||
src/
|
||||
routes/
|
||||
users/
|
||||
index.ts # Route definitions
|
||||
handlers.ts # Handler functions
|
||||
schemas.ts # JSON schemas
|
||||
posts/
|
||||
index.ts
|
||||
handlers.ts
|
||||
schemas.ts
|
||||
```
|
||||
|
||||
```typescript
|
||||
// routes/users/schemas.ts
|
||||
export const userSchema = {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string', format: 'uuid' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string', format: 'email' },
|
||||
},
|
||||
};
|
||||
|
||||
export const createUserSchema = {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string', minLength: 1 },
|
||||
email: { type: 'string', format: 'email' },
|
||||
},
|
||||
required: ['name', 'email'],
|
||||
},
|
||||
response: {
|
||||
201: userSchema,
|
||||
},
|
||||
};
|
||||
|
||||
// routes/users/handlers.ts
|
||||
import type { FastifyRequest, FastifyReply } from 'fastify';
|
||||
|
||||
export async function createUser(
|
||||
request: FastifyRequest<{ Body: { name: string; email: string } }>,
|
||||
reply: FastifyReply,
|
||||
) {
|
||||
const { name, email } = request.body;
|
||||
const user = await request.server.db.users.create({ name, email });
|
||||
reply.code(201);
|
||||
return user;
|
||||
}
|
||||
|
||||
export async function getUsers(request: FastifyRequest) {
|
||||
return request.server.db.users.findAll();
|
||||
}
|
||||
|
||||
// routes/users/index.ts
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import { createUser, getUsers } from './handlers.js';
|
||||
import { createUserSchema } from './schemas.js';
|
||||
|
||||
export default async function userRoutes(fastify: FastifyInstance) {
|
||||
fastify.get('/', getUsers);
|
||||
fastify.post('/', { schema: createUserSchema }, createUser);
|
||||
}
|
||||
```
|
||||
|
||||
## Route Constraints
|
||||
|
||||
Add constraints to routes for versioning or host-based routing:
|
||||
|
||||
```typescript
|
||||
// Version constraint
|
||||
app.get('/users', {
|
||||
constraints: { version: '1.0.0' },
|
||||
handler: async () => ({ version: '1.0.0', users: [] }),
|
||||
});
|
||||
|
||||
app.get('/users', {
|
||||
constraints: { version: '2.0.0' },
|
||||
handler: async () => ({ version: '2.0.0', data: { users: [] } }),
|
||||
});
|
||||
|
||||
// Client sends: Accept-Version: 1.0.0
|
||||
|
||||
// Host constraint
|
||||
app.get('/', {
|
||||
constraints: { host: 'api.example.com' },
|
||||
handler: async () => ({ api: true }),
|
||||
});
|
||||
|
||||
app.get('/', {
|
||||
constraints: { host: 'www.example.com' },
|
||||
handler: async () => ({ web: true }),
|
||||
});
|
||||
```
|
||||
|
||||
## Route Prefixing
|
||||
|
||||
Use prefixes to namespace routes:
|
||||
|
||||
```typescript
|
||||
// Using register
|
||||
app.register(async function (fastify) {
|
||||
fastify.get('/list', async () => ({ users: [] }));
|
||||
fastify.get('/:id', async (request) => ({ id: request.params.id }));
|
||||
}, { prefix: '/users' });
|
||||
|
||||
// Results in:
|
||||
// GET /users/list
|
||||
// GET /users/:id
|
||||
```
|
||||
|
||||
## Multiple Methods
|
||||
|
||||
Handle multiple HTTP methods with one handler:
|
||||
|
||||
```typescript
|
||||
app.route({
|
||||
method: ['GET', 'HEAD'],
|
||||
url: '/resource',
|
||||
handler: async (request) => {
|
||||
return { data: 'resource' };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## 404 Handler
|
||||
|
||||
Customize the not found handler:
|
||||
|
||||
```typescript
|
||||
app.setNotFoundHandler({
|
||||
preValidation: async (request, reply) => {
|
||||
// Optional pre-validation hook
|
||||
},
|
||||
preHandler: async (request, reply) => {
|
||||
// Optional pre-handler hook
|
||||
},
|
||||
}, async (request, reply) => {
|
||||
reply.code(404);
|
||||
return {
|
||||
error: 'Not Found',
|
||||
message: `Route ${request.method} ${request.url} not found`,
|
||||
statusCode: 404,
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Method Not Allowed
|
||||
|
||||
Handle method not allowed responses:
|
||||
|
||||
```typescript
|
||||
// Fastify doesn't have built-in 405 handling
|
||||
// Implement with a custom not found handler that checks allowed methods
|
||||
app.setNotFoundHandler(async (request, reply) => {
|
||||
// Check if the URL exists with a different method
|
||||
const route = app.hasRoute({
|
||||
url: request.url,
|
||||
method: 'GET', // Check other methods
|
||||
});
|
||||
|
||||
if (route) {
|
||||
reply.code(405);
|
||||
return { error: 'Method Not Allowed' };
|
||||
}
|
||||
|
||||
reply.code(404);
|
||||
return { error: 'Not Found' };
|
||||
});
|
||||
```
|
||||
|
||||
## Route-Level Configuration
|
||||
|
||||
Apply configuration to specific routes:
|
||||
|
||||
```typescript
|
||||
app.get('/slow-operation', {
|
||||
config: {
|
||||
rateLimit: { max: 10, timeWindow: '1 minute' },
|
||||
},
|
||||
handler: async (request) => {
|
||||
return { result: await slowOperation() };
|
||||
},
|
||||
});
|
||||
|
||||
// Access config in hooks
|
||||
app.addHook('onRequest', async (request, reply) => {
|
||||
const config = request.routeOptions.config;
|
||||
if (config.rateLimit) {
|
||||
// Apply rate limiting
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Async Route Registration
|
||||
|
||||
Register routes from async sources:
|
||||
|
||||
```typescript
|
||||
app.register(async function (fastify) {
|
||||
const routeConfigs = await loadRoutesFromDatabase();
|
||||
|
||||
for (const config of routeConfigs) {
|
||||
fastify.route({
|
||||
method: config.method,
|
||||
url: config.path,
|
||||
handler: createDynamicHandler(config),
|
||||
});
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Auto-loading Routes with @fastify/autoload
|
||||
|
||||
Use `@fastify/autoload` to automatically load routes from a directory structure:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import autoload from '@fastify/autoload';
|
||||
import { join } from 'node:path';
|
||||
|
||||
const app = Fastify({ logger: true });
|
||||
|
||||
// Auto-load plugins
|
||||
app.register(autoload, {
|
||||
dir: join(import.meta.dirname, 'plugins'),
|
||||
options: { prefix: '' },
|
||||
});
|
||||
|
||||
// Auto-load routes
|
||||
app.register(autoload, {
|
||||
dir: join(import.meta.dirname, 'routes'),
|
||||
options: { prefix: '/api' },
|
||||
});
|
||||
|
||||
await app.listen({ port: 3000 });
|
||||
```
|
||||
|
||||
Directory structure:
|
||||
|
||||
```
|
||||
src/
|
||||
plugins/
|
||||
database.ts # Loaded automatically
|
||||
auth.ts # Loaded automatically
|
||||
routes/
|
||||
users/
|
||||
index.ts # GET/POST /api/users
|
||||
_id/
|
||||
index.ts # GET/PUT/DELETE /api/users/:id
|
||||
posts/
|
||||
index.ts # GET/POST /api/posts
|
||||
```
|
||||
|
||||
Route file example:
|
||||
|
||||
```typescript
|
||||
// routes/users/index.ts
|
||||
import type { FastifyPluginAsync } from 'fastify';
|
||||
|
||||
const users: FastifyPluginAsync = async (fastify) => {
|
||||
fastify.get('/', async () => {
|
||||
return fastify.repositories.users.findAll();
|
||||
});
|
||||
|
||||
fastify.post('/', async (request) => {
|
||||
return fastify.repositories.users.create(request.body);
|
||||
});
|
||||
};
|
||||
|
||||
export default users;
|
||||
```
|
||||
585
.claude/skills/fastify-best-practices/rules/schemas.md
Normal file
585
.claude/skills/fastify-best-practices/rules/schemas.md
Normal file
|
|
@ -0,0 +1,585 @@
|
|||
---
|
||||
name: schemas
|
||||
description: JSON Schema validation in Fastify with TypeBox
|
||||
metadata:
|
||||
tags: validation, json-schema, schemas, ajv, typebox
|
||||
---
|
||||
|
||||
# JSON Schema Validation
|
||||
|
||||
## Use TypeBox for Type-Safe Schemas
|
||||
|
||||
**Prefer TypeBox for defining schemas.** It provides TypeScript types automatically and compiles to JSON Schema:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Define schema with TypeBox - get TypeScript types for free
|
||||
const CreateUserBody = Type.Object({
|
||||
name: Type.String({ minLength: 1, maxLength: 100 }),
|
||||
email: Type.String({ format: 'email' }),
|
||||
age: Type.Optional(Type.Integer({ minimum: 0, maximum: 150 })),
|
||||
});
|
||||
|
||||
const UserResponse = Type.Object({
|
||||
id: Type.String({ format: 'uuid' }),
|
||||
name: Type.String(),
|
||||
email: Type.String(),
|
||||
createdAt: Type.String({ format: 'date-time' }),
|
||||
});
|
||||
|
||||
// TypeScript types are derived automatically
|
||||
type CreateUserBodyType = Static<typeof CreateUserBody>;
|
||||
type UserResponseType = Static<typeof UserResponse>;
|
||||
|
||||
app.post<{
|
||||
Body: CreateUserBodyType;
|
||||
Reply: UserResponseType;
|
||||
}>('/users', {
|
||||
schema: {
|
||||
body: CreateUserBody,
|
||||
response: {
|
||||
201: UserResponse,
|
||||
},
|
||||
},
|
||||
}, async (request, reply) => {
|
||||
// request.body is fully typed as CreateUserBodyType
|
||||
const user = await createUser(request.body);
|
||||
reply.code(201);
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## TypeBox Common Patterns
|
||||
|
||||
```typescript
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
// Enums
|
||||
const Status = Type.Union([
|
||||
Type.Literal('active'),
|
||||
Type.Literal('inactive'),
|
||||
Type.Literal('pending'),
|
||||
]);
|
||||
|
||||
// Arrays
|
||||
const Tags = Type.Array(Type.String(), { minItems: 1, maxItems: 10 });
|
||||
|
||||
// Nested objects
|
||||
const Address = Type.Object({
|
||||
street: Type.String(),
|
||||
city: Type.String(),
|
||||
country: Type.String(),
|
||||
zip: Type.Optional(Type.String()),
|
||||
});
|
||||
|
||||
// References (reusable schemas)
|
||||
const User = Type.Object({
|
||||
id: Type.String({ format: 'uuid' }),
|
||||
name: Type.String(),
|
||||
address: Address,
|
||||
tags: Tags,
|
||||
status: Status,
|
||||
});
|
||||
|
||||
// Nullable
|
||||
const NullableString = Type.Union([Type.String(), Type.Null()]);
|
||||
|
||||
// Record/Map
|
||||
const Metadata = Type.Record(Type.String(), Type.Unknown());
|
||||
```
|
||||
|
||||
## Register TypeBox Schemas Globally
|
||||
|
||||
```typescript
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
// Define shared schemas
|
||||
const ErrorResponse = Type.Object({
|
||||
error: Type.String(),
|
||||
message: Type.String(),
|
||||
statusCode: Type.Integer(),
|
||||
});
|
||||
|
||||
const PaginationQuery = Type.Object({
|
||||
page: Type.Integer({ minimum: 1, default: 1 }),
|
||||
limit: Type.Integer({ minimum: 1, maximum: 100, default: 20 }),
|
||||
});
|
||||
|
||||
// Register globally
|
||||
app.addSchema(Type.Object({ $id: 'ErrorResponse', ...ErrorResponse }));
|
||||
app.addSchema(Type.Object({ $id: 'PaginationQuery', ...PaginationQuery }));
|
||||
|
||||
// Reference in routes
|
||||
app.get('/items', {
|
||||
schema: {
|
||||
querystring: { $ref: 'PaginationQuery#' },
|
||||
response: {
|
||||
400: { $ref: 'ErrorResponse#' },
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Plain JSON Schema (Alternative)
|
||||
|
||||
You can also use plain JSON Schema directly:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
const createUserSchema = {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string', minLength: 1, maxLength: 100 },
|
||||
email: { type: 'string', format: 'email' },
|
||||
age: { type: 'integer', minimum: 0, maximum: 150 },
|
||||
},
|
||||
required: ['name', 'email'],
|
||||
additionalProperties: false,
|
||||
},
|
||||
response: {
|
||||
201: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string', format: 'uuid' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string' },
|
||||
createdAt: { type: 'string', format: 'date-time' },
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
app.post('/users', { schema: createUserSchema }, async (request, reply) => {
|
||||
const user = await createUser(request.body);
|
||||
reply.code(201);
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## Request Validation Parts
|
||||
|
||||
Validate different parts of the request:
|
||||
|
||||
```typescript
|
||||
const fullRequestSchema = {
|
||||
// URL parameters
|
||||
params: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string', format: 'uuid' },
|
||||
},
|
||||
required: ['id'],
|
||||
},
|
||||
|
||||
// Query string
|
||||
querystring: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
include: { type: 'string', enum: ['posts', 'comments', 'all'] },
|
||||
limit: { type: 'integer', minimum: 1, maximum: 100, default: 10 },
|
||||
},
|
||||
},
|
||||
|
||||
// Request headers
|
||||
headers: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
'x-api-key': { type: 'string', minLength: 32 },
|
||||
},
|
||||
required: ['x-api-key'],
|
||||
},
|
||||
|
||||
// Request body
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
data: { type: 'object' },
|
||||
},
|
||||
required: ['data'],
|
||||
},
|
||||
};
|
||||
|
||||
app.put('/resources/:id', { schema: fullRequestSchema }, handler);
|
||||
```
|
||||
|
||||
## Shared Schemas with $id
|
||||
|
||||
Define reusable schemas with `$id` and reference them with `$ref`:
|
||||
|
||||
```typescript
|
||||
// Add shared schemas to Fastify
|
||||
app.addSchema({
|
||||
$id: 'user',
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string', format: 'uuid' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string', format: 'email' },
|
||||
createdAt: { type: 'string', format: 'date-time' },
|
||||
},
|
||||
required: ['id', 'name', 'email'],
|
||||
});
|
||||
|
||||
app.addSchema({
|
||||
$id: 'userCreate',
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string', minLength: 1 },
|
||||
email: { type: 'string', format: 'email' },
|
||||
},
|
||||
required: ['name', 'email'],
|
||||
additionalProperties: false,
|
||||
});
|
||||
|
||||
app.addSchema({
|
||||
$id: 'error',
|
||||
type: 'object',
|
||||
properties: {
|
||||
statusCode: { type: 'integer' },
|
||||
error: { type: 'string' },
|
||||
message: { type: 'string' },
|
||||
},
|
||||
});
|
||||
|
||||
// Reference shared schemas
|
||||
app.post('/users', {
|
||||
schema: {
|
||||
body: { $ref: 'userCreate#' },
|
||||
response: {
|
||||
201: { $ref: 'user#' },
|
||||
400: { $ref: 'error#' },
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
|
||||
app.get('/users/:id', {
|
||||
schema: {
|
||||
params: {
|
||||
type: 'object',
|
||||
properties: { id: { type: 'string', format: 'uuid' } },
|
||||
required: ['id'],
|
||||
},
|
||||
response: {
|
||||
200: { $ref: 'user#' },
|
||||
404: { $ref: 'error#' },
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Array Schemas
|
||||
|
||||
Define schemas for array responses:
|
||||
|
||||
```typescript
|
||||
app.addSchema({
|
||||
$id: 'userList',
|
||||
type: 'object',
|
||||
properties: {
|
||||
users: {
|
||||
type: 'array',
|
||||
items: { $ref: 'user#' },
|
||||
},
|
||||
total: { type: 'integer' },
|
||||
page: { type: 'integer' },
|
||||
pageSize: { type: 'integer' },
|
||||
},
|
||||
});
|
||||
|
||||
app.get('/users', {
|
||||
schema: {
|
||||
querystring: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
page: { type: 'integer', minimum: 1, default: 1 },
|
||||
pageSize: { type: 'integer', minimum: 1, maximum: 100, default: 20 },
|
||||
},
|
||||
},
|
||||
response: {
|
||||
200: { $ref: 'userList#' },
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Custom Formats
|
||||
|
||||
Add custom validation formats:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
ajv: {
|
||||
customOptions: {
|
||||
formats: {
|
||||
'iso-country': /^[A-Z]{2}$/,
|
||||
'phone': /^\+?[1-9]\d{1,14}$/,
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Or add formats dynamically
|
||||
app.addSchema({
|
||||
$id: 'address',
|
||||
type: 'object',
|
||||
properties: {
|
||||
street: { type: 'string' },
|
||||
country: { type: 'string', format: 'iso-country' },
|
||||
phone: { type: 'string', format: 'phone' },
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Custom Keywords
|
||||
|
||||
Add custom validation keywords:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import Ajv from 'ajv';
|
||||
|
||||
const app = Fastify({
|
||||
ajv: {
|
||||
customOptions: {
|
||||
keywords: [
|
||||
{
|
||||
keyword: 'isEven',
|
||||
type: 'number',
|
||||
validate: (schema: boolean, data: number) => {
|
||||
if (schema) {
|
||||
return data % 2 === 0;
|
||||
}
|
||||
return true;
|
||||
},
|
||||
errors: false,
|
||||
},
|
||||
],
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Use custom keyword
|
||||
app.post('/numbers', {
|
||||
schema: {
|
||||
body: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
value: { type: 'integer', isEven: true },
|
||||
},
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Coercion
|
||||
|
||||
Fastify coerces types by default for query strings and params:
|
||||
|
||||
```typescript
|
||||
// Query string "?page=5&active=true" becomes:
|
||||
// { page: 5, active: true } (number and boolean, not strings)
|
||||
|
||||
app.get('/items', {
|
||||
schema: {
|
||||
querystring: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
page: { type: 'integer' }, // "5" -> 5
|
||||
active: { type: 'boolean' }, // "true" -> true
|
||||
tags: {
|
||||
type: 'array',
|
||||
items: { type: 'string' }, // "a,b,c" -> ["a", "b", "c"]
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Validation Error Handling
|
||||
|
||||
Customize validation error responses:
|
||||
|
||||
```typescript
|
||||
app.setErrorHandler((error, request, reply) => {
|
||||
if (error.validation) {
|
||||
reply.code(400).send({
|
||||
error: 'Validation Error',
|
||||
message: 'Request validation failed',
|
||||
details: error.validation.map((err) => ({
|
||||
field: err.instancePath || err.params?.missingProperty,
|
||||
message: err.message,
|
||||
keyword: err.keyword,
|
||||
})),
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Handle other errors
|
||||
reply.code(error.statusCode || 500).send({
|
||||
error: error.name,
|
||||
message: error.message,
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Schema Compiler Options
|
||||
|
||||
Configure the Ajv schema compiler:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
ajv: {
|
||||
customOptions: {
|
||||
removeAdditional: 'all', // Remove extra properties
|
||||
useDefaults: true, // Apply default values
|
||||
coerceTypes: true, // Coerce types
|
||||
allErrors: true, // Report all errors, not just first
|
||||
},
|
||||
plugins: [
|
||||
require('ajv-formats'), // Add format validators
|
||||
],
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Nullable Fields
|
||||
|
||||
Handle nullable fields properly:
|
||||
|
||||
```typescript
|
||||
app.addSchema({
|
||||
$id: 'profile',
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string' },
|
||||
bio: { type: ['string', 'null'] }, // Can be string or null
|
||||
avatar: {
|
||||
oneOf: [
|
||||
{ type: 'string', format: 'uri' },
|
||||
{ type: 'null' },
|
||||
],
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Conditional Validation
|
||||
|
||||
Use if/then/else for conditional validation:
|
||||
|
||||
```typescript
|
||||
app.addSchema({
|
||||
$id: 'payment',
|
||||
type: 'object',
|
||||
properties: {
|
||||
method: { type: 'string', enum: ['card', 'bank'] },
|
||||
cardNumber: { type: 'string' },
|
||||
bankAccount: { type: 'string' },
|
||||
},
|
||||
required: ['method'],
|
||||
if: {
|
||||
properties: { method: { const: 'card' } },
|
||||
},
|
||||
then: {
|
||||
required: ['cardNumber'],
|
||||
},
|
||||
else: {
|
||||
required: ['bankAccount'],
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Schema Organization
|
||||
|
||||
Organize schemas in a dedicated file:
|
||||
|
||||
```typescript
|
||||
// schemas/index.ts
|
||||
export const schemas = [
|
||||
{
|
||||
$id: 'user',
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string', format: 'uuid' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string', format: 'email' },
|
||||
},
|
||||
},
|
||||
{
|
||||
$id: 'error',
|
||||
type: 'object',
|
||||
properties: {
|
||||
statusCode: { type: 'integer' },
|
||||
error: { type: 'string' },
|
||||
message: { type: 'string' },
|
||||
},
|
||||
},
|
||||
];
|
||||
|
||||
// app.ts
|
||||
import { schemas } from './schemas/index.js';
|
||||
|
||||
for (const schema of schemas) {
|
||||
app.addSchema(schema);
|
||||
}
|
||||
```
|
||||
|
||||
## OpenAPI/Swagger Integration
|
||||
|
||||
Schemas work directly with @fastify/swagger:
|
||||
|
||||
```typescript
|
||||
import fastifySwagger from '@fastify/swagger';
|
||||
import fastifySwaggerUi from '@fastify/swagger-ui';
|
||||
|
||||
app.register(fastifySwagger, {
|
||||
openapi: {
|
||||
info: {
|
||||
title: 'My API',
|
||||
version: '1.0.0',
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
app.register(fastifySwaggerUi, {
|
||||
routePrefix: '/docs',
|
||||
});
|
||||
|
||||
// Schemas are automatically converted to OpenAPI definitions
|
||||
```
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
Response schemas enable fast-json-stringify for serialization:
|
||||
|
||||
```typescript
|
||||
// With response schema - uses fast-json-stringify (faster)
|
||||
app.get('/users', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'array',
|
||||
items: { $ref: 'user#' },
|
||||
},
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
|
||||
// Without response schema - uses JSON.stringify (slower)
|
||||
app.get('/users-slow', handler);
|
||||
```
|
||||
|
||||
Always define response schemas for production APIs to benefit from optimized serialization.
|
||||
475
.claude/skills/fastify-best-practices/rules/serialization.md
Normal file
475
.claude/skills/fastify-best-practices/rules/serialization.md
Normal file
|
|
@ -0,0 +1,475 @@
|
|||
---
|
||||
name: serialization
|
||||
description: Response serialization in Fastify with TypeBox
|
||||
metadata:
|
||||
tags: serialization, response, json, fast-json-stringify, typebox
|
||||
---
|
||||
|
||||
# Response Serialization
|
||||
|
||||
## Use TypeBox for Type-Safe Response Schemas
|
||||
|
||||
Define response schemas with TypeBox for automatic TypeScript types and fast serialization:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import { Type, type Static } from '@sinclair/typebox';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Define response schema with TypeBox
|
||||
const UserResponse = Type.Object({
|
||||
id: Type.String(),
|
||||
name: Type.String(),
|
||||
email: Type.String(),
|
||||
});
|
||||
|
||||
const UsersResponse = Type.Array(UserResponse);
|
||||
|
||||
type UserResponseType = Static<typeof UserResponse>;
|
||||
|
||||
// With TypeBox schema - uses fast-json-stringify (faster) + TypeScript types
|
||||
app.get<{ Reply: Static<typeof UsersResponse> }>('/users', {
|
||||
schema: {
|
||||
response: {
|
||||
200: UsersResponse,
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return db.users.findAll();
|
||||
});
|
||||
|
||||
// Without schema - uses JSON.stringify (slower), no type safety
|
||||
app.get('/users-slow', async () => {
|
||||
return db.users.findAll();
|
||||
});
|
||||
```
|
||||
|
||||
## Fast JSON Stringify
|
||||
|
||||
Fastify uses `fast-json-stringify` when response schemas are defined. This provides:
|
||||
|
||||
1. **Performance**: 2-3x faster serialization than JSON.stringify
|
||||
2. **Security**: Only defined properties are serialized (strips sensitive data)
|
||||
3. **Type coercion**: Ensures output matches the schema
|
||||
4. **TypeScript**: Full type inference with TypeBox
|
||||
|
||||
## Response Schema Benefits
|
||||
|
||||
1. **Performance**: 2-3x faster serialization
|
||||
2. **Security**: Only defined properties are included
|
||||
3. **Documentation**: OpenAPI/Swagger integration
|
||||
4. **Type coercion**: Ensures correct output types
|
||||
|
||||
```typescript
|
||||
app.get('/user/:id', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
name: { type: 'string' },
|
||||
// password is NOT in schema, so it's stripped
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async (request) => {
|
||||
const user = await db.users.findById(request.params.id);
|
||||
// Even if user has password field, it won't be serialized
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## Multiple Status Code Schemas
|
||||
|
||||
Define schemas for different response codes:
|
||||
|
||||
```typescript
|
||||
app.get('/users/:id', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string' },
|
||||
},
|
||||
},
|
||||
404: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
statusCode: { type: 'integer' },
|
||||
error: { type: 'string' },
|
||||
message: { type: 'string' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async (request, reply) => {
|
||||
const user = await db.users.findById(request.params.id);
|
||||
|
||||
if (!user) {
|
||||
reply.code(404);
|
||||
return { statusCode: 404, error: 'Not Found', message: 'User not found' };
|
||||
}
|
||||
|
||||
return user;
|
||||
});
|
||||
```
|
||||
|
||||
## Default Response Schema
|
||||
|
||||
Use 'default' for common error responses:
|
||||
|
||||
```typescript
|
||||
app.get('/resource', {
|
||||
schema: {
|
||||
response: {
|
||||
200: { $ref: 'resource#' },
|
||||
'4xx': {
|
||||
type: 'object',
|
||||
properties: {
|
||||
statusCode: { type: 'integer' },
|
||||
error: { type: 'string' },
|
||||
message: { type: 'string' },
|
||||
},
|
||||
},
|
||||
'5xx': {
|
||||
type: 'object',
|
||||
properties: {
|
||||
statusCode: { type: 'integer' },
|
||||
error: { type: 'string' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, handler);
|
||||
```
|
||||
|
||||
## Custom Serializers
|
||||
|
||||
Create custom serialization functions:
|
||||
|
||||
```typescript
|
||||
// Per-route serializer
|
||||
app.get('/custom', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
value: { type: 'string' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
serializerCompiler: ({ schema }) => {
|
||||
return (data) => {
|
||||
// Custom serialization logic
|
||||
return JSON.stringify({
|
||||
value: String(data.value).toUpperCase(),
|
||||
serializedAt: new Date().toISOString(),
|
||||
});
|
||||
};
|
||||
},
|
||||
}, async () => {
|
||||
return { value: 'hello' };
|
||||
});
|
||||
```
|
||||
|
||||
## Shared Serializers
|
||||
|
||||
Use the global serializer compiler:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
const app = Fastify({
|
||||
serializerCompiler: ({ schema, method, url, httpStatus }) => {
|
||||
// Custom compilation logic
|
||||
const stringify = fastJson(schema);
|
||||
return (data) => stringify(data);
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Serialization with Type Coercion
|
||||
|
||||
fast-json-stringify coerces types:
|
||||
|
||||
```typescript
|
||||
app.get('/data', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
count: { type: 'integer' }, // '5' -> 5
|
||||
active: { type: 'boolean' }, // 'true' -> true
|
||||
tags: {
|
||||
type: 'array',
|
||||
items: { type: 'string' }, // [1, 2] -> ['1', '2']
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return {
|
||||
count: '5', // Coerced to integer
|
||||
active: 'true', // Coerced to boolean
|
||||
tags: [1, 2, 3], // Coerced to strings
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Nullable Fields
|
||||
|
||||
Handle nullable fields properly:
|
||||
|
||||
```typescript
|
||||
app.get('/profile', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string' },
|
||||
bio: { type: ['string', 'null'] },
|
||||
avatar: {
|
||||
oneOf: [
|
||||
{ type: 'string', format: 'uri' },
|
||||
{ type: 'null' },
|
||||
],
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return {
|
||||
name: 'John',
|
||||
bio: null,
|
||||
avatar: null,
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Additional Properties
|
||||
|
||||
Control extra properties in response:
|
||||
|
||||
```typescript
|
||||
// Strip additional properties (default)
|
||||
app.get('/strict', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
name: { type: 'string' },
|
||||
},
|
||||
additionalProperties: false,
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return { id: '1', name: 'John', secret: 'hidden' };
|
||||
// Output: { "id": "1", "name": "John" }
|
||||
});
|
||||
|
||||
// Allow additional properties
|
||||
app.get('/flexible', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
},
|
||||
additionalProperties: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return { id: '1', extra: 'included' };
|
||||
// Output: { "id": "1", "extra": "included" }
|
||||
});
|
||||
```
|
||||
|
||||
## Nested Objects
|
||||
|
||||
Serialize nested structures:
|
||||
|
||||
```typescript
|
||||
app.addSchema({
|
||||
$id: 'address',
|
||||
type: 'object',
|
||||
properties: {
|
||||
street: { type: 'string' },
|
||||
city: { type: 'string' },
|
||||
country: { type: 'string' },
|
||||
},
|
||||
});
|
||||
|
||||
app.get('/user', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string' },
|
||||
address: { $ref: 'address#' },
|
||||
contacts: {
|
||||
type: 'array',
|
||||
items: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
type: { type: 'string' },
|
||||
value: { type: 'string' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
return {
|
||||
name: 'John',
|
||||
address: { street: '123 Main', city: 'Boston', country: 'USA' },
|
||||
contacts: [
|
||||
{ type: 'email', value: 'john@example.com' },
|
||||
{ type: 'phone', value: '+1234567890' },
|
||||
],
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Date Serialization
|
||||
|
||||
Handle dates consistently:
|
||||
|
||||
```typescript
|
||||
app.get('/events', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'array',
|
||||
items: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
name: { type: 'string' },
|
||||
date: { type: 'string', format: 'date-time' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
const events = await db.events.findAll();
|
||||
|
||||
// Convert Date objects to ISO strings
|
||||
return events.map((e) => ({
|
||||
...e,
|
||||
date: e.date.toISOString(),
|
||||
}));
|
||||
});
|
||||
```
|
||||
|
||||
## BigInt Serialization
|
||||
|
||||
Handle BigInt values:
|
||||
|
||||
```typescript
|
||||
// BigInt is not JSON serializable by default
|
||||
app.get('/large-number', {
|
||||
schema: {
|
||||
response: {
|
||||
200: {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' }, // Serialize as string
|
||||
count: { type: 'integer' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}, async () => {
|
||||
const bigValue = 9007199254740993n;
|
||||
|
||||
return {
|
||||
id: bigValue.toString(), // Convert to string
|
||||
count: Number(bigValue), // Or number if safe
|
||||
};
|
||||
});
|
||||
```
|
||||
|
||||
## Stream Responses
|
||||
|
||||
Stream responses bypass serialization:
|
||||
|
||||
```typescript
|
||||
import { createReadStream } from 'node:fs';
|
||||
|
||||
app.get('/file', async (request, reply) => {
|
||||
const stream = createReadStream('./data.json');
|
||||
reply.type('application/json');
|
||||
return reply.send(stream);
|
||||
});
|
||||
|
||||
// Streaming JSON array
|
||||
app.get('/stream', async (request, reply) => {
|
||||
reply.type('application/json');
|
||||
|
||||
const cursor = db.users.findCursor();
|
||||
|
||||
reply.raw.write('[');
|
||||
let first = true;
|
||||
|
||||
for await (const user of cursor) {
|
||||
if (!first) reply.raw.write(',');
|
||||
reply.raw.write(JSON.stringify(user));
|
||||
first = false;
|
||||
}
|
||||
|
||||
reply.raw.write(']');
|
||||
reply.raw.end();
|
||||
});
|
||||
```
|
||||
|
||||
## Pre-Serialization Hook
|
||||
|
||||
Modify data before serialization:
|
||||
|
||||
```typescript
|
||||
app.addHook('preSerialization', async (request, reply, payload) => {
|
||||
// Add metadata to responses
|
||||
if (payload && typeof payload === 'object' && !Array.isArray(payload)) {
|
||||
return {
|
||||
...payload,
|
||||
_links: {
|
||||
self: request.url,
|
||||
},
|
||||
};
|
||||
}
|
||||
return payload;
|
||||
});
|
||||
```
|
||||
|
||||
## Disable Serialization
|
||||
|
||||
Skip serialization for specific routes:
|
||||
|
||||
```typescript
|
||||
app.get('/raw', async (request, reply) => {
|
||||
const data = JSON.stringify({ raw: true });
|
||||
reply.type('application/json');
|
||||
reply.serializer((payload) => payload); // Pass through
|
||||
return data;
|
||||
});
|
||||
```
|
||||
536
.claude/skills/fastify-best-practices/rules/testing.md
Normal file
536
.claude/skills/fastify-best-practices/rules/testing.md
Normal file
|
|
@ -0,0 +1,536 @@
|
|||
---
|
||||
name: testing
|
||||
description: Testing Fastify applications with inject()
|
||||
metadata:
|
||||
tags: testing, inject, node-test, integration, unit
|
||||
---
|
||||
|
||||
# Testing Fastify Applications
|
||||
|
||||
## Using inject() for Request Testing
|
||||
|
||||
Fastify's `inject()` method simulates HTTP requests without network overhead:
|
||||
|
||||
```typescript
|
||||
import { describe, it, before, after } from 'node:test';
|
||||
import Fastify from 'fastify';
|
||||
import { buildApp } from './app.js';
|
||||
|
||||
describe('User API', () => {
|
||||
let app;
|
||||
|
||||
before(async () => {
|
||||
app = await buildApp();
|
||||
await app.ready();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should return users list', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/users',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
t.assert.equal(response.headers['content-type'], 'application/json; charset=utf-8');
|
||||
|
||||
const body = response.json();
|
||||
t.assert.ok(Array.isArray(body.users));
|
||||
});
|
||||
|
||||
it('should create a user', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/users',
|
||||
payload: {
|
||||
name: 'John Doe',
|
||||
email: 'john@example.com',
|
||||
},
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 201);
|
||||
|
||||
const body = response.json();
|
||||
t.assert.equal(body.name, 'John Doe');
|
||||
t.assert.ok(body.id);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Testing with Headers and Authentication
|
||||
|
||||
Test authenticated endpoints:
|
||||
|
||||
```typescript
|
||||
describe('Protected Routes', () => {
|
||||
let app;
|
||||
let authToken;
|
||||
|
||||
before(async () => {
|
||||
app = await buildApp();
|
||||
await app.ready();
|
||||
|
||||
// Get auth token
|
||||
const loginResponse = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/auth/login',
|
||||
payload: {
|
||||
email: 'test@example.com',
|
||||
password: 'password123',
|
||||
},
|
||||
});
|
||||
|
||||
authToken = loginResponse.json().token;
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should reject unauthenticated requests', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/profile',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 401);
|
||||
});
|
||||
|
||||
it('should return profile for authenticated user', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/profile',
|
||||
headers: {
|
||||
authorization: `Bearer ${authToken}`,
|
||||
},
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
t.assert.equal(response.json().email, 'test@example.com');
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Testing Query Parameters
|
||||
|
||||
Test routes with query strings:
|
||||
|
||||
```typescript
|
||||
it('should filter users by status', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/users',
|
||||
query: {
|
||||
status: 'active',
|
||||
page: '1',
|
||||
limit: '10',
|
||||
},
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
const body = response.json();
|
||||
t.assert.ok(body.users.every((u) => u.status === 'active'));
|
||||
});
|
||||
|
||||
// Or use URL with query string
|
||||
it('should search users', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/users?q=john&sort=name',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
});
|
||||
```
|
||||
|
||||
## Testing URL Parameters
|
||||
|
||||
Test routes with path parameters:
|
||||
|
||||
```typescript
|
||||
it('should return user by id', async (t) => {
|
||||
const userId = 'user-123';
|
||||
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: `/users/${userId}`,
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
t.assert.equal(response.json().id, userId);
|
||||
});
|
||||
|
||||
it('should return 404 for non-existent user', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/users/non-existent',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 404);
|
||||
});
|
||||
```
|
||||
|
||||
## Testing Validation Errors
|
||||
|
||||
Test schema validation:
|
||||
|
||||
```typescript
|
||||
describe('Validation', () => {
|
||||
it('should reject invalid email', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/users',
|
||||
payload: {
|
||||
name: 'John',
|
||||
email: 'not-an-email',
|
||||
},
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 400);
|
||||
const body = response.json();
|
||||
t.assert.ok(body.message.includes('email'));
|
||||
});
|
||||
|
||||
it('should reject missing required fields', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/users',
|
||||
payload: {
|
||||
name: 'John',
|
||||
// missing email
|
||||
},
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 400);
|
||||
});
|
||||
|
||||
it('should coerce query parameters', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/items?limit=10&active=true',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
// limit is coerced to number, active to boolean
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Testing File Uploads
|
||||
|
||||
Test multipart form data:
|
||||
|
||||
```typescript
|
||||
import { createReadStream } from 'node:fs';
|
||||
import FormData from 'form-data';
|
||||
|
||||
it('should upload file', async (t) => {
|
||||
const form = new FormData();
|
||||
form.append('file', createReadStream('./test/fixtures/test.pdf'));
|
||||
form.append('name', 'test-document');
|
||||
|
||||
const response = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/upload',
|
||||
payload: form,
|
||||
headers: form.getHeaders(),
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
t.assert.ok(response.json().fileId);
|
||||
});
|
||||
```
|
||||
|
||||
## Testing Streams
|
||||
|
||||
Test streaming responses:
|
||||
|
||||
```typescript
|
||||
it('should stream large file', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/files/large-file',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
t.assert.ok(response.rawPayload.length > 0);
|
||||
});
|
||||
```
|
||||
|
||||
## Mocking Dependencies
|
||||
|
||||
Mock external services and databases:
|
||||
|
||||
```typescript
|
||||
import { describe, it, before, after, mock } from 'node:test';
|
||||
|
||||
describe('User Service', () => {
|
||||
let app;
|
||||
|
||||
before(async () => {
|
||||
// Create app with mocked dependencies
|
||||
const mockDb = {
|
||||
users: {
|
||||
findAll: mock.fn(async () => [
|
||||
{ id: '1', name: 'User 1' },
|
||||
{ id: '2', name: 'User 2' },
|
||||
]),
|
||||
findById: mock.fn(async (id) => {
|
||||
if (id === '1') return { id: '1', name: 'User 1' };
|
||||
return null;
|
||||
}),
|
||||
create: mock.fn(async (data) => ({ id: 'new-id', ...data })),
|
||||
},
|
||||
};
|
||||
|
||||
app = Fastify();
|
||||
app.decorate('db', mockDb);
|
||||
app.register(import('./routes/users.js'));
|
||||
await app.ready();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should call findAll', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/users',
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
t.assert.equal(app.db.users.findAll.mock.calls.length, 1);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Testing Plugins in Isolation
|
||||
|
||||
Test plugins independently:
|
||||
|
||||
```typescript
|
||||
import { describe, it, before, after } from 'node:test';
|
||||
import Fastify from 'fastify';
|
||||
import cachePlugin from './plugins/cache.js';
|
||||
|
||||
describe('Cache Plugin', () => {
|
||||
let app;
|
||||
|
||||
before(async () => {
|
||||
app = Fastify();
|
||||
app.register(cachePlugin, { ttl: 1000 });
|
||||
await app.ready();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should decorate fastify with cache', (t) => {
|
||||
t.assert.ok(app.hasDecorator('cache'));
|
||||
t.assert.equal(typeof app.cache.get, 'function');
|
||||
t.assert.equal(typeof app.cache.set, 'function');
|
||||
});
|
||||
|
||||
it('should cache and retrieve values', (t) => {
|
||||
app.cache.set('key', 'value');
|
||||
t.assert.equal(app.cache.get('key'), 'value');
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Testing Hooks
|
||||
|
||||
Test hook behavior:
|
||||
|
||||
```typescript
|
||||
describe('Hooks', () => {
|
||||
it('should add request id header', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/health',
|
||||
});
|
||||
|
||||
t.assert.ok(response.headers['x-request-id']);
|
||||
});
|
||||
|
||||
it('should log request timing', async (t) => {
|
||||
const logs = [];
|
||||
const app = Fastify({
|
||||
logger: {
|
||||
level: 'info',
|
||||
stream: {
|
||||
write: (msg) => logs.push(JSON.parse(msg)),
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
app.register(import('./app.js'));
|
||||
await app.ready();
|
||||
|
||||
await app.inject({ method: 'GET', url: '/health' });
|
||||
|
||||
const responseLog = logs.find((l) => l.msg?.includes('completed'));
|
||||
t.assert.ok(responseLog);
|
||||
t.assert.ok(responseLog.responseTime);
|
||||
|
||||
await app.close();
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Test Factory Pattern
|
||||
|
||||
Create a reusable test app builder:
|
||||
|
||||
```typescript
|
||||
// test/helper.ts
|
||||
import Fastify from 'fastify';
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
|
||||
interface TestContext {
|
||||
app: FastifyInstance;
|
||||
inject: FastifyInstance['inject'];
|
||||
}
|
||||
|
||||
export async function buildTestApp(options = {}): Promise<TestContext> {
|
||||
const app = Fastify({
|
||||
logger: false, // Disable logging in tests
|
||||
...options,
|
||||
});
|
||||
|
||||
// Register plugins
|
||||
app.register(import('../src/plugins/database.js'), {
|
||||
connectionString: process.env.TEST_DATABASE_URL,
|
||||
});
|
||||
app.register(import('../src/routes/index.js'));
|
||||
|
||||
await app.ready();
|
||||
|
||||
return {
|
||||
app,
|
||||
inject: app.inject.bind(app),
|
||||
};
|
||||
}
|
||||
|
||||
// Usage in tests
|
||||
describe('API Tests', () => {
|
||||
let ctx: TestContext;
|
||||
|
||||
before(async () => {
|
||||
ctx = await buildTestApp();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await ctx.app.close();
|
||||
});
|
||||
|
||||
it('should work', async (t) => {
|
||||
const response = await ctx.inject({
|
||||
method: 'GET',
|
||||
url: '/health',
|
||||
});
|
||||
t.assert.equal(response.statusCode, 200);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Database Testing with Transactions
|
||||
|
||||
Use transactions for test isolation:
|
||||
|
||||
```typescript
|
||||
describe('Database Integration', () => {
|
||||
let app;
|
||||
let transaction;
|
||||
|
||||
before(async () => {
|
||||
app = await buildApp();
|
||||
await app.ready();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
beforeEach(async () => {
|
||||
transaction = await app.db.beginTransaction();
|
||||
app.db.setTransaction(transaction);
|
||||
});
|
||||
|
||||
afterEach(async () => {
|
||||
await transaction.rollback();
|
||||
});
|
||||
|
||||
it('should create user', async (t) => {
|
||||
const response = await app.inject({
|
||||
method: 'POST',
|
||||
url: '/users',
|
||||
payload: { name: 'Test', email: 'test@example.com' },
|
||||
});
|
||||
|
||||
t.assert.equal(response.statusCode, 201);
|
||||
// Transaction is rolled back after test
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Parallel Test Execution
|
||||
|
||||
Structure tests for parallel execution:
|
||||
|
||||
```typescript
|
||||
// Tests run in parallel by default with node:test
|
||||
// Use separate app instances or proper isolation
|
||||
|
||||
import { describe, it } from 'node:test';
|
||||
|
||||
describe('User API', async () => {
|
||||
// Each test suite gets its own app instance
|
||||
const app = await buildTestApp();
|
||||
|
||||
it('test 1', async (t) => {
|
||||
// ...
|
||||
});
|
||||
|
||||
it('test 2', async (t) => {
|
||||
// ...
|
||||
});
|
||||
|
||||
// Cleanup after all tests in this suite
|
||||
after(() => app.close());
|
||||
});
|
||||
|
||||
describe('Post API', async () => {
|
||||
const app = await buildTestApp();
|
||||
|
||||
it('test 1', async (t) => {
|
||||
// ...
|
||||
});
|
||||
|
||||
after(() => app.close());
|
||||
});
|
||||
```
|
||||
|
||||
## Running Tests
|
||||
|
||||
```bash
|
||||
# Run all tests
|
||||
node --test
|
||||
|
||||
# Run with TypeScript
|
||||
node --test src/**/*.test.ts
|
||||
|
||||
# Run specific file
|
||||
node --test src/routes/users.test.ts
|
||||
|
||||
# With coverage
|
||||
node --test --experimental-test-coverage
|
||||
|
||||
# Watch mode
|
||||
node --test --watch
|
||||
```
|
||||
458
.claude/skills/fastify-best-practices/rules/typescript.md
Normal file
458
.claude/skills/fastify-best-practices/rules/typescript.md
Normal file
|
|
@ -0,0 +1,458 @@
|
|||
---
|
||||
name: typescript
|
||||
description: TypeScript integration with Fastify
|
||||
metadata:
|
||||
tags: typescript, types, generics, type-safety
|
||||
---
|
||||
|
||||
# TypeScript Integration
|
||||
|
||||
## Type Stripping with Node.js
|
||||
|
||||
Use Node.js built-in type stripping (Node.js 22.6+):
|
||||
|
||||
```bash
|
||||
# Run TypeScript directly
|
||||
node --experimental-strip-types app.ts
|
||||
|
||||
# In Node.js 23+
|
||||
node app.ts
|
||||
```
|
||||
|
||||
```json
|
||||
// package.json
|
||||
{
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"start": "node app.ts",
|
||||
"dev": "node --watch app.ts"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
```typescript
|
||||
// tsconfig.json for type stripping
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ESNext",
|
||||
"module": "NodeNext",
|
||||
"moduleResolution": "NodeNext",
|
||||
"verbatimModuleSyntax": true,
|
||||
"erasableSyntaxOnly": true,
|
||||
"noEmit": true,
|
||||
"strict": true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Basic Type Safety
|
||||
|
||||
Type your Fastify application:
|
||||
|
||||
```typescript
|
||||
import Fastify, { type FastifyInstance, type FastifyRequest, type FastifyReply } from 'fastify';
|
||||
|
||||
const app: FastifyInstance = Fastify({ logger: true });
|
||||
|
||||
app.get('/health', async (request: FastifyRequest, reply: FastifyReply) => {
|
||||
return { status: 'ok' };
|
||||
});
|
||||
|
||||
await app.listen({ port: 3000 });
|
||||
```
|
||||
|
||||
## Typing Route Handlers
|
||||
|
||||
Use generics to type request parts:
|
||||
|
||||
```typescript
|
||||
import type { FastifyRequest, FastifyReply } from 'fastify';
|
||||
|
||||
interface CreateUserBody {
|
||||
name: string;
|
||||
email: string;
|
||||
}
|
||||
|
||||
interface UserParams {
|
||||
id: string;
|
||||
}
|
||||
|
||||
interface UserQuery {
|
||||
include?: string;
|
||||
}
|
||||
|
||||
// Type the request with generics
|
||||
app.post<{
|
||||
Body: CreateUserBody;
|
||||
}>('/users', async (request, reply) => {
|
||||
const { name, email } = request.body; // Fully typed
|
||||
return { name, email };
|
||||
});
|
||||
|
||||
app.get<{
|
||||
Params: UserParams;
|
||||
Querystring: UserQuery;
|
||||
}>('/users/:id', async (request) => {
|
||||
const { id } = request.params; // string
|
||||
const { include } = request.query; // string | undefined
|
||||
return { id, include };
|
||||
});
|
||||
|
||||
// Full route options typing
|
||||
app.route<{
|
||||
Params: UserParams;
|
||||
Querystring: UserQuery;
|
||||
Body: CreateUserBody;
|
||||
Reply: { user: { id: string; name: string } };
|
||||
}>({
|
||||
method: 'PUT',
|
||||
url: '/users/:id',
|
||||
handler: async (request, reply) => {
|
||||
return { user: { id: request.params.id, name: request.body.name } };
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Type Providers
|
||||
|
||||
Use @fastify/type-provider-typebox for runtime + compile-time safety:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import { TypeBoxTypeProvider } from '@fastify/type-provider-typebox';
|
||||
import { Type } from '@sinclair/typebox';
|
||||
|
||||
const app = Fastify().withTypeProvider<TypeBoxTypeProvider>();
|
||||
|
||||
const UserSchema = Type.Object({
|
||||
id: Type.String(),
|
||||
name: Type.String(),
|
||||
email: Type.String({ format: 'email' }),
|
||||
});
|
||||
|
||||
const CreateUserSchema = Type.Object({
|
||||
name: Type.String({ minLength: 1 }),
|
||||
email: Type.String({ format: 'email' }),
|
||||
});
|
||||
|
||||
app.post('/users', {
|
||||
schema: {
|
||||
body: CreateUserSchema,
|
||||
response: {
|
||||
201: UserSchema,
|
||||
},
|
||||
},
|
||||
}, async (request, reply) => {
|
||||
// request.body is typed as { name: string; email: string }
|
||||
const { name, email } = request.body;
|
||||
|
||||
reply.code(201);
|
||||
return { id: 'generated', name, email };
|
||||
});
|
||||
```
|
||||
|
||||
## Typing Decorators
|
||||
|
||||
Extend Fastify types with declaration merging:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
|
||||
// Declare types for decorators
|
||||
declare module 'fastify' {
|
||||
interface FastifyInstance {
|
||||
config: {
|
||||
port: number;
|
||||
host: string;
|
||||
};
|
||||
db: Database;
|
||||
}
|
||||
|
||||
interface FastifyRequest {
|
||||
user?: {
|
||||
id: string;
|
||||
email: string;
|
||||
role: string;
|
||||
};
|
||||
startTime: number;
|
||||
}
|
||||
|
||||
interface FastifyReply {
|
||||
sendSuccess: (data: unknown) => void;
|
||||
}
|
||||
}
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
// Add decorators
|
||||
app.decorate('config', { port: 3000, host: 'localhost' });
|
||||
app.decorate('db', new Database());
|
||||
|
||||
app.decorateRequest('user', null);
|
||||
app.decorateRequest('startTime', 0);
|
||||
|
||||
app.decorateReply('sendSuccess', function (data: unknown) {
|
||||
this.send({ success: true, data });
|
||||
});
|
||||
|
||||
// Now fully typed
|
||||
app.get('/profile', async (request, reply) => {
|
||||
const user = request.user; // { id: string; email: string; role: string } | undefined
|
||||
const config = app.config; // { port: number; host: string }
|
||||
|
||||
reply.sendSuccess({ user });
|
||||
});
|
||||
```
|
||||
|
||||
## Typing Plugins
|
||||
|
||||
Type plugin options and exports:
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
import type { FastifyPluginAsync } from 'fastify';
|
||||
|
||||
interface DatabasePluginOptions {
|
||||
connectionString: string;
|
||||
poolSize?: number;
|
||||
}
|
||||
|
||||
declare module 'fastify' {
|
||||
interface FastifyInstance {
|
||||
db: {
|
||||
query: (sql: string, params?: unknown[]) => Promise<unknown[]>;
|
||||
close: () => Promise<void>;
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
const databasePlugin: FastifyPluginAsync<DatabasePluginOptions> = async (
|
||||
fastify,
|
||||
options,
|
||||
) => {
|
||||
const { connectionString, poolSize = 10 } = options;
|
||||
|
||||
const db = await createConnection(connectionString, poolSize);
|
||||
|
||||
fastify.decorate('db', {
|
||||
query: (sql: string, params?: unknown[]) => db.query(sql, params),
|
||||
close: () => db.end(),
|
||||
});
|
||||
|
||||
fastify.addHook('onClose', async () => {
|
||||
await db.end();
|
||||
});
|
||||
};
|
||||
|
||||
export default fp(databasePlugin, {
|
||||
name: 'database',
|
||||
});
|
||||
```
|
||||
|
||||
## Typing Hooks
|
||||
|
||||
Type hook functions:
|
||||
|
||||
```typescript
|
||||
import type {
|
||||
FastifyRequest,
|
||||
FastifyReply,
|
||||
onRequestHookHandler,
|
||||
preHandlerHookHandler,
|
||||
} from 'fastify';
|
||||
|
||||
const authHook: preHandlerHookHandler = async (
|
||||
request: FastifyRequest,
|
||||
reply: FastifyReply,
|
||||
) => {
|
||||
const token = request.headers.authorization;
|
||||
if (!token) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return;
|
||||
}
|
||||
request.user = await verifyToken(token);
|
||||
};
|
||||
|
||||
const timingHook: onRequestHookHandler = async (request) => {
|
||||
request.startTime = Date.now();
|
||||
};
|
||||
|
||||
app.addHook('onRequest', timingHook);
|
||||
app.addHook('preHandler', authHook);
|
||||
```
|
||||
|
||||
## Typing Schema Objects
|
||||
|
||||
Create reusable typed schemas:
|
||||
|
||||
```typescript
|
||||
import type { JSONSchema7 } from 'json-schema';
|
||||
|
||||
// Define schema with const assertion for type inference
|
||||
const userSchema = {
|
||||
type: 'object',
|
||||
properties: {
|
||||
id: { type: 'string' },
|
||||
name: { type: 'string' },
|
||||
email: { type: 'string', format: 'email' },
|
||||
},
|
||||
required: ['id', 'name', 'email'],
|
||||
} as const satisfies JSONSchema7;
|
||||
|
||||
// Infer TypeScript type from schema
|
||||
type User = {
|
||||
id: string;
|
||||
name: string;
|
||||
email: string;
|
||||
};
|
||||
|
||||
app.get<{ Reply: User }>('/users/:id', {
|
||||
schema: {
|
||||
response: {
|
||||
200: userSchema,
|
||||
},
|
||||
},
|
||||
}, async (request) => {
|
||||
return { id: '1', name: 'John', email: 'john@example.com' };
|
||||
});
|
||||
```
|
||||
|
||||
## Shared Types
|
||||
|
||||
Organize types in dedicated files:
|
||||
|
||||
```typescript
|
||||
// types/index.ts
|
||||
export interface User {
|
||||
id: string;
|
||||
name: string;
|
||||
email: string;
|
||||
role: 'admin' | 'user';
|
||||
}
|
||||
|
||||
export interface CreateUserInput {
|
||||
name: string;
|
||||
email: string;
|
||||
}
|
||||
|
||||
export interface PaginationQuery {
|
||||
page?: number;
|
||||
limit?: number;
|
||||
sort?: string;
|
||||
}
|
||||
|
||||
// routes/users.ts
|
||||
import type { FastifyInstance } from 'fastify';
|
||||
import type { User, CreateUserInput, PaginationQuery } from '../types/index.js';
|
||||
|
||||
export default async function userRoutes(fastify: FastifyInstance) {
|
||||
fastify.get<{
|
||||
Querystring: PaginationQuery;
|
||||
Reply: { users: User[]; total: number };
|
||||
}>('/', async (request) => {
|
||||
const { page = 1, limit = 10 } = request.query;
|
||||
// ...
|
||||
});
|
||||
|
||||
fastify.post<{
|
||||
Body: CreateUserInput;
|
||||
Reply: User;
|
||||
}>('/', async (request, reply) => {
|
||||
reply.code(201);
|
||||
// ...
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## Type-Safe Route Registration
|
||||
|
||||
Create typed route factories:
|
||||
|
||||
```typescript
|
||||
import type { FastifyInstance, RouteOptions } from 'fastify';
|
||||
|
||||
function createCrudRoutes<T extends { id: string }>(
|
||||
fastify: FastifyInstance,
|
||||
options: {
|
||||
prefix: string;
|
||||
schema: {
|
||||
item: object;
|
||||
create: object;
|
||||
update: object;
|
||||
};
|
||||
handlers: {
|
||||
list: () => Promise<T[]>;
|
||||
get: (id: string) => Promise<T | null>;
|
||||
create: (data: unknown) => Promise<T>;
|
||||
update: (id: string, data: unknown) => Promise<T>;
|
||||
delete: (id: string) => Promise<void>;
|
||||
};
|
||||
},
|
||||
) {
|
||||
const { prefix, schema, handlers } = options;
|
||||
|
||||
fastify.get(`${prefix}`, {
|
||||
schema: { response: { 200: { type: 'array', items: schema.item } } },
|
||||
}, async () => handlers.list());
|
||||
|
||||
fastify.get(`${prefix}/:id`, {
|
||||
schema: { response: { 200: schema.item } },
|
||||
}, async (request) => {
|
||||
const item = await handlers.get((request.params as { id: string }).id);
|
||||
if (!item) throw { statusCode: 404, message: 'Not found' };
|
||||
return item;
|
||||
});
|
||||
|
||||
// ... more routes
|
||||
}
|
||||
```
|
||||
|
||||
## Avoiding Type Gymnastics
|
||||
|
||||
Keep types simple and practical:
|
||||
|
||||
```typescript
|
||||
// GOOD - simple, readable types
|
||||
interface UserRequest {
|
||||
Params: { id: string };
|
||||
Body: { name: string };
|
||||
}
|
||||
|
||||
app.put<UserRequest>('/users/:id', handler);
|
||||
|
||||
// AVOID - overly complex generic types
|
||||
type DeepPartial<T> = T extends object ? {
|
||||
[P in keyof T]?: DeepPartial<T[P]>;
|
||||
} : T;
|
||||
|
||||
// AVOID - excessive type inference
|
||||
type InferSchemaType<T> = T extends { properties: infer P }
|
||||
? { [K in keyof P]: InferPropertyType<P[K]> }
|
||||
: never;
|
||||
```
|
||||
|
||||
## Type Checking Without Compilation
|
||||
|
||||
Use TypeScript for type checking only:
|
||||
|
||||
```bash
|
||||
# Type check without emitting
|
||||
npx tsc --noEmit
|
||||
|
||||
# Watch mode
|
||||
npx tsc --noEmit --watch
|
||||
|
||||
# In CI
|
||||
npm run typecheck
|
||||
```
|
||||
|
||||
```json
|
||||
// package.json
|
||||
{
|
||||
"scripts": {
|
||||
"start": "node app.ts",
|
||||
"typecheck": "tsc --noEmit",
|
||||
"test": "npm run typecheck && node --test"
|
||||
}
|
||||
}
|
||||
```
|
||||
421
.claude/skills/fastify-best-practices/rules/websockets.md
Normal file
421
.claude/skills/fastify-best-practices/rules/websockets.md
Normal file
|
|
@ -0,0 +1,421 @@
|
|||
---
|
||||
name: websockets
|
||||
description: WebSocket support in Fastify
|
||||
metadata:
|
||||
tags: websockets, realtime, ws, socket
|
||||
---
|
||||
|
||||
# WebSocket Support
|
||||
|
||||
## Using @fastify/websocket
|
||||
|
||||
Add WebSocket support to Fastify:
|
||||
|
||||
```typescript
|
||||
import Fastify from 'fastify';
|
||||
import websocket from '@fastify/websocket';
|
||||
|
||||
const app = Fastify();
|
||||
|
||||
app.register(websocket);
|
||||
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
socket.on('message', (message) => {
|
||||
const data = message.toString();
|
||||
console.log('Received:', data);
|
||||
|
||||
// Echo back
|
||||
socket.send(`Echo: ${data}`);
|
||||
});
|
||||
|
||||
socket.on('close', () => {
|
||||
console.log('Client disconnected');
|
||||
});
|
||||
|
||||
socket.on('error', (error) => {
|
||||
console.error('WebSocket error:', error);
|
||||
});
|
||||
});
|
||||
|
||||
await app.listen({ port: 3000 });
|
||||
```
|
||||
|
||||
## WebSocket with Hooks
|
||||
|
||||
Use Fastify hooks with WebSocket routes:
|
||||
|
||||
```typescript
|
||||
app.register(async function wsRoutes(fastify) {
|
||||
// This hook runs before WebSocket upgrade
|
||||
fastify.addHook('preValidation', async (request, reply) => {
|
||||
const token = request.headers.authorization;
|
||||
if (!token) {
|
||||
reply.code(401).send({ error: 'Unauthorized' });
|
||||
return;
|
||||
}
|
||||
request.user = await verifyToken(token);
|
||||
});
|
||||
|
||||
fastify.get('/ws', { websocket: true }, (socket, request) => {
|
||||
console.log('Connected user:', request.user.id);
|
||||
|
||||
socket.on('message', (message) => {
|
||||
// Handle authenticated messages
|
||||
});
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Connection Options
|
||||
|
||||
Configure WebSocket server options:
|
||||
|
||||
```typescript
|
||||
app.register(websocket, {
|
||||
options: {
|
||||
maxPayload: 1048576, // 1MB max message size
|
||||
clientTracking: true,
|
||||
perMessageDeflate: {
|
||||
zlibDeflateOptions: {
|
||||
chunkSize: 1024,
|
||||
memLevel: 7,
|
||||
level: 3,
|
||||
},
|
||||
zlibInflateOptions: {
|
||||
chunkSize: 10 * 1024,
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
```
|
||||
|
||||
## Broadcast to All Clients
|
||||
|
||||
Broadcast messages to connected clients:
|
||||
|
||||
```typescript
|
||||
const clients = new Set<WebSocket>();
|
||||
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
clients.add(socket);
|
||||
|
||||
socket.on('close', () => {
|
||||
clients.delete(socket);
|
||||
});
|
||||
|
||||
socket.on('message', (message) => {
|
||||
// Broadcast to all other clients
|
||||
for (const client of clients) {
|
||||
if (client !== socket && client.readyState === WebSocket.OPEN) {
|
||||
client.send(message);
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// Broadcast from HTTP route
|
||||
app.post('/broadcast', async (request) => {
|
||||
const { message } = request.body;
|
||||
|
||||
for (const client of clients) {
|
||||
if (client.readyState === WebSocket.OPEN) {
|
||||
client.send(JSON.stringify({ type: 'broadcast', message }));
|
||||
}
|
||||
}
|
||||
|
||||
return { sent: clients.size };
|
||||
});
|
||||
```
|
||||
|
||||
## Rooms/Channels Pattern
|
||||
|
||||
Organize connections into rooms:
|
||||
|
||||
```typescript
|
||||
const rooms = new Map<string, Set<WebSocket>>();
|
||||
|
||||
function joinRoom(socket: WebSocket, roomId: string) {
|
||||
if (!rooms.has(roomId)) {
|
||||
rooms.set(roomId, new Set());
|
||||
}
|
||||
rooms.get(roomId)!.add(socket);
|
||||
}
|
||||
|
||||
function leaveRoom(socket: WebSocket, roomId: string) {
|
||||
rooms.get(roomId)?.delete(socket);
|
||||
if (rooms.get(roomId)?.size === 0) {
|
||||
rooms.delete(roomId);
|
||||
}
|
||||
}
|
||||
|
||||
function broadcastToRoom(roomId: string, message: string, exclude?: WebSocket) {
|
||||
const room = rooms.get(roomId);
|
||||
if (!room) return;
|
||||
|
||||
for (const client of room) {
|
||||
if (client !== exclude && client.readyState === WebSocket.OPEN) {
|
||||
client.send(message);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
app.get('/ws/:roomId', { websocket: true }, (socket, request) => {
|
||||
const { roomId } = request.params as { roomId: string };
|
||||
|
||||
joinRoom(socket, roomId);
|
||||
|
||||
socket.on('message', (message) => {
|
||||
broadcastToRoom(roomId, message.toString(), socket);
|
||||
});
|
||||
|
||||
socket.on('close', () => {
|
||||
leaveRoom(socket, roomId);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Structured Message Protocol
|
||||
|
||||
Use JSON for structured messages:
|
||||
|
||||
```typescript
|
||||
interface WSMessage {
|
||||
type: string;
|
||||
payload?: unknown;
|
||||
id?: string;
|
||||
}
|
||||
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
function send(message: WSMessage) {
|
||||
socket.send(JSON.stringify(message));
|
||||
}
|
||||
|
||||
socket.on('message', (raw) => {
|
||||
let message: WSMessage;
|
||||
|
||||
try {
|
||||
message = JSON.parse(raw.toString());
|
||||
} catch {
|
||||
send({ type: 'error', payload: 'Invalid JSON' });
|
||||
return;
|
||||
}
|
||||
|
||||
switch (message.type) {
|
||||
case 'ping':
|
||||
send({ type: 'pong', id: message.id });
|
||||
break;
|
||||
|
||||
case 'subscribe':
|
||||
handleSubscribe(socket, message.payload);
|
||||
send({ type: 'subscribed', payload: message.payload, id: message.id });
|
||||
break;
|
||||
|
||||
case 'message':
|
||||
handleMessage(socket, message.payload);
|
||||
break;
|
||||
|
||||
default:
|
||||
send({ type: 'error', payload: 'Unknown message type' });
|
||||
}
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Heartbeat/Ping-Pong
|
||||
|
||||
Keep connections alive:
|
||||
|
||||
```typescript
|
||||
const HEARTBEAT_INTERVAL = 30000;
|
||||
const clients = new Map<WebSocket, { isAlive: boolean }>();
|
||||
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
clients.set(socket, { isAlive: true });
|
||||
|
||||
socket.on('pong', () => {
|
||||
const client = clients.get(socket);
|
||||
if (client) client.isAlive = true;
|
||||
});
|
||||
|
||||
socket.on('close', () => {
|
||||
clients.delete(socket);
|
||||
});
|
||||
});
|
||||
|
||||
// Heartbeat interval
|
||||
setInterval(() => {
|
||||
for (const [socket, state] of clients) {
|
||||
if (!state.isAlive) {
|
||||
socket.terminate();
|
||||
clients.delete(socket);
|
||||
continue;
|
||||
}
|
||||
|
||||
state.isAlive = false;
|
||||
socket.ping();
|
||||
}
|
||||
}, HEARTBEAT_INTERVAL);
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
Authenticate WebSocket connections:
|
||||
|
||||
```typescript
|
||||
app.get('/ws', {
|
||||
websocket: true,
|
||||
preValidation: async (request, reply) => {
|
||||
// Authenticate via query parameter or header
|
||||
const token = request.query.token || request.headers.authorization?.replace('Bearer ', '');
|
||||
|
||||
if (!token) {
|
||||
reply.code(401).send({ error: 'Token required' });
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
request.user = await verifyToken(token);
|
||||
} catch {
|
||||
reply.code(401).send({ error: 'Invalid token' });
|
||||
}
|
||||
},
|
||||
}, (socket, request) => {
|
||||
console.log('Authenticated user:', request.user);
|
||||
|
||||
socket.on('message', (message) => {
|
||||
// Handle authenticated messages
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
Handle WebSocket errors properly:
|
||||
|
||||
```typescript
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
socket.on('error', (error) => {
|
||||
request.log.error({ err: error }, 'WebSocket error');
|
||||
});
|
||||
|
||||
socket.on('message', async (raw) => {
|
||||
try {
|
||||
const message = JSON.parse(raw.toString());
|
||||
const result = await processMessage(message);
|
||||
socket.send(JSON.stringify({ success: true, result }));
|
||||
} catch (error) {
|
||||
request.log.error({ err: error }, 'Message processing error');
|
||||
socket.send(JSON.stringify({
|
||||
success: false,
|
||||
error: error.message,
|
||||
}));
|
||||
}
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Rate Limiting WebSocket Messages
|
||||
|
||||
Limit message frequency:
|
||||
|
||||
```typescript
|
||||
const rateLimits = new Map<WebSocket, { count: number; resetAt: number }>();
|
||||
|
||||
function checkRateLimit(socket: WebSocket, limit: number, window: number): boolean {
|
||||
const now = Date.now();
|
||||
let state = rateLimits.get(socket);
|
||||
|
||||
if (!state || now > state.resetAt) {
|
||||
state = { count: 0, resetAt: now + window };
|
||||
rateLimits.set(socket, state);
|
||||
}
|
||||
|
||||
state.count++;
|
||||
|
||||
if (state.count > limit) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
socket.on('message', (message) => {
|
||||
if (!checkRateLimit(socket, 100, 60000)) {
|
||||
socket.send(JSON.stringify({ error: 'Rate limit exceeded' }));
|
||||
return;
|
||||
}
|
||||
|
||||
// Process message
|
||||
});
|
||||
|
||||
socket.on('close', () => {
|
||||
rateLimits.delete(socket);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Graceful Shutdown
|
||||
|
||||
Close WebSocket connections on shutdown:
|
||||
|
||||
```typescript
|
||||
import closeWithGrace from 'close-with-grace';
|
||||
|
||||
const connections = new Set<WebSocket>();
|
||||
|
||||
app.get('/ws', { websocket: true }, (socket, request) => {
|
||||
connections.add(socket);
|
||||
|
||||
socket.on('close', () => {
|
||||
connections.delete(socket);
|
||||
});
|
||||
});
|
||||
|
||||
closeWithGrace({ delay: 5000 }, async ({ signal }) => {
|
||||
// Notify clients
|
||||
for (const socket of connections) {
|
||||
if (socket.readyState === WebSocket.OPEN) {
|
||||
socket.send(JSON.stringify({ type: 'shutdown', message: 'Server is shutting down' }));
|
||||
socket.close(1001, 'Server shutdown');
|
||||
}
|
||||
}
|
||||
|
||||
await app.close();
|
||||
});
|
||||
```
|
||||
|
||||
## Full-Duplex Stream Pattern
|
||||
|
||||
Use WebSocket for streaming data:
|
||||
|
||||
```typescript
|
||||
app.get('/ws/stream', { websocket: true }, async (socket, request) => {
|
||||
const stream = createDataStream();
|
||||
|
||||
stream.on('data', (data) => {
|
||||
if (socket.readyState === WebSocket.OPEN) {
|
||||
socket.send(JSON.stringify({ type: 'data', payload: data }));
|
||||
}
|
||||
});
|
||||
|
||||
stream.on('end', () => {
|
||||
socket.send(JSON.stringify({ type: 'end' }));
|
||||
socket.close();
|
||||
});
|
||||
|
||||
socket.on('message', (message) => {
|
||||
const { type, payload } = JSON.parse(message.toString());
|
||||
|
||||
if (type === 'pause') {
|
||||
stream.pause();
|
||||
} else if (type === 'resume') {
|
||||
stream.resume();
|
||||
}
|
||||
});
|
||||
|
||||
socket.on('close', () => {
|
||||
stream.destroy();
|
||||
});
|
||||
});
|
||||
```
|
||||
11
.claude/skills/fastify-best-practices/tile.json
Normal file
11
.claude/skills/fastify-best-practices/tile.json
Normal file
|
|
@ -0,0 +1,11 @@
|
|||
{
|
||||
"name": "mcollina/fastify-best-practices",
|
||||
"version": "0.1.0",
|
||||
"private": false,
|
||||
"summary": "Guides development of Fastify Node.js backend servers and REST APIs using TypeScript or JavaScript. Use when building, configuring, or debugging a Fastify application — including defining routes, implementing plugins, setting up JSON Schema validation, handling errors, optimising performance, managing authentication, configuring CORS and security headers, integrating databases, working with WebSockets, and deploying to production. Covers the full Fastify request lifecycle (hooks, serialization, logging with Pino) and TypeScript integration via strip types. Trigger terms: Fastify, Node.js server, REST API, API routes, backend framework, fastify.config, server.ts, app.ts.",
|
||||
"skills": {
|
||||
"fastify-best-practices": {
|
||||
"path": "SKILL.md"
|
||||
}
|
||||
}
|
||||
}
|
||||
244
.claude/skills/fastify-typescript/SKILL.md
Normal file
244
.claude/skills/fastify-typescript/SKILL.md
Normal file
|
|
@ -0,0 +1,244 @@
|
|||
---
|
||||
name: fastify-typescript
|
||||
description: Guidelines for building high-performance APIs with Fastify and TypeScript, covering validation, Prisma integration, and testing best practices
|
||||
---
|
||||
|
||||
# Fastify TypeScript Development
|
||||
|
||||
You are an expert in Fastify and TypeScript development with deep knowledge of building high-performance, type-safe APIs.
|
||||
|
||||
## TypeScript General Guidelines
|
||||
|
||||
### Basic Principles
|
||||
|
||||
- Use English for all code and documentation
|
||||
- Always declare types for variables and functions (parameters and return values)
|
||||
- Avoid using `any` type - create necessary types instead
|
||||
- Use JSDoc to document public classes and methods
|
||||
- Write concise, maintainable, and technically accurate code
|
||||
- Use functional and declarative programming patterns; avoid classes
|
||||
- Prefer iteration and modularization to adhere to DRY principles
|
||||
|
||||
### Nomenclature
|
||||
|
||||
- Use PascalCase for types and interfaces
|
||||
- Use camelCase for variables, functions, and methods
|
||||
- Use kebab-case for file and directory names
|
||||
- Use UPPERCASE for environment variables
|
||||
- Use descriptive variable names with auxiliary verbs: `isLoading`, `hasError`, `canDelete`
|
||||
- Start each function with a verb
|
||||
|
||||
### Functions
|
||||
|
||||
- Write short functions with a single purpose
|
||||
- Use arrow functions for simple operations
|
||||
- Use async/await consistently throughout the codebase
|
||||
- Use the RO-RO pattern (Receive an Object, Return an Object) for multiple parameters
|
||||
|
||||
### Types and Interfaces
|
||||
|
||||
- Prefer interfaces over types for object shapes
|
||||
- Avoid enums; use maps or const objects instead
|
||||
- Use Zod for runtime validation with inferred types
|
||||
- Use `readonly` for immutable properties
|
||||
- Use `import type` for type-only imports
|
||||
|
||||
## Fastify-Specific Guidelines
|
||||
|
||||
### Project Structure
|
||||
|
||||
```
|
||||
src/
|
||||
routes/
|
||||
{resource}/
|
||||
index.ts
|
||||
handlers.ts
|
||||
schemas.ts
|
||||
plugins/
|
||||
auth.ts
|
||||
database.ts
|
||||
cors.ts
|
||||
services/
|
||||
{domain}Service.ts
|
||||
repositories/
|
||||
{entity}Repository.ts
|
||||
types/
|
||||
index.ts
|
||||
utils/
|
||||
config/
|
||||
app.ts
|
||||
server.ts
|
||||
```
|
||||
|
||||
### Route Organization
|
||||
|
||||
- Organize routes by resource/domain
|
||||
- Use route plugins for modular registration
|
||||
- Define schemas alongside route handlers
|
||||
- Use route prefixes for API versioning
|
||||
|
||||
```typescript
|
||||
import { FastifyPluginAsync } from 'fastify';
|
||||
|
||||
const usersRoutes: FastifyPluginAsync = async (fastify) => {
|
||||
fastify.get('/', { schema: listUsersSchema }, listUsersHandler);
|
||||
fastify.get('/:id', { schema: getUserSchema }, getUserHandler);
|
||||
fastify.post('/', { schema: createUserSchema }, createUserHandler);
|
||||
fastify.put('/:id', { schema: updateUserSchema }, updateUserHandler);
|
||||
fastify.delete('/:id', { schema: deleteUserSchema }, deleteUserHandler);
|
||||
};
|
||||
|
||||
export default usersRoutes;
|
||||
```
|
||||
|
||||
### Schema Validation with JSON Schema / Ajv
|
||||
|
||||
- Define JSON schemas for all request/response validation
|
||||
- Use @sinclair/typebox for type-safe schema definitions
|
||||
- Leverage Fastify's built-in Ajv integration
|
||||
|
||||
```typescript
|
||||
import { Type, Static } from '@sinclair/typebox';
|
||||
|
||||
const UserSchema = Type.Object({
|
||||
id: Type.String({ format: 'uuid' }),
|
||||
name: Type.String({ minLength: 1 }),
|
||||
email: Type.String({ format: 'email' }),
|
||||
createdAt: Type.String({ format: 'date-time' }),
|
||||
});
|
||||
|
||||
type User = Static<typeof UserSchema>;
|
||||
|
||||
const createUserSchema = {
|
||||
body: Type.Object({
|
||||
name: Type.String({ minLength: 1 }),
|
||||
email: Type.String({ format: 'email' }),
|
||||
}),
|
||||
response: {
|
||||
201: UserSchema,
|
||||
400: ErrorSchema,
|
||||
},
|
||||
};
|
||||
```
|
||||
|
||||
### Plugins and Decorators
|
||||
|
||||
- Use plugins for shared functionality
|
||||
- Decorate Fastify instance with services and utilities
|
||||
- Register plugins with proper encapsulation
|
||||
|
||||
```typescript
|
||||
import fp from 'fastify-plugin';
|
||||
|
||||
const databasePlugin = fp(async (fastify) => {
|
||||
const prisma = new PrismaClient();
|
||||
|
||||
await prisma.$connect();
|
||||
|
||||
fastify.decorate('prisma', prisma);
|
||||
|
||||
fastify.addHook('onClose', async () => {
|
||||
await prisma.$disconnect();
|
||||
});
|
||||
});
|
||||
|
||||
export default databasePlugin;
|
||||
```
|
||||
|
||||
### Prisma Integration
|
||||
|
||||
- Use Prisma as the ORM for database operations
|
||||
- Create repository classes for data access
|
||||
- Use transactions for complex operations
|
||||
|
||||
```typescript
|
||||
class UserRepository {
|
||||
constructor(private prisma: PrismaClient) {}
|
||||
|
||||
async findById(id: string): Promise<User | null> {
|
||||
return this.prisma.user.findUnique({ where: { id } });
|
||||
}
|
||||
|
||||
async create(data: CreateUserInput): Promise<User> {
|
||||
return this.prisma.user.create({ data });
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Error Handling
|
||||
|
||||
- Use Fastify's built-in error handling
|
||||
- Create custom error classes for domain errors
|
||||
- Return consistent error responses
|
||||
|
||||
```typescript
|
||||
import { FastifyError } from 'fastify';
|
||||
|
||||
class NotFoundError extends Error implements FastifyError {
|
||||
code = 'NOT_FOUND';
|
||||
statusCode = 404;
|
||||
|
||||
constructor(resource: string, id: string) {
|
||||
super(`${resource} with id ${id} not found`);
|
||||
this.name = 'NotFoundError';
|
||||
}
|
||||
}
|
||||
|
||||
// Global error handler
|
||||
fastify.setErrorHandler((error, request, reply) => {
|
||||
const statusCode = error.statusCode || 500;
|
||||
|
||||
reply.status(statusCode).send({
|
||||
error: error.name,
|
||||
message: error.message,
|
||||
statusCode,
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
### Testing with Jest
|
||||
|
||||
- Write unit tests for services and handlers
|
||||
- Use integration tests for routes
|
||||
- Mock external dependencies
|
||||
|
||||
```typescript
|
||||
import { build } from '../app';
|
||||
|
||||
describe('Users API', () => {
|
||||
let app: FastifyInstance;
|
||||
|
||||
beforeAll(async () => {
|
||||
app = await build();
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await app.close();
|
||||
});
|
||||
|
||||
it('should list users', async () => {
|
||||
const response = await app.inject({
|
||||
method: 'GET',
|
||||
url: '/api/users',
|
||||
});
|
||||
|
||||
expect(response.statusCode).toBe(200);
|
||||
expect(JSON.parse(response.payload)).toBeInstanceOf(Array);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
### Performance
|
||||
|
||||
- Fastify is one of the fastest Node.js frameworks
|
||||
- Use schema validation for automatic serialization optimization
|
||||
- Enable logging only when needed in production
|
||||
- Use connection pooling for database connections
|
||||
|
||||
### Security
|
||||
|
||||
- Use @fastify/helmet for security headers
|
||||
- Implement rate limiting with @fastify/rate-limit
|
||||
- Use @fastify/cors for CORS configuration
|
||||
- Validate all inputs with JSON Schema
|
||||
- Use JWT for authentication with @fastify/jwt
|
||||
46
.dockerignore
Normal file
46
.dockerignore
Normal file
|
|
@ -0,0 +1,46 @@
|
|||
# Dependencies (reinstalled inside container)
|
||||
node_modules/
|
||||
|
||||
# Build output (rebuilt inside container)
|
||||
dist/
|
||||
|
||||
# Version control
|
||||
.git/
|
||||
|
||||
# GSD agent artifacts
|
||||
.gsd/
|
||||
|
||||
# Runtime data (mounted as volumes)
|
||||
data/
|
||||
media/
|
||||
|
||||
# Environment files (secrets — not baked into image)
|
||||
.env
|
||||
.env.*
|
||||
!.env.example
|
||||
|
||||
# Test coverage
|
||||
coverage/
|
||||
|
||||
# Logs
|
||||
*.log
|
||||
|
||||
# Temp / cache
|
||||
tmp/
|
||||
.cache/
|
||||
|
||||
# IDE
|
||||
.idea/
|
||||
.vscode/
|
||||
*.code-workspace
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Keep these (needed in build):
|
||||
# - drizzle/ (migration SQL files, copied into runtime image)
|
||||
# - .env.example (reference, excluded above via negation)
|
||||
# - package.json (dependency manifest)
|
||||
# - package-lock.json (lockfile for deterministic installs)
|
||||
# - src/ (compiled during build stage)
|
||||
17
.env.example
Normal file
17
.env.example
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
# Tubearr Environment Configuration
|
||||
# Copy this file to .env and customize as needed
|
||||
|
||||
# Server port (default: 8989)
|
||||
TUBEARR_PORT=8989
|
||||
|
||||
# Database file path (default: ./data/tubearr.db)
|
||||
TUBEARR_DB_PATH=./data/tubearr.db
|
||||
|
||||
# Log level: trace, debug, info, warn, error, fatal (default: info)
|
||||
TUBEARR_LOG_LEVEL=info
|
||||
|
||||
# API key for authentication (optional — auto-generated on first run if not set)
|
||||
# TUBEARR_API_KEY=
|
||||
|
||||
# Node environment
|
||||
NODE_ENV=development
|
||||
53
.gitignore
vendored
Normal file
53
.gitignore
vendored
Normal file
|
|
@ -0,0 +1,53 @@
|
|||
# Dependencies
|
||||
node_modules/
|
||||
|
||||
# Build output
|
||||
dist/
|
||||
|
||||
# Environment variables
|
||||
.env
|
||||
.env.*
|
||||
!.env.example
|
||||
|
||||
# Database
|
||||
data/
|
||||
|
||||
# Drizzle migrations meta (generated)
|
||||
# Keep the SQL files, but the meta/ directory is generated
|
||||
# drizzle/meta/ — actually we want to keep this for migration tracking
|
||||
|
||||
# IDE
|
||||
.idea/
|
||||
.vscode/
|
||||
*.code-workspace
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Logs
|
||||
*.log
|
||||
|
||||
# Test coverage
|
||||
coverage/
|
||||
|
||||
# Temp
|
||||
tmp/
|
||||
.cache/
|
||||
|
||||
# GSD
|
||||
.gsd/
|
||||
|
||||
# ── GSD baseline (auto-generated) ──
|
||||
.gsd
|
||||
*.swp
|
||||
*.swo
|
||||
*~
|
||||
.next/
|
||||
build/
|
||||
__pycache__/
|
||||
*.pyc
|
||||
.venv/
|
||||
venv/
|
||||
target/
|
||||
vendor/
|
||||
66
Dockerfile
Normal file
66
Dockerfile
Normal file
|
|
@ -0,0 +1,66 @@
|
|||
# ============================================================
|
||||
# Tubearr — Multi-stage Docker build
|
||||
# ============================================================
|
||||
# Stage 1: Install all dependencies (including devDependencies)
|
||||
# Stage 2: Compile TypeScript backend + Vite frontend
|
||||
# Stage 3: Slim Alpine runtime with Node 22, yt-dlp, ffmpeg
|
||||
# ============================================================
|
||||
|
||||
# ── Stage 1: Dependencies ──────────────────────────────────
|
||||
FROM node:22-alpine AS deps
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
COPY package.json package-lock.json ./
|
||||
RUN npm ci
|
||||
|
||||
# ── Stage 2: Build ─────────────────────────────────────────
|
||||
FROM deps AS build
|
||||
|
||||
# Copy source and config files needed for compilation
|
||||
COPY src/ ./src/
|
||||
COPY tsconfig.json ./
|
||||
COPY drizzle/ ./drizzle/
|
||||
|
||||
# Compile TypeScript backend (outputs to dist/)
|
||||
RUN npm run build
|
||||
|
||||
# Build Vite frontend SPA (outputs to dist/frontend/)
|
||||
RUN npm run build:frontend
|
||||
|
||||
# ── Stage 3: Runtime ───────────────────────────────────────
|
||||
FROM node:22-alpine AS runtime
|
||||
|
||||
# Install yt-dlp and ffmpeg — the core download/transcode tools
|
||||
RUN apk add --no-cache python3 py3-pip ffmpeg \
|
||||
&& pip install --no-cache-dir --break-system-packages yt-dlp
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Copy only what the runtime needs from the build stage
|
||||
COPY --from=build /app/dist/ ./dist/
|
||||
COPY --from=build /app/drizzle/ ./drizzle/
|
||||
COPY package.json package-lock.json ./
|
||||
|
||||
# Install production-only dependencies (no devDependencies)
|
||||
RUN npm ci --omit=dev
|
||||
|
||||
# Add tsx for ESM-compatible execution (handles extensionless imports that
|
||||
# tsc emits but Node's native ESM loader rejects).
|
||||
# Installed locally alongside production deps so Node's --import can resolve it.
|
||||
RUN npm install tsx
|
||||
|
||||
# Create default directories following *arr family conventions
|
||||
# /config — DB, logs, cookies, settings (like Radarr/Sonarr /config)
|
||||
# /media — downloaded/organized media files
|
||||
RUN mkdir -p /config /media
|
||||
|
||||
# Runtime environment defaults
|
||||
ENV NODE_ENV=production
|
||||
ENV TUBEARR_DB_PATH=/config/tubearr.db
|
||||
ENV TUBEARR_MEDIA_PATH=/media
|
||||
ENV TUBEARR_COOKIE_PATH=/config/cookies
|
||||
|
||||
EXPOSE 8989
|
||||
|
||||
CMD ["node", "--import", "tsx/esm", "dist/index.js"]
|
||||
20
docker-compose.yml
Normal file
20
docker-compose.yml
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
services:
|
||||
tubearr:
|
||||
build:
|
||||
context: .
|
||||
container_name: tubearr
|
||||
ports:
|
||||
- "8989:8989"
|
||||
volumes:
|
||||
- ./config:/config
|
||||
- ./media:/media
|
||||
environment:
|
||||
- NODE_ENV=production
|
||||
- TUBEARR_PORT=8989
|
||||
healthcheck:
|
||||
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://127.0.0.1:8989/ping"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
start_period: 15s
|
||||
restart: unless-stopped
|
||||
7
drizzle.config.ts
Normal file
7
drizzle.config.ts
Normal file
|
|
@ -0,0 +1,7 @@
|
|||
import { defineConfig } from 'drizzle-kit';
|
||||
|
||||
export default defineConfig({
|
||||
schema: './src/db/schema/*.ts',
|
||||
out: './drizzle',
|
||||
dialect: 'sqlite',
|
||||
});
|
||||
90
drizzle/0000_colossal_jubilee.sql
Normal file
90
drizzle/0000_colossal_jubilee.sql
Normal file
|
|
@ -0,0 +1,90 @@
|
|||
CREATE TABLE `content_items` (
|
||||
`id` integer PRIMARY KEY AUTOINCREMENT NOT NULL,
|
||||
`creator_id` integer NOT NULL,
|
||||
`title` text NOT NULL,
|
||||
`platform_content_id` text NOT NULL,
|
||||
`url` text NOT NULL,
|
||||
`content_type` text NOT NULL,
|
||||
`duration` integer,
|
||||
`file_path` text,
|
||||
`file_size` integer,
|
||||
`format` text,
|
||||
`quality_metadata` text,
|
||||
`status` text DEFAULT 'monitored' NOT NULL,
|
||||
`created_at` text DEFAULT (datetime('now')) NOT NULL,
|
||||
`updated_at` text DEFAULT (datetime('now')) NOT NULL,
|
||||
FOREIGN KEY (`creator_id`) REFERENCES `creators`(`id`) ON UPDATE no action ON DELETE cascade
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE `format_profiles` (
|
||||
`id` integer PRIMARY KEY AUTOINCREMENT NOT NULL,
|
||||
`name` text NOT NULL,
|
||||
`video_resolution` text,
|
||||
`audio_codec` text,
|
||||
`audio_bitrate` text,
|
||||
`container_format` text,
|
||||
`is_default` integer DEFAULT false NOT NULL,
|
||||
`created_at` text DEFAULT (datetime('now')) NOT NULL,
|
||||
`updated_at` text DEFAULT (datetime('now')) NOT NULL
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE `creators` (
|
||||
`id` integer PRIMARY KEY AUTOINCREMENT NOT NULL,
|
||||
`name` text NOT NULL,
|
||||
`platform` text NOT NULL,
|
||||
`platform_id` text NOT NULL,
|
||||
`url` text NOT NULL,
|
||||
`monitoring_enabled` integer DEFAULT true NOT NULL,
|
||||
`check_interval` integer DEFAULT 360 NOT NULL,
|
||||
`image_url` text,
|
||||
`metadata` text,
|
||||
`created_at` text DEFAULT (datetime('now')) NOT NULL,
|
||||
`updated_at` text DEFAULT (datetime('now')) NOT NULL
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE `download_history` (
|
||||
`id` integer PRIMARY KEY AUTOINCREMENT NOT NULL,
|
||||
`content_item_id` integer,
|
||||
`creator_id` integer,
|
||||
`event_type` text NOT NULL,
|
||||
`status` text NOT NULL,
|
||||
`details` text,
|
||||
`created_at` text DEFAULT (datetime('now')) NOT NULL,
|
||||
FOREIGN KEY (`content_item_id`) REFERENCES `content_items`(`id`) ON UPDATE no action ON DELETE set null,
|
||||
FOREIGN KEY (`creator_id`) REFERENCES `creators`(`id`) ON UPDATE no action ON DELETE set null
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE `notification_settings` (
|
||||
`id` integer PRIMARY KEY AUTOINCREMENT NOT NULL,
|
||||
`type` text NOT NULL,
|
||||
`name` text NOT NULL,
|
||||
`enabled` integer DEFAULT true NOT NULL,
|
||||
`config` text NOT NULL,
|
||||
`on_grab` integer DEFAULT true NOT NULL,
|
||||
`on_download` integer DEFAULT true NOT NULL,
|
||||
`on_failure` integer DEFAULT true NOT NULL,
|
||||
`created_at` text DEFAULT (datetime('now')) NOT NULL,
|
||||
`updated_at` text DEFAULT (datetime('now')) NOT NULL
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE `queue_items` (
|
||||
`id` integer PRIMARY KEY AUTOINCREMENT NOT NULL,
|
||||
`content_item_id` integer NOT NULL,
|
||||
`status` text DEFAULT 'pending' NOT NULL,
|
||||
`priority` integer DEFAULT 0 NOT NULL,
|
||||
`attempts` integer DEFAULT 0 NOT NULL,
|
||||
`max_attempts` integer DEFAULT 3 NOT NULL,
|
||||
`error` text,
|
||||
`started_at` text,
|
||||
`completed_at` text,
|
||||
`created_at` text DEFAULT (datetime('now')) NOT NULL,
|
||||
`updated_at` text DEFAULT (datetime('now')) NOT NULL,
|
||||
FOREIGN KEY (`content_item_id`) REFERENCES `content_items`(`id`) ON UPDATE no action ON DELETE cascade
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE `system_config` (
|
||||
`key` text PRIMARY KEY NOT NULL,
|
||||
`value` text NOT NULL,
|
||||
`created_at` text DEFAULT (datetime('now')) NOT NULL,
|
||||
`updated_at` text DEFAULT (datetime('now')) NOT NULL
|
||||
);
|
||||
2
drizzle/0001_natural_toad_men.sql
Normal file
2
drizzle/0001_natural_toad_men.sql
Normal file
|
|
@ -0,0 +1,2 @@
|
|||
ALTER TABLE `creators` ADD `last_checked_at` text;--> statement-breakpoint
|
||||
ALTER TABLE `creators` ADD `last_check_status` text;
|
||||
1
drizzle/0002_lonely_nico_minoru.sql
Normal file
1
drizzle/0002_lonely_nico_minoru.sql
Normal file
|
|
@ -0,0 +1 @@
|
|||
ALTER TABLE `creators` ADD `format_profile_id` integer REFERENCES format_profiles(id);
|
||||
3
drizzle/0003_moaning_vertigo.sql
Normal file
3
drizzle/0003_moaning_vertigo.sql
Normal file
|
|
@ -0,0 +1,3 @@
|
|||
ALTER TABLE `content_items` ADD `thumbnail_url` text;--> statement-breakpoint
|
||||
ALTER TABLE `format_profiles` ADD `subtitle_languages` text;--> statement-breakpoint
|
||||
ALTER TABLE `format_profiles` ADD `embed_subtitles` integer DEFAULT false NOT NULL;
|
||||
11
drizzle/0004_platform_settings.sql
Normal file
11
drizzle/0004_platform_settings.sql
Normal file
|
|
@ -0,0 +1,11 @@
|
|||
CREATE TABLE `platform_settings` (
|
||||
`platform` text PRIMARY KEY NOT NULL,
|
||||
`default_format_profile_id` integer REFERENCES `format_profiles`(`id`) ON DELETE SET NULL,
|
||||
`check_interval` integer DEFAULT 360,
|
||||
`concurrency_limit` integer DEFAULT 2,
|
||||
`subtitle_languages` text,
|
||||
`grab_all_enabled` integer DEFAULT false NOT NULL,
|
||||
`grab_all_order` text DEFAULT 'newest' NOT NULL,
|
||||
`created_at` text DEFAULT (datetime('now')) NOT NULL,
|
||||
`updated_at` text DEFAULT (datetime('now')) NOT NULL
|
||||
);
|
||||
23
drizzle/0005_monitoring_playlists.sql
Normal file
23
drizzle/0005_monitoring_playlists.sql
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
CREATE TABLE `content_playlist` (
|
||||
`content_item_id` integer NOT NULL,
|
||||
`playlist_id` integer NOT NULL,
|
||||
PRIMARY KEY(`content_item_id`, `playlist_id`),
|
||||
FOREIGN KEY (`content_item_id`) REFERENCES `content_items`(`id`) ON UPDATE no action ON DELETE cascade,
|
||||
FOREIGN KEY (`playlist_id`) REFERENCES `playlists`(`id`) ON UPDATE no action ON DELETE cascade
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE `playlists` (
|
||||
`id` integer PRIMARY KEY AUTOINCREMENT NOT NULL,
|
||||
`creator_id` integer NOT NULL,
|
||||
`platform_playlist_id` text NOT NULL,
|
||||
`title` text NOT NULL,
|
||||
`position` integer DEFAULT 0 NOT NULL,
|
||||
`created_at` text DEFAULT (datetime('now')) NOT NULL,
|
||||
`updated_at` text DEFAULT (datetime('now')) NOT NULL,
|
||||
FOREIGN KEY (`creator_id`) REFERENCES `creators`(`id`) ON UPDATE no action ON DELETE cascade
|
||||
);
|
||||
--> statement-breakpoint
|
||||
ALTER TABLE `content_items` ADD `published_at` text;--> statement-breakpoint
|
||||
ALTER TABLE `content_items` ADD `downloaded_at` text;--> statement-breakpoint
|
||||
ALTER TABLE `content_items` ADD `monitored` integer DEFAULT true NOT NULL;--> statement-breakpoint
|
||||
ALTER TABLE `creators` ADD `monitoring_mode` text DEFAULT 'all' NOT NULL;
|
||||
4
drizzle/0006_rename_creators_to_channels.sql
Normal file
4
drizzle/0006_rename_creators_to_channels.sql
Normal file
|
|
@ -0,0 +1,4 @@
|
|||
ALTER TABLE creators RENAME TO channels;--> statement-breakpoint
|
||||
ALTER TABLE content_items RENAME COLUMN creator_id TO channel_id;--> statement-breakpoint
|
||||
ALTER TABLE download_history RENAME COLUMN creator_id TO channel_id;--> statement-breakpoint
|
||||
ALTER TABLE playlists RENAME COLUMN creator_id TO channel_id;
|
||||
2
drizzle/0007_steep_the_watchers.sql
Normal file
2
drizzle/0007_steep_the_watchers.sql
Normal file
|
|
@ -0,0 +1,2 @@
|
|||
ALTER TABLE `platform_settings` ADD `scan_limit` integer DEFAULT 100;--> statement-breakpoint
|
||||
ALTER TABLE `platform_settings` ADD `rate_limit_delay` integer DEFAULT 1000;
|
||||
630
drizzle/meta/0000_snapshot.json
Normal file
630
drizzle/meta/0000_snapshot.json
Normal file
|
|
@ -0,0 +1,630 @@
|
|||
{
|
||||
"version": "6",
|
||||
"dialect": "sqlite",
|
||||
"id": "ee333905-695c-4855-ae88-77eed7c0ac4d",
|
||||
"prevId": "00000000-0000-0000-0000-000000000000",
|
||||
"tables": {
|
||||
"content_items": {
|
||||
"name": "content_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"title": {
|
||||
"name": "title",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_content_id": {
|
||||
"name": "platform_content_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"content_type": {
|
||||
"name": "content_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"duration": {
|
||||
"name": "duration",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_path": {
|
||||
"name": "file_path",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_size": {
|
||||
"name": "file_size",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format": {
|
||||
"name": "format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"quality_metadata": {
|
||||
"name": "quality_metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'monitored'"
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"content_items_creator_id_creators_id_fk": {
|
||||
"name": "content_items_creator_id_creators_id_fk",
|
||||
"tableFrom": "content_items",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"format_profiles": {
|
||||
"name": "format_profiles",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"video_resolution": {
|
||||
"name": "video_resolution",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_codec": {
|
||||
"name": "audio_codec",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_bitrate": {
|
||||
"name": "audio_bitrate",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"container_format": {
|
||||
"name": "container_format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"is_default": {
|
||||
"name": "is_default",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"creators": {
|
||||
"name": "creators",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform": {
|
||||
"name": "platform",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_id": {
|
||||
"name": "platform_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitoring_enabled": {
|
||||
"name": "monitoring_enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"check_interval": {
|
||||
"name": "check_interval",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 360
|
||||
},
|
||||
"image_url": {
|
||||
"name": "image_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"metadata": {
|
||||
"name": "metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"download_history": {
|
||||
"name": "download_history",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"event_type": {
|
||||
"name": "event_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"details": {
|
||||
"name": "details",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"download_history_content_item_id_content_items_id_fk": {
|
||||
"name": "download_history_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"download_history_creator_id_creators_id_fk": {
|
||||
"name": "download_history_creator_id_creators_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"notification_settings": {
|
||||
"name": "notification_settings",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"type": {
|
||||
"name": "type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"enabled": {
|
||||
"name": "enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"config": {
|
||||
"name": "config",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"on_grab": {
|
||||
"name": "on_grab",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_download": {
|
||||
"name": "on_download",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_failure": {
|
||||
"name": "on_failure",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"queue_items": {
|
||||
"name": "queue_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'pending'"
|
||||
},
|
||||
"priority": {
|
||||
"name": "priority",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"attempts": {
|
||||
"name": "attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"max_attempts": {
|
||||
"name": "max_attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 3
|
||||
},
|
||||
"error": {
|
||||
"name": "error",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"started_at": {
|
||||
"name": "started_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"completed_at": {
|
||||
"name": "completed_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"queue_items_content_item_id_content_items_id_fk": {
|
||||
"name": "queue_items_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "queue_items",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"system_config": {
|
||||
"name": "system_config",
|
||||
"columns": {
|
||||
"key": {
|
||||
"name": "key",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"value": {
|
||||
"name": "value",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
}
|
||||
},
|
||||
"views": {},
|
||||
"enums": {},
|
||||
"_meta": {
|
||||
"schemas": {},
|
||||
"tables": {},
|
||||
"columns": {}
|
||||
},
|
||||
"internal": {
|
||||
"indexes": {}
|
||||
}
|
||||
}
|
||||
644
drizzle/meta/0001_snapshot.json
Normal file
644
drizzle/meta/0001_snapshot.json
Normal file
|
|
@ -0,0 +1,644 @@
|
|||
{
|
||||
"version": "6",
|
||||
"dialect": "sqlite",
|
||||
"id": "f9f9642b-8498-4158-aebb-814eb2a363d0",
|
||||
"prevId": "ee333905-695c-4855-ae88-77eed7c0ac4d",
|
||||
"tables": {
|
||||
"content_items": {
|
||||
"name": "content_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"title": {
|
||||
"name": "title",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_content_id": {
|
||||
"name": "platform_content_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"content_type": {
|
||||
"name": "content_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"duration": {
|
||||
"name": "duration",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_path": {
|
||||
"name": "file_path",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_size": {
|
||||
"name": "file_size",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format": {
|
||||
"name": "format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"quality_metadata": {
|
||||
"name": "quality_metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'monitored'"
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"content_items_creator_id_creators_id_fk": {
|
||||
"name": "content_items_creator_id_creators_id_fk",
|
||||
"tableFrom": "content_items",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"format_profiles": {
|
||||
"name": "format_profiles",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"video_resolution": {
|
||||
"name": "video_resolution",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_codec": {
|
||||
"name": "audio_codec",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_bitrate": {
|
||||
"name": "audio_bitrate",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"container_format": {
|
||||
"name": "container_format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"is_default": {
|
||||
"name": "is_default",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"creators": {
|
||||
"name": "creators",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform": {
|
||||
"name": "platform",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_id": {
|
||||
"name": "platform_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitoring_enabled": {
|
||||
"name": "monitoring_enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"check_interval": {
|
||||
"name": "check_interval",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 360
|
||||
},
|
||||
"image_url": {
|
||||
"name": "image_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"metadata": {
|
||||
"name": "metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"last_checked_at": {
|
||||
"name": "last_checked_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"last_check_status": {
|
||||
"name": "last_check_status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"download_history": {
|
||||
"name": "download_history",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"event_type": {
|
||||
"name": "event_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"details": {
|
||||
"name": "details",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"download_history_content_item_id_content_items_id_fk": {
|
||||
"name": "download_history_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"download_history_creator_id_creators_id_fk": {
|
||||
"name": "download_history_creator_id_creators_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"notification_settings": {
|
||||
"name": "notification_settings",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"type": {
|
||||
"name": "type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"enabled": {
|
||||
"name": "enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"config": {
|
||||
"name": "config",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"on_grab": {
|
||||
"name": "on_grab",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_download": {
|
||||
"name": "on_download",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_failure": {
|
||||
"name": "on_failure",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"queue_items": {
|
||||
"name": "queue_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'pending'"
|
||||
},
|
||||
"priority": {
|
||||
"name": "priority",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"attempts": {
|
||||
"name": "attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"max_attempts": {
|
||||
"name": "max_attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 3
|
||||
},
|
||||
"error": {
|
||||
"name": "error",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"started_at": {
|
||||
"name": "started_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"completed_at": {
|
||||
"name": "completed_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"queue_items_content_item_id_content_items_id_fk": {
|
||||
"name": "queue_items_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "queue_items",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"system_config": {
|
||||
"name": "system_config",
|
||||
"columns": {
|
||||
"key": {
|
||||
"name": "key",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"value": {
|
||||
"name": "value",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
}
|
||||
},
|
||||
"views": {},
|
||||
"enums": {},
|
||||
"_meta": {
|
||||
"schemas": {},
|
||||
"tables": {},
|
||||
"columns": {}
|
||||
},
|
||||
"internal": {
|
||||
"indexes": {}
|
||||
}
|
||||
}
|
||||
665
drizzle/meta/0002_snapshot.json
Normal file
665
drizzle/meta/0002_snapshot.json
Normal file
|
|
@ -0,0 +1,665 @@
|
|||
{
|
||||
"version": "6",
|
||||
"dialect": "sqlite",
|
||||
"id": "9c1ae9c3-7499-4269-b4a2-21d5ec040367",
|
||||
"prevId": "f9f9642b-8498-4158-aebb-814eb2a363d0",
|
||||
"tables": {
|
||||
"content_items": {
|
||||
"name": "content_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"title": {
|
||||
"name": "title",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_content_id": {
|
||||
"name": "platform_content_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"content_type": {
|
||||
"name": "content_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"duration": {
|
||||
"name": "duration",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_path": {
|
||||
"name": "file_path",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_size": {
|
||||
"name": "file_size",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format": {
|
||||
"name": "format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"quality_metadata": {
|
||||
"name": "quality_metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'monitored'"
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"content_items_creator_id_creators_id_fk": {
|
||||
"name": "content_items_creator_id_creators_id_fk",
|
||||
"tableFrom": "content_items",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"format_profiles": {
|
||||
"name": "format_profiles",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"video_resolution": {
|
||||
"name": "video_resolution",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_codec": {
|
||||
"name": "audio_codec",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_bitrate": {
|
||||
"name": "audio_bitrate",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"container_format": {
|
||||
"name": "container_format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"is_default": {
|
||||
"name": "is_default",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"creators": {
|
||||
"name": "creators",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform": {
|
||||
"name": "platform",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_id": {
|
||||
"name": "platform_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitoring_enabled": {
|
||||
"name": "monitoring_enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"check_interval": {
|
||||
"name": "check_interval",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 360
|
||||
},
|
||||
"image_url": {
|
||||
"name": "image_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"metadata": {
|
||||
"name": "metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format_profile_id": {
|
||||
"name": "format_profile_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"last_checked_at": {
|
||||
"name": "last_checked_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"last_check_status": {
|
||||
"name": "last_check_status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"creators_format_profile_id_format_profiles_id_fk": {
|
||||
"name": "creators_format_profile_id_format_profiles_id_fk",
|
||||
"tableFrom": "creators",
|
||||
"tableTo": "format_profiles",
|
||||
"columnsFrom": [
|
||||
"format_profile_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"download_history": {
|
||||
"name": "download_history",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"event_type": {
|
||||
"name": "event_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"details": {
|
||||
"name": "details",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"download_history_content_item_id_content_items_id_fk": {
|
||||
"name": "download_history_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"download_history_creator_id_creators_id_fk": {
|
||||
"name": "download_history_creator_id_creators_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"notification_settings": {
|
||||
"name": "notification_settings",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"type": {
|
||||
"name": "type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"enabled": {
|
||||
"name": "enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"config": {
|
||||
"name": "config",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"on_grab": {
|
||||
"name": "on_grab",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_download": {
|
||||
"name": "on_download",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_failure": {
|
||||
"name": "on_failure",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"queue_items": {
|
||||
"name": "queue_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'pending'"
|
||||
},
|
||||
"priority": {
|
||||
"name": "priority",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"attempts": {
|
||||
"name": "attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"max_attempts": {
|
||||
"name": "max_attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 3
|
||||
},
|
||||
"error": {
|
||||
"name": "error",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"started_at": {
|
||||
"name": "started_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"completed_at": {
|
||||
"name": "completed_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"queue_items_content_item_id_content_items_id_fk": {
|
||||
"name": "queue_items_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "queue_items",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"system_config": {
|
||||
"name": "system_config",
|
||||
"columns": {
|
||||
"key": {
|
||||
"name": "key",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"value": {
|
||||
"name": "value",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
}
|
||||
},
|
||||
"views": {},
|
||||
"enums": {},
|
||||
"_meta": {
|
||||
"schemas": {},
|
||||
"tables": {},
|
||||
"columns": {}
|
||||
},
|
||||
"internal": {
|
||||
"indexes": {}
|
||||
}
|
||||
}
|
||||
687
drizzle/meta/0003_snapshot.json
Normal file
687
drizzle/meta/0003_snapshot.json
Normal file
|
|
@ -0,0 +1,687 @@
|
|||
{
|
||||
"version": "6",
|
||||
"dialect": "sqlite",
|
||||
"id": "389cb6a3-b0dc-4103-b786-3014023f5ed2",
|
||||
"prevId": "9c1ae9c3-7499-4269-b4a2-21d5ec040367",
|
||||
"tables": {
|
||||
"content_items": {
|
||||
"name": "content_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"title": {
|
||||
"name": "title",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_content_id": {
|
||||
"name": "platform_content_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"content_type": {
|
||||
"name": "content_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"duration": {
|
||||
"name": "duration",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_path": {
|
||||
"name": "file_path",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_size": {
|
||||
"name": "file_size",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format": {
|
||||
"name": "format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"quality_metadata": {
|
||||
"name": "quality_metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'monitored'"
|
||||
},
|
||||
"thumbnail_url": {
|
||||
"name": "thumbnail_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"content_items_creator_id_creators_id_fk": {
|
||||
"name": "content_items_creator_id_creators_id_fk",
|
||||
"tableFrom": "content_items",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"format_profiles": {
|
||||
"name": "format_profiles",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"video_resolution": {
|
||||
"name": "video_resolution",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_codec": {
|
||||
"name": "audio_codec",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_bitrate": {
|
||||
"name": "audio_bitrate",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"container_format": {
|
||||
"name": "container_format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"is_default": {
|
||||
"name": "is_default",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"subtitle_languages": {
|
||||
"name": "subtitle_languages",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"embed_subtitles": {
|
||||
"name": "embed_subtitles",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"creators": {
|
||||
"name": "creators",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform": {
|
||||
"name": "platform",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_id": {
|
||||
"name": "platform_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitoring_enabled": {
|
||||
"name": "monitoring_enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"check_interval": {
|
||||
"name": "check_interval",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 360
|
||||
},
|
||||
"image_url": {
|
||||
"name": "image_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"metadata": {
|
||||
"name": "metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format_profile_id": {
|
||||
"name": "format_profile_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"last_checked_at": {
|
||||
"name": "last_checked_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"last_check_status": {
|
||||
"name": "last_check_status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"creators_format_profile_id_format_profiles_id_fk": {
|
||||
"name": "creators_format_profile_id_format_profiles_id_fk",
|
||||
"tableFrom": "creators",
|
||||
"tableTo": "format_profiles",
|
||||
"columnsFrom": [
|
||||
"format_profile_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"download_history": {
|
||||
"name": "download_history",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"event_type": {
|
||||
"name": "event_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"details": {
|
||||
"name": "details",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"download_history_content_item_id_content_items_id_fk": {
|
||||
"name": "download_history_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"download_history_creator_id_creators_id_fk": {
|
||||
"name": "download_history_creator_id_creators_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"notification_settings": {
|
||||
"name": "notification_settings",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"type": {
|
||||
"name": "type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"enabled": {
|
||||
"name": "enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"config": {
|
||||
"name": "config",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"on_grab": {
|
||||
"name": "on_grab",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_download": {
|
||||
"name": "on_download",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_failure": {
|
||||
"name": "on_failure",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"queue_items": {
|
||||
"name": "queue_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'pending'"
|
||||
},
|
||||
"priority": {
|
||||
"name": "priority",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"attempts": {
|
||||
"name": "attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"max_attempts": {
|
||||
"name": "max_attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 3
|
||||
},
|
||||
"error": {
|
||||
"name": "error",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"started_at": {
|
||||
"name": "started_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"completed_at": {
|
||||
"name": "completed_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"queue_items_content_item_id_content_items_id_fk": {
|
||||
"name": "queue_items_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "queue_items",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"system_config": {
|
||||
"name": "system_config",
|
||||
"columns": {
|
||||
"key": {
|
||||
"name": "key",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"value": {
|
||||
"name": "value",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
}
|
||||
},
|
||||
"views": {},
|
||||
"enums": {},
|
||||
"_meta": {
|
||||
"schemas": {},
|
||||
"tables": {},
|
||||
"columns": {}
|
||||
},
|
||||
"internal": {
|
||||
"indexes": {}
|
||||
}
|
||||
}
|
||||
780
drizzle/meta/0004_snapshot.json
Normal file
780
drizzle/meta/0004_snapshot.json
Normal file
|
|
@ -0,0 +1,780 @@
|
|||
{
|
||||
"version": "6",
|
||||
"dialect": "sqlite",
|
||||
"id": "f54a9d63-7807-4753-894f-6822f36471f2",
|
||||
"prevId": "389cb6a3-b0dc-4103-b786-3014023f5ed2",
|
||||
"tables": {
|
||||
"content_items": {
|
||||
"name": "content_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"title": {
|
||||
"name": "title",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_content_id": {
|
||||
"name": "platform_content_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"content_type": {
|
||||
"name": "content_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"duration": {
|
||||
"name": "duration",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_path": {
|
||||
"name": "file_path",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_size": {
|
||||
"name": "file_size",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format": {
|
||||
"name": "format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"quality_metadata": {
|
||||
"name": "quality_metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'monitored'"
|
||||
},
|
||||
"thumbnail_url": {
|
||||
"name": "thumbnail_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"content_items_creator_id_creators_id_fk": {
|
||||
"name": "content_items_creator_id_creators_id_fk",
|
||||
"tableFrom": "content_items",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"format_profiles": {
|
||||
"name": "format_profiles",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"video_resolution": {
|
||||
"name": "video_resolution",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_codec": {
|
||||
"name": "audio_codec",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_bitrate": {
|
||||
"name": "audio_bitrate",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"container_format": {
|
||||
"name": "container_format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"is_default": {
|
||||
"name": "is_default",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"subtitle_languages": {
|
||||
"name": "subtitle_languages",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"embed_subtitles": {
|
||||
"name": "embed_subtitles",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"creators": {
|
||||
"name": "creators",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform": {
|
||||
"name": "platform",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_id": {
|
||||
"name": "platform_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitoring_enabled": {
|
||||
"name": "monitoring_enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"check_interval": {
|
||||
"name": "check_interval",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 360
|
||||
},
|
||||
"image_url": {
|
||||
"name": "image_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"metadata": {
|
||||
"name": "metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format_profile_id": {
|
||||
"name": "format_profile_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"last_checked_at": {
|
||||
"name": "last_checked_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"last_check_status": {
|
||||
"name": "last_check_status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"creators_format_profile_id_format_profiles_id_fk": {
|
||||
"name": "creators_format_profile_id_format_profiles_id_fk",
|
||||
"tableFrom": "creators",
|
||||
"tableTo": "format_profiles",
|
||||
"columnsFrom": [
|
||||
"format_profile_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"download_history": {
|
||||
"name": "download_history",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"event_type": {
|
||||
"name": "event_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"details": {
|
||||
"name": "details",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"download_history_content_item_id_content_items_id_fk": {
|
||||
"name": "download_history_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"download_history_creator_id_creators_id_fk": {
|
||||
"name": "download_history_creator_id_creators_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"notification_settings": {
|
||||
"name": "notification_settings",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"type": {
|
||||
"name": "type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"enabled": {
|
||||
"name": "enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"config": {
|
||||
"name": "config",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"on_grab": {
|
||||
"name": "on_grab",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_download": {
|
||||
"name": "on_download",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_failure": {
|
||||
"name": "on_failure",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"queue_items": {
|
||||
"name": "queue_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'pending'"
|
||||
},
|
||||
"priority": {
|
||||
"name": "priority",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"attempts": {
|
||||
"name": "attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"max_attempts": {
|
||||
"name": "max_attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 3
|
||||
},
|
||||
"error": {
|
||||
"name": "error",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"started_at": {
|
||||
"name": "started_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"completed_at": {
|
||||
"name": "completed_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"queue_items_content_item_id_content_items_id_fk": {
|
||||
"name": "queue_items_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "queue_items",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"system_config": {
|
||||
"name": "system_config",
|
||||
"columns": {
|
||||
"key": {
|
||||
"name": "key",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"value": {
|
||||
"name": "value",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"platform_settings": {
|
||||
"name": "platform_settings",
|
||||
"columns": {
|
||||
"platform": {
|
||||
"name": "platform",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"default_format_profile_id": {
|
||||
"name": "default_format_profile_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"check_interval": {
|
||||
"name": "check_interval",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": 360
|
||||
},
|
||||
"concurrency_limit": {
|
||||
"name": "concurrency_limit",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": 2
|
||||
},
|
||||
"subtitle_languages": {
|
||||
"name": "subtitle_languages",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"grab_all_enabled": {
|
||||
"name": "grab_all_enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"grab_all_order": {
|
||||
"name": "grab_all_order",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'newest'"
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"platform_settings_default_format_profile_id_format_profiles_id_fk": {
|
||||
"name": "platform_settings_default_format_profile_id_format_profiles_id_fk",
|
||||
"tableFrom": "platform_settings",
|
||||
"tableTo": "format_profiles",
|
||||
"columnsFrom": [
|
||||
"default_format_profile_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
}
|
||||
},
|
||||
"views": {},
|
||||
"enums": {},
|
||||
"_meta": {
|
||||
"schemas": {},
|
||||
"tables": {},
|
||||
"columns": {}
|
||||
},
|
||||
"internal": {
|
||||
"indexes": {}
|
||||
}
|
||||
}
|
||||
945
drizzle/meta/0005_snapshot.json
Normal file
945
drizzle/meta/0005_snapshot.json
Normal file
|
|
@ -0,0 +1,945 @@
|
|||
{
|
||||
"version": "6",
|
||||
"dialect": "sqlite",
|
||||
"id": "8270642c-9deb-4bba-b798-bb3739dbe29f",
|
||||
"prevId": "f54a9d63-7807-4753-894f-6822f36471f2",
|
||||
"tables": {
|
||||
"content_items": {
|
||||
"name": "content_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"title": {
|
||||
"name": "title",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_content_id": {
|
||||
"name": "platform_content_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"content_type": {
|
||||
"name": "content_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"duration": {
|
||||
"name": "duration",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_path": {
|
||||
"name": "file_path",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_size": {
|
||||
"name": "file_size",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format": {
|
||||
"name": "format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"quality_metadata": {
|
||||
"name": "quality_metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'monitored'"
|
||||
},
|
||||
"thumbnail_url": {
|
||||
"name": "thumbnail_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"published_at": {
|
||||
"name": "published_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"downloaded_at": {
|
||||
"name": "downloaded_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitored": {
|
||||
"name": "monitored",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"content_items_creator_id_creators_id_fk": {
|
||||
"name": "content_items_creator_id_creators_id_fk",
|
||||
"tableFrom": "content_items",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"format_profiles": {
|
||||
"name": "format_profiles",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"video_resolution": {
|
||||
"name": "video_resolution",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_codec": {
|
||||
"name": "audio_codec",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_bitrate": {
|
||||
"name": "audio_bitrate",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"container_format": {
|
||||
"name": "container_format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"is_default": {
|
||||
"name": "is_default",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"subtitle_languages": {
|
||||
"name": "subtitle_languages",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"embed_subtitles": {
|
||||
"name": "embed_subtitles",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"creators": {
|
||||
"name": "creators",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform": {
|
||||
"name": "platform",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_id": {
|
||||
"name": "platform_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitoring_enabled": {
|
||||
"name": "monitoring_enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"check_interval": {
|
||||
"name": "check_interval",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 360
|
||||
},
|
||||
"image_url": {
|
||||
"name": "image_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"metadata": {
|
||||
"name": "metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format_profile_id": {
|
||||
"name": "format_profile_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"last_checked_at": {
|
||||
"name": "last_checked_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"last_check_status": {
|
||||
"name": "last_check_status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitoring_mode": {
|
||||
"name": "monitoring_mode",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'all'"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"creators_format_profile_id_format_profiles_id_fk": {
|
||||
"name": "creators_format_profile_id_format_profiles_id_fk",
|
||||
"tableFrom": "creators",
|
||||
"tableTo": "format_profiles",
|
||||
"columnsFrom": [
|
||||
"format_profile_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"download_history": {
|
||||
"name": "download_history",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"event_type": {
|
||||
"name": "event_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"details": {
|
||||
"name": "details",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"download_history_content_item_id_content_items_id_fk": {
|
||||
"name": "download_history_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"download_history_creator_id_creators_id_fk": {
|
||||
"name": "download_history_creator_id_creators_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"content_playlist": {
|
||||
"name": "content_playlist",
|
||||
"columns": {
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"playlist_id": {
|
||||
"name": "playlist_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"content_playlist_content_item_id_content_items_id_fk": {
|
||||
"name": "content_playlist_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "content_playlist",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"content_playlist_playlist_id_playlists_id_fk": {
|
||||
"name": "content_playlist_playlist_id_playlists_id_fk",
|
||||
"tableFrom": "content_playlist",
|
||||
"tableTo": "playlists",
|
||||
"columnsFrom": [
|
||||
"playlist_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {
|
||||
"content_playlist_content_item_id_playlist_id_pk": {
|
||||
"columns": [
|
||||
"content_item_id",
|
||||
"playlist_id"
|
||||
],
|
||||
"name": "content_playlist_content_item_id_playlist_id_pk"
|
||||
}
|
||||
},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"notification_settings": {
|
||||
"name": "notification_settings",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"type": {
|
||||
"name": "type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"enabled": {
|
||||
"name": "enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"config": {
|
||||
"name": "config",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"on_grab": {
|
||||
"name": "on_grab",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_download": {
|
||||
"name": "on_download",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_failure": {
|
||||
"name": "on_failure",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"platform_settings": {
|
||||
"name": "platform_settings",
|
||||
"columns": {
|
||||
"platform": {
|
||||
"name": "platform",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"default_format_profile_id": {
|
||||
"name": "default_format_profile_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"check_interval": {
|
||||
"name": "check_interval",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": 360
|
||||
},
|
||||
"concurrency_limit": {
|
||||
"name": "concurrency_limit",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": 2
|
||||
},
|
||||
"subtitle_languages": {
|
||||
"name": "subtitle_languages",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"grab_all_enabled": {
|
||||
"name": "grab_all_enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"grab_all_order": {
|
||||
"name": "grab_all_order",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'newest'"
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"platform_settings_default_format_profile_id_format_profiles_id_fk": {
|
||||
"name": "platform_settings_default_format_profile_id_format_profiles_id_fk",
|
||||
"tableFrom": "platform_settings",
|
||||
"tableTo": "format_profiles",
|
||||
"columnsFrom": [
|
||||
"default_format_profile_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"playlists": {
|
||||
"name": "playlists",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"creator_id": {
|
||||
"name": "creator_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_playlist_id": {
|
||||
"name": "platform_playlist_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"title": {
|
||||
"name": "title",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"position": {
|
||||
"name": "position",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"playlists_creator_id_creators_id_fk": {
|
||||
"name": "playlists_creator_id_creators_id_fk",
|
||||
"tableFrom": "playlists",
|
||||
"tableTo": "creators",
|
||||
"columnsFrom": [
|
||||
"creator_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"queue_items": {
|
||||
"name": "queue_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'pending'"
|
||||
},
|
||||
"priority": {
|
||||
"name": "priority",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"attempts": {
|
||||
"name": "attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"max_attempts": {
|
||||
"name": "max_attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 3
|
||||
},
|
||||
"error": {
|
||||
"name": "error",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"started_at": {
|
||||
"name": "started_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"completed_at": {
|
||||
"name": "completed_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"queue_items_content_item_id_content_items_id_fk": {
|
||||
"name": "queue_items_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "queue_items",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"system_config": {
|
||||
"name": "system_config",
|
||||
"columns": {
|
||||
"key": {
|
||||
"name": "key",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"value": {
|
||||
"name": "value",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
}
|
||||
},
|
||||
"views": {},
|
||||
"enums": {},
|
||||
"_meta": {
|
||||
"schemas": {},
|
||||
"tables": {},
|
||||
"columns": {}
|
||||
},
|
||||
"internal": {
|
||||
"indexes": {}
|
||||
}
|
||||
}
|
||||
951
drizzle/meta/0006_snapshot.json
Normal file
951
drizzle/meta/0006_snapshot.json
Normal file
|
|
@ -0,0 +1,951 @@
|
|||
{
|
||||
"version": "6",
|
||||
"dialect": "sqlite",
|
||||
"id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890",
|
||||
"prevId": "8270642c-9deb-4bba-b798-bb3739dbe29f",
|
||||
"tables": {
|
||||
"content_items": {
|
||||
"name": "content_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"title": {
|
||||
"name": "title",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_content_id": {
|
||||
"name": "platform_content_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"content_type": {
|
||||
"name": "content_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"duration": {
|
||||
"name": "duration",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_path": {
|
||||
"name": "file_path",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_size": {
|
||||
"name": "file_size",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format": {
|
||||
"name": "format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"quality_metadata": {
|
||||
"name": "quality_metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'monitored'"
|
||||
},
|
||||
"thumbnail_url": {
|
||||
"name": "thumbnail_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"published_at": {
|
||||
"name": "published_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"downloaded_at": {
|
||||
"name": "downloaded_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitored": {
|
||||
"name": "monitored",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"channel_id": {
|
||||
"name": "channel_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"content_items_channel_id_channels_id_fk": {
|
||||
"name": "content_items_channel_id_channels_id_fk",
|
||||
"tableFrom": "content_items",
|
||||
"tableTo": "channels",
|
||||
"columnsFrom": [
|
||||
"channel_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"format_profiles": {
|
||||
"name": "format_profiles",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"video_resolution": {
|
||||
"name": "video_resolution",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_codec": {
|
||||
"name": "audio_codec",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_bitrate": {
|
||||
"name": "audio_bitrate",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"container_format": {
|
||||
"name": "container_format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"is_default": {
|
||||
"name": "is_default",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"subtitle_languages": {
|
||||
"name": "subtitle_languages",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"embed_subtitles": {
|
||||
"name": "embed_subtitles",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"download_history": {
|
||||
"name": "download_history",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"event_type": {
|
||||
"name": "event_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"details": {
|
||||
"name": "details",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"channel_id": {
|
||||
"name": "channel_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"download_history_content_item_id_content_items_id_fk": {
|
||||
"name": "download_history_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"download_history_channel_id_channels_id_fk": {
|
||||
"name": "download_history_channel_id_channels_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "channels",
|
||||
"columnsFrom": [
|
||||
"channel_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"content_playlist": {
|
||||
"name": "content_playlist",
|
||||
"columns": {
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"playlist_id": {
|
||||
"name": "playlist_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"content_playlist_content_item_id_content_items_id_fk": {
|
||||
"name": "content_playlist_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "content_playlist",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"content_playlist_playlist_id_playlists_id_fk": {
|
||||
"name": "content_playlist_playlist_id_playlists_id_fk",
|
||||
"tableFrom": "content_playlist",
|
||||
"tableTo": "playlists",
|
||||
"columnsFrom": [
|
||||
"playlist_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {
|
||||
"content_playlist_content_item_id_playlist_id_pk": {
|
||||
"columns": [
|
||||
"content_item_id",
|
||||
"playlist_id"
|
||||
],
|
||||
"name": "content_playlist_content_item_id_playlist_id_pk"
|
||||
}
|
||||
},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"notification_settings": {
|
||||
"name": "notification_settings",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"type": {
|
||||
"name": "type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"enabled": {
|
||||
"name": "enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"config": {
|
||||
"name": "config",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"on_grab": {
|
||||
"name": "on_grab",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_download": {
|
||||
"name": "on_download",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_failure": {
|
||||
"name": "on_failure",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"platform_settings": {
|
||||
"name": "platform_settings",
|
||||
"columns": {
|
||||
"platform": {
|
||||
"name": "platform",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"default_format_profile_id": {
|
||||
"name": "default_format_profile_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"check_interval": {
|
||||
"name": "check_interval",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": 360
|
||||
},
|
||||
"concurrency_limit": {
|
||||
"name": "concurrency_limit",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": 2
|
||||
},
|
||||
"subtitle_languages": {
|
||||
"name": "subtitle_languages",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"grab_all_enabled": {
|
||||
"name": "grab_all_enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"grab_all_order": {
|
||||
"name": "grab_all_order",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'newest'"
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"platform_settings_default_format_profile_id_format_profiles_id_fk": {
|
||||
"name": "platform_settings_default_format_profile_id_format_profiles_id_fk",
|
||||
"tableFrom": "platform_settings",
|
||||
"tableTo": "format_profiles",
|
||||
"columnsFrom": [
|
||||
"default_format_profile_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"playlists": {
|
||||
"name": "playlists",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"platform_playlist_id": {
|
||||
"name": "platform_playlist_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"title": {
|
||||
"name": "title",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"position": {
|
||||
"name": "position",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"channel_id": {
|
||||
"name": "channel_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"playlists_channel_id_channels_id_fk": {
|
||||
"name": "playlists_channel_id_channels_id_fk",
|
||||
"tableFrom": "playlists",
|
||||
"tableTo": "channels",
|
||||
"columnsFrom": [
|
||||
"channel_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"queue_items": {
|
||||
"name": "queue_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'pending'"
|
||||
},
|
||||
"priority": {
|
||||
"name": "priority",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"attempts": {
|
||||
"name": "attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"max_attempts": {
|
||||
"name": "max_attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 3
|
||||
},
|
||||
"error": {
|
||||
"name": "error",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"started_at": {
|
||||
"name": "started_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"completed_at": {
|
||||
"name": "completed_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"queue_items_content_item_id_content_items_id_fk": {
|
||||
"name": "queue_items_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "queue_items",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"system_config": {
|
||||
"name": "system_config",
|
||||
"columns": {
|
||||
"key": {
|
||||
"name": "key",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"value": {
|
||||
"name": "value",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"channels": {
|
||||
"name": "channels",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform": {
|
||||
"name": "platform",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_id": {
|
||||
"name": "platform_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitoring_enabled": {
|
||||
"name": "monitoring_enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"check_interval": {
|
||||
"name": "check_interval",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 360
|
||||
},
|
||||
"image_url": {
|
||||
"name": "image_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"metadata": {
|
||||
"name": "metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format_profile_id": {
|
||||
"name": "format_profile_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"last_checked_at": {
|
||||
"name": "last_checked_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"last_check_status": {
|
||||
"name": "last_check_status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitoring_mode": {
|
||||
"name": "monitoring_mode",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'all'"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"channels_format_profile_id_format_profiles_id_fk": {
|
||||
"name": "channels_format_profile_id_format_profiles_id_fk",
|
||||
"tableFrom": "channels",
|
||||
"tableTo": "format_profiles",
|
||||
"columnsFrom": [
|
||||
"format_profile_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
}
|
||||
},
|
||||
"views": {},
|
||||
"enums": {},
|
||||
"_meta": {
|
||||
"schemas": {},
|
||||
"tables": {
|
||||
"\"creators\"": "\"channels\""
|
||||
},
|
||||
"columns": {
|
||||
"\"content_items\".\"creator_id\"": "\"content_items\".\"channel_id\"",
|
||||
"\"download_history\".\"creator_id\"": "\"download_history\".\"channel_id\"",
|
||||
"\"playlists\".\"creator_id\"": "\"playlists\".\"channel_id\""
|
||||
}
|
||||
},
|
||||
"internal": {
|
||||
"indexes": {}
|
||||
}
|
||||
}
|
||||
961
drizzle/meta/0007_snapshot.json
Normal file
961
drizzle/meta/0007_snapshot.json
Normal file
|
|
@ -0,0 +1,961 @@
|
|||
{
|
||||
"version": "6",
|
||||
"dialect": "sqlite",
|
||||
"id": "24769c36-328d-4e54-8ba1-e74a2681bef5",
|
||||
"prevId": "a1b2c3d4-e5f6-7890-abcd-ef1234567890",
|
||||
"tables": {
|
||||
"channels": {
|
||||
"name": "channels",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform": {
|
||||
"name": "platform",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_id": {
|
||||
"name": "platform_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitoring_enabled": {
|
||||
"name": "monitoring_enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"check_interval": {
|
||||
"name": "check_interval",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 360
|
||||
},
|
||||
"image_url": {
|
||||
"name": "image_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"metadata": {
|
||||
"name": "metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format_profile_id": {
|
||||
"name": "format_profile_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"last_checked_at": {
|
||||
"name": "last_checked_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"last_check_status": {
|
||||
"name": "last_check_status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitoring_mode": {
|
||||
"name": "monitoring_mode",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'all'"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"channels_format_profile_id_format_profiles_id_fk": {
|
||||
"name": "channels_format_profile_id_format_profiles_id_fk",
|
||||
"tableFrom": "channels",
|
||||
"tableTo": "format_profiles",
|
||||
"columnsFrom": [
|
||||
"format_profile_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"content_items": {
|
||||
"name": "content_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"channel_id": {
|
||||
"name": "channel_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"title": {
|
||||
"name": "title",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_content_id": {
|
||||
"name": "platform_content_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"url": {
|
||||
"name": "url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"content_type": {
|
||||
"name": "content_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"duration": {
|
||||
"name": "duration",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_path": {
|
||||
"name": "file_path",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"file_size": {
|
||||
"name": "file_size",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"format": {
|
||||
"name": "format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"quality_metadata": {
|
||||
"name": "quality_metadata",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'monitored'"
|
||||
},
|
||||
"thumbnail_url": {
|
||||
"name": "thumbnail_url",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"published_at": {
|
||||
"name": "published_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"downloaded_at": {
|
||||
"name": "downloaded_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"monitored": {
|
||||
"name": "monitored",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"content_items_channel_id_channels_id_fk": {
|
||||
"name": "content_items_channel_id_channels_id_fk",
|
||||
"tableFrom": "content_items",
|
||||
"tableTo": "channels",
|
||||
"columnsFrom": [
|
||||
"channel_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"format_profiles": {
|
||||
"name": "format_profiles",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"video_resolution": {
|
||||
"name": "video_resolution",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_codec": {
|
||||
"name": "audio_codec",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"audio_bitrate": {
|
||||
"name": "audio_bitrate",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"container_format": {
|
||||
"name": "container_format",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"is_default": {
|
||||
"name": "is_default",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"subtitle_languages": {
|
||||
"name": "subtitle_languages",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"embed_subtitles": {
|
||||
"name": "embed_subtitles",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"download_history": {
|
||||
"name": "download_history",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"channel_id": {
|
||||
"name": "channel_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"event_type": {
|
||||
"name": "event_type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"details": {
|
||||
"name": "details",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"download_history_content_item_id_content_items_id_fk": {
|
||||
"name": "download_history_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"download_history_channel_id_channels_id_fk": {
|
||||
"name": "download_history_channel_id_channels_id_fk",
|
||||
"tableFrom": "download_history",
|
||||
"tableTo": "channels",
|
||||
"columnsFrom": [
|
||||
"channel_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"content_playlist": {
|
||||
"name": "content_playlist",
|
||||
"columns": {
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"playlist_id": {
|
||||
"name": "playlist_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"content_playlist_content_item_id_content_items_id_fk": {
|
||||
"name": "content_playlist_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "content_playlist",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
},
|
||||
"content_playlist_playlist_id_playlists_id_fk": {
|
||||
"name": "content_playlist_playlist_id_playlists_id_fk",
|
||||
"tableFrom": "content_playlist",
|
||||
"tableTo": "playlists",
|
||||
"columnsFrom": [
|
||||
"playlist_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {
|
||||
"content_playlist_content_item_id_playlist_id_pk": {
|
||||
"columns": [
|
||||
"content_item_id",
|
||||
"playlist_id"
|
||||
],
|
||||
"name": "content_playlist_content_item_id_playlist_id_pk"
|
||||
}
|
||||
},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"notification_settings": {
|
||||
"name": "notification_settings",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"type": {
|
||||
"name": "type",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"name": {
|
||||
"name": "name",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"enabled": {
|
||||
"name": "enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"config": {
|
||||
"name": "config",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"on_grab": {
|
||||
"name": "on_grab",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_download": {
|
||||
"name": "on_download",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"on_failure": {
|
||||
"name": "on_failure",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": true
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"platform_settings": {
|
||||
"name": "platform_settings",
|
||||
"columns": {
|
||||
"platform": {
|
||||
"name": "platform",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"default_format_profile_id": {
|
||||
"name": "default_format_profile_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"check_interval": {
|
||||
"name": "check_interval",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": 360
|
||||
},
|
||||
"concurrency_limit": {
|
||||
"name": "concurrency_limit",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": 2
|
||||
},
|
||||
"subtitle_languages": {
|
||||
"name": "subtitle_languages",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"grab_all_enabled": {
|
||||
"name": "grab_all_enabled",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": false
|
||||
},
|
||||
"grab_all_order": {
|
||||
"name": "grab_all_order",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'newest'"
|
||||
},
|
||||
"scan_limit": {
|
||||
"name": "scan_limit",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": 100
|
||||
},
|
||||
"rate_limit_delay": {
|
||||
"name": "rate_limit_delay",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false,
|
||||
"default": 1000
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"platform_settings_default_format_profile_id_format_profiles_id_fk": {
|
||||
"name": "platform_settings_default_format_profile_id_format_profiles_id_fk",
|
||||
"tableFrom": "platform_settings",
|
||||
"tableTo": "format_profiles",
|
||||
"columnsFrom": [
|
||||
"default_format_profile_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "set null",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"playlists": {
|
||||
"name": "playlists",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"channel_id": {
|
||||
"name": "channel_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"platform_playlist_id": {
|
||||
"name": "platform_playlist_id",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"title": {
|
||||
"name": "title",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"position": {
|
||||
"name": "position",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"playlists_channel_id_channels_id_fk": {
|
||||
"name": "playlists_channel_id_channels_id_fk",
|
||||
"tableFrom": "playlists",
|
||||
"tableTo": "channels",
|
||||
"columnsFrom": [
|
||||
"channel_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"queue_items": {
|
||||
"name": "queue_items",
|
||||
"columns": {
|
||||
"id": {
|
||||
"name": "id",
|
||||
"type": "integer",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": true
|
||||
},
|
||||
"content_item_id": {
|
||||
"name": "content_item_id",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"status": {
|
||||
"name": "status",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "'pending'"
|
||||
},
|
||||
"priority": {
|
||||
"name": "priority",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"attempts": {
|
||||
"name": "attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 0
|
||||
},
|
||||
"max_attempts": {
|
||||
"name": "max_attempts",
|
||||
"type": "integer",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": 3
|
||||
},
|
||||
"error": {
|
||||
"name": "error",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"started_at": {
|
||||
"name": "started_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"completed_at": {
|
||||
"name": "completed_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": false,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {
|
||||
"queue_items_content_item_id_content_items_id_fk": {
|
||||
"name": "queue_items_content_item_id_content_items_id_fk",
|
||||
"tableFrom": "queue_items",
|
||||
"tableTo": "content_items",
|
||||
"columnsFrom": [
|
||||
"content_item_id"
|
||||
],
|
||||
"columnsTo": [
|
||||
"id"
|
||||
],
|
||||
"onDelete": "cascade",
|
||||
"onUpdate": "no action"
|
||||
}
|
||||
},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
},
|
||||
"system_config": {
|
||||
"name": "system_config",
|
||||
"columns": {
|
||||
"key": {
|
||||
"name": "key",
|
||||
"type": "text",
|
||||
"primaryKey": true,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"value": {
|
||||
"name": "value",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false
|
||||
},
|
||||
"created_at": {
|
||||
"name": "created_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
},
|
||||
"updated_at": {
|
||||
"name": "updated_at",
|
||||
"type": "text",
|
||||
"primaryKey": false,
|
||||
"notNull": true,
|
||||
"autoincrement": false,
|
||||
"default": "(datetime('now'))"
|
||||
}
|
||||
},
|
||||
"indexes": {},
|
||||
"foreignKeys": {},
|
||||
"compositePrimaryKeys": {},
|
||||
"uniqueConstraints": {},
|
||||
"checkConstraints": {}
|
||||
}
|
||||
},
|
||||
"views": {},
|
||||
"enums": {},
|
||||
"_meta": {
|
||||
"schemas": {},
|
||||
"tables": {},
|
||||
"columns": {}
|
||||
},
|
||||
"internal": {
|
||||
"indexes": {}
|
||||
}
|
||||
}
|
||||
62
drizzle/meta/_journal.json
Normal file
62
drizzle/meta/_journal.json
Normal file
|
|
@ -0,0 +1,62 @@
|
|||
{
|
||||
"version": "7",
|
||||
"dialect": "sqlite",
|
||||
"entries": [
|
||||
{
|
||||
"idx": 0,
|
||||
"version": "6",
|
||||
"when": 1774243438376,
|
||||
"tag": "0000_colossal_jubilee",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 1,
|
||||
"version": "6",
|
||||
"when": 1774245174383,
|
||||
"tag": "0001_natural_toad_men",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 2,
|
||||
"version": "6",
|
||||
"when": 1774246365287,
|
||||
"tag": "0002_lonely_nico_minoru",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 3,
|
||||
"version": "6",
|
||||
"when": 1774310330436,
|
||||
"tag": "0003_moaning_vertigo",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 4,
|
||||
"version": "6",
|
||||
"when": 1774312153698,
|
||||
"tag": "0004_platform_settings",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 5,
|
||||
"version": "6",
|
||||
"when": 1774325204862,
|
||||
"tag": "0005_monitoring_playlists",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 6,
|
||||
"version": "6",
|
||||
"when": 1774656000000,
|
||||
"tag": "0006_rename_creators_to_channels",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 7,
|
||||
"version": "6",
|
||||
"when": 1774396066443,
|
||||
"tag": "0007_steep_the_watchers",
|
||||
"breakpoints": true
|
||||
}
|
||||
]
|
||||
}
|
||||
5266
package-lock.json
generated
Normal file
5266
package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load diff
47
package.json
Normal file
47
package.json
Normal file
|
|
@ -0,0 +1,47 @@
|
|||
{
|
||||
"name": "tubearr",
|
||||
"version": "0.1.0",
|
||||
"description": "Self-hosted content archival and monitoring application in the *arr family style",
|
||||
"private": true,
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "tsx watch src/index.ts",
|
||||
"dev:frontend": "vite --config src/frontend/vite.config.ts",
|
||||
"build": "tsc",
|
||||
"build:frontend": "vite build --config src/frontend/vite.config.ts",
|
||||
"start": "node dist/index.js",
|
||||
"test": "vitest run",
|
||||
"db:generate": "drizzle-kit generate",
|
||||
"db:migrate": "tsx src/db/migrate.ts"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"@fastify/cors": "^11.0.0",
|
||||
"@fastify/rate-limit": "^10.2.1",
|
||||
"@fastify/static": "^9.0.0",
|
||||
"@libsql/client": "^0.14.0",
|
||||
"@tanstack/react-query": "^5.95.0",
|
||||
"croner": "^10.0.1",
|
||||
"dotenv": "^16.4.7",
|
||||
"drizzle-orm": "^0.38.4",
|
||||
"fastify": "^5.2.1",
|
||||
"fastify-plugin": "^5.1.0",
|
||||
"lucide-react": "^0.577.0",
|
||||
"react": "^19.2.4",
|
||||
"react-dom": "^19.2.4",
|
||||
"react-router": "^7.13.1",
|
||||
"react-router-dom": "^7.13.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/node": "^22.12.0",
|
||||
"@types/react": "^19.2.14",
|
||||
"@types/react-dom": "^19.2.3",
|
||||
"@vitejs/plugin-react": "^4.7.0",
|
||||
"drizzle-kit": "^0.30.4",
|
||||
"tsx": "^4.19.2",
|
||||
"typescript": "^5.7.3",
|
||||
"vitest": "^3.0.5"
|
||||
}
|
||||
}
|
||||
239
scripts/docker-smoke-test.sh
Normal file
239
scripts/docker-smoke-test.sh
Normal file
|
|
@ -0,0 +1,239 @@
|
|||
#!/usr/bin/env bash
|
||||
# ============================================================
|
||||
# Docker Smoke Test — Tubearr
|
||||
#
|
||||
# Builds the Docker image, starts a container, and verifies
|
||||
# core endpoints work end-to-end. Tests restart persistence.
|
||||
#
|
||||
# Usage: bash scripts/docker-smoke-test.sh
|
||||
# ============================================================
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# ── Configuration ──
|
||||
|
||||
IMAGE_NAME="tubearr"
|
||||
# Container name must match docker-compose.yml container_name
|
||||
CONTAINER_NAME="tubearr"
|
||||
PORT=8989
|
||||
HEALTH_TIMEOUT=90 # seconds to wait for healthy status
|
||||
COMPOSE_FILE="docker-compose.yml"
|
||||
|
||||
# ── Color output helpers ──
|
||||
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
pass() { echo -e "${GREEN}✓ $1${NC}"; }
|
||||
fail() { echo -e "${RED}✗ $1${NC}"; }
|
||||
info() { echo -e "${YELLOW}→ $1${NC}"; }
|
||||
|
||||
# ── Cleanup trap ──
|
||||
|
||||
cleanup() {
|
||||
info "Cleaning up..."
|
||||
docker-compose -f "$COMPOSE_FILE" down --volumes --remove-orphans 2>/dev/null || true
|
||||
# Remove any leftover config/media dirs created by compose bind mounts
|
||||
rm -rf ./config ./media 2>/dev/null || true
|
||||
}
|
||||
trap cleanup EXIT
|
||||
|
||||
# ── Pre-check: Ensure port is not already in use ──
|
||||
|
||||
info "Checking port $PORT availability"
|
||||
if curl -sf "http://localhost:${PORT}/ping" >/dev/null 2>&1; then
|
||||
fail "Port $PORT is already in use — another service is running. Stop it before running this test."
|
||||
exit 1
|
||||
fi
|
||||
pass "Port $PORT is available"
|
||||
|
||||
# ── Step 1: Build Docker image ──
|
||||
|
||||
info "Building Docker image: $IMAGE_NAME"
|
||||
docker build -t "$IMAGE_NAME" . || {
|
||||
fail "Docker build failed"
|
||||
exit 1
|
||||
}
|
||||
pass "Docker image built successfully"
|
||||
|
||||
# ── Step 2: Start container via docker-compose ──
|
||||
|
||||
info "Starting container via docker-compose"
|
||||
docker-compose -f "$COMPOSE_FILE" up -d || {
|
||||
fail "docker-compose up failed"
|
||||
exit 1
|
||||
}
|
||||
pass "Container started"
|
||||
|
||||
# ── Step 3: Wait for healthy ──
|
||||
|
||||
info "Waiting for container to become healthy (timeout: ${HEALTH_TIMEOUT}s)"
|
||||
elapsed=0
|
||||
while [ $elapsed -lt $HEALTH_TIMEOUT ]; do
|
||||
status=$(docker inspect --format='{{.State.Health.Status}}' "$CONTAINER_NAME" 2>/dev/null || echo "unknown")
|
||||
if [ "$status" = "healthy" ]; then
|
||||
break
|
||||
fi
|
||||
if [ "$status" = "unhealthy" ]; then
|
||||
fail "Container became unhealthy"
|
||||
echo "Container logs:"
|
||||
docker logs "$CONTAINER_NAME" 2>&1 | tail -30
|
||||
exit 1
|
||||
fi
|
||||
sleep 2
|
||||
elapsed=$((elapsed + 2))
|
||||
done
|
||||
|
||||
if [ "$status" != "healthy" ]; then
|
||||
fail "Container did not become healthy within ${HEALTH_TIMEOUT}s (status: $status)"
|
||||
echo "Container logs:"
|
||||
docker logs "$CONTAINER_NAME" 2>&1 | tail -30
|
||||
exit 1
|
||||
fi
|
||||
pass "Container is healthy"
|
||||
|
||||
# ── Step 4: Verify /ping (unauthenticated) ──
|
||||
|
||||
info "Testing GET /ping"
|
||||
PING_RESPONSE=$(curl -sf "http://localhost:${PORT}/ping" 2>&1) || {
|
||||
fail "GET /ping failed"
|
||||
exit 1
|
||||
}
|
||||
|
||||
if echo "$PING_RESPONSE" | grep -q '"status":"ok"'; then
|
||||
pass "GET /ping returns {\"status\":\"ok\"}"
|
||||
else
|
||||
fail "GET /ping unexpected response: $PING_RESPONSE"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# ── Step 5: Extract API key from container logs ──
|
||||
|
||||
info "Extracting API key from container logs"
|
||||
# The auth plugin logs the generated key in a banner like:
|
||||
# API Key generated (save this — it will not be shown again):
|
||||
# <uuid>
|
||||
# We look for a UUID-like string on a line by itself after the banner text.
|
||||
# On restart with persisted state, the key won't be in logs — use TUBEARR_API_KEY env var as fallback.
|
||||
API_KEY=$(docker logs "$CONTAINER_NAME" 2>&1 | grep -oE '[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}' | head -1 || true)
|
||||
|
||||
if [ -z "$API_KEY" ]; then
|
||||
fail "Could not extract API key from container logs"
|
||||
echo "Container logs:"
|
||||
docker logs "$CONTAINER_NAME" 2>&1 | tail -20
|
||||
exit 1
|
||||
fi
|
||||
pass "API key extracted: ${API_KEY:0:8}...${API_KEY: -4}"
|
||||
|
||||
# ── Step 6: Verify /api/v1/health (authenticated) ──
|
||||
|
||||
info "Testing GET /api/v1/health"
|
||||
HEALTH_RESPONSE=$(curl -sf -H "X-Api-Key: $API_KEY" "http://localhost:${PORT}/api/v1/health" 2>&1) || {
|
||||
fail "GET /api/v1/health failed"
|
||||
exit 1
|
||||
}
|
||||
|
||||
if echo "$HEALTH_RESPONSE" | grep -q '"status"'; then
|
||||
pass "GET /api/v1/health returns health status"
|
||||
else
|
||||
fail "GET /api/v1/health unexpected response: $HEALTH_RESPONSE"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# ── Step 7: Verify /api/v1/system/status (authenticated) ──
|
||||
|
||||
info "Testing GET /api/v1/system/status"
|
||||
STATUS_RESPONSE=$(curl -sf -H "X-Api-Key: $API_KEY" "http://localhost:${PORT}/api/v1/system/status" 2>&1) || {
|
||||
fail "GET /api/v1/system/status failed"
|
||||
exit 1
|
||||
}
|
||||
|
||||
if echo "$STATUS_RESPONSE" | grep -q '"appName":"Tubearr"'; then
|
||||
pass "GET /api/v1/system/status returns appName=Tubearr"
|
||||
else
|
||||
fail "GET /api/v1/system/status unexpected response: $STATUS_RESPONSE"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# ── Step 8: Verify auth rejection ──
|
||||
|
||||
info "Testing auth rejection (no API key)"
|
||||
AUTH_CODE=$(curl -s -o /dev/null -w '%{http_code}' "http://localhost:${PORT}/api/v1/system/status" 2>&1)
|
||||
|
||||
if [ "$AUTH_CODE" = "401" ]; then
|
||||
pass "Unauthenticated request correctly returns 401"
|
||||
else
|
||||
fail "Expected 401, got $AUTH_CODE"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# ── Step 9: Test restart persistence ──
|
||||
|
||||
info "Testing container restart persistence"
|
||||
|
||||
# Record the API key before restart
|
||||
PRE_RESTART_KEY="$API_KEY"
|
||||
|
||||
# Restart the container
|
||||
docker-compose -f "$COMPOSE_FILE" restart || {
|
||||
fail "docker-compose restart failed"
|
||||
exit 1
|
||||
}
|
||||
|
||||
# Wait for healthy again
|
||||
info "Waiting for container to become healthy after restart"
|
||||
elapsed=0
|
||||
while [ $elapsed -lt $HEALTH_TIMEOUT ]; do
|
||||
status=$(docker inspect --format='{{.State.Health.Status}}' "$CONTAINER_NAME" 2>/dev/null || echo "unknown")
|
||||
if [ "$status" = "healthy" ]; then
|
||||
break
|
||||
fi
|
||||
if [ "$status" = "unhealthy" ]; then
|
||||
fail "Container became unhealthy after restart"
|
||||
docker logs "$CONTAINER_NAME" 2>&1 | tail -20
|
||||
exit 1
|
||||
fi
|
||||
sleep 2
|
||||
elapsed=$((elapsed + 2))
|
||||
done
|
||||
|
||||
if [ "$status" != "healthy" ]; then
|
||||
fail "Container did not become healthy after restart within ${HEALTH_TIMEOUT}s"
|
||||
docker logs "$CONTAINER_NAME" 2>&1 | tail -20
|
||||
exit 1
|
||||
fi
|
||||
pass "Container healthy after restart"
|
||||
|
||||
# Verify /ping still works
|
||||
PING_AFTER=$(curl -sf "http://localhost:${PORT}/ping" 2>&1) || {
|
||||
fail "GET /ping failed after restart"
|
||||
exit 1
|
||||
}
|
||||
if echo "$PING_AFTER" | grep -q '"status":"ok"'; then
|
||||
pass "GET /ping works after restart"
|
||||
else
|
||||
fail "GET /ping unexpected response after restart: $PING_AFTER"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Verify the same API key works (state persisted via volume)
|
||||
HEALTH_AFTER=$(curl -sf -H "X-Api-Key: $PRE_RESTART_KEY" "http://localhost:${PORT}/api/v1/health" 2>&1) || {
|
||||
fail "GET /api/v1/health failed after restart with pre-restart API key"
|
||||
exit 1
|
||||
}
|
||||
if echo "$HEALTH_AFTER" | grep -q '"status"'; then
|
||||
pass "Pre-restart API key still works — state persisted"
|
||||
else
|
||||
fail "API key state not preserved across restart"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# ── Done ──
|
||||
|
||||
echo ""
|
||||
echo -e "${GREEN}═══════════════════════════════════════════${NC}"
|
||||
echo -e "${GREEN} SMOKE TEST PASSED${NC}"
|
||||
echo -e "${GREEN}═══════════════════════════════════════════${NC}"
|
||||
echo ""
|
||||
25
skills-lock.json
Normal file
25
skills-lock.json
Normal file
|
|
@ -0,0 +1,25 @@
|
|||
{
|
||||
"version": 1,
|
||||
"skills": {
|
||||
"drizzle-migrations": {
|
||||
"source": "bobmatnyc/claude-mpm-skills",
|
||||
"sourceType": "github",
|
||||
"computedHash": "b5e3d1249589aebf83c8308f8feacb6da6e38a21995946ea5a6a6522da898507"
|
||||
},
|
||||
"drizzle-orm": {
|
||||
"source": "bobmatnyc/claude-mpm-skills",
|
||||
"sourceType": "github",
|
||||
"computedHash": "c5132317134698624d023cbbbb612b99d2b20f532f6d1417639607cf401f03cd"
|
||||
},
|
||||
"fastify-best-practices": {
|
||||
"source": "mcollina/skills",
|
||||
"sourceType": "github",
|
||||
"computedHash": "b3a771fa66bc5d8dac0af14e99e51fca2ff4a9add56f09d986778528bdf72c4c"
|
||||
},
|
||||
"fastify-typescript": {
|
||||
"source": "mindrally/skills",
|
||||
"sourceType": "github",
|
||||
"computedHash": "d133948c2f5af7fbed1a559cd21a1cbb6abfb5ef25b90c7e3fd3064a422af993"
|
||||
}
|
||||
}
|
||||
}
|
||||
252
src/__tests__/auth-model.test.ts
Normal file
252
src/__tests__/auth-model.test.ts
Normal file
|
|
@ -0,0 +1,252 @@
|
|||
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { type FastifyInstance } from 'fastify';
|
||||
import { initDatabaseAsync, closeDatabase } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import { buildServer } from '../server/index';
|
||||
import { systemConfig } from '../db/schema/index';
|
||||
import { eq } from 'drizzle-orm';
|
||||
import { type LibSQLDatabase } from 'drizzle-orm/libsql';
|
||||
import type * as schema from '../db/schema/index';
|
||||
|
||||
/**
|
||||
* Auth model tests: verify dual-mode authentication.
|
||||
*
|
||||
* The Tubearr auth model (matching Sonarr/Radarr):
|
||||
* - Same-origin browser requests (Origin/Referer matching server host) are trusted
|
||||
* - External requests require a valid API key via header or query param
|
||||
* - API key management endpoints allow reading and regenerating the key
|
||||
*/
|
||||
|
||||
describe('Auth model — dual-mode authentication', () => {
|
||||
let server: FastifyInstance;
|
||||
let db: LibSQLDatabase<typeof schema>;
|
||||
let apiKey: string;
|
||||
let tmpDir: string;
|
||||
|
||||
beforeAll(async () => {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-auth-'));
|
||||
const dbPath = join(tmpDir, 'test.db');
|
||||
db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
server = await buildServer({ db });
|
||||
await server.ready();
|
||||
|
||||
// Read the generated API key from the database
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(systemConfig)
|
||||
.where(eq(systemConfig.key, 'api_key'))
|
||||
.limit(1);
|
||||
apiKey = rows[0]?.value ?? '';
|
||||
expect(apiKey).toBeTruthy();
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await server.close();
|
||||
closeDatabase();
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Temp dir cleanup is best-effort on Windows
|
||||
}
|
||||
});
|
||||
|
||||
// ── Same-origin bypass ──
|
||||
|
||||
describe('Same-origin bypass (trusted browser requests)', () => {
|
||||
it('allows request with matching Origin header — no API key needed', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/status',
|
||||
headers: {
|
||||
origin: 'http://localhost:3000',
|
||||
},
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json()).toHaveProperty('appName', 'Tubearr');
|
||||
});
|
||||
|
||||
it('allows request with matching Referer header — no API key needed', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/status',
|
||||
headers: {
|
||||
referer: 'http://localhost:8989/settings',
|
||||
},
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json()).toHaveProperty('appName', 'Tubearr');
|
||||
});
|
||||
|
||||
it('rejects cross-origin request (different hostname) without API key', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/status',
|
||||
headers: {
|
||||
origin: 'http://evil.example.com:8989',
|
||||
},
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(401);
|
||||
expect(res.json().message).toContain('API key');
|
||||
});
|
||||
});
|
||||
|
||||
// ── External API key authentication ──
|
||||
|
||||
describe('External API key authentication', () => {
|
||||
it('rejects external request without API key (no Origin/Referer)', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/status',
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(401);
|
||||
const body = res.json();
|
||||
expect(body.error).toBe('Unauthorized');
|
||||
expect(body.message).toContain('API key');
|
||||
});
|
||||
|
||||
it('allows external request with valid API key via X-Api-Key header', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/status',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json()).toHaveProperty('appName', 'Tubearr');
|
||||
});
|
||||
|
||||
it('allows external request with valid API key via apikey query param', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: `/api/v1/system/status?apikey=${apiKey}`,
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json()).toHaveProperty('appName', 'Tubearr');
|
||||
});
|
||||
|
||||
it('rejects external request with invalid API key', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/status',
|
||||
headers: { 'x-api-key': 'totally-wrong-key' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(401);
|
||||
expect(res.json().message).toBe('Invalid API key');
|
||||
});
|
||||
});
|
||||
|
||||
// ── API key management endpoints ──
|
||||
|
||||
describe('GET /api/v1/system/apikey', () => {
|
||||
it('returns the current API key for same-origin requests', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/apikey',
|
||||
headers: {
|
||||
origin: 'http://localhost:8989',
|
||||
},
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body).toHaveProperty('apiKey');
|
||||
expect(body.apiKey).toBe(apiKey);
|
||||
});
|
||||
|
||||
it('returns the current API key for API-key-authenticated requests', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/apikey',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json().apiKey).toBe(apiKey);
|
||||
});
|
||||
|
||||
it('rejects unauthenticated external requests', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/apikey',
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
});
|
||||
|
||||
describe('POST /api/v1/system/apikey/regenerate', () => {
|
||||
it('regenerates the API key and returns the new one', async () => {
|
||||
const oldKey = apiKey;
|
||||
|
||||
// Regenerate using same-origin auth
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/system/apikey/regenerate',
|
||||
headers: {
|
||||
origin: 'http://localhost:8989',
|
||||
},
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body).toHaveProperty('apiKey');
|
||||
expect(body.apiKey).not.toBe(oldKey);
|
||||
expect(body.apiKey).toMatch(
|
||||
/^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$/
|
||||
);
|
||||
|
||||
// Update our local reference for subsequent tests
|
||||
apiKey = body.apiKey;
|
||||
});
|
||||
|
||||
it('old API key no longer works for external requests after regeneration', async () => {
|
||||
// The previous test regenerated the key, so the original key should be invalid
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/status',
|
||||
headers: { 'x-api-key': 'the-original-key-is-gone' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
|
||||
it('new API key works for external requests after regeneration', async () => {
|
||||
// Read the current key from the DB to be sure we have the right one
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(systemConfig)
|
||||
.where(eq(systemConfig.key, 'api_key'))
|
||||
.limit(1);
|
||||
const currentKey = rows[0]?.value ?? '';
|
||||
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/status',
|
||||
headers: { 'x-api-key': currentKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
});
|
||||
|
||||
it('rejects unauthenticated external regeneration requests', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/system/apikey/regenerate',
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
});
|
||||
});
|
||||
452
src/__tests__/back-catalog-import.test.ts
Normal file
452
src/__tests__/back-catalog-import.test.ts
Normal file
|
|
@ -0,0 +1,452 @@
|
|||
import { describe, it, expect, vi, beforeAll, afterAll, beforeEach } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { type FastifyInstance } from 'fastify';
|
||||
import { initDatabaseAsync, closeDatabase } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import { buildServer } from '../server/index';
|
||||
import { systemConfig } from '../db/schema/index';
|
||||
import { eq } from 'drizzle-orm';
|
||||
import { type LibSQLDatabase } from 'drizzle-orm/libsql';
|
||||
import type * as schema from '../db/schema/index';
|
||||
import { createChannel } from '../db/repositories/channel-repository';
|
||||
import {
|
||||
createContentItem,
|
||||
getContentByChannelId,
|
||||
} from '../db/repositories/content-repository';
|
||||
import {
|
||||
getQueueItemByContentItemId,
|
||||
getPendingQueueItems,
|
||||
} from '../db/repositories/queue-repository';
|
||||
import { BackCatalogImportService } from '../services/back-catalog-import';
|
||||
import { PlatformRegistry } from '../sources/platform-source';
|
||||
import { YouTubeSource } from '../sources/youtube';
|
||||
import { SoundCloudSource } from '../sources/soundcloud';
|
||||
import { QueueService } from '../services/queue';
|
||||
import type { Channel, PlatformContentMetadata } from '../types/index';
|
||||
|
||||
// ── Mock yt-dlp ──
|
||||
|
||||
const { execYtDlpMock } = vi.hoisted(() => ({
|
||||
execYtDlpMock: vi.fn(),
|
||||
}));
|
||||
|
||||
vi.mock('../sources/yt-dlp', async (importOriginal) => {
|
||||
const actual = await importOriginal<typeof import('../sources/yt-dlp')>();
|
||||
return {
|
||||
...actual,
|
||||
execYtDlp: execYtDlpMock,
|
||||
};
|
||||
});
|
||||
|
||||
// ── Canned data ──
|
||||
|
||||
function makeYtDlpEntry(id: string, title: string) {
|
||||
return {
|
||||
id,
|
||||
title,
|
||||
url: `https://www.youtube.com/watch?v=${id}`,
|
||||
webpage_url: `https://www.youtube.com/watch?v=${id}`,
|
||||
duration: 600,
|
||||
thumbnail: `https://i.ytimg.com/vi/${id}/maxresdefault.jpg`,
|
||||
live_status: 'not_live',
|
||||
};
|
||||
}
|
||||
|
||||
/** Generate yt-dlp NDJSON output for N entries */
|
||||
function makeNdjsonOutput(count: number, prefix = 'vid'): string {
|
||||
return Array.from({ length: count }, (_, i) =>
|
||||
JSON.stringify(makeYtDlpEntry(`${prefix}_${i + 1}`, `Video ${i + 1}`))
|
||||
).join('\n');
|
||||
}
|
||||
|
||||
const YOUTUBE_CHANNEL_JSON = JSON.stringify({
|
||||
channel: 'Import Channel',
|
||||
channel_id: 'UC_IMPORT_TEST',
|
||||
channel_url: 'https://www.youtube.com/@ImportChannel',
|
||||
uploader: 'Import Channel',
|
||||
uploader_url: 'https://www.youtube.com/@ImportChannel',
|
||||
thumbnails: [{ url: 'https://yt.com/thumb.jpg' }],
|
||||
});
|
||||
|
||||
// ── Test setup ──
|
||||
|
||||
describe('BackCatalogImportService', () => {
|
||||
let db: LibSQLDatabase<typeof schema>;
|
||||
let tmpDir: string;
|
||||
let testChannel: Channel;
|
||||
let registry: PlatformRegistry;
|
||||
let mockDownloadService: { downloadItem: ReturnType<typeof vi.fn> };
|
||||
|
||||
beforeAll(async () => {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-import-'));
|
||||
const dbPath = join(tmpDir, 'test.db');
|
||||
db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
});
|
||||
|
||||
afterAll(() => {
|
||||
closeDatabase();
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Windows cleanup best-effort
|
||||
}
|
||||
});
|
||||
|
||||
beforeEach(() => {
|
||||
execYtDlpMock.mockReset();
|
||||
mockDownloadService = { downloadItem: vi.fn().mockResolvedValue(undefined) };
|
||||
});
|
||||
|
||||
async function createTestChannel(suffix = ''): Promise<Channel> {
|
||||
return createChannel(db, {
|
||||
name: `Import Test Channel${suffix}`,
|
||||
platform: 'youtube',
|
||||
platformId: `UC_IMPORT_TEST${suffix}`,
|
||||
url: `https://www.youtube.com/@ImportChannel${suffix}`,
|
||||
imageUrl: null,
|
||||
formatProfileId: null,
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
metadata: null,
|
||||
});
|
||||
}
|
||||
|
||||
function createImportService(concurrency = 0): {
|
||||
importService: BackCatalogImportService;
|
||||
queueService: QueueService;
|
||||
} {
|
||||
const queueService = new QueueService(db, mockDownloadService as any, concurrency);
|
||||
queueService.stop(); // Prevent auto-processing during tests
|
||||
registry = new PlatformRegistry();
|
||||
registry.register('youtube' as any, new YouTubeSource());
|
||||
registry.register('soundcloud' as any, new SoundCloudSource());
|
||||
const importService = new BackCatalogImportService(db, registry, queueService);
|
||||
return { importService, queueService };
|
||||
}
|
||||
|
||||
// ── Import fetches and inserts ──
|
||||
|
||||
describe('importChannel', () => {
|
||||
it('fetches all content and inserts new items', async () => {
|
||||
const channel = await createTestChannel('_fetch');
|
||||
|
||||
// Mock fetchAllContent → yt-dlp returns 5 entries
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: makeNdjsonOutput(5, 'fetch'),
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
const { importService } = createImportService();
|
||||
const result = await importService.importChannel(channel.id, 'newest');
|
||||
|
||||
expect(result.found).toBe(5);
|
||||
expect(result.imported).toBe(5);
|
||||
expect(result.skipped).toBe(0);
|
||||
|
||||
// Verify content items were created in DB
|
||||
const content = await getContentByChannelId(db, channel.id);
|
||||
expect(content.length).toBe(5);
|
||||
});
|
||||
|
||||
it('deduplicates — second import inserts 0 new items', async () => {
|
||||
const channel = await createTestChannel('_dedup');
|
||||
|
||||
// First import
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: makeNdjsonOutput(3, 'dedup'),
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
const { importService } = createImportService();
|
||||
const result1 = await importService.importChannel(channel.id, 'newest');
|
||||
expect(result1.imported).toBe(3);
|
||||
expect(result1.skipped).toBe(0);
|
||||
|
||||
// Second import — same content IDs
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: makeNdjsonOutput(3, 'dedup'),
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
const result2 = await importService.importChannel(channel.id, 'newest');
|
||||
expect(result2.found).toBe(3);
|
||||
expect(result2.imported).toBe(0);
|
||||
expect(result2.skipped).toBe(3);
|
||||
});
|
||||
|
||||
it('enqueues imported items at priority -10', async () => {
|
||||
const channel = await createTestChannel('_priority');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: makeNdjsonOutput(2, 'prio'),
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
const { importService } = createImportService();
|
||||
await importService.importChannel(channel.id, 'newest');
|
||||
|
||||
// Verify content was created and enqueued
|
||||
const content = await getContentByChannelId(db, channel.id);
|
||||
expect(content.length).toBe(2);
|
||||
|
||||
for (const item of content) {
|
||||
const queueItem = await getQueueItemByContentItemId(db, item.id);
|
||||
expect(queueItem).not.toBeNull();
|
||||
expect(queueItem!.priority).toBe(-10);
|
||||
expect(queueItem!.status).toBe('pending');
|
||||
}
|
||||
});
|
||||
|
||||
it("'oldest' order reverses insertion order", async () => {
|
||||
const channel = await createTestChannel('_order');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: makeNdjsonOutput(4, 'ord'),
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
const { importService } = createImportService();
|
||||
await importService.importChannel(channel.id, 'oldest');
|
||||
|
||||
// Content items should be inserted in reversed order.
|
||||
// yt-dlp returns [ord_1, ord_2, ord_3, ord_4]. With 'oldest' order,
|
||||
// the array is reversed to [ord_4, ord_3, ord_2, ord_1], so ord_4
|
||||
// is inserted first (lowest id) and ord_1 last (highest id).
|
||||
const content = await getContentByChannelId(db, channel.id);
|
||||
expect(content.length).toBe(4);
|
||||
|
||||
// getContentByChannelId returns newest first (ORDER BY createdAt DESC, id DESC implied).
|
||||
// Since all items have the same createdAt (same second), we verify via ID ordering.
|
||||
// ord_4 was inserted first → lowest DB id; ord_1 was inserted last → highest DB id
|
||||
const sortedById = [...content].sort((a, b) => a.id - b.id);
|
||||
expect(sortedById[0].platformContentId).toBe('ord_4'); // First inserted (lowest id)
|
||||
expect(sortedById[sortedById.length - 1].platformContentId).toBe('ord_1'); // Last inserted (highest id)
|
||||
});
|
||||
|
||||
it('handles missing channel gracefully', async () => {
|
||||
const { importService } = createImportService();
|
||||
|
||||
await expect(importService.importChannel(99999, 'newest')).rejects.toThrow(
|
||||
/Channel 99999 not found/
|
||||
);
|
||||
});
|
||||
|
||||
it('handles platform source fetch errors gracefully', async () => {
|
||||
const channel = await createTestChannel('_fetcherr');
|
||||
|
||||
// Mock yt-dlp to fail
|
||||
const { YtDlpError } = await import('../sources/yt-dlp');
|
||||
execYtDlpMock.mockRejectedValueOnce(
|
||||
new YtDlpError('yt-dlp exited with code 1', 'network error', 1)
|
||||
);
|
||||
|
||||
const { importService } = createImportService();
|
||||
|
||||
await expect(importService.importChannel(channel.id, 'newest')).rejects.toThrow(
|
||||
/yt-dlp exited/
|
||||
);
|
||||
});
|
||||
|
||||
it('individual enqueue failures do not abort the import', async () => {
|
||||
const channel = await createTestChannel('_enqfail');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: makeNdjsonOutput(3, 'enqfail'),
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
// Create a service where the queue throws on the second enqueue
|
||||
const queueService = new QueueService(db, mockDownloadService as any, 0);
|
||||
queueService.stop();
|
||||
|
||||
// Spy on enqueue to make it fail on the second call
|
||||
let enqueueCallCount = 0;
|
||||
const originalEnqueue = queueService.enqueue.bind(queueService);
|
||||
vi.spyOn(queueService, 'enqueue').mockImplementation(async (contentItemId, priority) => {
|
||||
enqueueCallCount++;
|
||||
if (enqueueCallCount === 2) {
|
||||
throw new Error('Simulated enqueue failure');
|
||||
}
|
||||
return originalEnqueue(contentItemId, priority);
|
||||
});
|
||||
|
||||
registry = new PlatformRegistry();
|
||||
registry.register('youtube' as any, new YouTubeSource());
|
||||
const importService = new BackCatalogImportService(db, registry, queueService);
|
||||
|
||||
const result = await importService.importChannel(channel.id, 'newest');
|
||||
|
||||
// All 3 items should be imported (content created), even though one enqueue failed
|
||||
expect(result.found).toBe(3);
|
||||
expect(result.imported).toBe(3);
|
||||
expect(result.skipped).toBe(0);
|
||||
});
|
||||
|
||||
// ── monitoringMode-aware import tests ──
|
||||
|
||||
it("imports items with monitored=false when channel monitoringMode is 'future'", async () => {
|
||||
const channel = await createTestChannel('_mode_future');
|
||||
|
||||
// Update channel's monitoringMode to 'future' via direct DB creation
|
||||
// (createTestChannel defaults to 'all', so we need a custom one)
|
||||
const futureChannel = await createChannel(db, {
|
||||
name: 'Future Mode Channel',
|
||||
platform: 'youtube',
|
||||
platformId: `UC_FUTURE_MODE_${Date.now()}`,
|
||||
url: 'https://www.youtube.com/@FutureMode',
|
||||
imageUrl: null,
|
||||
formatProfileId: null,
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
metadata: null,
|
||||
monitoringMode: 'future',
|
||||
});
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: makeNdjsonOutput(3, `mode_future_${futureChannel.id}`),
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
const { importService } = createImportService();
|
||||
const result = await importService.importChannel(futureChannel.id, 'newest');
|
||||
|
||||
expect(result.found).toBe(3);
|
||||
expect(result.imported).toBe(3);
|
||||
|
||||
// Back-catalog is *existing* content, mode 'future' → monitored=false
|
||||
const content = await getContentByChannelId(db, futureChannel.id);
|
||||
expect(content.length).toBe(3);
|
||||
for (const item of content) {
|
||||
expect(item.monitored).toBe(false);
|
||||
}
|
||||
});
|
||||
|
||||
it("imports items with monitored=true when channel monitoringMode is 'existing'", async () => {
|
||||
const existingChannel = await createChannel(db, {
|
||||
name: 'Existing Mode Channel',
|
||||
platform: 'youtube',
|
||||
platformId: `UC_EXISTING_MODE_${Date.now()}`,
|
||||
url: 'https://www.youtube.com/@ExistingMode',
|
||||
imageUrl: null,
|
||||
formatProfileId: null,
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
metadata: null,
|
||||
monitoringMode: 'existing',
|
||||
});
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: makeNdjsonOutput(3, `mode_existing_${existingChannel.id}`),
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
const { importService } = createImportService();
|
||||
const result = await importService.importChannel(existingChannel.id, 'newest');
|
||||
|
||||
expect(result.found).toBe(3);
|
||||
expect(result.imported).toBe(3);
|
||||
|
||||
// Back-catalog is *existing* content, mode 'existing' → monitored=true
|
||||
const content = await getContentByChannelId(db, existingChannel.id);
|
||||
expect(content.length).toBe(3);
|
||||
for (const item of content) {
|
||||
expect(item.monitored).toBe(true);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// ── Integration via channel POST endpoint ──
|
||||
|
||||
describe('Channel POST endpoint with grabAll', () => {
|
||||
let server: FastifyInstance;
|
||||
let apiKey: string;
|
||||
|
||||
beforeAll(async () => {
|
||||
server = await buildServer({ db });
|
||||
await server.ready();
|
||||
|
||||
// Read auto-generated API key
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(systemConfig)
|
||||
.where(eq(systemConfig.key, 'api_key'))
|
||||
.limit(1);
|
||||
apiKey = rows[0]?.value ?? '';
|
||||
expect(apiKey).toBeTruthy();
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await server.close();
|
||||
});
|
||||
|
||||
it('returns 201 and triggers import when grabAll is true', async () => {
|
||||
// Mock resolveChannel
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: JSON.stringify({
|
||||
channel: 'Grab All Channel',
|
||||
channel_id: 'UC_GRABALL_TEST',
|
||||
channel_url: 'https://www.youtube.com/@GrabAll',
|
||||
uploader: 'Grab All Channel',
|
||||
thumbnails: [{ url: 'https://yt.com/thumb.jpg' }],
|
||||
}),
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
// Mock fetchAllContent (will be called async by the fire-and-forget import)
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: makeNdjsonOutput(3, 'graball'),
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
// Need queueService on the server for import to run
|
||||
const qs = new QueueService(db, mockDownloadService as any, 0);
|
||||
qs.stop();
|
||||
server.queueService = qs;
|
||||
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
payload: {
|
||||
url: 'https://www.youtube.com/@GrabAll',
|
||||
grabAll: true,
|
||||
grabAllOrder: 'newest',
|
||||
},
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(201);
|
||||
const body = res.json();
|
||||
expect(body.name).toBe('Grab All Channel');
|
||||
expect(body.platform).toBe('youtube');
|
||||
|
||||
// Give the fire-and-forget import a moment to complete
|
||||
await new Promise((resolve) => setTimeout(resolve, 300));
|
||||
|
||||
// Verify content items were created by the async import
|
||||
const content = await getContentByChannelId(db, body.id);
|
||||
expect(content.length).toBe(3);
|
||||
|
||||
// Verify queue items exist at priority -10
|
||||
for (const item of content) {
|
||||
const queueItem = await getQueueItemByContentItemId(db, item.id);
|
||||
expect(queueItem).not.toBeNull();
|
||||
expect(queueItem!.priority).toBe(-10);
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
299
src/__tests__/channel-counts.test.ts
Normal file
299
src/__tests__/channel-counts.test.ts
Normal file
|
|
@ -0,0 +1,299 @@
|
|||
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { type FastifyInstance } from 'fastify';
|
||||
import { initDatabaseAsync, closeDatabase } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import { buildServer } from '../server/index';
|
||||
import { systemConfig } from '../db/schema/index';
|
||||
import { eq } from 'drizzle-orm';
|
||||
import { type LibSQLDatabase } from 'drizzle-orm/libsql';
|
||||
import type * as schema from '../db/schema/index';
|
||||
import { createChannel } from '../db/repositories/channel-repository';
|
||||
import { createContentItem } from '../db/repositories/content-repository';
|
||||
import type { Channel, ContentItem } from '../types/index';
|
||||
import type { ContentCounts } from '../types/api';
|
||||
|
||||
/**
|
||||
* Integration tests for GET /api/v1/channel with contentCounts.
|
||||
*
|
||||
* Verifies that the channel list endpoint returns per-channel aggregated
|
||||
* content counts (total, monitored, downloaded) as part of each channel object.
|
||||
*/
|
||||
describe('Channel contentCounts API', () => {
|
||||
let server: FastifyInstance;
|
||||
let db: LibSQLDatabase<typeof schema>;
|
||||
let apiKey: string;
|
||||
let tmpDir: string;
|
||||
let channelA: Channel;
|
||||
let channelB: Channel;
|
||||
let channelEmpty: Channel;
|
||||
|
||||
beforeAll(async () => {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-channel-counts-'));
|
||||
const dbPath = join(tmpDir, 'test.db');
|
||||
db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
server = await buildServer({ db });
|
||||
await server.ready();
|
||||
|
||||
// Read auto-generated API key
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(systemConfig)
|
||||
.where(eq(systemConfig.key, 'api_key'))
|
||||
.limit(1);
|
||||
apiKey = rows[0]?.value ?? '';
|
||||
expect(apiKey).toBeTruthy();
|
||||
|
||||
// ── Seed channels ──
|
||||
|
||||
// Channel A: will have mixed content
|
||||
channelA = await createChannel(db, {
|
||||
name: 'Channel Alpha',
|
||||
platform: 'youtube',
|
||||
platformId: 'UC_ALPHA_COUNTS',
|
||||
url: 'https://www.youtube.com/channel/UC_ALPHA_COUNTS',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: null,
|
||||
});
|
||||
|
||||
// Channel B: will have independent content
|
||||
channelB = await createChannel(db, {
|
||||
name: 'Channel Beta',
|
||||
platform: 'youtube',
|
||||
platformId: 'UC_BETA_COUNTS',
|
||||
url: 'https://www.youtube.com/channel/UC_BETA_COUNTS',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: null,
|
||||
});
|
||||
|
||||
// Channel Empty: no content items
|
||||
channelEmpty = await createChannel(db, {
|
||||
name: 'Channel Empty',
|
||||
platform: 'soundcloud',
|
||||
platformId: 'sc_empty_counts',
|
||||
url: 'https://soundcloud.com/empty-counts',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: null,
|
||||
});
|
||||
|
||||
// ── Seed content for Channel A (5 items: 3 monitored, 2 not; 2 downloaded) ──
|
||||
|
||||
await createContentItem(db, {
|
||||
channelId: channelA.id,
|
||||
title: 'Alpha Vid 1',
|
||||
platformContentId: 'cnt_a1',
|
||||
url: 'https://youtube.com/watch?v=cnt_a1',
|
||||
contentType: 'video',
|
||||
duration: 300,
|
||||
monitored: true,
|
||||
status: 'downloaded',
|
||||
});
|
||||
await createContentItem(db, {
|
||||
channelId: channelA.id,
|
||||
title: 'Alpha Vid 2',
|
||||
platformContentId: 'cnt_a2',
|
||||
url: 'https://youtube.com/watch?v=cnt_a2',
|
||||
contentType: 'video',
|
||||
duration: 300,
|
||||
monitored: true,
|
||||
status: 'downloaded',
|
||||
});
|
||||
await createContentItem(db, {
|
||||
channelId: channelA.id,
|
||||
title: 'Alpha Vid 3',
|
||||
platformContentId: 'cnt_a3',
|
||||
url: 'https://youtube.com/watch?v=cnt_a3',
|
||||
contentType: 'video',
|
||||
duration: 300,
|
||||
monitored: true,
|
||||
status: 'monitored',
|
||||
});
|
||||
await createContentItem(db, {
|
||||
channelId: channelA.id,
|
||||
title: 'Alpha Vid 4',
|
||||
platformContentId: 'cnt_a4',
|
||||
url: 'https://youtube.com/watch?v=cnt_a4',
|
||||
contentType: 'video',
|
||||
duration: 300,
|
||||
monitored: false,
|
||||
status: 'ignored',
|
||||
});
|
||||
await createContentItem(db, {
|
||||
channelId: channelA.id,
|
||||
title: 'Alpha Vid 5',
|
||||
platformContentId: 'cnt_a5',
|
||||
url: 'https://youtube.com/watch?v=cnt_a5',
|
||||
contentType: 'video',
|
||||
duration: 300,
|
||||
monitored: false,
|
||||
status: 'monitored',
|
||||
});
|
||||
|
||||
// ── Seed content for Channel B (2 items: 1 monitored, 1 downloaded) ──
|
||||
|
||||
await createContentItem(db, {
|
||||
channelId: channelB.id,
|
||||
title: 'Beta Vid 1',
|
||||
platformContentId: 'cnt_b1',
|
||||
url: 'https://youtube.com/watch?v=cnt_b1',
|
||||
contentType: 'video',
|
||||
duration: 600,
|
||||
monitored: true,
|
||||
status: 'monitored',
|
||||
});
|
||||
await createContentItem(db, {
|
||||
channelId: channelB.id,
|
||||
title: 'Beta Vid 2',
|
||||
platformContentId: 'cnt_b2',
|
||||
url: 'https://youtube.com/watch?v=cnt_b2',
|
||||
contentType: 'video',
|
||||
duration: 600,
|
||||
monitored: true,
|
||||
status: 'downloaded',
|
||||
});
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await server.close();
|
||||
closeDatabase();
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Temp dir cleanup is best-effort on Windows (K004)
|
||||
}
|
||||
});
|
||||
|
||||
// ── Helpers ──
|
||||
|
||||
function getChannelCounts(body: Array<{ id: number; contentCounts: ContentCounts }>, id: number) {
|
||||
const channel = body.find((c) => c.id === id);
|
||||
expect(channel).toBeDefined();
|
||||
return channel!.contentCounts;
|
||||
}
|
||||
|
||||
// ── Tests ──
|
||||
|
||||
describe('GET /api/v1/channel with contentCounts', () => {
|
||||
it('returns contentCounts on every channel object', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(Array.isArray(body)).toBe(true);
|
||||
expect(body.length).toBeGreaterThanOrEqual(3);
|
||||
|
||||
// Every channel must have contentCounts
|
||||
for (const channel of body) {
|
||||
expect(channel).toHaveProperty('contentCounts');
|
||||
expect(channel.contentCounts).toHaveProperty('total');
|
||||
expect(channel.contentCounts).toHaveProperty('monitored');
|
||||
expect(channel.contentCounts).toHaveProperty('downloaded');
|
||||
expect(typeof channel.contentCounts.total).toBe('number');
|
||||
expect(typeof channel.contentCounts.monitored).toBe('number');
|
||||
expect(typeof channel.contentCounts.downloaded).toBe('number');
|
||||
}
|
||||
});
|
||||
|
||||
it('channel with no content has zero counts', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
const counts = getChannelCounts(res.json(), channelEmpty.id);
|
||||
expect(counts).toEqual({ total: 0, monitored: 0, downloaded: 0 });
|
||||
});
|
||||
|
||||
it('channel with mixed content has correct counts', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
// Channel A: 5 total, 3 monitored, 2 downloaded
|
||||
const counts = getChannelCounts(res.json(), channelA.id);
|
||||
expect(counts.total).toBe(5);
|
||||
expect(counts.monitored).toBe(3);
|
||||
expect(counts.downloaded).toBe(2);
|
||||
});
|
||||
|
||||
it('multiple channels have independent counts', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
const body = res.json();
|
||||
const countsA = getChannelCounts(body, channelA.id);
|
||||
const countsB = getChannelCounts(body, channelB.id);
|
||||
|
||||
// Channel A: 5 total, 3 monitored, 2 downloaded
|
||||
expect(countsA).toEqual({ total: 5, monitored: 3, downloaded: 2 });
|
||||
|
||||
// Channel B: 2 total, 2 monitored, 1 downloaded
|
||||
expect(countsB).toEqual({ total: 2, monitored: 2, downloaded: 1 });
|
||||
});
|
||||
|
||||
it('counts update after toggling monitored state', async () => {
|
||||
// Toggle monitored on Channel A's first content item (cnt_a1) from true to false
|
||||
// Need to find the item ID first
|
||||
const contentRes = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/content',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
query: { channelId: String(channelA.id) },
|
||||
});
|
||||
|
||||
const contentBody = contentRes.json();
|
||||
const items = contentBody.data ?? contentBody;
|
||||
const targetItem = (Array.isArray(items) ? items : []).find(
|
||||
(i: ContentItem) => i.platformContentId === 'cnt_a1'
|
||||
);
|
||||
expect(targetItem).toBeDefined();
|
||||
|
||||
// Toggle monitored off
|
||||
const toggleRes = await server.inject({
|
||||
method: 'PATCH',
|
||||
url: `/api/v1/content/${targetItem.id}/monitored`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
payload: { monitored: false },
|
||||
});
|
||||
expect(toggleRes.statusCode).toBe(200);
|
||||
|
||||
// Re-fetch channel list and check updated counts
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
const counts = getChannelCounts(res.json(), channelA.id);
|
||||
// Was: 5 total, 3 monitored, 2 downloaded
|
||||
// Now: 5 total, 2 monitored, 2 downloaded (one less monitored)
|
||||
expect(counts.total).toBe(5);
|
||||
expect(counts.monitored).toBe(2);
|
||||
expect(counts.downloaded).toBe(2);
|
||||
});
|
||||
});
|
||||
});
|
||||
451
src/__tests__/channel.test.ts
Normal file
451
src/__tests__/channel.test.ts
Normal file
|
|
@ -0,0 +1,451 @@
|
|||
import { describe, it, expect, beforeAll, afterAll, vi } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { type FastifyInstance } from 'fastify';
|
||||
import { initDatabaseAsync, closeDatabase } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import { buildServer } from '../server/index';
|
||||
import { systemConfig } from '../db/schema/index';
|
||||
import { eq } from 'drizzle-orm';
|
||||
import { type LibSQLDatabase } from 'drizzle-orm/libsql';
|
||||
import type * as schema from '../db/schema/index';
|
||||
|
||||
// ── Mock yt-dlp to avoid real subprocess calls ──
|
||||
|
||||
// Mock the yt-dlp module so resolveChannel doesn't invoke the real binary.
|
||||
// We intercept at the yt-dlp wrapper level — platform sources call execYtDlp
|
||||
// which we replace with a function returning canned JSON.
|
||||
const { execYtDlpMock } = vi.hoisted(() => ({
|
||||
execYtDlpMock: vi.fn(),
|
||||
}));
|
||||
|
||||
vi.mock('../sources/yt-dlp', async (importOriginal) => {
|
||||
const actual = await importOriginal<typeof import('../sources/yt-dlp')>();
|
||||
return {
|
||||
...actual,
|
||||
execYtDlp: execYtDlpMock,
|
||||
};
|
||||
});
|
||||
|
||||
// ── Canned yt-dlp Responses ──
|
||||
|
||||
const YOUTUBE_CHANNEL_JSON = JSON.stringify({
|
||||
channel: 'Tech Channel',
|
||||
channel_id: 'UC_YOUTUBE_123',
|
||||
channel_url: 'https://www.youtube.com/@TechChannel',
|
||||
uploader: 'Tech Channel',
|
||||
uploader_url: 'https://www.youtube.com/@TechChannel',
|
||||
thumbnails: [
|
||||
{ url: 'https://yt.com/thumb_small.jpg' },
|
||||
{ url: 'https://yt.com/thumb_large.jpg' },
|
||||
],
|
||||
});
|
||||
|
||||
const SOUNDCLOUD_ARTIST_JSON = JSON.stringify({
|
||||
uploader: 'Beat Artist',
|
||||
uploader_id: 'beat-artist',
|
||||
uploader_url: 'https://soundcloud.com/beat-artist',
|
||||
channel: null,
|
||||
channel_url: null,
|
||||
thumbnails: [{ url: 'https://sc.com/avatar.jpg' }],
|
||||
});
|
||||
|
||||
/**
|
||||
* Integration tests for channel CRUD API.
|
||||
*
|
||||
* Pattern follows server.integration.test.ts: temp DB, migrations,
|
||||
* buildServer, inject(). yt-dlp is mocked to avoid subprocess dependency.
|
||||
*/
|
||||
describe('Channel API', () => {
|
||||
let server: FastifyInstance;
|
||||
let db: LibSQLDatabase<typeof schema>;
|
||||
let apiKey: string;
|
||||
let tmpDir: string;
|
||||
|
||||
beforeAll(async () => {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-channel-'));
|
||||
const dbPath = join(tmpDir, 'test.db');
|
||||
db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
server = await buildServer({ db });
|
||||
await server.ready();
|
||||
|
||||
// Read auto-generated API key
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(systemConfig)
|
||||
.where(eq(systemConfig.key, 'api_key'))
|
||||
.limit(1);
|
||||
apiKey = rows[0]?.value ?? '';
|
||||
expect(apiKey).toBeTruthy();
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await server.close();
|
||||
closeDatabase();
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Temp dir cleanup is best-effort on Windows (K004)
|
||||
}
|
||||
});
|
||||
|
||||
// ── Auth ──
|
||||
|
||||
describe('Authentication', () => {
|
||||
it('returns 401 on POST without API key', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/channel',
|
||||
payload: { url: 'https://www.youtube.com/@Test' },
|
||||
});
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
|
||||
it('returns 401 on GET without API key', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel',
|
||||
});
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
|
||||
it('returns 401 on GET /:id without API key', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel/1',
|
||||
});
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
|
||||
it('returns 401 on PUT without API key', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'PUT',
|
||||
url: '/api/v1/channel/1',
|
||||
payload: { name: 'Updated' },
|
||||
});
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
|
||||
it('returns 401 on DELETE without API key', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'DELETE',
|
||||
url: '/api/v1/channel/1',
|
||||
});
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
});
|
||||
|
||||
// ── POST /api/v1/channel ──
|
||||
|
||||
describe('POST /api/v1/channel', () => {
|
||||
it('creates a YouTube channel and returns 201 with resolved metadata', async () => {
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: YOUTUBE_CHANNEL_JSON,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
payload: {
|
||||
url: 'https://www.youtube.com/@TechChannel',
|
||||
checkInterval: 120,
|
||||
monitoringEnabled: true,
|
||||
},
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(201);
|
||||
const body = res.json();
|
||||
expect(body.name).toBe('Tech Channel');
|
||||
expect(body.platform).toBe('youtube');
|
||||
expect(body.platformId).toBe('UC_YOUTUBE_123');
|
||||
expect(body.url).toBe('https://www.youtube.com/@TechChannel');
|
||||
expect(body.checkInterval).toBe(120);
|
||||
expect(body.monitoringEnabled).toBe(true);
|
||||
expect(body.imageUrl).toBe('https://yt.com/thumb_large.jpg');
|
||||
expect(body.id).toBeTypeOf('number');
|
||||
expect(body.lastCheckedAt).toBeNull();
|
||||
expect(body.lastCheckStatus).toBeNull();
|
||||
});
|
||||
|
||||
it('creates a SoundCloud channel with correct platform field', async () => {
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: SOUNDCLOUD_ARTIST_JSON,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
payload: { url: 'https://soundcloud.com/beat-artist' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(201);
|
||||
const body = res.json();
|
||||
expect(body.name).toBe('Beat Artist');
|
||||
expect(body.platform).toBe('soundcloud');
|
||||
expect(body.platformId).toBe('beat-artist');
|
||||
expect(body.monitoringEnabled).toBe(true); // default
|
||||
expect(body.checkInterval).toBe(360); // default
|
||||
});
|
||||
|
||||
it('returns 422 for unsupported URL', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
payload: { url: 'https://www.example.com/not-a-platform' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(422);
|
||||
const body = res.json();
|
||||
expect(body.message).toContain('Unsupported URL');
|
||||
});
|
||||
|
||||
it('returns 409 Conflict for duplicate channel (same platformId)', async () => {
|
||||
// The YouTube channel from the first test already exists with platformId UC_YOUTUBE_123
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: YOUTUBE_CHANNEL_JSON,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
payload: { url: 'https://www.youtube.com/@TechChannel' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(409);
|
||||
const body = res.json();
|
||||
expect(body.error).toBe('Conflict');
|
||||
expect(body.message).toContain('already exists');
|
||||
});
|
||||
|
||||
it('returns 502 when yt-dlp fails', async () => {
|
||||
// Import YtDlpError to throw from mock
|
||||
const { YtDlpError } = await import('../sources/yt-dlp');
|
||||
execYtDlpMock.mockRejectedValueOnce(
|
||||
new YtDlpError('yt-dlp exited with code 1: network error', 'network error', 1)
|
||||
);
|
||||
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
payload: { url: 'https://www.youtube.com/@BrokenChannel' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(502);
|
||||
const body = res.json();
|
||||
expect(body.error).toBe('Bad Gateway');
|
||||
});
|
||||
|
||||
it('returns 400 when body is missing url field', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
payload: {},
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(400);
|
||||
});
|
||||
});
|
||||
|
||||
// ── GET /api/v1/channel ──
|
||||
|
||||
describe('GET /api/v1/channel', () => {
|
||||
it('returns all created channels', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(Array.isArray(body)).toBe(true);
|
||||
// We created YouTube + SoundCloud channels earlier
|
||||
expect(body.length).toBeGreaterThanOrEqual(2);
|
||||
|
||||
// Verify ordering by name (Beat Artist before Tech Channel)
|
||||
const names = body.map((c: { name: string }) => c.name);
|
||||
expect(names).toContain('Tech Channel');
|
||||
expect(names).toContain('Beat Artist');
|
||||
});
|
||||
});
|
||||
|
||||
// ── GET /api/v1/channel/:id ──
|
||||
|
||||
describe('GET /api/v1/channel/:id', () => {
|
||||
it('returns 200 with the correct channel', async () => {
|
||||
// Get the list first to find a valid ID
|
||||
const listRes = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
const channels = listRes.json();
|
||||
const channelId = channels[0].id;
|
||||
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: `/api/v1/channel/${channelId}`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.id).toBe(channelId);
|
||||
expect(body.name).toBeTruthy();
|
||||
expect(body.platform).toBeTruthy();
|
||||
});
|
||||
|
||||
it('returns 404 for non-existent ID', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel/99999',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(404);
|
||||
expect(res.json().error).toBe('Not Found');
|
||||
});
|
||||
|
||||
it('returns 400 for non-numeric ID', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel/abc',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(400);
|
||||
});
|
||||
});
|
||||
|
||||
// ── PUT /api/v1/channel/:id ──
|
||||
|
||||
describe('PUT /api/v1/channel/:id', () => {
|
||||
it('updates checkInterval and returns the updated channel', async () => {
|
||||
// Find the YouTube channel
|
||||
const listRes = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
const ytChannel = listRes.json().find((c: { platform: string }) => c.platform === 'youtube');
|
||||
|
||||
const res = await server.inject({
|
||||
method: 'PUT',
|
||||
url: `/api/v1/channel/${ytChannel.id}`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
payload: { checkInterval: 60 },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.checkInterval).toBe(60);
|
||||
expect(body.id).toBe(ytChannel.id);
|
||||
});
|
||||
|
||||
it('updates monitoringEnabled', async () => {
|
||||
const listRes = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
const channel = listRes.json()[0];
|
||||
|
||||
const res = await server.inject({
|
||||
method: 'PUT',
|
||||
url: `/api/v1/channel/${channel.id}`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
payload: { monitoringEnabled: false },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json().monitoringEnabled).toBe(false);
|
||||
});
|
||||
|
||||
it('returns 404 for non-existent ID', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'PUT',
|
||||
url: '/api/v1/channel/99999',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
payload: { name: 'Ghost' },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
});
|
||||
|
||||
// ── DELETE /api/v1/channel/:id ──
|
||||
|
||||
describe('DELETE /api/v1/channel/:id', () => {
|
||||
let deletableId: number;
|
||||
|
||||
beforeAll(async () => {
|
||||
// Create a channel specifically for deletion testing
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: JSON.stringify({
|
||||
channel: 'Delete Me',
|
||||
channel_id: 'UC_DELETE_ME',
|
||||
channel_url: 'https://www.youtube.com/@DeleteMe',
|
||||
uploader: 'Delete Me',
|
||||
thumbnails: [],
|
||||
}),
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
payload: { url: 'https://www.youtube.com/@DeleteMe' },
|
||||
});
|
||||
|
||||
deletableId = res.json().id;
|
||||
});
|
||||
|
||||
it('returns 204 on successful delete', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'DELETE',
|
||||
url: `/api/v1/channel/${deletableId}`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(204);
|
||||
});
|
||||
|
||||
it('returns 404 when trying to GET the deleted channel', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: `/api/v1/channel/${deletableId}`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
|
||||
it('returns 404 when deleting non-existent ID', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'DELETE',
|
||||
url: '/api/v1/channel/99999',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
});
|
||||
});
|
||||
309
src/__tests__/content-api.test.ts
Normal file
309
src/__tests__/content-api.test.ts
Normal file
|
|
@ -0,0 +1,309 @@
|
|||
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { type FastifyInstance } from 'fastify';
|
||||
import { initDatabaseAsync, closeDatabase } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import { buildServer } from '../server/index';
|
||||
import { systemConfig } from '../db/schema/index';
|
||||
import { eq } from 'drizzle-orm';
|
||||
import { type LibSQLDatabase } from 'drizzle-orm/libsql';
|
||||
import type * as schema from '../db/schema/index';
|
||||
import { createChannel } from '../db/repositories/channel-repository';
|
||||
import { createContentItem } from '../db/repositories/content-repository';
|
||||
import type { Channel, ContentItem } from '../types/index';
|
||||
|
||||
/**
|
||||
* Integration tests for content listing API endpoints.
|
||||
*/
|
||||
|
||||
describe('content-api', () => {
|
||||
let server: FastifyInstance;
|
||||
let db: LibSQLDatabase<typeof schema>;
|
||||
let apiKey: string;
|
||||
let tmpDir: string;
|
||||
let channelA: Channel;
|
||||
let channelB: Channel;
|
||||
const contentItems: ContentItem[] = [];
|
||||
|
||||
beforeAll(async () => {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-content-api-'));
|
||||
const dbPath = join(tmpDir, 'test.db');
|
||||
db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
server = await buildServer({ db });
|
||||
await server.ready();
|
||||
|
||||
// Read API key
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(systemConfig)
|
||||
.where(eq(systemConfig.key, 'api_key'))
|
||||
.limit(1);
|
||||
apiKey = rows[0]?.value ?? '';
|
||||
expect(apiKey).toBeTruthy();
|
||||
|
||||
// Create two channels for filtering tests
|
||||
channelA = await createChannel(db, {
|
||||
name: 'Channel Alpha',
|
||||
platform: 'youtube',
|
||||
platformId: 'UC_alpha',
|
||||
url: 'https://www.youtube.com/channel/UC_alpha',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: null,
|
||||
});
|
||||
|
||||
channelB = await createChannel(db, {
|
||||
name: 'Channel Beta',
|
||||
platform: 'soundcloud',
|
||||
platformId: 'beta_artist',
|
||||
url: 'https://soundcloud.com/beta_artist',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: null,
|
||||
});
|
||||
|
||||
// Create varied content items for filter/search/pagination tests
|
||||
const items = [
|
||||
{ channelId: channelA.id, title: 'Alpha Video One', platformContentId: 'a_v1', url: 'https://youtube.com/watch?v=a_v1', contentType: 'video' as const, duration: 600, status: 'monitored' as const },
|
||||
{ channelId: channelA.id, title: 'Alpha Video Two', platformContentId: 'a_v2', url: 'https://youtube.com/watch?v=a_v2', contentType: 'video' as const, duration: 300, status: 'downloaded' as const },
|
||||
{ channelId: channelA.id, title: 'Alpha Livestream Special', platformContentId: 'a_ls1', url: 'https://youtube.com/watch?v=a_ls1', contentType: 'livestream' as const, duration: 7200, status: 'monitored' as const },
|
||||
{ channelId: channelB.id, title: 'Beta Audio Track', platformContentId: 'b_a1', url: 'https://soundcloud.com/beta/track1', contentType: 'audio' as const, duration: 240, status: 'monitored' as const },
|
||||
{ channelId: channelB.id, title: 'Beta Audio Mix', platformContentId: 'b_a2', url: 'https://soundcloud.com/beta/mix1', contentType: 'audio' as const, duration: 3600, status: 'failed' as const },
|
||||
];
|
||||
|
||||
for (const item of items) {
|
||||
const created = await createContentItem(db, item);
|
||||
if (created) contentItems.push(created);
|
||||
}
|
||||
|
||||
expect(contentItems.length).toBe(5);
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await server.close();
|
||||
closeDatabase();
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Temp dir cleanup is best-effort on Windows (K004)
|
||||
}
|
||||
});
|
||||
|
||||
// ── GET /api/v1/content ──
|
||||
|
||||
describe('GET /api/v1/content', () => {
|
||||
it('returns paginated results with all items', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/content',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(body.data).toHaveLength(5);
|
||||
expect(body.pagination).toBeDefined();
|
||||
expect(body.pagination.page).toBe(1);
|
||||
expect(body.pagination.pageSize).toBe(20);
|
||||
expect(body.pagination.totalItems).toBe(5);
|
||||
expect(body.pagination.totalPages).toBe(1);
|
||||
});
|
||||
|
||||
it('respects page and pageSize parameters', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/content?page=1&pageSize=2',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data).toHaveLength(2);
|
||||
expect(body.pagination.page).toBe(1);
|
||||
expect(body.pagination.pageSize).toBe(2);
|
||||
expect(body.pagination.totalItems).toBe(5);
|
||||
expect(body.pagination.totalPages).toBe(3);
|
||||
});
|
||||
|
||||
it('returns second page correctly', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/content?page=2&pageSize=2',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data).toHaveLength(2);
|
||||
expect(body.pagination.page).toBe(2);
|
||||
});
|
||||
|
||||
it('returns empty data for page beyond range', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/content?page=100&pageSize=20',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data).toHaveLength(0);
|
||||
expect(body.pagination.totalItems).toBe(5);
|
||||
});
|
||||
|
||||
it('filters by status', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/content?status=downloaded',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data).toHaveLength(1);
|
||||
expect(body.data[0].status).toBe('downloaded');
|
||||
expect(body.data[0].title).toBe('Alpha Video Two');
|
||||
expect(body.pagination.totalItems).toBe(1);
|
||||
});
|
||||
|
||||
it('filters by contentType', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/content?contentType=audio',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data).toHaveLength(2);
|
||||
expect(body.data.every((item: { contentType: string }) => item.contentType === 'audio')).toBe(true);
|
||||
expect(body.pagination.totalItems).toBe(2);
|
||||
});
|
||||
|
||||
it('filters by channelId', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: `/api/v1/content?channelId=${channelA.id}`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data).toHaveLength(3);
|
||||
expect(body.data.every((item: { channelId: number }) => item.channelId === channelA.id)).toBe(true);
|
||||
expect(body.pagination.totalItems).toBe(3);
|
||||
});
|
||||
|
||||
it('searches by title substring', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/content?search=Livestream',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data).toHaveLength(1);
|
||||
expect(body.data[0].title).toContain('Livestream');
|
||||
expect(body.pagination.totalItems).toBe(1);
|
||||
});
|
||||
|
||||
it('combines multiple filters', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: `/api/v1/content?channelId=${channelB.id}&status=failed`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data).toHaveLength(1);
|
||||
expect(body.data[0].title).toBe('Beta Audio Mix');
|
||||
expect(body.data[0].status).toBe('failed');
|
||||
expect(body.pagination.totalItems).toBe(1);
|
||||
});
|
||||
|
||||
it('returns 401 without API key', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/content',
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
});
|
||||
|
||||
// ── GET /api/v1/channel/:id/content ──
|
||||
|
||||
describe('GET /api/v1/channel/:id/content', () => {
|
||||
it('returns content for a specific channel', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: `/api/v1/channel/${channelA.id}/content`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(body.data).toHaveLength(3);
|
||||
expect(body.data.every((item: { channelId: number }) => item.channelId === channelA.id)).toBe(true);
|
||||
});
|
||||
|
||||
it('returns empty array for channel with no content', async () => {
|
||||
const noContentChannel = await createChannel(db, {
|
||||
name: 'Empty Channel',
|
||||
platform: 'youtube',
|
||||
platformId: 'UC_empty',
|
||||
url: 'https://www.youtube.com/channel/UC_empty',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: null,
|
||||
});
|
||||
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: `/api/v1/channel/${noContentChannel.id}/content`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(body.data).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('returns 400 for invalid channel ID', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel/notanumber/content',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(400);
|
||||
const body = res.json();
|
||||
expect(body.error).toBe('Bad Request');
|
||||
});
|
||||
|
||||
it('returns 401 without API key', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: `/api/v1/channel/${channelA.id}/content`,
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
});
|
||||
});
|
||||
229
src/__tests__/cookie-manager.test.ts
Normal file
229
src/__tests__/cookie-manager.test.ts
Normal file
|
|
@ -0,0 +1,229 @@
|
|||
import { describe, it, expect, afterEach } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync, writeFileSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { CookieManager } from '../services/cookie-manager';
|
||||
import { Platform } from '../types/index';
|
||||
|
||||
const VALID_COOKIE_CONTENT = `# Netscape HTTP Cookie File
|
||||
# https://curl.se/docs/http-cookies.html
|
||||
.youtube.com\tTRUE\t/\tTRUE\t0\tSID\tabc123
|
||||
.youtube.com\tTRUE\t/\tTRUE\t0\tHSID\txyz789
|
||||
`;
|
||||
|
||||
const ALT_VALID_HEADER = `# HTTP Cookie File
|
||||
.soundcloud.com\tTRUE\t/\tFALSE\t0\tsc_token\tdef456
|
||||
`;
|
||||
|
||||
let tmpDir: string;
|
||||
let sourceDir: string;
|
||||
|
||||
function makeTmpDirs(): { cookiePath: string; sourcePath: string } {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-cm-test-'));
|
||||
sourceDir = mkdtempSync(join(tmpdir(), 'tubearr-cm-source-'));
|
||||
const cookiePath = join(tmpDir, 'cookies');
|
||||
return { cookiePath, sourcePath: sourceDir };
|
||||
}
|
||||
|
||||
afterEach(() => {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
if (sourceDir && existsSync(sourceDir)) {
|
||||
rmSync(sourceDir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
describe('CookieManager', () => {
|
||||
describe('importCookieFile', () => {
|
||||
it('imports a valid Netscape cookie file to the expected path', async () => {
|
||||
const { cookiePath, sourcePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
const sourceFile = join(sourcePath, 'cookies.txt');
|
||||
writeFileSync(sourceFile, VALID_COOKIE_CONTENT);
|
||||
|
||||
await cm.importCookieFile(Platform.YouTube, sourceFile);
|
||||
|
||||
const expectedPath = join(cookiePath, 'youtube_cookies.txt');
|
||||
expect(existsSync(expectedPath)).toBe(true);
|
||||
});
|
||||
|
||||
it('accepts the alternative "# HTTP Cookie File" header', async () => {
|
||||
const { cookiePath, sourcePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
const sourceFile = join(sourcePath, 'cookies.txt');
|
||||
writeFileSync(sourceFile, ALT_VALID_HEADER);
|
||||
|
||||
await cm.importCookieFile(Platform.SoundCloud, sourceFile);
|
||||
|
||||
const expectedPath = join(cookiePath, 'soundcloud_cookies.txt');
|
||||
expect(existsSync(expectedPath)).toBe(true);
|
||||
});
|
||||
|
||||
it('throws on file without Netscape header', async () => {
|
||||
const { cookiePath, sourcePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
const sourceFile = join(sourcePath, 'bad.txt');
|
||||
writeFileSync(sourceFile, 'not a cookie file\njust random text\n');
|
||||
|
||||
await expect(
|
||||
cm.importCookieFile(Platform.YouTube, sourceFile)
|
||||
).rejects.toThrow('Invalid cookie file format');
|
||||
});
|
||||
|
||||
it('throws on empty cookie file', async () => {
|
||||
const { cookiePath, sourcePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
const sourceFile = join(sourcePath, 'empty.txt');
|
||||
writeFileSync(sourceFile, '');
|
||||
|
||||
await expect(
|
||||
cm.importCookieFile(Platform.YouTube, sourceFile)
|
||||
).rejects.toThrow('Cookie file is empty');
|
||||
});
|
||||
|
||||
it('throws when source file does not exist', async () => {
|
||||
const { cookiePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
await expect(
|
||||
cm.importCookieFile(Platform.YouTube, '/nonexistent/path.txt')
|
||||
).rejects.toThrow('Source cookie file not found');
|
||||
});
|
||||
|
||||
it('creates cookie directory if it does not exist', async () => {
|
||||
const { cookiePath, sourcePath } = makeTmpDirs();
|
||||
const nestedCookiePath = join(cookiePath, 'nested', 'deep');
|
||||
const cm = new CookieManager(nestedCookiePath);
|
||||
|
||||
const sourceFile = join(sourcePath, 'cookies.txt');
|
||||
writeFileSync(sourceFile, VALID_COOKIE_CONTENT);
|
||||
|
||||
await cm.importCookieFile(Platform.YouTube, sourceFile);
|
||||
|
||||
expect(existsSync(nestedCookiePath)).toBe(true);
|
||||
});
|
||||
|
||||
it('accepts files with leading blank lines before header', async () => {
|
||||
const { cookiePath, sourcePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
const sourceFile = join(sourcePath, 'cookies.txt');
|
||||
writeFileSync(sourceFile, '\n\n# Netscape HTTP Cookie File\n.example.com\tTRUE\t/\tFALSE\t0\ttest\tval\n');
|
||||
|
||||
await cm.importCookieFile(Platform.YouTube, sourceFile);
|
||||
expect(existsSync(join(cookiePath, 'youtube_cookies.txt'))).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe('hasCookies', () => {
|
||||
it('returns false when no cookie file exists', () => {
|
||||
const { cookiePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
expect(cm.hasCookies(Platform.YouTube)).toBe(false);
|
||||
});
|
||||
|
||||
it('returns true after importing a cookie file', async () => {
|
||||
const { cookiePath, sourcePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
const sourceFile = join(sourcePath, 'cookies.txt');
|
||||
writeFileSync(sourceFile, VALID_COOKIE_CONTENT);
|
||||
await cm.importCookieFile(Platform.YouTube, sourceFile);
|
||||
|
||||
expect(cm.hasCookies(Platform.YouTube)).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe('getCookieFilePath', () => {
|
||||
it('returns null when no cookie file exists', () => {
|
||||
const { cookiePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
expect(cm.getCookieFilePath(Platform.YouTube)).toBeNull();
|
||||
});
|
||||
|
||||
it('returns path when cookie file exists', async () => {
|
||||
const { cookiePath, sourcePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
const sourceFile = join(sourcePath, 'cookies.txt');
|
||||
writeFileSync(sourceFile, VALID_COOKIE_CONTENT);
|
||||
await cm.importCookieFile(Platform.YouTube, sourceFile);
|
||||
|
||||
const result = cm.getCookieFilePath(Platform.YouTube);
|
||||
expect(result).toBe(join(cookiePath, 'youtube_cookies.txt'));
|
||||
});
|
||||
});
|
||||
|
||||
describe('deleteCookieFile', () => {
|
||||
it('removes the cookie file for a platform', async () => {
|
||||
const { cookiePath, sourcePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
const sourceFile = join(sourcePath, 'cookies.txt');
|
||||
writeFileSync(sourceFile, VALID_COOKIE_CONTENT);
|
||||
await cm.importCookieFile(Platform.YouTube, sourceFile);
|
||||
|
||||
expect(cm.hasCookies(Platform.YouTube)).toBe(true);
|
||||
await cm.deleteCookieFile(Platform.YouTube);
|
||||
expect(cm.hasCookies(Platform.YouTube)).toBe(false);
|
||||
});
|
||||
|
||||
it('does not throw when cookie file does not exist', async () => {
|
||||
const { cookiePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
// Should not throw
|
||||
await cm.deleteCookieFile(Platform.YouTube);
|
||||
});
|
||||
});
|
||||
|
||||
describe('multiple platforms', () => {
|
||||
it('stores independent cookie files per platform', async () => {
|
||||
const { cookiePath, sourcePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
const ytSource = join(sourcePath, 'yt.txt');
|
||||
const scSource = join(sourcePath, 'sc.txt');
|
||||
writeFileSync(ytSource, VALID_COOKIE_CONTENT);
|
||||
writeFileSync(scSource, ALT_VALID_HEADER);
|
||||
|
||||
await cm.importCookieFile(Platform.YouTube, ytSource);
|
||||
await cm.importCookieFile(Platform.SoundCloud, scSource);
|
||||
|
||||
expect(cm.hasCookies(Platform.YouTube)).toBe(true);
|
||||
expect(cm.hasCookies(Platform.SoundCloud)).toBe(true);
|
||||
|
||||
// Deleting one doesn't affect the other
|
||||
await cm.deleteCookieFile(Platform.YouTube);
|
||||
expect(cm.hasCookies(Platform.YouTube)).toBe(false);
|
||||
expect(cm.hasCookies(Platform.SoundCloud)).toBe(true);
|
||||
});
|
||||
|
||||
it('getCookieFilePath returns different paths per platform', async () => {
|
||||
const { cookiePath, sourcePath } = makeTmpDirs();
|
||||
const cm = new CookieManager(cookiePath);
|
||||
|
||||
const ytSource = join(sourcePath, 'yt.txt');
|
||||
const scSource = join(sourcePath, 'sc.txt');
|
||||
writeFileSync(ytSource, VALID_COOKIE_CONTENT);
|
||||
writeFileSync(scSource, ALT_VALID_HEADER);
|
||||
|
||||
await cm.importCookieFile(Platform.YouTube, ytSource);
|
||||
await cm.importCookieFile(Platform.SoundCloud, scSource);
|
||||
|
||||
const ytPath = cm.getCookieFilePath(Platform.YouTube);
|
||||
const scPath = cm.getCookieFilePath(Platform.SoundCloud);
|
||||
|
||||
expect(ytPath).not.toBe(scPath);
|
||||
expect(ytPath).toContain('youtube_cookies.txt');
|
||||
expect(scPath).toContain('soundcloud_cookies.txt');
|
||||
});
|
||||
});
|
||||
});
|
||||
157
src/__tests__/database.test.ts
Normal file
157
src/__tests__/database.test.ts
Normal file
|
|
@ -0,0 +1,157 @@
|
|||
import { describe, it, expect, afterEach } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { initDatabaseAsync, closeDatabase, getRawClient } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import { systemConfig } from '../db/schema/index';
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
/**
|
||||
* Integration tests for the database layer.
|
||||
*
|
||||
* Each test creates a temporary directory + database file, initializes
|
||||
* the connection, runs migrations, then tears down. The database module
|
||||
* uses a module-level singleton, so closeDatabase() must be called in
|
||||
* afterEach to reset state for the next test.
|
||||
*/
|
||||
|
||||
const EXPECTED_TABLES = [
|
||||
'system_config',
|
||||
'channels',
|
||||
'content_items',
|
||||
'format_profiles',
|
||||
'queue_items',
|
||||
'download_history',
|
||||
'notification_settings',
|
||||
'platform_settings',
|
||||
'playlists',
|
||||
'content_playlist',
|
||||
];
|
||||
|
||||
let tmpDir: string;
|
||||
|
||||
function freshDbPath(): string {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-test-'));
|
||||
return join(tmpDir, 'test.db');
|
||||
}
|
||||
|
||||
function cleanup(): void {
|
||||
closeDatabase();
|
||||
// On Windows, SQLite WAL/SHM files may still be locked briefly after
|
||||
// closeDatabase(). Use try/catch to avoid EPERM failures in cleanup —
|
||||
// the OS temp directory is cleaned automatically.
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Temp dir cleanup is best-effort on Windows
|
||||
}
|
||||
}
|
||||
|
||||
describe('Database initialization', () => {
|
||||
afterEach(cleanup);
|
||||
|
||||
it('creates the database file when initializing with a new path', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
expect(existsSync(dbPath)).toBe(false);
|
||||
|
||||
await initDatabaseAsync(dbPath);
|
||||
|
||||
// libsql creates the file on first connection
|
||||
expect(existsSync(dbPath)).toBe(true);
|
||||
});
|
||||
|
||||
it('enables WAL journal mode', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
await initDatabaseAsync(dbPath);
|
||||
|
||||
const client = getRawClient();
|
||||
const result = await client.execute('PRAGMA journal_mode');
|
||||
const mode = result.rows[0]?.journal_mode;
|
||||
|
||||
expect(mode).toBe('wal');
|
||||
});
|
||||
});
|
||||
|
||||
describe('Database migrations', () => {
|
||||
afterEach(cleanup);
|
||||
|
||||
it('creates all expected tables after migration', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const client = getRawClient();
|
||||
const result = await client.execute(
|
||||
"SELECT name FROM sqlite_master WHERE type='table' ORDER BY name"
|
||||
);
|
||||
const tableNames = result.rows.map((r) => r.name as string);
|
||||
|
||||
for (const table of EXPECTED_TABLES) {
|
||||
expect(tableNames, `expected table "${table}" to exist`).toContain(table);
|
||||
}
|
||||
});
|
||||
|
||||
it('is idempotent — running migrations twice does not error', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
await initDatabaseAsync(dbPath);
|
||||
|
||||
// First run
|
||||
await runMigrations(dbPath);
|
||||
// Second run — should not throw
|
||||
await runMigrations(dbPath);
|
||||
|
||||
// Verify tables still exist
|
||||
const client = getRawClient();
|
||||
const result = await client.execute(
|
||||
"SELECT name FROM sqlite_master WHERE type='table' ORDER BY name"
|
||||
);
|
||||
const tableNames = result.rows.map((r) => r.name as string);
|
||||
expect(tableNames).toContain('system_config');
|
||||
});
|
||||
});
|
||||
|
||||
describe('System config CRUD', () => {
|
||||
afterEach(cleanup);
|
||||
|
||||
it('supports insert and read of key/value pairs', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
// Insert a key-value pair
|
||||
await db.insert(systemConfig).values({
|
||||
key: 'test_key',
|
||||
value: 'test_value',
|
||||
});
|
||||
|
||||
// Read it back
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(systemConfig)
|
||||
.where(eq(systemConfig.key, 'test_key'))
|
||||
.limit(1);
|
||||
|
||||
expect(rows).toHaveLength(1);
|
||||
expect(rows[0].key).toBe('test_key');
|
||||
expect(rows[0].value).toBe('test_value');
|
||||
expect(rows[0].createdAt).toBeTruthy();
|
||||
expect(rows[0].updatedAt).toBeTruthy();
|
||||
});
|
||||
|
||||
it('returns empty array for non-existent keys', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(systemConfig)
|
||||
.where(eq(systemConfig.key, 'nonexistent'))
|
||||
.limit(1);
|
||||
|
||||
expect(rows).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
227
src/__tests__/download-api.test.ts
Normal file
227
src/__tests__/download-api.test.ts
Normal file
|
|
@ -0,0 +1,227 @@
|
|||
import { describe, it, expect, beforeAll, afterAll, vi } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { type FastifyInstance } from 'fastify';
|
||||
import { initDatabaseAsync, closeDatabase } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import { buildServer } from '../server/index';
|
||||
import { systemConfig } from '../db/schema/index';
|
||||
import { eq } from 'drizzle-orm';
|
||||
import { type LibSQLDatabase } from 'drizzle-orm/libsql';
|
||||
import type * as schema from '../db/schema/index';
|
||||
import { createChannel } from '../db/repositories/channel-repository';
|
||||
import { createContentItem, updateContentItem } from '../db/repositories/content-repository';
|
||||
import { QueueService } from '../services/queue';
|
||||
import type { DownloadService } from '../services/download';
|
||||
import type { ContentItem, Channel } from '../types/index';
|
||||
|
||||
/**
|
||||
* Integration tests for the download trigger API endpoint.
|
||||
*
|
||||
* The download route now enqueues via QueueService instead of calling
|
||||
* DownloadService directly. It returns 202 Accepted with the queue item.
|
||||
*/
|
||||
|
||||
describe('Download API', () => {
|
||||
let server: FastifyInstance;
|
||||
let db: LibSQLDatabase<typeof schema>;
|
||||
let apiKey: string;
|
||||
let tmpDir: string;
|
||||
let testChannel: Channel;
|
||||
let queueService: QueueService;
|
||||
let mockDownloadService: {
|
||||
downloadItem: ReturnType<typeof vi.fn>;
|
||||
};
|
||||
|
||||
beforeAll(async () => {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-dl-api-'));
|
||||
const dbPath = join(tmpDir, 'test.db');
|
||||
db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
server = await buildServer({ db });
|
||||
|
||||
// Create mock download service and queue service
|
||||
mockDownloadService = {
|
||||
downloadItem: vi.fn().mockResolvedValue(undefined),
|
||||
};
|
||||
queueService = new QueueService(
|
||||
db,
|
||||
mockDownloadService as unknown as DownloadService,
|
||||
2
|
||||
);
|
||||
// Stop auto-processing so tests stay deterministic
|
||||
queueService.stop();
|
||||
|
||||
(server as { downloadService: DownloadService | null }).downloadService =
|
||||
mockDownloadService as unknown as DownloadService;
|
||||
(server as { queueService: QueueService | null }).queueService = queueService;
|
||||
|
||||
await server.ready();
|
||||
|
||||
// Read API key
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(systemConfig)
|
||||
.where(eq(systemConfig.key, 'api_key'))
|
||||
.limit(1);
|
||||
apiKey = rows[0]?.value ?? '';
|
||||
expect(apiKey).toBeTruthy();
|
||||
|
||||
// Create a test channel
|
||||
testChannel = await createChannel(db, {
|
||||
name: 'Download Test Channel',
|
||||
platform: 'youtube',
|
||||
platformId: 'UC_dl_test',
|
||||
url: 'https://www.youtube.com/channel/UC_dl_test',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: null,
|
||||
});
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
queueService.stop();
|
||||
await server.close();
|
||||
closeDatabase();
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Temp dir cleanup is best-effort on Windows
|
||||
}
|
||||
});
|
||||
|
||||
// ── Helpers ──
|
||||
|
||||
function authed(opts: Record<string, unknown>) {
|
||||
return {
|
||||
...opts,
|
||||
headers: { 'x-api-key': apiKey, ...(opts.headers as Record<string, string> | undefined) },
|
||||
};
|
||||
}
|
||||
|
||||
let contentCounter = 0;
|
||||
async function createTestContentItem(
|
||||
overrides: { status?: string; platformContentId?: string } = {}
|
||||
): Promise<ContentItem> {
|
||||
contentCounter++;
|
||||
const item = await createContentItem(db, {
|
||||
channelId: testChannel.id,
|
||||
title: 'Test Download Video',
|
||||
platformContentId: overrides.platformContentId ?? `vid_dl_${Date.now()}_${contentCounter}`,
|
||||
url: 'https://www.youtube.com/watch?v=test123',
|
||||
contentType: 'video',
|
||||
duration: 300,
|
||||
status: (overrides.status ?? 'monitored') as 'monitored',
|
||||
});
|
||||
return item!;
|
||||
}
|
||||
|
||||
// ── Auth gating ──
|
||||
|
||||
describe('Authentication', () => {
|
||||
it('returns 401 when no API key is provided', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/download/1',
|
||||
});
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
});
|
||||
|
||||
// ── 404 handling ──
|
||||
|
||||
describe('Not found', () => {
|
||||
it('returns 404 for non-existent content item', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'POST', url: '/api/v1/download/99999' })
|
||||
);
|
||||
expect(res.statusCode).toBe(404);
|
||||
expect(res.json().message).toContain('99999');
|
||||
});
|
||||
|
||||
it('returns 400 for non-numeric content item ID', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'POST', url: '/api/v1/download/abc' })
|
||||
);
|
||||
expect(res.statusCode).toBe(400);
|
||||
});
|
||||
});
|
||||
|
||||
// ── 409 Conflict ──
|
||||
|
||||
describe('Conflict handling', () => {
|
||||
it('returns 409 when content item is already downloading', async () => {
|
||||
const item = await createTestContentItem();
|
||||
await updateContentItem(db, item.id, { status: 'downloading' });
|
||||
|
||||
const res = await server.inject(
|
||||
authed({ method: 'POST', url: `/api/v1/download/${item.id}` })
|
||||
);
|
||||
expect(res.statusCode).toBe(409);
|
||||
expect(res.json().message).toContain('downloading');
|
||||
});
|
||||
|
||||
it('returns 409 when content item is already downloaded', async () => {
|
||||
const item = await createTestContentItem();
|
||||
await updateContentItem(db, item.id, { status: 'downloaded' });
|
||||
|
||||
const res = await server.inject(
|
||||
authed({ method: 'POST', url: `/api/v1/download/${item.id}` })
|
||||
);
|
||||
expect(res.statusCode).toBe(409);
|
||||
expect(res.json().message).toContain('downloaded');
|
||||
});
|
||||
|
||||
it('returns 409 when content item is already queued', async () => {
|
||||
const item = await createTestContentItem();
|
||||
// Enqueue once
|
||||
await queueService.enqueue(item.id);
|
||||
|
||||
// Try to enqueue again via the download endpoint
|
||||
const res = await server.inject(
|
||||
authed({ method: 'POST', url: `/api/v1/download/${item.id}` })
|
||||
);
|
||||
expect(res.statusCode).toBe(409);
|
||||
expect(res.json().message).toContain('already in the queue');
|
||||
});
|
||||
});
|
||||
|
||||
// ── Successful enqueue ──
|
||||
|
||||
describe('Successful enqueue via download endpoint', () => {
|
||||
it('returns 202 Accepted with queue item', async () => {
|
||||
const item = await createTestContentItem();
|
||||
|
||||
const res = await server.inject(
|
||||
authed({ method: 'POST', url: `/api/v1/download/${item.id}` })
|
||||
);
|
||||
|
||||
expect(res.statusCode).toBe(202);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(body.data.contentItemId).toBe(item.id);
|
||||
expect(body.data.status).toBe('pending');
|
||||
expect(body.data.id).toBeDefined();
|
||||
});
|
||||
|
||||
it('re-allows enqueue of failed items', async () => {
|
||||
const item = await createTestContentItem();
|
||||
await updateContentItem(db, item.id, { status: 'failed' });
|
||||
|
||||
const res = await server.inject(
|
||||
authed({ method: 'POST', url: `/api/v1/download/${item.id}` })
|
||||
);
|
||||
|
||||
// Failed items can be re-enqueued
|
||||
expect(res.statusCode).toBe(202);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(body.data.status).toBe('pending');
|
||||
});
|
||||
});
|
||||
});
|
||||
764
src/__tests__/download.test.ts
Normal file
764
src/__tests__/download.test.ts
Normal file
|
|
@ -0,0 +1,764 @@
|
|||
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync, writeFileSync, mkdirSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { initDatabaseAsync, closeDatabase } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import { createChannel } from '../db/repositories/channel-repository';
|
||||
import {
|
||||
createContentItem,
|
||||
getContentItemById,
|
||||
} from '../db/repositories/content-repository';
|
||||
import { DownloadService } from '../services/download';
|
||||
import { QualityAnalyzer } from '../services/quality-analyzer';
|
||||
import { FileOrganizer } from '../services/file-organizer';
|
||||
import { CookieManager } from '../services/cookie-manager';
|
||||
import { RateLimiter } from '../services/rate-limiter';
|
||||
import { YtDlpError } from '../sources/yt-dlp';
|
||||
import type { ContentItem, Channel, FormatProfile, QualityInfo } from '../types/index';
|
||||
|
||||
// ── Mocks ──
|
||||
|
||||
// Mock execYtDlp from yt-dlp module
|
||||
const execYtDlpMock = vi.fn();
|
||||
vi.mock('../sources/yt-dlp', async (importOriginal) => {
|
||||
const actual = await importOriginal() as Record<string, unknown>;
|
||||
return {
|
||||
...actual,
|
||||
execYtDlp: (...args: unknown[]) => execYtDlpMock(...args),
|
||||
};
|
||||
});
|
||||
|
||||
// Mock fs.stat for file size
|
||||
const statMock = vi.fn();
|
||||
vi.mock('node:fs/promises', async (importOriginal) => {
|
||||
const actual = await importOriginal() as Record<string, unknown>;
|
||||
return {
|
||||
...actual,
|
||||
stat: (...args: unknown[]) => statMock(...args),
|
||||
};
|
||||
});
|
||||
|
||||
// ── Test Helpers ──
|
||||
|
||||
let tmpDir: string;
|
||||
let db: Awaited<ReturnType<typeof initDatabaseAsync>>;
|
||||
let testChannel: Channel;
|
||||
let testContentItem: ContentItem;
|
||||
|
||||
async function setupDb(): Promise<void> {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-dl-test-'));
|
||||
const dbPath = join(tmpDir, 'test.db');
|
||||
db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
// Create a test channel
|
||||
testChannel = await createChannel(db, {
|
||||
name: 'Test Channel',
|
||||
platform: 'youtube',
|
||||
platformId: 'UC_test123',
|
||||
url: 'https://www.youtube.com/channel/UC_test123',
|
||||
imageUrl: null,
|
||||
formatProfileId: null,
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
metadata: null,
|
||||
});
|
||||
|
||||
// Create a test content item in 'monitored' status
|
||||
testContentItem = (await createContentItem(db, {
|
||||
channelId: testChannel.id,
|
||||
title: 'Test Video Title',
|
||||
platformContentId: 'vid_abc123',
|
||||
url: 'https://www.youtube.com/watch?v=abc123',
|
||||
contentType: 'video',
|
||||
duration: 600,
|
||||
status: 'monitored',
|
||||
}))!;
|
||||
}
|
||||
|
||||
function cleanup(): void {
|
||||
closeDatabase();
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Windows cleanup best-effort
|
||||
}
|
||||
}
|
||||
|
||||
function createMockDeps() {
|
||||
const mediaPath = join(tmpDir, 'media');
|
||||
const cookiePath = join(tmpDir, 'cookies');
|
||||
mkdirSync(mediaPath, { recursive: true });
|
||||
mkdirSync(cookiePath, { recursive: true });
|
||||
|
||||
const rateLimiter = new RateLimiter({
|
||||
youtube: { minIntervalMs: 0 },
|
||||
soundcloud: { minIntervalMs: 0 },
|
||||
});
|
||||
const fileOrganizer = new FileOrganizer(mediaPath);
|
||||
const qualityAnalyzer = new QualityAnalyzer();
|
||||
const cookieManager = new CookieManager(cookiePath);
|
||||
|
||||
// Spy on rate limiter methods
|
||||
vi.spyOn(rateLimiter, 'acquire');
|
||||
vi.spyOn(rateLimiter, 'reportSuccess');
|
||||
vi.spyOn(rateLimiter, 'reportError');
|
||||
|
||||
// Spy on quality analyzer
|
||||
vi.spyOn(qualityAnalyzer, 'analyze').mockResolvedValue({
|
||||
actualResolution: '1920x1080',
|
||||
actualCodec: 'h264',
|
||||
actualBitrate: '5.0 Mbps',
|
||||
containerFormat: 'mp4',
|
||||
qualityWarnings: [],
|
||||
});
|
||||
|
||||
return { rateLimiter, fileOrganizer, qualityAnalyzer, cookieManager };
|
||||
}
|
||||
|
||||
// ── Tests ──
|
||||
|
||||
describe('DownloadService', () => {
|
||||
beforeEach(async () => {
|
||||
vi.clearAllMocks();
|
||||
await setupDb();
|
||||
});
|
||||
|
||||
afterEach(cleanup);
|
||||
|
||||
describe('downloadItem — successful download', () => {
|
||||
it('transitions content item from monitored → downloading → downloaded', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
// Mock yt-dlp returning a filepath on stdout
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.mp4');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'fake video data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
statMock.mockResolvedValueOnce({ size: 15_000_000 });
|
||||
|
||||
const result = await service.downloadItem(testContentItem, testChannel);
|
||||
|
||||
expect(result.status).toBe('downloaded');
|
||||
expect(result.filePath).toBe(outputPath);
|
||||
expect(result.fileSize).toBe(15_000_000);
|
||||
expect(result.format).toBe('mp4');
|
||||
expect(result.qualityMetadata).toBeDefined();
|
||||
expect(result.qualityMetadata?.actualResolution).toBe('1920x1080');
|
||||
|
||||
// Verify DB state
|
||||
const dbItem = await getContentItemById(db, testContentItem.id);
|
||||
expect(dbItem?.status).toBe('downloaded');
|
||||
expect(dbItem?.filePath).toBe(outputPath);
|
||||
});
|
||||
|
||||
it('populates filePath, fileSize, format, and qualityMetadata', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.webm');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'fake data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
statMock.mockResolvedValueOnce({ size: 8_500_000 });
|
||||
|
||||
const qualityInfo: QualityInfo = {
|
||||
actualResolution: '1280x720',
|
||||
actualCodec: 'vp9',
|
||||
actualBitrate: '2.5 Mbps',
|
||||
containerFormat: 'webm',
|
||||
qualityWarnings: [],
|
||||
};
|
||||
(deps.qualityAnalyzer.analyze as ReturnType<typeof vi.fn>).mockResolvedValueOnce(qualityInfo);
|
||||
|
||||
const result = await service.downloadItem(testContentItem, testChannel);
|
||||
|
||||
expect(result.filePath).toBe(outputPath);
|
||||
expect(result.fileSize).toBe(8_500_000);
|
||||
expect(result.format).toBe('webm');
|
||||
expect(result.qualityMetadata).toEqual(qualityInfo);
|
||||
});
|
||||
|
||||
it('sets downloadedAt on successful download', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.mp4');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'fake data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
|
||||
statMock.mockResolvedValueOnce({ size: 10_000_000 });
|
||||
|
||||
const before = new Date().toISOString();
|
||||
const result = await service.downloadItem(testContentItem, testChannel);
|
||||
const after = new Date().toISOString();
|
||||
|
||||
// downloadedAt should be set to a valid ISO datetime
|
||||
expect(result.downloadedAt).toBeTruthy();
|
||||
expect(result.downloadedAt).toMatch(/^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}/);
|
||||
// Should be between before and after timestamps
|
||||
expect(result.downloadedAt! >= before).toBe(true);
|
||||
expect(result.downloadedAt! <= after).toBe(true);
|
||||
|
||||
// Verify in DB as well
|
||||
const dbItem = await getContentItemById(db, testContentItem.id);
|
||||
expect(dbItem?.downloadedAt).toBe(result.downloadedAt);
|
||||
});
|
||||
});
|
||||
|
||||
describe('downloadItem — failed download', () => {
|
||||
it('transitions content item to failed on yt-dlp error', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
execYtDlpMock.mockRejectedValueOnce(
|
||||
new YtDlpError('yt-dlp exited with code 1: ERROR: Video not found', 'ERROR: Video not found', 1)
|
||||
);
|
||||
|
||||
await expect(
|
||||
service.downloadItem(testContentItem, testChannel)
|
||||
).rejects.toThrow(YtDlpError);
|
||||
|
||||
// Verify status is 'failed' in DB
|
||||
const dbItem = await getContentItemById(db, testContentItem.id);
|
||||
expect(dbItem?.status).toBe('failed');
|
||||
});
|
||||
});
|
||||
|
||||
describe('downloadItem — rate limiter integration', () => {
|
||||
it('calls acquire() before download and reportSuccess() after', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.mp4');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 1000 });
|
||||
|
||||
await service.downloadItem(testContentItem, testChannel);
|
||||
|
||||
expect(deps.rateLimiter.acquire).toHaveBeenCalledWith('youtube');
|
||||
expect(deps.rateLimiter.reportSuccess).toHaveBeenCalledWith('youtube');
|
||||
expect(deps.rateLimiter.reportError).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('calls reportError() on download failure', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
execYtDlpMock.mockRejectedValueOnce(
|
||||
new YtDlpError('download error', 'stderr content', 1)
|
||||
);
|
||||
|
||||
await expect(
|
||||
service.downloadItem(testContentItem, testChannel)
|
||||
).rejects.toThrow();
|
||||
|
||||
expect(deps.rateLimiter.acquire).toHaveBeenCalledWith('youtube');
|
||||
expect(deps.rateLimiter.reportError).toHaveBeenCalledWith('youtube');
|
||||
expect(deps.rateLimiter.reportSuccess).not.toHaveBeenCalled();
|
||||
});
|
||||
});
|
||||
|
||||
describe('downloadItem — format profile', () => {
|
||||
it('applies video resolution format profile with correct -f flag', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.mkv');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 1000 });
|
||||
|
||||
const profile: FormatProfile = {
|
||||
id: 1,
|
||||
name: 'High Quality',
|
||||
videoResolution: '1080p',
|
||||
audioCodec: null,
|
||||
audioBitrate: null,
|
||||
containerFormat: 'mkv',
|
||||
isDefault: false,
|
||||
subtitleLanguages: null,
|
||||
embedSubtitles: false,
|
||||
createdAt: '',
|
||||
updatedAt: '',
|
||||
};
|
||||
|
||||
await service.downloadItem(testContentItem, testChannel, profile);
|
||||
|
||||
const args = execYtDlpMock.mock.calls[0][0] as string[];
|
||||
expect(args).toContain('-f');
|
||||
const fIdx = args.indexOf('-f');
|
||||
expect(args[fIdx + 1]).toBe('bestvideo[height<=1080]+bestaudio/best[height<=1080]');
|
||||
expect(args).toContain('--merge-output-format');
|
||||
const moIdx = args.indexOf('--merge-output-format');
|
||||
expect(args[moIdx + 1]).toBe('mkv');
|
||||
});
|
||||
|
||||
it('applies audio codec/bitrate format profile for audio content', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
// Create an audio content item
|
||||
const audioItem = (await createContentItem(db, {
|
||||
channelId: testChannel.id,
|
||||
title: 'Test Audio Track',
|
||||
platformContentId: 'audio_xyz',
|
||||
url: 'https://soundcloud.com/test/track',
|
||||
contentType: 'audio',
|
||||
duration: 300,
|
||||
status: 'monitored',
|
||||
}))!;
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Audio Track.opus');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 5_000_000 });
|
||||
|
||||
const profile: FormatProfile = {
|
||||
id: 2,
|
||||
name: 'Audio HQ',
|
||||
videoResolution: null,
|
||||
audioCodec: 'opus',
|
||||
audioBitrate: '320k',
|
||||
containerFormat: null,
|
||||
isDefault: false,
|
||||
subtitleLanguages: null,
|
||||
embedSubtitles: false,
|
||||
createdAt: '',
|
||||
updatedAt: '',
|
||||
};
|
||||
|
||||
await service.downloadItem(audioItem, testChannel, profile);
|
||||
|
||||
const args = execYtDlpMock.mock.calls[0][0] as string[];
|
||||
expect(args).toContain('-f');
|
||||
const fIdx = args.indexOf('-f');
|
||||
expect(args[fIdx + 1]).toBe('bestaudio');
|
||||
expect(args).toContain('--extract-audio');
|
||||
expect(args).toContain('--audio-format');
|
||||
const afIdx = args.indexOf('--audio-format');
|
||||
expect(args[afIdx + 1]).toBe('opus');
|
||||
expect(args).toContain('--audio-quality');
|
||||
const aqIdx = args.indexOf('--audio-quality');
|
||||
expect(args[aqIdx + 1]).toBe('320k');
|
||||
});
|
||||
|
||||
it('falls back to -f "best" for video when no format profile', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.mp4');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 1000 });
|
||||
|
||||
await service.downloadItem(testContentItem, testChannel);
|
||||
|
||||
const args = execYtDlpMock.mock.calls[0][0] as string[];
|
||||
expect(args).toContain('-f');
|
||||
const fIdx = args.indexOf('-f');
|
||||
expect(args[fIdx + 1]).toBe('best');
|
||||
});
|
||||
|
||||
it('falls back to -f "bestaudio" for audio when no format profile', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
const audioItem = (await createContentItem(db, {
|
||||
channelId: testChannel.id,
|
||||
title: 'No Profile Audio',
|
||||
platformContentId: 'audio_np',
|
||||
url: 'https://soundcloud.com/test/no-profile',
|
||||
contentType: 'audio',
|
||||
duration: 200,
|
||||
status: 'monitored',
|
||||
}))!;
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'No Profile Audio.mp3');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 3_000_000 });
|
||||
|
||||
await service.downloadItem(audioItem, testChannel);
|
||||
|
||||
const args = execYtDlpMock.mock.calls[0][0] as string[];
|
||||
expect(args).toContain('-f');
|
||||
const fIdx = args.indexOf('-f');
|
||||
expect(args[fIdx + 1]).toBe('bestaudio');
|
||||
// No --extract-audio when no profile
|
||||
expect(args).not.toContain('--extract-audio');
|
||||
});
|
||||
});
|
||||
|
||||
describe('downloadItem — cookie support', () => {
|
||||
it('includes --cookies flag when cookies exist for platform', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
// Import a cookie file for youtube
|
||||
const cookieSource = join(tmpDir, 'source_cookies.txt');
|
||||
writeFileSync(cookieSource, '# Netscape HTTP Cookie File\n.youtube.com\tTRUE\t/\tFALSE\t0\tSID\tabc123\n');
|
||||
await deps.cookieManager.importCookieFile('youtube', cookieSource);
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.mp4');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 1000 });
|
||||
|
||||
await service.downloadItem(testContentItem, testChannel);
|
||||
|
||||
const args = execYtDlpMock.mock.calls[0][0] as string[];
|
||||
expect(args).toContain('--cookies');
|
||||
const cookieIdx = args.indexOf('--cookies');
|
||||
expect(args[cookieIdx + 1]).toContain('youtube_cookies.txt');
|
||||
});
|
||||
|
||||
it('does not include --cookies flag when no cookies exist', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.mp4');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 1000 });
|
||||
|
||||
await service.downloadItem(testContentItem, testChannel);
|
||||
|
||||
const args = execYtDlpMock.mock.calls[0][0] as string[];
|
||||
expect(args).not.toContain('--cookies');
|
||||
});
|
||||
});
|
||||
|
||||
describe('downloadItem — timeout', () => {
|
||||
it('uses 30-minute timeout for yt-dlp download calls', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.mp4');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 1000 });
|
||||
|
||||
await service.downloadItem(testContentItem, testChannel);
|
||||
|
||||
// Check that execYtDlp was called with 30-minute timeout
|
||||
expect(execYtDlpMock).toHaveBeenCalledWith(
|
||||
expect.any(Array),
|
||||
{ timeout: 1_800_000 }
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe('downloadItem — common args', () => {
|
||||
it('always includes --no-playlist and --print after_move:filepath', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.mp4');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 1000 });
|
||||
|
||||
await service.downloadItem(testContentItem, testChannel);
|
||||
|
||||
const args = execYtDlpMock.mock.calls[0][0] as string[];
|
||||
expect(args).toContain('--no-playlist');
|
||||
expect(args).toContain('--print');
|
||||
const printIdx = args.indexOf('--print');
|
||||
expect(args[printIdx + 1]).toBe('after_move:filepath');
|
||||
});
|
||||
|
||||
it('includes -o output template and URL', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.mp4');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 1000 });
|
||||
|
||||
await service.downloadItem(testContentItem, testChannel);
|
||||
|
||||
const args = execYtDlpMock.mock.calls[0][0] as string[];
|
||||
expect(args).toContain('-o');
|
||||
// URL should be the last arg
|
||||
expect(args[args.length - 1]).toBe('https://www.youtube.com/watch?v=abc123');
|
||||
});
|
||||
});
|
||||
|
||||
describe('Best format option', () => {
|
||||
it('emits bestvideo+bestaudio/best for video with videoResolution "Best"', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.mp4');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 1000 });
|
||||
|
||||
const profile: FormatProfile = {
|
||||
id: 10,
|
||||
name: 'Best Quality',
|
||||
videoResolution: 'Best',
|
||||
audioCodec: null,
|
||||
audioBitrate: null,
|
||||
containerFormat: null,
|
||||
isDefault: false,
|
||||
subtitleLanguages: null,
|
||||
embedSubtitles: false,
|
||||
createdAt: '',
|
||||
updatedAt: '',
|
||||
};
|
||||
|
||||
await service.downloadItem(testContentItem, testChannel, profile);
|
||||
|
||||
const args = execYtDlpMock.mock.calls[0][0] as string[];
|
||||
const fIdx = args.indexOf('-f');
|
||||
expect(fIdx).toBeGreaterThanOrEqual(0);
|
||||
expect(args[fIdx + 1]).toBe('bestvideo+bestaudio/best');
|
||||
// Should default to mp4 merge format when containerFormat is null
|
||||
expect(args).toContain('--merge-output-format');
|
||||
const moIdx = args.indexOf('--merge-output-format');
|
||||
expect(args[moIdx + 1]).toBe('mp4');
|
||||
});
|
||||
|
||||
it('uses specified container format with "Best" resolution', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Test Video Title.mkv');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 1000 });
|
||||
|
||||
const profile: FormatProfile = {
|
||||
id: 11,
|
||||
name: 'Best MKV',
|
||||
videoResolution: 'Best',
|
||||
audioCodec: null,
|
||||
audioBitrate: null,
|
||||
containerFormat: 'mkv',
|
||||
isDefault: false,
|
||||
subtitleLanguages: null,
|
||||
embedSubtitles: false,
|
||||
createdAt: '',
|
||||
updatedAt: '',
|
||||
};
|
||||
|
||||
await service.downloadItem(testContentItem, testChannel, profile);
|
||||
|
||||
const args = execYtDlpMock.mock.calls[0][0] as string[];
|
||||
const fIdx = args.indexOf('-f');
|
||||
expect(args[fIdx + 1]).toBe('bestvideo+bestaudio/best');
|
||||
const moIdx = args.indexOf('--merge-output-format');
|
||||
expect(args[moIdx + 1]).toBe('mkv');
|
||||
});
|
||||
|
||||
it('omits --audio-quality when audioBitrate is "Best"', async () => {
|
||||
const deps = createMockDeps();
|
||||
const service = new DownloadService(
|
||||
db, deps.rateLimiter, deps.fileOrganizer,
|
||||
deps.qualityAnalyzer, deps.cookieManager
|
||||
);
|
||||
|
||||
// Create an audio content item
|
||||
const audioItem = (await createContentItem(db, {
|
||||
channelId: testChannel.id,
|
||||
title: 'Best Audio Track',
|
||||
platformContentId: 'audio_best',
|
||||
url: 'https://soundcloud.com/test/best-track',
|
||||
contentType: 'audio',
|
||||
duration: 240,
|
||||
status: 'monitored',
|
||||
}))!;
|
||||
|
||||
const outputPath = join(tmpDir, 'media', 'youtube', 'Test Channel', 'Best Audio Track.mp3');
|
||||
mkdirSync(join(tmpDir, 'media', 'youtube', 'Test Channel'), { recursive: true });
|
||||
writeFileSync(outputPath, 'data');
|
||||
|
||||
execYtDlpMock.mockResolvedValueOnce({
|
||||
stdout: outputPath,
|
||||
stderr: '',
|
||||
exitCode: 0,
|
||||
});
|
||||
statMock.mockResolvedValueOnce({ size: 2_000_000 });
|
||||
|
||||
const profile: FormatProfile = {
|
||||
id: 12,
|
||||
name: 'Best Audio',
|
||||
videoResolution: null,
|
||||
audioCodec: 'mp3',
|
||||
audioBitrate: 'Best',
|
||||
containerFormat: null,
|
||||
isDefault: false,
|
||||
subtitleLanguages: null,
|
||||
embedSubtitles: false,
|
||||
createdAt: '',
|
||||
updatedAt: '',
|
||||
};
|
||||
|
||||
await service.downloadItem(audioItem, testChannel, profile);
|
||||
|
||||
const args = execYtDlpMock.mock.calls[0][0] as string[];
|
||||
|
||||
// Should have -f bestaudio
|
||||
const fIdx = args.indexOf('-f');
|
||||
expect(args[fIdx + 1]).toBe('bestaudio');
|
||||
|
||||
// Should have --extract-audio and --audio-format
|
||||
expect(args).toContain('--extract-audio');
|
||||
expect(args).toContain('--audio-format');
|
||||
const afIdx = args.indexOf('--audio-format');
|
||||
expect(args[afIdx + 1]).toBe('mp3');
|
||||
|
||||
// Must NOT have --audio-quality when bitrate is "Best"
|
||||
expect(args).not.toContain('--audio-quality');
|
||||
});
|
||||
});
|
||||
});
|
||||
407
src/__tests__/e2e-flow.test.ts
Normal file
407
src/__tests__/e2e-flow.test.ts
Normal file
|
|
@ -0,0 +1,407 @@
|
|||
import { describe, it, expect, beforeAll, afterAll, vi } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { type FastifyInstance } from 'fastify';
|
||||
import { initDatabaseAsync, closeDatabase } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import { buildServer } from '../server/index';
|
||||
import { createChannel } from '../db/repositories/channel-repository';
|
||||
import { createContentItem } from '../db/repositories/content-repository';
|
||||
import { updateQueueItemStatus } from '../db/repositories/queue-repository';
|
||||
import { QueueService } from '../services/queue';
|
||||
import { systemConfig } from '../db/schema/index';
|
||||
import { eq } from 'drizzle-orm';
|
||||
import { type LibSQLDatabase } from 'drizzle-orm/libsql';
|
||||
import type * as schema from '../db/schema/index';
|
||||
|
||||
/**
|
||||
* End-to-end integration test exercising the full application flow:
|
||||
* channel CRUD → content listing → download enqueue → queue state →
|
||||
* history records → health check → system status.
|
||||
*
|
||||
* Uses a real SQLite database with migrations and Fastify inject() for
|
||||
* fast HTTP testing without binding ports. The download service is mocked
|
||||
* so yt-dlp is not required.
|
||||
*/
|
||||
|
||||
describe('End-to-end flow', () => {
|
||||
let server: FastifyInstance;
|
||||
let db: LibSQLDatabase<typeof schema>;
|
||||
let apiKey: string;
|
||||
let tmpDir: string;
|
||||
|
||||
// IDs populated during test flow
|
||||
let channelId: number;
|
||||
let contentItemId: number;
|
||||
let queueItemId: number;
|
||||
|
||||
// Mock download service — simulates successful downloads
|
||||
const mockDownloadService = {
|
||||
downloadItem: vi.fn().mockResolvedValue(undefined),
|
||||
};
|
||||
|
||||
beforeAll(async () => {
|
||||
// Create isolated temp database
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-e2e-'));
|
||||
const dbPath = join(tmpDir, 'test.db');
|
||||
db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
// Build server with real database
|
||||
server = await buildServer({ db });
|
||||
|
||||
// Attach a QueueService with mock download service so enqueue works
|
||||
const queueService = new QueueService(db, mockDownloadService as any, {
|
||||
concurrency: 1,
|
||||
});
|
||||
// Stop auto-processing so we control when downloads run
|
||||
queueService.stop();
|
||||
server.queueService = queueService;
|
||||
|
||||
await server.ready();
|
||||
|
||||
// Read auto-generated API key from database
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(systemConfig)
|
||||
.where(eq(systemConfig.key, 'api_key'))
|
||||
.limit(1);
|
||||
apiKey = rows[0]?.value ?? '';
|
||||
expect(apiKey).toBeTruthy();
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
server.queueService?.stop();
|
||||
await server.close();
|
||||
closeDatabase();
|
||||
// Windows: SQLite WAL/SHM files may be locked briefly (K004)
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Temp dir cleanup is best-effort on Windows
|
||||
}
|
||||
});
|
||||
|
||||
// ── Step 1: Create a channel (via repository — bypasses yt-dlp resolution) ──
|
||||
|
||||
describe('Step 1: Channel creation and retrieval', () => {
|
||||
it('creates a channel in the database', async () => {
|
||||
const channel = await createChannel(db, {
|
||||
name: 'E2E Test Channel',
|
||||
platform: 'youtube',
|
||||
platformId: 'UC_e2e_test_channel',
|
||||
url: 'https://www.youtube.com/channel/UC_e2e_test_channel',
|
||||
imageUrl: 'https://example.com/thumb.jpg',
|
||||
formatProfileId: null,
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
metadata: null,
|
||||
});
|
||||
|
||||
expect(channel.id).toBeGreaterThan(0);
|
||||
channelId = channel.id;
|
||||
});
|
||||
|
||||
it('GET /api/v1/channel/:id returns the channel', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: `/api/v1/channel/${channelId}`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.name).toBe('E2E Test Channel');
|
||||
expect(body.platform).toBe('youtube');
|
||||
expect(body.platformId).toBe('UC_e2e_test_channel');
|
||||
});
|
||||
|
||||
it('GET /api/v1/channel lists channels including ours', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const channels = res.json();
|
||||
expect(Array.isArray(channels)).toBe(true);
|
||||
expect(channels.some((c: { id: number }) => c.id === channelId)).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Step 2: Create content item (via repository — simulates scheduler detection) ──
|
||||
|
||||
describe('Step 2: Content creation and listing', () => {
|
||||
it('creates a content item for the channel', async () => {
|
||||
const item = await createContentItem(db, {
|
||||
channelId,
|
||||
title: 'E2E Test Video — How to Build a Media Server',
|
||||
platformContentId: 'e2e_test_video_001',
|
||||
url: 'https://www.youtube.com/watch?v=e2e_test_001',
|
||||
contentType: 'video',
|
||||
duration: 600,
|
||||
status: 'monitored',
|
||||
});
|
||||
|
||||
expect(item).not.toBeNull();
|
||||
contentItemId = item!.id;
|
||||
});
|
||||
|
||||
it('GET /api/v1/content?channelId=:id shows the content item', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: `/api/v1/content?channelId=${channelId}`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(body.data.length).toBeGreaterThanOrEqual(1);
|
||||
expect(body.data.some((c: { id: number }) => c.id === contentItemId)).toBe(true);
|
||||
});
|
||||
|
||||
it('GET /api/v1/channel/:id/content returns channel-specific content', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: `/api/v1/channel/${channelId}/content`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(body.data.length).toBeGreaterThanOrEqual(1);
|
||||
const item = body.data.find((c: { id: number }) => c.id === contentItemId);
|
||||
expect(item).toBeDefined();
|
||||
expect(item.title).toBe('E2E Test Video — How to Build a Media Server');
|
||||
});
|
||||
});
|
||||
|
||||
// ── Step 3: Enqueue download and check queue state ──
|
||||
|
||||
describe('Step 3: Download enqueue and queue management', () => {
|
||||
it('POST /api/v1/download/:contentItemId enqueues the item', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: `/api/v1/download/${contentItemId}`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(202);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(body.data).toHaveProperty('id');
|
||||
expect(body.data.contentItemId).toBe(contentItemId);
|
||||
expect(body.data.status).toBe('pending');
|
||||
queueItemId = body.data.id;
|
||||
});
|
||||
|
||||
it('GET /api/v1/queue shows the queued item', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/queue',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(body.data.length).toBeGreaterThanOrEqual(1);
|
||||
const item = body.data.find((q: { id: number }) => q.id === queueItemId);
|
||||
expect(item).toBeDefined();
|
||||
expect(item.status).toBe('pending');
|
||||
});
|
||||
|
||||
it('GET /api/v1/queue?status=pending filters correctly', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/queue?status=pending',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data.every((q: { status: string }) => q.status === 'pending')).toBe(true);
|
||||
});
|
||||
|
||||
it('POST /api/v1/download/:contentItemId rejects duplicate enqueue', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: `/api/v1/download/${contentItemId}`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
// Content status is now 'queued', so this should return 409
|
||||
expect(res.statusCode).toBe(409);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Step 4: Simulate download completion and verify history ──
|
||||
|
||||
describe('Step 4: Download completion and history', () => {
|
||||
it('simulating download completion creates history records', async () => {
|
||||
// Manually transition the queue item to completed to simulate
|
||||
// what the QueueService would do after a successful download
|
||||
await updateQueueItemStatus(db, queueItemId, 'completed', {
|
||||
completedAt: new Date().toISOString(),
|
||||
});
|
||||
|
||||
// Verify queue item is now completed
|
||||
const queueRes = await server.inject({
|
||||
method: 'GET',
|
||||
url: `/api/v1/queue/${queueItemId}`,
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(queueRes.statusCode).toBe(200);
|
||||
expect(queueRes.json().data.status).toBe('completed');
|
||||
});
|
||||
|
||||
it('GET /api/v1/history shows history events', async () => {
|
||||
// The enqueue operation created a 'grabbed' history event
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/history',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(body.data.length).toBeGreaterThanOrEqual(1);
|
||||
|
||||
// At minimum, we should have a 'grabbed' event from enqueue
|
||||
const grabbedEvent = body.data.find(
|
||||
(e: { eventType: string; contentItemId: number | null }) =>
|
||||
e.eventType === 'grabbed' && e.contentItemId === contentItemId
|
||||
);
|
||||
expect(grabbedEvent).toBeDefined();
|
||||
});
|
||||
|
||||
it('GET /api/v1/history?eventType=grabbed filters by event type', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/history?eventType=grabbed',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data.every((e: { eventType: string }) => e.eventType === 'grabbed')).toBe(true);
|
||||
});
|
||||
|
||||
it('GET /api/v1/activity returns recent activity', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/activity',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(body.data.length).toBeGreaterThanOrEqual(1);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Step 5: Health and System Status ──
|
||||
|
||||
describe('Step 5: Health and system status', () => {
|
||||
it('GET /ping returns ok (unauthenticated)', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/ping',
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
expect(res.json()).toEqual({ status: 'ok' });
|
||||
});
|
||||
|
||||
it('GET /api/v1/health returns healthy status', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/health',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.status).toBeDefined();
|
||||
expect(body.components).toBeDefined();
|
||||
expect(Array.isArray(body.components)).toBe(true);
|
||||
|
||||
// Database component should be healthy
|
||||
const dbComponent = body.components.find(
|
||||
(c: { name: string }) => c.name === 'database'
|
||||
);
|
||||
expect(dbComponent).toBeDefined();
|
||||
expect(dbComponent.status).toBe('healthy');
|
||||
});
|
||||
|
||||
it('GET /api/v1/system/status returns system information', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/status',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body).toHaveProperty('appName');
|
||||
expect(body.appName).toBe('Tubearr');
|
||||
expect(body).toHaveProperty('version');
|
||||
expect(body).toHaveProperty('uptime');
|
||||
expect(body).toHaveProperty('platform');
|
||||
expect(body).toHaveProperty('nodeVersion');
|
||||
expect(typeof body.uptime).toBe('number');
|
||||
});
|
||||
});
|
||||
|
||||
// ── Step 6: Error handling and edge cases ──
|
||||
|
||||
describe('Step 6: Error handling', () => {
|
||||
it('returns 401 for missing API key on protected routes', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/system/status',
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
|
||||
it('returns 404 for unknown API routes', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/nonexistent-route',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
|
||||
it('returns 404 for non-existent channel', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/channel/99999',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
|
||||
it('returns 404 for non-existent content item download', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/v1/download/99999',
|
||||
headers: { 'x-api-key': apiKey },
|
||||
});
|
||||
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
});
|
||||
});
|
||||
216
src/__tests__/file-organizer.test.ts
Normal file
216
src/__tests__/file-organizer.test.ts
Normal file
|
|
@ -0,0 +1,216 @@
|
|||
import { describe, it, expect, afterEach } from 'vitest';
|
||||
import { mkdtempSync, rmSync, writeFileSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { FileOrganizer } from '../services/file-organizer';
|
||||
|
||||
let tmpDir: string;
|
||||
|
||||
function makeTmpDir(): string {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-fo-test-'));
|
||||
return tmpDir;
|
||||
}
|
||||
|
||||
afterEach(() => {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
describe('FileOrganizer', () => {
|
||||
describe('buildOutputPath', () => {
|
||||
it('produces {mediaPath}/{platform}/{channel}/{title}.{ext} paths', () => {
|
||||
const mediaPath = join('media', 'downloads');
|
||||
const fo = new FileOrganizer(mediaPath);
|
||||
|
||||
const result = fo.buildOutputPath('youtube', 'TechChannel', 'My Video', 'mp4');
|
||||
// Use path.join behavior — just verify the segments are present
|
||||
expect(result).toContain('youtube');
|
||||
expect(result).toContain('TechChannel');
|
||||
expect(result).toContain('My Video.mp4');
|
||||
expect(result.startsWith(mediaPath)).toBe(true);
|
||||
});
|
||||
|
||||
it('sanitizes channelName and title in the path', () => {
|
||||
const fo = new FileOrganizer('/media');
|
||||
|
||||
const result = fo.buildOutputPath(
|
||||
'youtube',
|
||||
'Bad:Channel*Name',
|
||||
'Title "With" <Special> Chars',
|
||||
'mkv'
|
||||
);
|
||||
|
||||
expect(result).not.toContain(':');
|
||||
expect(result).not.toContain('*');
|
||||
expect(result).not.toContain('"');
|
||||
expect(result).not.toContain('<');
|
||||
expect(result).not.toContain('>');
|
||||
expect(result).toContain('BadChannelName');
|
||||
expect(result).toContain('Title With Special Chars.mkv');
|
||||
});
|
||||
|
||||
it('handles extension with or without leading dot', () => {
|
||||
const fo = new FileOrganizer('/media');
|
||||
|
||||
const withDot = fo.buildOutputPath('youtube', 'Ch', 'Vid', '.mp4');
|
||||
const withoutDot = fo.buildOutputPath('youtube', 'Ch', 'Vid', 'mp4');
|
||||
|
||||
// Both should produce the same filename
|
||||
expect(withDot).toContain('Vid.mp4');
|
||||
expect(withoutDot).toContain('Vid.mp4');
|
||||
});
|
||||
});
|
||||
|
||||
describe('sanitizeFilename', () => {
|
||||
it('strips forbidden characters / \\ : * ? " < > |', () => {
|
||||
const fo = new FileOrganizer('/media');
|
||||
|
||||
const result = fo.sanitizeFilename('a/b\\c:d*e?f"g<h>i|j');
|
||||
expect(result).toBe('abcdefghij');
|
||||
});
|
||||
|
||||
it('handles Unicode characters intact', () => {
|
||||
const fo = new FileOrganizer('/media');
|
||||
|
||||
expect(fo.sanitizeFilename('日本語テスト')).toBe('日本語テスト');
|
||||
expect(fo.sanitizeFilename('Ünîcödé Fïlé')).toBe('Ünîcödé Fïlé');
|
||||
expect(fo.sanitizeFilename('🎵 Music 🎶')).toBe('🎵 Music 🎶');
|
||||
});
|
||||
|
||||
it('collapses multiple spaces and underscores', () => {
|
||||
const fo = new FileOrganizer('/media');
|
||||
|
||||
expect(fo.sanitizeFilename('too many spaces')).toBe('too many spaces');
|
||||
expect(fo.sanitizeFilename('too___many___underscores')).toBe('too_many_underscores');
|
||||
});
|
||||
|
||||
it('handles empty and dot-only names', () => {
|
||||
const fo = new FileOrganizer('/media');
|
||||
|
||||
expect(fo.sanitizeFilename('')).toBe('_unnamed');
|
||||
expect(fo.sanitizeFilename('...')).toBe('_unnamed');
|
||||
expect(fo.sanitizeFilename(' ')).toBe('_unnamed');
|
||||
expect(fo.sanitizeFilename('***')).toBe('_unnamed');
|
||||
});
|
||||
|
||||
it('trims leading and trailing dots and spaces', () => {
|
||||
const fo = new FileOrganizer('/media');
|
||||
|
||||
expect(fo.sanitizeFilename(' hello ')).toBe('hello');
|
||||
expect(fo.sanitizeFilename('..hello..')).toBe('hello');
|
||||
expect(fo.sanitizeFilename('. .hello. .')).toBe('hello');
|
||||
});
|
||||
|
||||
it('replaces control characters', () => {
|
||||
const fo = new FileOrganizer('/media');
|
||||
|
||||
const withControls = 'hello\x00world\x1f!';
|
||||
expect(fo.sanitizeFilename(withControls)).toBe('helloworld!');
|
||||
});
|
||||
|
||||
it('respects max filename length of 200 characters', () => {
|
||||
const fo = new FileOrganizer('/media');
|
||||
|
||||
const longName = 'A'.repeat(300);
|
||||
const result = fo.sanitizeFilename(longName);
|
||||
expect(result.length).toBeLessThanOrEqual(200);
|
||||
expect(result.length).toBe(200);
|
||||
});
|
||||
|
||||
it('truncates without breaking multi-byte codepoints', () => {
|
||||
const fo = new FileOrganizer('/media');
|
||||
|
||||
// Each emoji is 2 code units but 1 codepoint — use 201 single-codepoint chars
|
||||
const emojiName = '🎵'.repeat(201);
|
||||
const result = fo.sanitizeFilename(emojiName);
|
||||
expect(result.length).toBeLessThanOrEqual(200 * 2); // String.length counts UTF-16 code units
|
||||
expect([...result].length).toBeLessThanOrEqual(200); // Spread counts codepoints
|
||||
});
|
||||
});
|
||||
|
||||
describe('ensureDirectory', () => {
|
||||
it('creates parent directories recursively', async () => {
|
||||
const base = makeTmpDir();
|
||||
const fo = new FileOrganizer(base);
|
||||
|
||||
const filePath = join(base, 'youtube', 'channel', 'video.mp4');
|
||||
await fo.ensureDirectory(filePath);
|
||||
|
||||
expect(existsSync(join(base, 'youtube', 'channel'))).toBe(true);
|
||||
});
|
||||
|
||||
it('succeeds when directory already exists', async () => {
|
||||
const base = makeTmpDir();
|
||||
const fo = new FileOrganizer(base);
|
||||
|
||||
const filePath = join(base, 'youtube', 'channel', 'video.mp4');
|
||||
await fo.ensureDirectory(filePath);
|
||||
// Second call should not throw
|
||||
await fo.ensureDirectory(filePath);
|
||||
|
||||
expect(existsSync(join(base, 'youtube', 'channel'))).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe('resolveUniquePath', () => {
|
||||
it('returns original path when file does not exist', async () => {
|
||||
const base = makeTmpDir();
|
||||
const fo = new FileOrganizer(base);
|
||||
|
||||
const filePath = join(base, 'nonexistent.mp4');
|
||||
const result = await fo.resolveUniquePath(filePath);
|
||||
expect(result).toBe(filePath);
|
||||
});
|
||||
|
||||
it('appends (2) suffix when original file exists', async () => {
|
||||
const base = makeTmpDir();
|
||||
const fo = new FileOrganizer(base);
|
||||
|
||||
const filePath = join(base, 'video.mp4');
|
||||
writeFileSync(filePath, 'data');
|
||||
|
||||
const result = await fo.resolveUniquePath(filePath);
|
||||
expect(result).toBe(join(base, 'video (2).mp4'));
|
||||
});
|
||||
|
||||
it('increments suffix until a free name is found', async () => {
|
||||
const base = makeTmpDir();
|
||||
const fo = new FileOrganizer(base);
|
||||
|
||||
const filePath = join(base, 'video.mp4');
|
||||
writeFileSync(filePath, 'data');
|
||||
writeFileSync(join(base, 'video (2).mp4'), 'data');
|
||||
writeFileSync(join(base, 'video (3).mp4'), 'data');
|
||||
|
||||
const result = await fo.resolveUniquePath(filePath);
|
||||
expect(result).toBe(join(base, 'video (4).mp4'));
|
||||
});
|
||||
|
||||
it('preserves extension when adding suffix', async () => {
|
||||
const base = makeTmpDir();
|
||||
const fo = new FileOrganizer(base);
|
||||
|
||||
const filePath = join(base, 'song.flac');
|
||||
writeFileSync(filePath, 'data');
|
||||
|
||||
const result = await fo.resolveUniquePath(filePath);
|
||||
expect(result).toContain('.flac');
|
||||
expect(result).toContain('(2)');
|
||||
});
|
||||
});
|
||||
|
||||
describe('cross-platform paths', () => {
|
||||
it('uses path.join (no hardcoded separators)', () => {
|
||||
const fo = new FileOrganizer('/base/media');
|
||||
|
||||
const result = fo.buildOutputPath('soundcloud', 'Artist', 'Track', 'mp3');
|
||||
|
||||
// The path should be well-formed for the current OS
|
||||
// On Windows path.join uses \, on Unix it uses /
|
||||
// Just verify it doesn't contain double separators
|
||||
expect(result).not.toContain('//');
|
||||
expect(result).not.toMatch(/\\{2,}/);
|
||||
});
|
||||
});
|
||||
});
|
||||
357
src/__tests__/format-profile-api.test.ts
Normal file
357
src/__tests__/format-profile-api.test.ts
Normal file
|
|
@ -0,0 +1,357 @@
|
|||
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { type FastifyInstance } from 'fastify';
|
||||
import { initDatabaseAsync, closeDatabase } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import { buildServer } from '../server/index';
|
||||
import { systemConfig } from '../db/schema/index';
|
||||
import { eq } from 'drizzle-orm';
|
||||
import { type LibSQLDatabase } from 'drizzle-orm/libsql';
|
||||
import type * as schema from '../db/schema/index';
|
||||
|
||||
/**
|
||||
* Integration tests for format profile CRUD API endpoints.
|
||||
* Uses Fastify inject — no real HTTP ports.
|
||||
*/
|
||||
|
||||
describe('Format Profile API', () => {
|
||||
let server: FastifyInstance;
|
||||
let db: LibSQLDatabase<typeof schema>;
|
||||
let apiKey: string;
|
||||
let tmpDir: string;
|
||||
|
||||
beforeAll(async () => {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-fp-api-'));
|
||||
const dbPath = join(tmpDir, 'test.db');
|
||||
db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
server = await buildServer({ db });
|
||||
await server.ready();
|
||||
|
||||
// Read API key from database (generated by auth plugin)
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(systemConfig)
|
||||
.where(eq(systemConfig.key, 'api_key'))
|
||||
.limit(1);
|
||||
apiKey = rows[0]?.value ?? '';
|
||||
expect(apiKey).toBeTruthy();
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await server.close();
|
||||
closeDatabase();
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Temp dir cleanup is best-effort on Windows
|
||||
}
|
||||
});
|
||||
|
||||
// ── Helpers ──
|
||||
|
||||
function authed(opts: Record<string, unknown>) {
|
||||
return {
|
||||
...opts,
|
||||
headers: { 'x-api-key': apiKey, ...(opts.headers as Record<string, string> | undefined) },
|
||||
};
|
||||
}
|
||||
|
||||
// ── Auth gating ──
|
||||
|
||||
describe('Authentication', () => {
|
||||
it('returns 401 when no API key is provided', async () => {
|
||||
const res = await server.inject({
|
||||
method: 'GET',
|
||||
url: '/api/v1/format-profile',
|
||||
});
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
});
|
||||
|
||||
// ── CRUD lifecycle ──
|
||||
|
||||
describe('CRUD lifecycle', () => {
|
||||
let profileId: number;
|
||||
|
||||
it('POST creates a format profile', async () => {
|
||||
const res = await server.inject(
|
||||
authed({
|
||||
method: 'POST',
|
||||
url: '/api/v1/format-profile',
|
||||
payload: {
|
||||
name: 'HD Video',
|
||||
videoResolution: '1080p',
|
||||
audioCodec: 'aac',
|
||||
audioBitrate: '192k',
|
||||
containerFormat: 'mp4',
|
||||
isDefault: false,
|
||||
},
|
||||
})
|
||||
);
|
||||
|
||||
expect(res.statusCode).toBe(201);
|
||||
const body = res.json();
|
||||
expect(body.name).toBe('HD Video');
|
||||
expect(body.videoResolution).toBe('1080p');
|
||||
expect(body.audioCodec).toBe('aac');
|
||||
expect(body.audioBitrate).toBe('192k');
|
||||
expect(body.containerFormat).toBe('mp4');
|
||||
expect(body.isDefault).toBe(false);
|
||||
expect(body.id).toBeDefined();
|
||||
profileId = body.id;
|
||||
});
|
||||
|
||||
it('GET / lists all profiles', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'GET', url: '/api/v1/format-profile' })
|
||||
);
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(Array.isArray(body)).toBe(true);
|
||||
expect(body.length).toBeGreaterThanOrEqual(1);
|
||||
expect(body.some((p: { id: number }) => p.id === profileId)).toBe(true);
|
||||
});
|
||||
|
||||
it('GET /:id returns a single profile', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'GET', url: `/api/v1/format-profile/${profileId}` })
|
||||
);
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.id).toBe(profileId);
|
||||
expect(body.name).toBe('HD Video');
|
||||
});
|
||||
|
||||
it('PUT /:id updates profile fields', async () => {
|
||||
const res = await server.inject(
|
||||
authed({
|
||||
method: 'PUT',
|
||||
url: `/api/v1/format-profile/${profileId}`,
|
||||
payload: { name: 'Full HD', videoResolution: '1080p' },
|
||||
})
|
||||
);
|
||||
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.id).toBe(profileId);
|
||||
expect(body.name).toBe('Full HD');
|
||||
});
|
||||
|
||||
it('DELETE /:id removes the profile', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'DELETE', url: `/api/v1/format-profile/${profileId}` })
|
||||
);
|
||||
|
||||
expect(res.statusCode).toBe(204);
|
||||
|
||||
// Verify it's gone
|
||||
const getRes = await server.inject(
|
||||
authed({ method: 'GET', url: `/api/v1/format-profile/${profileId}` })
|
||||
);
|
||||
expect(getRes.statusCode).toBe(404);
|
||||
});
|
||||
});
|
||||
|
||||
// ── 404 handling ──
|
||||
|
||||
describe('Not found handling', () => {
|
||||
it('GET /:id returns 404 for non-existent profile', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'GET', url: '/api/v1/format-profile/99999' })
|
||||
);
|
||||
expect(res.statusCode).toBe(404);
|
||||
expect(res.json().error).toBe('Not Found');
|
||||
});
|
||||
|
||||
it('PUT /:id returns 404 for non-existent profile', async () => {
|
||||
const res = await server.inject(
|
||||
authed({
|
||||
method: 'PUT',
|
||||
url: '/api/v1/format-profile/99999',
|
||||
payload: { name: 'Nope' },
|
||||
})
|
||||
);
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
|
||||
it('DELETE /:id returns 404 for non-existent profile', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'DELETE', url: '/api/v1/format-profile/99999' })
|
||||
);
|
||||
expect(res.statusCode).toBe(404);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Validation errors ──
|
||||
|
||||
describe('Validation', () => {
|
||||
it('POST rejects body missing required name', async () => {
|
||||
const res = await server.inject(
|
||||
authed({
|
||||
method: 'POST',
|
||||
url: '/api/v1/format-profile',
|
||||
payload: { videoResolution: '720p' },
|
||||
})
|
||||
);
|
||||
expect(res.statusCode).toBe(400);
|
||||
});
|
||||
|
||||
it('POST rejects body with empty name', async () => {
|
||||
const res = await server.inject(
|
||||
authed({
|
||||
method: 'POST',
|
||||
url: '/api/v1/format-profile',
|
||||
payload: { name: '' },
|
||||
})
|
||||
);
|
||||
expect(res.statusCode).toBe(400);
|
||||
});
|
||||
|
||||
it('GET /:id returns 400 for non-numeric ID', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'GET', url: '/api/v1/format-profile/abc' })
|
||||
);
|
||||
expect(res.statusCode).toBe(400);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Default profile management ──
|
||||
|
||||
describe('Default profile', () => {
|
||||
it('setting isDefault on one profile clears it from others', async () => {
|
||||
// Create first profile as default
|
||||
const res1 = await server.inject(
|
||||
authed({
|
||||
method: 'POST',
|
||||
url: '/api/v1/format-profile',
|
||||
payload: { name: 'Default A', isDefault: true },
|
||||
})
|
||||
);
|
||||
expect(res1.statusCode).toBe(201);
|
||||
const profileA = res1.json();
|
||||
expect(profileA.isDefault).toBe(true);
|
||||
|
||||
// Create second profile as default — should clear the first
|
||||
const res2 = await server.inject(
|
||||
authed({
|
||||
method: 'POST',
|
||||
url: '/api/v1/format-profile',
|
||||
payload: { name: 'Default B', isDefault: true },
|
||||
})
|
||||
);
|
||||
expect(res2.statusCode).toBe(201);
|
||||
const profileB = res2.json();
|
||||
expect(profileB.isDefault).toBe(true);
|
||||
|
||||
// Verify first profile is no longer default
|
||||
const resA = await server.inject(
|
||||
authed({ method: 'GET', url: `/api/v1/format-profile/${profileA.id}` })
|
||||
);
|
||||
expect(resA.json().isDefault).toBe(false);
|
||||
|
||||
// Clean up — profileA is not default so it's deletable.
|
||||
// profileB is default and protected — leave it (shared test DB, no conflict).
|
||||
await server.inject(
|
||||
authed({ method: 'DELETE', url: `/api/v1/format-profile/${profileA.id}` })
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Default profile protection ──
|
||||
|
||||
describe('Default profile protection', () => {
|
||||
it('DELETE default profile returns 403', async () => {
|
||||
const createRes = await server.inject(
|
||||
authed({
|
||||
method: 'POST',
|
||||
url: '/api/v1/format-profile',
|
||||
payload: { name: 'Protected Default', isDefault: true },
|
||||
})
|
||||
);
|
||||
expect(createRes.statusCode).toBe(201);
|
||||
const profile = createRes.json();
|
||||
|
||||
const deleteRes = await server.inject(
|
||||
authed({ method: 'DELETE', url: `/api/v1/format-profile/${profile.id}` })
|
||||
);
|
||||
expect(deleteRes.statusCode).toBe(403);
|
||||
expect(deleteRes.json().message).toBe('Cannot delete the default format profile');
|
||||
|
||||
// Profile remains in DB (default, protected) — no cleanup needed for test isolation
|
||||
});
|
||||
|
||||
it('DELETE non-default profile still works', async () => {
|
||||
const createRes = await server.inject(
|
||||
authed({
|
||||
method: 'POST',
|
||||
url: '/api/v1/format-profile',
|
||||
payload: { name: 'Deletable Profile', isDefault: false },
|
||||
})
|
||||
);
|
||||
expect(createRes.statusCode).toBe(201);
|
||||
const profile = createRes.json();
|
||||
|
||||
const deleteRes = await server.inject(
|
||||
authed({ method: 'DELETE', url: `/api/v1/format-profile/${profile.id}` })
|
||||
);
|
||||
expect(deleteRes.statusCode).toBe(204);
|
||||
});
|
||||
|
||||
it('PUT default profile with isDefault: false returns 400', async () => {
|
||||
const createRes = await server.inject(
|
||||
authed({
|
||||
method: 'POST',
|
||||
url: '/api/v1/format-profile',
|
||||
payload: { name: 'Default No Unset', isDefault: true },
|
||||
})
|
||||
);
|
||||
expect(createRes.statusCode).toBe(201);
|
||||
const profile = createRes.json();
|
||||
|
||||
const putRes = await server.inject(
|
||||
authed({
|
||||
method: 'PUT',
|
||||
url: `/api/v1/format-profile/${profile.id}`,
|
||||
payload: { isDefault: false },
|
||||
})
|
||||
);
|
||||
expect(putRes.statusCode).toBe(400);
|
||||
expect(putRes.json().message).toBe('Cannot unset isDefault on the default format profile');
|
||||
|
||||
// Clean up — force unset via direct DB or just leave (fresh DB per suite)
|
||||
// We can't unset via API (that's what we're testing), so just leave it
|
||||
});
|
||||
|
||||
it('PUT default profile with other fields works', async () => {
|
||||
const createRes = await server.inject(
|
||||
authed({
|
||||
method: 'POST',
|
||||
url: '/api/v1/format-profile',
|
||||
payload: { name: 'Renameable Default', isDefault: true },
|
||||
})
|
||||
);
|
||||
expect(createRes.statusCode).toBe(201);
|
||||
const profile = createRes.json();
|
||||
|
||||
const putRes = await server.inject(
|
||||
authed({
|
||||
method: 'PUT',
|
||||
url: `/api/v1/format-profile/${profile.id}`,
|
||||
payload: { name: 'Renamed Default', videoResolution: '1080p' },
|
||||
})
|
||||
);
|
||||
expect(putRes.statusCode).toBe(200);
|
||||
const updated = putRes.json();
|
||||
expect(updated.name).toBe('Renamed Default');
|
||||
expect(updated.videoResolution).toBe('1080p');
|
||||
expect(updated.isDefault).toBe(true);
|
||||
});
|
||||
});
|
||||
});
|
||||
544
src/__tests__/format-profile.test.ts
Normal file
544
src/__tests__/format-profile.test.ts
Normal file
|
|
@ -0,0 +1,544 @@
|
|||
import { describe, it, expect, afterEach } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { initDatabaseAsync, closeDatabase } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import {
|
||||
createFormatProfile,
|
||||
getFormatProfileById,
|
||||
getAllFormatProfiles,
|
||||
getDefaultFormatProfile,
|
||||
updateFormatProfile,
|
||||
deleteFormatProfile,
|
||||
ensureDefaultFormatProfile,
|
||||
} from '../db/repositories/format-profile-repository';
|
||||
import {
|
||||
createChannel,
|
||||
getChannelById,
|
||||
} from '../db/repositories/channel-repository';
|
||||
import {
|
||||
createContentItem,
|
||||
getContentItemById,
|
||||
updateContentItem,
|
||||
getContentItemsByStatus,
|
||||
} from '../db/repositories/content-repository';
|
||||
import type { Platform } from '../types/index';
|
||||
|
||||
let tmpDir: string;
|
||||
|
||||
function freshDbPath(): string {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-fp-test-'));
|
||||
return join(tmpDir, 'test.db');
|
||||
}
|
||||
|
||||
function cleanup(): void {
|
||||
closeDatabase();
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Windows cleanup best-effort (K004)
|
||||
}
|
||||
}
|
||||
|
||||
// ── Format Profile CRUD ──
|
||||
|
||||
describe('Format Profile Repository', () => {
|
||||
afterEach(cleanup);
|
||||
|
||||
it('creates a format profile and reads it back', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const profile = await createFormatProfile(db, {
|
||||
name: 'High Quality',
|
||||
videoResolution: '1080p',
|
||||
audioCodec: 'opus',
|
||||
audioBitrate: '320k',
|
||||
containerFormat: 'mkv',
|
||||
isDefault: false,
|
||||
});
|
||||
|
||||
expect(profile.id).toBeGreaterThan(0);
|
||||
expect(profile.name).toBe('High Quality');
|
||||
expect(profile.videoResolution).toBe('1080p');
|
||||
expect(profile.audioCodec).toBe('opus');
|
||||
expect(profile.audioBitrate).toBe('320k');
|
||||
expect(profile.containerFormat).toBe('mkv');
|
||||
expect(profile.isDefault).toBe(false);
|
||||
expect(profile.createdAt).toBeTruthy();
|
||||
expect(profile.updatedAt).toBeTruthy();
|
||||
|
||||
const fetched = await getFormatProfileById(db, profile.id);
|
||||
expect(fetched).toEqual(profile);
|
||||
});
|
||||
|
||||
it('returns null for non-existent profile ID', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const result = await getFormatProfileById(db, 999);
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
it('lists all profiles ordered by name', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
await createFormatProfile(db, { name: 'Zebra' });
|
||||
await createFormatProfile(db, { name: 'Alpha' });
|
||||
await createFormatProfile(db, { name: 'Middle' });
|
||||
|
||||
const all = await getAllFormatProfiles(db);
|
||||
expect(all).toHaveLength(3);
|
||||
expect(all[0].name).toBe('Alpha');
|
||||
expect(all[1].name).toBe('Middle');
|
||||
expect(all[2].name).toBe('Zebra');
|
||||
});
|
||||
|
||||
it('updates a format profile', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const profile = await createFormatProfile(db, {
|
||||
name: 'Original',
|
||||
videoResolution: '720p',
|
||||
});
|
||||
|
||||
const updated = await updateFormatProfile(db, profile.id, {
|
||||
name: 'Updated',
|
||||
videoResolution: '1080p',
|
||||
audioCodec: 'aac',
|
||||
});
|
||||
|
||||
expect(updated).not.toBeNull();
|
||||
expect(updated!.name).toBe('Updated');
|
||||
expect(updated!.videoResolution).toBe('1080p');
|
||||
expect(updated!.audioCodec).toBe('aac');
|
||||
// Verify updatedAt is set (may match createdAt if within same second)
|
||||
expect(updated!.updatedAt).toBeTruthy();
|
||||
});
|
||||
|
||||
it('returns null when updating non-existent profile', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const result = await updateFormatProfile(db, 999, { name: 'Nope' });
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
it('deletes a format profile', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const profile = await createFormatProfile(db, { name: 'Delete Me' });
|
||||
const deleted = await deleteFormatProfile(db, profile.id);
|
||||
expect(deleted).toBe(true);
|
||||
|
||||
const after = await getFormatProfileById(db, profile.id);
|
||||
expect(after).toBeNull();
|
||||
});
|
||||
|
||||
it('returns false when deleting non-existent profile', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const deleted = await deleteFormatProfile(db, 999);
|
||||
expect(deleted).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Default Profile Logic ──
|
||||
|
||||
describe('Default Format Profile', () => {
|
||||
afterEach(cleanup);
|
||||
|
||||
it('returns null when no default profile exists', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const result = await getDefaultFormatProfile(db);
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
it('marks a profile as default on create', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const profile = await createFormatProfile(db, {
|
||||
name: 'Default Profile',
|
||||
isDefault: true,
|
||||
});
|
||||
|
||||
expect(profile.isDefault).toBe(true);
|
||||
|
||||
const fetched = await getDefaultFormatProfile(db);
|
||||
expect(fetched).not.toBeNull();
|
||||
expect(fetched!.id).toBe(profile.id);
|
||||
});
|
||||
|
||||
it('clears previous default when creating a new default profile', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const first = await createFormatProfile(db, {
|
||||
name: 'First Default',
|
||||
isDefault: true,
|
||||
});
|
||||
|
||||
const second = await createFormatProfile(db, {
|
||||
name: 'Second Default',
|
||||
isDefault: true,
|
||||
});
|
||||
|
||||
// Second should be default
|
||||
const defaultProfile = await getDefaultFormatProfile(db);
|
||||
expect(defaultProfile!.id).toBe(second.id);
|
||||
|
||||
// First should no longer be default
|
||||
const firstUpdated = await getFormatProfileById(db, first.id);
|
||||
expect(firstUpdated!.isDefault).toBe(false);
|
||||
});
|
||||
|
||||
it('clears previous default when updating a profile to be default', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const first = await createFormatProfile(db, {
|
||||
name: 'Current Default',
|
||||
isDefault: true,
|
||||
});
|
||||
|
||||
const second = await createFormatProfile(db, {
|
||||
name: 'Will Become Default',
|
||||
isDefault: false,
|
||||
});
|
||||
|
||||
await updateFormatProfile(db, second.id, { isDefault: true });
|
||||
|
||||
const defaultProfile = await getDefaultFormatProfile(db);
|
||||
expect(defaultProfile!.id).toBe(second.id);
|
||||
|
||||
const firstUpdated = await getFormatProfileById(db, first.id);
|
||||
expect(firstUpdated!.isDefault).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Ensure Default Profile (Seed) ──
|
||||
|
||||
describe('ensureDefaultFormatProfile', () => {
|
||||
afterEach(cleanup);
|
||||
|
||||
it('creates a default profile when none exists', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const profile = await ensureDefaultFormatProfile(db);
|
||||
|
||||
expect(profile.name).toBe('Default');
|
||||
expect(profile.isDefault).toBe(true);
|
||||
expect(profile.videoResolution).toBeNull();
|
||||
expect(profile.audioCodec).toBeNull();
|
||||
expect(profile.audioBitrate).toBeNull();
|
||||
expect(profile.containerFormat).toBeNull();
|
||||
|
||||
// Verify it's retrievable
|
||||
const fetched = await getDefaultFormatProfile(db);
|
||||
expect(fetched).not.toBeNull();
|
||||
expect(fetched!.id).toBe(profile.id);
|
||||
});
|
||||
|
||||
it('is idempotent — does not create a duplicate', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const first = await ensureDefaultFormatProfile(db);
|
||||
const second = await ensureDefaultFormatProfile(db);
|
||||
|
||||
expect(first.id).toBe(second.id);
|
||||
|
||||
const all = await getAllFormatProfiles(db);
|
||||
expect(all).toHaveLength(1);
|
||||
});
|
||||
|
||||
it('does not overwrite an existing default profile', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
// Create a custom default profile first
|
||||
const custom = await createFormatProfile(db, {
|
||||
name: 'Custom Default',
|
||||
isDefault: true,
|
||||
videoResolution: '1080p',
|
||||
});
|
||||
|
||||
const result = await ensureDefaultFormatProfile(db);
|
||||
|
||||
// Should return the existing one, not create a new one
|
||||
expect(result.id).toBe(custom.id);
|
||||
expect(result.name).toBe('Custom Default');
|
||||
expect(result.videoResolution).toBe('1080p');
|
||||
|
||||
const all = await getAllFormatProfiles(db);
|
||||
expect(all).toHaveLength(1);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Channel ↔ Format Profile FK ──
|
||||
|
||||
describe('Channel-FormatProfile FK relationship', () => {
|
||||
afterEach(cleanup);
|
||||
|
||||
it('creates a channel with a format profile ID', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const profile = await createFormatProfile(db, { name: 'Test Profile' });
|
||||
|
||||
const channel = await createChannel(db, {
|
||||
name: 'Test Channel',
|
||||
platform: 'youtube' as Platform,
|
||||
platformId: 'UC123',
|
||||
url: 'https://www.youtube.com/@Test',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: profile.id,
|
||||
});
|
||||
|
||||
expect(channel.formatProfileId).toBe(profile.id);
|
||||
});
|
||||
|
||||
it('creates a channel with null format profile ID', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const channel = await createChannel(db, {
|
||||
name: 'No Profile Channel',
|
||||
platform: 'youtube' as Platform,
|
||||
platformId: 'UC456',
|
||||
url: 'https://www.youtube.com/@NoProfile',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: null,
|
||||
});
|
||||
|
||||
expect(channel.formatProfileId).toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
// ── Content Item Update Functions ──
|
||||
|
||||
describe('Content Item Update & Query Functions', () => {
|
||||
afterEach(cleanup);
|
||||
|
||||
async function setupChannelWithItem(db: ReturnType<typeof initDatabaseAsync> extends Promise<infer T> ? T : never) {
|
||||
const channel = await createChannel(db, {
|
||||
name: 'Content Channel',
|
||||
platform: 'youtube' as Platform,
|
||||
platformId: 'UC_CONTENT_TEST',
|
||||
url: 'https://www.youtube.com/@ContentChannel',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: null,
|
||||
});
|
||||
|
||||
const item = await createContentItem(db, {
|
||||
channelId: channel.id,
|
||||
title: 'Test Video',
|
||||
platformContentId: 'vid123',
|
||||
url: 'https://www.youtube.com/watch?v=vid123',
|
||||
contentType: 'video' as const,
|
||||
duration: 600,
|
||||
});
|
||||
|
||||
return { channel, item: item! };
|
||||
}
|
||||
|
||||
it('gets a content item by ID', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
const { item } = await setupChannelWithItem(db);
|
||||
|
||||
const fetched = await getContentItemById(db, item.id);
|
||||
expect(fetched).not.toBeNull();
|
||||
expect(fetched!.id).toBe(item.id);
|
||||
expect(fetched!.title).toBe('Test Video');
|
||||
expect(fetched!.status).toBe('monitored');
|
||||
});
|
||||
|
||||
it('returns null for non-existent content item ID', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const result = await getContentItemById(db, 999);
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
it('updates content item with download results', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
const { item } = await setupChannelWithItem(db);
|
||||
|
||||
const qualityMetadata = {
|
||||
actualResolution: '1920x1080',
|
||||
actualCodec: 'h264',
|
||||
actualBitrate: '5000kbps',
|
||||
containerFormat: 'mp4',
|
||||
qualityWarnings: [],
|
||||
};
|
||||
|
||||
const updated = await updateContentItem(db, item.id, {
|
||||
filePath: '/media/youtube/channel/test-video.mp4',
|
||||
fileSize: 52428800,
|
||||
format: 'mp4',
|
||||
qualityMetadata,
|
||||
status: 'downloaded',
|
||||
});
|
||||
|
||||
expect(updated).not.toBeNull();
|
||||
expect(updated!.filePath).toBe('/media/youtube/channel/test-video.mp4');
|
||||
expect(updated!.fileSize).toBe(52428800);
|
||||
expect(updated!.format).toBe('mp4');
|
||||
expect(updated!.qualityMetadata).toEqual(qualityMetadata);
|
||||
expect(updated!.status).toBe('downloaded');
|
||||
// Verify updatedAt is set (may match original if within same second)
|
||||
expect(updated!.updatedAt).toBeTruthy();
|
||||
});
|
||||
|
||||
it('returns null when updating non-existent content item', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const result = await updateContentItem(db, 999, { status: 'failed' });
|
||||
expect(result).toBeNull();
|
||||
});
|
||||
|
||||
it('gets content items by status', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const channel = await createChannel(db, {
|
||||
name: 'Multi Channel',
|
||||
platform: 'youtube' as Platform,
|
||||
platformId: 'UC_MULTI',
|
||||
url: 'https://www.youtube.com/@Multi',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: null,
|
||||
});
|
||||
|
||||
// Create items with different statuses
|
||||
await createContentItem(db, {
|
||||
channelId: channel.id,
|
||||
title: 'Item 1',
|
||||
platformContentId: 'v1',
|
||||
url: 'https://youtube.com/watch?v=v1',
|
||||
contentType: 'video' as const,
|
||||
duration: null,
|
||||
status: 'monitored',
|
||||
});
|
||||
|
||||
const item2 = await createContentItem(db, {
|
||||
channelId: channel.id,
|
||||
title: 'Item 2',
|
||||
platformContentId: 'v2',
|
||||
url: 'https://youtube.com/watch?v=v2',
|
||||
contentType: 'video' as const,
|
||||
duration: null,
|
||||
status: 'monitored',
|
||||
});
|
||||
|
||||
await createContentItem(db, {
|
||||
channelId: channel.id,
|
||||
title: 'Item 3',
|
||||
platformContentId: 'v3',
|
||||
url: 'https://youtube.com/watch?v=v3',
|
||||
contentType: 'audio' as const,
|
||||
duration: null,
|
||||
status: 'downloaded',
|
||||
});
|
||||
|
||||
const monitored = await getContentItemsByStatus(db, 'monitored');
|
||||
expect(monitored).toHaveLength(2);
|
||||
|
||||
const downloaded = await getContentItemsByStatus(db, 'downloaded');
|
||||
expect(downloaded).toHaveLength(1);
|
||||
expect(downloaded[0].title).toBe('Item 3');
|
||||
});
|
||||
|
||||
it('respects limit parameter on getContentItemsByStatus', async () => {
|
||||
const dbPath = freshDbPath();
|
||||
const db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
|
||||
const channel = await createChannel(db, {
|
||||
name: 'Limit Channel',
|
||||
platform: 'youtube' as Platform,
|
||||
platformId: 'UC_LIMIT',
|
||||
url: 'https://www.youtube.com/@Limit',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: null,
|
||||
});
|
||||
|
||||
for (let i = 0; i < 5; i++) {
|
||||
await createContentItem(db, {
|
||||
channelId: channel.id,
|
||||
title: `Item ${i}`,
|
||||
platformContentId: `vid_${i}`,
|
||||
url: `https://youtube.com/watch?v=vid_${i}`,
|
||||
contentType: 'video' as const,
|
||||
duration: null,
|
||||
status: 'monitored',
|
||||
});
|
||||
}
|
||||
|
||||
const limited = await getContentItemsByStatus(db, 'monitored', 2);
|
||||
expect(limited).toHaveLength(2);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Config ──
|
||||
|
||||
describe('Config download fields', () => {
|
||||
it('has mediaPath, concurrentDownloads, and cookiePath with defaults', async () => {
|
||||
// Config is loaded at import time, so just verify the fields exist with defaults
|
||||
const { appConfig } = await import('../config/index');
|
||||
expect(appConfig.mediaPath).toBe('./media');
|
||||
expect(appConfig.concurrentDownloads).toBe(2);
|
||||
expect(appConfig.cookiePath).toBe('./data/cookies');
|
||||
});
|
||||
});
|
||||
326
src/__tests__/health-service.test.ts
Normal file
326
src/__tests__/health-service.test.ts
Normal file
|
|
@ -0,0 +1,326 @@
|
|||
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { initDatabaseAsync, closeDatabase } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import { createHistoryEvent } from '../db/repositories/history-repository';
|
||||
import { HealthService } from '../services/health';
|
||||
import type { SchedulerState } from '../services/scheduler';
|
||||
|
||||
// ── Mock yt-dlp ──
|
||||
vi.mock('../sources/yt-dlp', () => ({
|
||||
getYtDlpVersion: vi.fn(),
|
||||
}));
|
||||
|
||||
// ── Mock statfs ──
|
||||
vi.mock('node:fs/promises', async (importOriginal) => {
|
||||
const actual = await importOriginal<typeof import('node:fs/promises')>();
|
||||
return {
|
||||
...actual,
|
||||
statfs: vi.fn(),
|
||||
};
|
||||
});
|
||||
|
||||
import { getYtDlpVersion } from '../sources/yt-dlp';
|
||||
import { statfs } from 'node:fs/promises';
|
||||
|
||||
const mockGetYtDlpVersion = vi.mocked(getYtDlpVersion);
|
||||
const mockStatfs = vi.mocked(statfs);
|
||||
|
||||
// ── Test Helpers ──
|
||||
|
||||
let tmpDir: string;
|
||||
let db: Awaited<ReturnType<typeof initDatabaseAsync>>;
|
||||
|
||||
async function setupDb(): Promise<void> {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-health-'));
|
||||
const dbPath = join(tmpDir, 'test.db');
|
||||
db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
}
|
||||
|
||||
function cleanup(): void {
|
||||
closeDatabase();
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Windows cleanup best-effort
|
||||
}
|
||||
}
|
||||
|
||||
function makeSchedulerState(overrides?: Partial<SchedulerState>): SchedulerState {
|
||||
return {
|
||||
running: true,
|
||||
channelCount: 3,
|
||||
channels: [],
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
function makeStatfsResult(availableRatio: number, totalBlocks = 1000000) {
|
||||
// bsize=4096, total blocks = totalBlocks, available = totalBlocks * ratio
|
||||
const bsize = 4096;
|
||||
const bavail = Math.floor(totalBlocks * availableRatio);
|
||||
return {
|
||||
type: 0,
|
||||
bsize: BigInt(bsize),
|
||||
blocks: BigInt(totalBlocks),
|
||||
bfree: BigInt(bavail),
|
||||
bavail: BigInt(bavail),
|
||||
files: BigInt(0),
|
||||
ffree: BigInt(0),
|
||||
};
|
||||
}
|
||||
|
||||
// ── Tests ──
|
||||
|
||||
describe('HealthService', () => {
|
||||
beforeEach(async () => {
|
||||
await setupDb();
|
||||
vi.clearAllMocks();
|
||||
// Default mocks
|
||||
mockGetYtDlpVersion.mockResolvedValue('2024.12.23');
|
||||
mockStatfs.mockResolvedValue(makeStatfsResult(0.5) as never); // 50% free
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
cleanup();
|
||||
});
|
||||
|
||||
// ── Scheduler Component ──
|
||||
|
||||
describe('scheduler component', () => {
|
||||
it('returns healthy with channel count when scheduler is running', async () => {
|
||||
const service = new HealthService(
|
||||
db,
|
||||
() => makeSchedulerState({ running: true, channelCount: 5 }),
|
||||
'/tmp/media'
|
||||
);
|
||||
|
||||
const components = await service.getComponentHealth();
|
||||
const scheduler = components.find((c) => c.name === 'scheduler');
|
||||
|
||||
expect(scheduler).toBeDefined();
|
||||
expect(scheduler!.status).toBe('healthy');
|
||||
expect(scheduler!.message).toBe('Running — 5 channel(s) monitored');
|
||||
expect(scheduler!.details).toEqual({ channelCount: 5 });
|
||||
});
|
||||
|
||||
it('returns degraded when scheduler is disabled (null)', async () => {
|
||||
const service = new HealthService(db, () => null, '/tmp/media');
|
||||
|
||||
const components = await service.getComponentHealth();
|
||||
const scheduler = components.find((c) => c.name === 'scheduler');
|
||||
|
||||
expect(scheduler!.status).toBe('degraded');
|
||||
expect(scheduler!.message).toBe('Scheduler disabled');
|
||||
});
|
||||
|
||||
it('returns unhealthy when scheduler is stopped', async () => {
|
||||
const service = new HealthService(
|
||||
db,
|
||||
() => makeSchedulerState({ running: false }),
|
||||
'/tmp/media'
|
||||
);
|
||||
|
||||
const components = await service.getComponentHealth();
|
||||
const scheduler = components.find((c) => c.name === 'scheduler');
|
||||
|
||||
expect(scheduler!.status).toBe('unhealthy');
|
||||
expect(scheduler!.message).toBe('Scheduler stopped');
|
||||
});
|
||||
});
|
||||
|
||||
// ── yt-dlp Component ──
|
||||
|
||||
describe('yt-dlp component', () => {
|
||||
it('returns healthy with version when yt-dlp is available', async () => {
|
||||
mockGetYtDlpVersion.mockResolvedValue('2024.12.23');
|
||||
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
const components = await service.getComponentHealth();
|
||||
const ytDlp = components.find((c) => c.name === 'ytDlp');
|
||||
|
||||
expect(ytDlp!.status).toBe('healthy');
|
||||
expect(ytDlp!.message).toBe('yt-dlp 2024.12.23');
|
||||
expect(ytDlp!.details).toEqual({ version: '2024.12.23' });
|
||||
});
|
||||
|
||||
it('returns unhealthy when yt-dlp is not available', async () => {
|
||||
mockGetYtDlpVersion.mockResolvedValue(null);
|
||||
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
const components = await service.getComponentHealth();
|
||||
const ytDlp = components.find((c) => c.name === 'ytDlp');
|
||||
|
||||
expect(ytDlp!.status).toBe('unhealthy');
|
||||
expect(ytDlp!.message).toBe('yt-dlp not found');
|
||||
});
|
||||
});
|
||||
|
||||
// ── Disk Space Component ──
|
||||
|
||||
describe('disk space component', () => {
|
||||
it('returns healthy when >10% free', async () => {
|
||||
mockStatfs.mockResolvedValue(makeStatfsResult(0.5) as never); // 50% free
|
||||
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
const components = await service.getComponentHealth();
|
||||
const disk = components.find((c) => c.name === 'diskSpace');
|
||||
|
||||
expect(disk!.status).toBe('healthy');
|
||||
expect(disk!.message).toMatch(/GB free of/);
|
||||
expect(disk!.message).toMatch(/50%/);
|
||||
expect(disk!.details).toHaveProperty('availableBytes');
|
||||
expect(disk!.details).toHaveProperty('totalBytes');
|
||||
expect(disk!.details).toHaveProperty('freePercent');
|
||||
});
|
||||
|
||||
it('returns degraded when 5-10% free', async () => {
|
||||
mockStatfs.mockResolvedValue(makeStatfsResult(0.07) as never); // 7% free
|
||||
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
const components = await service.getComponentHealth();
|
||||
const disk = components.find((c) => c.name === 'diskSpace');
|
||||
|
||||
expect(disk!.status).toBe('degraded');
|
||||
});
|
||||
|
||||
it('returns unhealthy when <5% free', async () => {
|
||||
mockStatfs.mockResolvedValue(makeStatfsResult(0.03) as never); // 3% free
|
||||
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
const components = await service.getComponentHealth();
|
||||
const disk = components.find((c) => c.name === 'diskSpace');
|
||||
|
||||
expect(disk!.status).toBe('unhealthy');
|
||||
});
|
||||
|
||||
it('returns degraded on statfs error', async () => {
|
||||
mockStatfs.mockRejectedValue(new Error('ENOENT: no such file or directory'));
|
||||
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
const components = await service.getComponentHealth();
|
||||
const disk = components.find((c) => c.name === 'diskSpace');
|
||||
|
||||
expect(disk!.status).toBe('degraded');
|
||||
expect(disk!.message).toMatch(/Disk check failed/);
|
||||
expect(disk!.message).toMatch(/ENOENT/);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Recent Errors Component ──
|
||||
|
||||
describe('recent errors component', () => {
|
||||
it('returns healthy when no errors in 24h', async () => {
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
const components = await service.getComponentHealth();
|
||||
const errors = components.find((c) => c.name === 'recentErrors');
|
||||
|
||||
expect(errors!.status).toBe('healthy');
|
||||
expect(errors!.message).toBe('0 error(s) in the last 24 hours');
|
||||
});
|
||||
|
||||
it('returns degraded when 1-5 errors in 24h', async () => {
|
||||
// Insert 3 failed history events
|
||||
for (let i = 0; i < 3; i++) {
|
||||
await createHistoryEvent(db, {
|
||||
eventType: 'failed',
|
||||
status: 'failed',
|
||||
details: { error: `Test error ${i}` },
|
||||
});
|
||||
}
|
||||
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
const components = await service.getComponentHealth();
|
||||
const errors = components.find((c) => c.name === 'recentErrors');
|
||||
|
||||
expect(errors!.status).toBe('degraded');
|
||||
expect(errors!.message).toBe('3 error(s) in the last 24 hours');
|
||||
});
|
||||
|
||||
it('returns unhealthy when >5 errors in 24h', async () => {
|
||||
// Insert 7 failed events
|
||||
for (let i = 0; i < 7; i++) {
|
||||
await createHistoryEvent(db, {
|
||||
eventType: 'failed',
|
||||
status: 'failed',
|
||||
details: { error: `Test error ${i}` },
|
||||
});
|
||||
}
|
||||
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
const components = await service.getComponentHealth();
|
||||
const errors = components.find((c) => c.name === 'recentErrors');
|
||||
|
||||
expect(errors!.status).toBe('unhealthy');
|
||||
expect(errors!.message).toBe('7 error(s) in the last 24 hours');
|
||||
expect(errors!.details).toEqual({ errorCount: 7 });
|
||||
});
|
||||
|
||||
it('does not count non-failed events', async () => {
|
||||
// Insert a downloaded event — should not count
|
||||
await createHistoryEvent(db, {
|
||||
eventType: 'downloaded',
|
||||
status: 'success',
|
||||
});
|
||||
// Insert a failed event — should count
|
||||
await createHistoryEvent(db, {
|
||||
eventType: 'failed',
|
||||
status: 'failed',
|
||||
});
|
||||
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
const components = await service.getComponentHealth();
|
||||
const errors = components.find((c) => c.name === 'recentErrors');
|
||||
|
||||
expect(errors!.status).toBe('degraded');
|
||||
expect(errors!.message).toBe('1 error(s) in the last 24 hours');
|
||||
});
|
||||
});
|
||||
|
||||
// ── Caching ──
|
||||
|
||||
describe('caching', () => {
|
||||
it('caches yt-dlp version — second call does not invoke getYtDlpVersion', async () => {
|
||||
mockGetYtDlpVersion.mockResolvedValue('2024.12.23');
|
||||
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
await service.getComponentHealth();
|
||||
await service.getComponentHealth();
|
||||
|
||||
// getYtDlpVersion should only be called once due to caching
|
||||
expect(mockGetYtDlpVersion).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it('caches disk space — second call does not invoke statfs', async () => {
|
||||
mockStatfs.mockResolvedValue(makeStatfsResult(0.5) as never);
|
||||
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
await service.getComponentHealth();
|
||||
await service.getComponentHealth();
|
||||
|
||||
// statfs should only be called once due to caching
|
||||
expect(mockStatfs).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Full Response Shape ──
|
||||
|
||||
describe('full response', () => {
|
||||
it('returns all four components', async () => {
|
||||
const service = new HealthService(db, () => makeSchedulerState(), '/tmp/media');
|
||||
const components = await service.getComponentHealth();
|
||||
|
||||
expect(components).toHaveLength(4);
|
||||
const names = components.map((c) => c.name);
|
||||
expect(names).toContain('scheduler');
|
||||
expect(names).toContain('ytDlp');
|
||||
expect(names).toContain('diskSpace');
|
||||
expect(names).toContain('recentErrors');
|
||||
});
|
||||
});
|
||||
});
|
||||
255
src/__tests__/history-api.test.ts
Normal file
255
src/__tests__/history-api.test.ts
Normal file
|
|
@ -0,0 +1,255 @@
|
|||
import { describe, it, expect, beforeAll, afterAll } from 'vitest';
|
||||
import { mkdtempSync, rmSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { type FastifyInstance } from 'fastify';
|
||||
import { initDatabaseAsync, closeDatabase } from '../db/index';
|
||||
import { runMigrations } from '../db/migrate';
|
||||
import { buildServer } from '../server/index';
|
||||
import { systemConfig } from '../db/schema/index';
|
||||
import { eq } from 'drizzle-orm';
|
||||
import { type LibSQLDatabase } from 'drizzle-orm/libsql';
|
||||
import type * as schema from '../db/schema/index';
|
||||
import { createChannel } from '../db/repositories/channel-repository';
|
||||
import { createContentItem } from '../db/repositories/content-repository';
|
||||
import { createHistoryEvent } from '../db/repositories/history-repository';
|
||||
import type { Channel, ContentItem } from '../types/index';
|
||||
|
||||
/**
|
||||
* Integration tests for history and activity API endpoints.
|
||||
*/
|
||||
|
||||
describe('History API', () => {
|
||||
let server: FastifyInstance;
|
||||
let db: LibSQLDatabase<typeof schema>;
|
||||
let apiKey: string;
|
||||
let tmpDir: string;
|
||||
let testChannel: Channel;
|
||||
let testContent: ContentItem;
|
||||
|
||||
beforeAll(async () => {
|
||||
tmpDir = mkdtempSync(join(tmpdir(), 'tubearr-history-api-'));
|
||||
const dbPath = join(tmpDir, 'test.db');
|
||||
db = await initDatabaseAsync(dbPath);
|
||||
await runMigrations(dbPath);
|
||||
server = await buildServer({ db });
|
||||
await server.ready();
|
||||
|
||||
// Read API key
|
||||
const rows = await db
|
||||
.select()
|
||||
.from(systemConfig)
|
||||
.where(eq(systemConfig.key, 'api_key'))
|
||||
.limit(1);
|
||||
apiKey = rows[0]?.value ?? '';
|
||||
expect(apiKey).toBeTruthy();
|
||||
|
||||
// Create test data
|
||||
testChannel = await createChannel(db, {
|
||||
name: 'History API Test Channel',
|
||||
platform: 'youtube',
|
||||
platformId: 'UC_history_api_test',
|
||||
url: 'https://www.youtube.com/channel/UC_history_api_test',
|
||||
monitoringEnabled: true,
|
||||
checkInterval: 360,
|
||||
imageUrl: null,
|
||||
metadata: null,
|
||||
formatProfileId: null,
|
||||
});
|
||||
|
||||
testContent = (await createContentItem(db, {
|
||||
channelId: testChannel.id,
|
||||
title: 'History API Test Video',
|
||||
platformContentId: 'vid_hist_api_1',
|
||||
url: 'https://www.youtube.com/watch?v=hist_test',
|
||||
contentType: 'video',
|
||||
duration: 300,
|
||||
status: 'monitored',
|
||||
}))!;
|
||||
|
||||
// Seed some history events
|
||||
await createHistoryEvent(db, {
|
||||
contentItemId: testContent.id,
|
||||
channelId: testChannel.id,
|
||||
eventType: 'grabbed',
|
||||
status: 'pending',
|
||||
details: { title: testContent.title },
|
||||
});
|
||||
await createHistoryEvent(db, {
|
||||
contentItemId: testContent.id,
|
||||
channelId: testChannel.id,
|
||||
eventType: 'downloaded',
|
||||
status: 'completed',
|
||||
details: { title: testContent.title },
|
||||
});
|
||||
await createHistoryEvent(db, {
|
||||
contentItemId: testContent.id,
|
||||
channelId: testChannel.id,
|
||||
eventType: 'failed',
|
||||
status: 'failed',
|
||||
details: { error: 'test error' },
|
||||
});
|
||||
});
|
||||
|
||||
afterAll(async () => {
|
||||
await server.close();
|
||||
closeDatabase();
|
||||
try {
|
||||
if (tmpDir && existsSync(tmpDir)) {
|
||||
rmSync(tmpDir, { recursive: true, force: true });
|
||||
}
|
||||
} catch {
|
||||
// Temp dir cleanup is best-effort on Windows
|
||||
}
|
||||
});
|
||||
|
||||
// ── Helpers ──
|
||||
|
||||
function authed(opts: Record<string, unknown>) {
|
||||
return {
|
||||
...opts,
|
||||
headers: {
|
||||
'x-api-key': apiKey,
|
||||
...(opts.headers as Record<string, string> | undefined),
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
// ── Auth gating ──
|
||||
|
||||
describe('Authentication', () => {
|
||||
it('GET /api/v1/history returns 401 without API key', async () => {
|
||||
const res = await server.inject({ method: 'GET', url: '/api/v1/history' });
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
|
||||
it('GET /api/v1/activity returns 401 without API key', async () => {
|
||||
const res = await server.inject({ method: 'GET', url: '/api/v1/activity' });
|
||||
expect(res.statusCode).toBe(401);
|
||||
});
|
||||
});
|
||||
|
||||
// ── GET /api/v1/history ──
|
||||
|
||||
describe('GET /api/v1/history', () => {
|
||||
it('returns paginated history events', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'GET', url: '/api/v1/history' })
|
||||
);
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(Array.isArray(body.data)).toBe(true);
|
||||
expect(body.pagination).toBeDefined();
|
||||
expect(body.pagination.page).toBe(1);
|
||||
expect(body.pagination.pageSize).toBe(20);
|
||||
expect(body.pagination.totalItems).toBeGreaterThanOrEqual(3);
|
||||
expect(body.pagination.totalPages).toBeGreaterThanOrEqual(1);
|
||||
});
|
||||
|
||||
it('respects page and pageSize parameters', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'GET', url: '/api/v1/history?page=1&pageSize=2' })
|
||||
);
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data.length).toBeLessThanOrEqual(2);
|
||||
expect(body.pagination.pageSize).toBe(2);
|
||||
});
|
||||
|
||||
it('filters by eventType', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'GET', url: '/api/v1/history?eventType=grabbed' })
|
||||
);
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data.every((e: { eventType: string }) => e.eventType === 'grabbed')).toBe(
|
||||
true
|
||||
);
|
||||
});
|
||||
|
||||
it('filters by channelId', async () => {
|
||||
const res = await server.inject(
|
||||
authed({
|
||||
method: 'GET',
|
||||
url: `/api/v1/history?channelId=${testChannel.id}`,
|
||||
})
|
||||
);
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data.length).toBeGreaterThanOrEqual(3);
|
||||
expect(
|
||||
body.data.every((e: { channelId: number }) => e.channelId === testChannel.id)
|
||||
).toBe(true);
|
||||
});
|
||||
|
||||
it('filters by contentItemId', async () => {
|
||||
const res = await server.inject(
|
||||
authed({
|
||||
method: 'GET',
|
||||
url: `/api/v1/history?contentItemId=${testContent.id}`,
|
||||
})
|
||||
);
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data.length).toBeGreaterThanOrEqual(3);
|
||||
});
|
||||
|
||||
it('returns empty data for unmatched filters', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'GET', url: '/api/v1/history?eventType=nonexistent' })
|
||||
);
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data).toHaveLength(0);
|
||||
expect(body.pagination.totalItems).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
// ── GET /api/v1/activity ──
|
||||
|
||||
describe('GET /api/v1/activity', () => {
|
||||
it('returns recent activity feed', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'GET', url: '/api/v1/activity' })
|
||||
);
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.success).toBe(true);
|
||||
expect(Array.isArray(body.data)).toBe(true);
|
||||
expect(body.data.length).toBeGreaterThanOrEqual(3);
|
||||
});
|
||||
|
||||
it('respects limit parameter', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'GET', url: '/api/v1/activity?limit=2' })
|
||||
);
|
||||
expect(res.statusCode).toBe(200);
|
||||
const body = res.json();
|
||||
expect(body.data.length).toBeLessThanOrEqual(2);
|
||||
});
|
||||
|
||||
it('returns events in newest-first order', async () => {
|
||||
const res = await server.inject(
|
||||
authed({ method: 'GET', url: '/api/v1/activity' })
|
||||
);
|
||||
const body = res.json();
|
||||
const dates = body.data.map((e: { createdAt: string; id: number }) => ({
|
||||
createdAt: e.createdAt,
|
||||
id: e.id,
|
||||
}));
|
||||
|
||||
// Events should be ordered by createdAt DESC, then ID DESC
|
||||
for (let i = 1; i < dates.length; i++) {
|
||||
const prev = dates[i - 1];
|
||||
const curr = dates[i];
|
||||
const prevTime = new Date(prev.createdAt).getTime();
|
||||
const currTime = new Date(curr.createdAt).getTime();
|
||||
expect(prevTime).toBeGreaterThanOrEqual(currTime);
|
||||
if (prevTime === currTime) {
|
||||
expect(prev.id).toBeGreaterThan(curr.id);
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Reference in a new issue