M0: Foundation scaffold — Docker Compose, DB schema, FastAPI app, all service stubs

Track A (Infrastructure & Data Layer):
- docker-compose.yml with all 7 services (nginx, frontend, api, mcp, renderer, worker, postgres, redis)
- docker-compose.override.yml for local dev (hot reload, port exposure)
- PostgreSQL init.sql with full schema (15 tables, pgvector indexes, creator economy stubs)
- .env.example with all required environment variables

Track A+B (API Layer):
- FastAPI app with 10 routers (auth, shaders, feed, votes, generate, desires, users, payments, mcp_keys, health)
- SQLAlchemy ORM models for all 15 tables
- Pydantic schemas for all request/response types
- JWT auth middleware (access + refresh tokens, Redis blocklist)
- Redis rate limiting middleware
- Celery worker config with job stubs (render, embed, generate, feed cache, expire bounties)
- Alembic migration framework

Service stubs:
- MCP server (health endpoint, 501 for all tools)
- Renderer service (Express + Puppeteer scaffold, 501 for /render)
- Frontend (package.json with React/Vite/Three.js/TanStack/Tailwind deps)
- Nginx reverse proxy config (/, /api, /mcp, /renders)

Project:
- DECISIONS.md with 11 recorded architectural decisions
- README.md with architecture overview
- Sample shader seed data (plasma, fractal noise, raymarched sphere)
This commit is contained in:
John Lightner 2026-03-24 20:45:08 -05:00
parent 8cb2a50b6c
commit 05d39fdda8
46 changed files with 2931 additions and 0 deletions

43
.env.example Normal file
View file

@ -0,0 +1,43 @@
# Fractafrag Environment Variables
# Copy to .env and fill in values before running docker compose up
# ─── Database ───────────────────────────────────────────────
DB_PASS=changeme_use_a_real_password
POSTGRES_USER=fracta
POSTGRES_DB=fractafrag
# ─── Security ───────────────────────────────────────────────
JWT_SECRET=changeme_generate_with_openssl_rand_hex_64
JWT_ALGORITHM=HS256
JWT_ACCESS_TOKEN_EXPIRE_MINUTES=15
JWT_REFRESH_TOKEN_EXPIRE_DAYS=30
# ─── Cloudflare Turnstile ───────────────────────────────────
TURNSTILE_SITE_KEY=your_turnstile_site_key
TURNSTILE_SECRET=your_turnstile_secret_key
# ─── Stripe ─────────────────────────────────────────────────
STRIPE_SECRET_KEY=sk_test_...
STRIPE_PUBLISHABLE_KEY=pk_test_...
STRIPE_WEBHOOK_SECRET=whsec_...
# ─── AI Providers (platform keys for internal generation) ───
ANTHROPIC_API_KEY=sk-ant-...
OPENAI_API_KEY=sk-...
# ─── MCP Server ─────────────────────────────────────────────
MCP_API_KEY_SALT=changeme_random_salt
# ─── Renderer ───────────────────────────────────────────────
MAX_RENDER_DURATION=8
RENDER_OUTPUT_DIR=/renders
# ─── BYOK Encryption ────────────────────────────────────────
BYOK_MASTER_KEY=changeme_generate_with_openssl_rand_hex_32
# ─── Frontend (Vite) ────────────────────────────────────────
VITE_API_URL=http://localhost/api
VITE_MCP_URL=http://localhost/mcp
# ─── Redis ──────────────────────────────────────────────────
REDIS_URL=redis://redis:6379/0

43
.gitignore vendored Normal file
View file

@ -0,0 +1,43 @@
# ─── Dependencies ─────────────────────────────────────────
node_modules/
__pycache__/
*.pyc
*.pyo
.venv/
venv/
# ─── Environment ──────────────────────────────────────────
.env
.env.local
.env.production
# ─── Build artifacts ──────────────────────────────────────
dist/
build/
*.egg-info/
.eggs/
# ─── Docker volumes (local) ──────────────────────────────
pgdata/
redisdata/
renders/
# ─── IDE / Editor ─────────────────────────────────────────
.vscode/
.idea/
*.swp
*.swo
*~
.DS_Store
# ─── GSD ──────────────────────────────────────────────────
.gsd/browser-state/
.gsd/browser-baselines/
.bg-shell/
# ─── SSL certs ────────────────────────────────────────────
services/nginx/certs/*.pem
services/nginx/certs/*.key
# ─── Alembic ──────────────────────────────────────────────
*.db

67
DECISIONS.md Normal file
View file

@ -0,0 +1,67 @@
# Fractafrag — Project Decisions
## D001 — Backend Language & Framework
- **Choice:** Python + FastAPI
- **Rationale:** AI/ML integrations (pgvector, LLM clients, embeddings) are Python-native. FastAPI gives async performance with Pydantic auto-generated OpenAPI docs. Celery + Redis is mature for job queues.
- **Made by:** Collaborative
- **Revisable:** No
## D002 — Frontend Stack
- **Choice:** React 18 + Vite + Three.js + TanStack Query + Zustand + Tailwind CSS
- **Rationale:** Three.js for 3D shader rendering, raw WebGL for feed thumbnails. React UI, TanStack Query for server state, Zustand for client state.
- **Made by:** Collaborative
- **Revisable:** No
## D003 — Database & Cache
- **Choice:** PostgreSQL 16 + pgvector + Redis 7
- **Rationale:** pgvector for taste/style/desire embeddings (ANN). Redis for sessions, feed cache, rate limiting, Celery broker.
- **Made by:** Collaborative
- **Revisable:** No
## D004 — Container Orchestration
- **Choice:** Single Docker Compose stack, self-hosted, no cloud dependencies
- **Rationale:** Self-contained with nginx reverse proxy. .env-driven config.
- **Made by:** Collaborative
- **Revisable:** No
## D005 — Media Storage (Q1)
- **Choice:** Docker volume initially, S3-compatible config flag for later migration
- **Rationale:** Volume is simplest for single-server. Add Minio/S3 when storage grows large.
- **Made by:** Agent (per spec recommendation)
- **Revisable:** Yes
## D006 — Style Embedding Model (Q2)
- **Choice:** Heuristic classifier + LLM structured output for M1, fine-tune later
- **Rationale:** No training data yet for fine-tuning. Heuristic is fast/cheap, LLM fills accuracy gaps.
- **Made by:** Agent (per spec recommendation)
- **Revisable:** Yes
## D007 — Renderer Approach (Q3)
- **Choice:** Puppeteer + Headless Chromium
- **Rationale:** Accurate browser-equivalent rendering. Profile at M2 and optimize if needed.
- **Made by:** Agent (per spec recommendation)
- **Revisable:** Yes
## D008 — Generation Status UX (Q4)
- **Choice:** Polling for M5, SSE upgrade later
- **Rationale:** Simpler to implement. Generation takes 5-30s, 2s polling is acceptable UX.
- **Made by:** Agent (per spec recommendation)
- **Revisable:** Yes
## D009 — Comments Scope (Q6)
- **Choice:** Defer to post-M5 polish sprint
- **Rationale:** Schema is in place. Feature is not on critical path for core product loop.
- **Made by:** Agent (per spec recommendation)
- **Revisable:** Yes
## D010 — Moderation Approach (Q7)
- **Choice:** Admin API endpoints only (/api/v1/admin/queue). No admin UI for M4.
- **Rationale:** Simple approve/reject actions via API. Admin panel deferred until scale demands it.
- **Made by:** Agent (per spec recommendation)
- **Revisable:** Yes
## D011 — Creator Economy
- **Choice:** Deferred until organic traction (500 DAU, 1000 shaders, 20 active creators)
- **Rationale:** Build the hooks (schema stubs, engagement tracking), not the features. Monetization on a platform nobody uses is worthless.
- **Made by:** Collaborative (per spec Section 11)
- **Revisable:** Yes

79
README.md Normal file
View file

@ -0,0 +1,79 @@
# 🔥 Fractafrag
**A self-hosted GLSL shader platform — browse, create, generate, and share real-time GPU visuals.**
Fractafrag fuses three experiences:
- **TikTok-style adaptive feed** of living, animated shaders that learns your taste
- **Shadertoy-style code editor** for writing, forking, and publishing GLSL shaders
- **AI generation layer** where you describe what you want and the platform writes the shader
Plus a **desire queue / bounty board** where users express what they want to see, and human creators or AI agents fulfill those requests.
## Quick Start
```bash
# 1. Clone and configure
cp .env.example .env
# Edit .env with your secrets
# 2. Launch everything
docker compose up -d
# 3. Open
open http://localhost
```
## Architecture
```
nginx (reverse proxy)
├── / → React frontend (Vite)
├── /api/* → FastAPI backend
└── /mcp/* → MCP server (AI agent interface)
postgres (pgvector/pgvector:pg16) — primary datastore + vector similarity
redis (redis:7-alpine) — cache, rate limiting, job queue
renderer — headless Chromium shader renderer
worker — Celery job processor (render, embed, AI generate)
```
## Tech Stack
| Layer | Tech |
|-------|------|
| Frontend | React 18, Vite, Three.js, TanStack Query, Zustand, Tailwind CSS |
| Backend | Python, FastAPI, SQLAlchemy, Pydantic |
| Database | PostgreSQL 16 + pgvector, Redis 7 |
| Jobs | Celery + Redis |
| Renderer | Node.js + Puppeteer (Headless Chromium) |
| MCP | Python MCP SDK, HTTP+SSE transport |
| Payments | Stripe (subscriptions + Connect) |
| Container | Docker Compose, single-stack |
## Milestone Roadmap
| Milestone | Focus | Status |
|-----------|-------|--------|
| **M0** | Infrastructure + Auth | 🚧 In Progress |
| **M1** | Core Shader Loop (editor, submit, feed) | ⏳ |
| **M2** | Intelligence Layer (MCP, recommendations) | ⏳ |
| **M3** | Desire Economy (bounties, fulfillment) | ⏳ |
| **M4** | Monetization (Stripe, subscriptions) | ⏳ |
| **M5** | AI Generation (prompt → shader) | ⏳ |
## Development
```bash
# API direct access (dev mode)
http://localhost:8000/api/docs # Swagger UI
http://localhost:8000/health # Health check
# Services
http://localhost:5173 # Vite dev server
http://localhost:3200 # MCP server
http://localhost:3100 # Renderer
```
## License
Private — see DECISIONS.md for project governance.

265
db/init.sql Normal file
View file

@ -0,0 +1,265 @@
-- Fractafrag Database Bootstrap
-- Runs on first container start via docker-entrypoint-initdb.d
-- Enable required extensions
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
CREATE EXTENSION IF NOT EXISTS "vector";
CREATE EXTENSION IF NOT EXISTS "pg_trgm"; -- for text search
-- ════════════════════════════════════════════════════════════
-- USERS
-- ════════════════════════════════════════════════════════════
CREATE TABLE users (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
username TEXT UNIQUE NOT NULL,
email TEXT UNIQUE NOT NULL,
password_hash TEXT NOT NULL,
role TEXT NOT NULL DEFAULT 'user', -- user, moderator, admin
trust_tier TEXT NOT NULL DEFAULT 'standard', -- standard, creator, trusted_api
stripe_customer_id TEXT,
subscription_tier TEXT DEFAULT 'free', -- free, pro, studio
ai_credits_remaining INTEGER DEFAULT 0,
taste_vector vector(512), -- pgvector: learned taste embedding
-- Creator economy stubs (Section 11f — deferred, schema only)
is_verified_creator BOOLEAN DEFAULT FALSE,
verified_creator_at TIMESTAMPTZ,
stripe_connect_account_id TEXT,
-- Timestamps
created_at TIMESTAMPTZ DEFAULT NOW(),
last_active_at TIMESTAMPTZ
);
-- ════════════════════════════════════════════════════════════
-- SHADERS
-- ════════════════════════════════════════════════════════════
CREATE TABLE shaders (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
author_id UUID REFERENCES users(id) ON DELETE SET NULL,
title TEXT NOT NULL,
description TEXT,
glsl_code TEXT NOT NULL,
is_public BOOLEAN DEFAULT TRUE,
is_ai_generated BOOLEAN DEFAULT FALSE,
ai_provider TEXT, -- anthropic, openai, ollama, null
thumbnail_url TEXT,
preview_url TEXT,
render_status TEXT DEFAULT 'pending', -- pending, rendering, ready, failed
style_vector vector(512), -- pgvector: visual style embedding
style_metadata JSONB, -- { chaos_level, color_temp, motion_type, ... }
tags TEXT[],
shader_type TEXT DEFAULT '2d', -- 2d, 3d, audio-reactive
forked_from UUID REFERENCES shaders(id) ON DELETE SET NULL,
view_count INTEGER DEFAULT 0,
score FLOAT DEFAULT 0, -- cached hot score for feed ranking
-- Creator economy stubs (Section 11f)
access_tier TEXT DEFAULT 'open', -- open, source_locked, commercial
source_unlock_price_cents INTEGER,
commercial_license_price_cents INTEGER,
verified_creator_shader BOOLEAN DEFAULT FALSE,
-- Timestamps
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
-- ════════════════════════════════════════════════════════════
-- VOTES
-- ════════════════════════════════════════════════════════════
CREATE TABLE votes (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID REFERENCES users(id) ON DELETE CASCADE,
shader_id UUID REFERENCES shaders(id) ON DELETE CASCADE,
value SMALLINT NOT NULL CHECK (value IN (-1, 1)),
created_at TIMESTAMPTZ DEFAULT NOW(),
UNIQUE (user_id, shader_id)
);
-- ════════════════════════════════════════════════════════════
-- ENGAGEMENT EVENTS (dwell time, replays, shares)
-- ════════════════════════════════════════════════════════════
CREATE TABLE engagement_events (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID REFERENCES users(id) ON DELETE SET NULL, -- null for anonymous
session_id TEXT, -- anonymous session token
shader_id UUID REFERENCES shaders(id) ON DELETE CASCADE,
event_type TEXT NOT NULL, -- dwell, replay, share, generate_similar
dwell_secs FLOAT,
metadata JSONB,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- ════════════════════════════════════════════════════════════
-- DESIRES / BOUNTIES
-- ════════════════════════════════════════════════════════════
CREATE TABLE desires (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
author_id UUID REFERENCES users(id) ON DELETE SET NULL,
prompt_text TEXT NOT NULL,
prompt_embedding vector(512), -- embedded for similarity grouping
style_hints JSONB, -- { chaos_level, color_temp, etc }
tip_amount_cents INTEGER DEFAULT 0,
status TEXT DEFAULT 'open', -- open, in_progress, fulfilled, expired
heat_score FLOAT DEFAULT 1, -- updated as similar desires accumulate
fulfilled_by_shader UUID REFERENCES shaders(id) ON DELETE SET NULL,
fulfilled_at TIMESTAMPTZ,
expires_at TIMESTAMPTZ,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- Similar desire grouping (many-to-many)
CREATE TABLE desire_clusters (
cluster_id UUID,
desire_id UUID REFERENCES desires(id) ON DELETE CASCADE,
similarity FLOAT,
PRIMARY KEY (cluster_id, desire_id)
);
-- ════════════════════════════════════════════════════════════
-- BOUNTY TIPS
-- ════════════════════════════════════════════════════════════
CREATE TABLE bounty_tips (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
desire_id UUID REFERENCES desires(id) ON DELETE CASCADE,
tipper_id UUID REFERENCES users(id) ON DELETE SET NULL,
amount_cents INTEGER NOT NULL,
stripe_payment_intent_id TEXT,
status TEXT DEFAULT 'held', -- held, released, refunded
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- ════════════════════════════════════════════════════════════
-- CREATOR PAYOUTS
-- ════════════════════════════════════════════════════════════
CREATE TABLE creator_payouts (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
creator_id UUID REFERENCES users(id) ON DELETE SET NULL,
desire_id UUID REFERENCES desires(id) ON DELETE SET NULL,
gross_amount_cents INTEGER,
platform_fee_cents INTEGER, -- 10%
net_amount_cents INTEGER, -- 90%
stripe_transfer_id TEXT,
status TEXT DEFAULT 'pending', -- pending, processing, completed, failed
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- ════════════════════════════════════════════════════════════
-- API KEYS (for MCP clients)
-- ════════════════════════════════════════════════════════════
CREATE TABLE api_keys (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID REFERENCES users(id) ON DELETE CASCADE,
key_hash TEXT UNIQUE NOT NULL, -- bcrypt hash of the actual key
key_prefix TEXT NOT NULL, -- first 8 chars for display (ff_key_XXXXXXXX)
name TEXT, -- user-given label
trust_tier TEXT DEFAULT 'probation', -- probation, trusted, premium
submissions_approved INTEGER DEFAULT 0,
rate_limit_per_hour INTEGER DEFAULT 10,
last_used_at TIMESTAMPTZ,
created_at TIMESTAMPTZ DEFAULT NOW(),
revoked_at TIMESTAMPTZ
);
-- ════════════════════════════════════════════════════════════
-- AI GENERATION LOG
-- ════════════════════════════════════════════════════════════
CREATE TABLE generation_log (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID REFERENCES users(id) ON DELETE SET NULL,
shader_id UUID REFERENCES shaders(id) ON DELETE SET NULL,
provider TEXT NOT NULL,
prompt_text TEXT,
tokens_used INTEGER,
cost_cents INTEGER, -- platform cost for credit-based generations
success BOOLEAN,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- ════════════════════════════════════════════════════════════
-- COMMENTS (schema in place, feature deferred to post-M5)
-- ════════════════════════════════════════════════════════════
CREATE TABLE comments (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
shader_id UUID REFERENCES shaders(id) ON DELETE CASCADE,
author_id UUID REFERENCES users(id) ON DELETE SET NULL,
body TEXT NOT NULL,
parent_id UUID REFERENCES comments(id) ON DELETE CASCADE,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- ════════════════════════════════════════════════════════════
-- CREATOR ECONOMY STUBS (Section 11f — dormant until activated)
-- ════════════════════════════════════════════════════════════
CREATE TABLE source_unlocks (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
shader_id UUID REFERENCES shaders(id) ON DELETE CASCADE,
buyer_id UUID REFERENCES users(id) ON DELETE SET NULL,
license_type TEXT NOT NULL, -- personal, commercial
amount_cents INTEGER NOT NULL,
platform_fee_cents INTEGER NOT NULL,
stripe_payment_intent_id TEXT,
created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE TABLE creator_engagement_snapshots (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
creator_id UUID REFERENCES users(id) ON DELETE CASCADE,
month DATE NOT NULL,
total_score FLOAT NOT NULL,
pool_share FLOAT,
payout_cents INTEGER,
paid_at TIMESTAMPTZ,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- ════════════════════════════════════════════════════════════
-- INDEXES
-- ════════════════════════════════════════════════════════════
-- Feed performance
CREATE INDEX idx_shaders_score ON shaders(score DESC) WHERE is_public = TRUE;
CREATE INDEX idx_shaders_created ON shaders(created_at DESC) WHERE is_public = TRUE;
CREATE INDEX idx_shaders_tags ON shaders USING GIN(tags);
CREATE INDEX idx_shaders_render_status ON shaders(render_status) WHERE render_status != 'ready';
-- Recommendation (pgvector ANN — ivfflat, will rebuild after data exists)
-- NOTE: ivfflat indexes require data in the table to build properly.
-- Run these AFTER seeding initial data:
-- CREATE INDEX idx_shaders_style_vector ON shaders
-- USING ivfflat (style_vector vector_cosine_ops) WITH (lists = 100);
-- CREATE INDEX idx_users_taste_vector ON users
-- USING ivfflat (taste_vector vector_cosine_ops) WITH (lists = 50);
-- CREATE INDEX idx_desires_embedding ON desires
-- USING ivfflat (prompt_embedding vector_cosine_ops) WITH (lists = 50);
-- For now, use HNSW (works on empty tables, better perf at small scale)
CREATE INDEX idx_shaders_style_vector ON shaders
USING hnsw (style_vector vector_cosine_ops) WITH (m = 16, ef_construction = 64);
CREATE INDEX idx_users_taste_vector ON users
USING hnsw (taste_vector vector_cosine_ops) WITH (m = 16, ef_construction = 64);
CREATE INDEX idx_desires_embedding ON desires
USING hnsw (prompt_embedding vector_cosine_ops) WITH (m = 16, ef_construction = 64);
-- Engagement
CREATE INDEX idx_engagement_user ON engagement_events(user_id, created_at DESC);
CREATE INDEX idx_engagement_shader ON engagement_events(shader_id, event_type);
CREATE INDEX idx_engagement_session ON engagement_events(session_id, created_at DESC)
WHERE session_id IS NOT NULL;
-- Desires / bounties
CREATE INDEX idx_desires_status ON desires(status, heat_score DESC);
CREATE INDEX idx_desires_author ON desires(author_id);
-- API keys
CREATE INDEX idx_api_keys_user ON api_keys(user_id) WHERE revoked_at IS NULL;
CREATE INDEX idx_api_keys_prefix ON api_keys(key_prefix);
-- Votes
CREATE INDEX idx_votes_shader ON votes(shader_id);
CREATE INDEX idx_votes_user ON votes(user_id);
-- Comments
CREATE INDEX idx_comments_shader ON comments(shader_id, created_at);
CREATE INDEX idx_comments_parent ON comments(parent_id);
-- Text search
CREATE INDEX idx_shaders_title_trgm ON shaders USING GIN(title gin_trgm_ops);
CREATE INDEX idx_desires_prompt_trgm ON desires USING GIN(prompt_text gin_trgm_ops);

View file

@ -0,0 +1,38 @@
# docker-compose.override.yml — Local dev overrides
# This file is automatically picked up by docker compose
version: "3.9"
services:
api:
volumes:
- ./services/api:/app
command: uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
ports:
- "8000:8000" # Direct access for debugging
frontend:
volumes:
- ./services/frontend:/app
- /app/node_modules
command: npm run dev -- --host 0.0.0.0
ports:
- "5173:5173" # Vite dev server direct access
mcp:
volumes:
- ./services/mcp:/app
ports:
- "3200:3200" # Direct MCP access
renderer:
ports:
- "3100:3100" # Direct renderer access
postgres:
ports:
- "5432:5432" # Direct DB access for dev tools
redis:
ports:
- "6379:6379" # Direct Redis access for dev tools

144
docker-compose.yml Normal file
View file

@ -0,0 +1,144 @@
version: "3.9"
services:
# ─── Reverse Proxy ──────────────────────────────────────────
nginx:
image: nginx:alpine
ports:
- "80:80"
- "443:443"
volumes:
- ./services/nginx/conf:/etc/nginx/conf.d:ro
- ./services/nginx/certs:/etc/ssl/certs:ro
depends_on:
api:
condition: service_healthy
frontend:
condition: service_started
restart: unless-stopped
# ─── Frontend (React + Vite) ────────────────────────────────
frontend:
build:
context: ./services/frontend
dockerfile: Dockerfile
environment:
- VITE_API_URL=${VITE_API_URL:-http://localhost/api}
- VITE_MCP_URL=${VITE_MCP_URL:-http://localhost/mcp}
restart: unless-stopped
# ─── API (FastAPI) ──────────────────────────────────────────
api:
build:
context: ./services/api
dockerfile: Dockerfile
environment:
- DATABASE_URL=postgresql+asyncpg://${POSTGRES_USER:-fracta}:${DB_PASS}@postgres:5432/${POSTGRES_DB:-fractafrag}
- DATABASE_URL_SYNC=postgresql://${POSTGRES_USER:-fracta}:${DB_PASS}@postgres:5432/${POSTGRES_DB:-fractafrag}
- REDIS_URL=${REDIS_URL:-redis://redis:6379/0}
- JWT_SECRET=${JWT_SECRET}
- JWT_ALGORITHM=${JWT_ALGORITHM:-HS256}
- JWT_ACCESS_TOKEN_EXPIRE_MINUTES=${JWT_ACCESS_TOKEN_EXPIRE_MINUTES:-15}
- JWT_REFRESH_TOKEN_EXPIRE_DAYS=${JWT_REFRESH_TOKEN_EXPIRE_DAYS:-30}
- TURNSTILE_SECRET=${TURNSTILE_SECRET}
- STRIPE_SECRET_KEY=${STRIPE_SECRET_KEY}
- STRIPE_WEBHOOK_SECRET=${STRIPE_WEBHOOK_SECRET}
- RENDERER_URL=http://renderer:3100
- BYOK_MASTER_KEY=${BYOK_MASTER_KEY}
depends_on:
postgres:
condition: service_healthy
redis:
condition: service_healthy
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 10s
timeout: 5s
retries: 5
restart: unless-stopped
# ─── MCP Server ─────────────────────────────────────────────
mcp:
build:
context: ./services/mcp
dockerfile: Dockerfile
environment:
- API_BASE_URL=http://api:8000
- MCP_API_KEY_SALT=${MCP_API_KEY_SALT}
- REDIS_URL=${REDIS_URL:-redis://redis:6379/0}
depends_on:
api:
condition: service_healthy
restart: unless-stopped
# ─── Renderer (Headless Chromium) ───────────────────────────
renderer:
build:
context: ./services/renderer
dockerfile: Dockerfile
shm_size: "512mb"
environment:
- MAX_RENDER_DURATION=${MAX_RENDER_DURATION:-8}
- OUTPUT_DIR=${RENDER_OUTPUT_DIR:-/renders}
volumes:
- renders:/renders
restart: unless-stopped
# ─── Worker (Celery) ────────────────────────────────────────
worker:
build:
context: ./services/api
dockerfile: Dockerfile
command: celery -A app.worker.celery_app worker --loglevel=info --concurrency=4
environment:
- DATABASE_URL=postgresql+asyncpg://${POSTGRES_USER:-fracta}:${DB_PASS}@postgres:5432/${POSTGRES_DB:-fractafrag}
- DATABASE_URL_SYNC=postgresql://${POSTGRES_USER:-fracta}:${DB_PASS}@postgres:5432/${POSTGRES_DB:-fractafrag}
- REDIS_URL=${REDIS_URL:-redis://redis:6379/0}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- OPENAI_API_KEY=${OPENAI_API_KEY}
- RENDERER_URL=http://renderer:3100
- BYOK_MASTER_KEY=${BYOK_MASTER_KEY}
depends_on:
postgres:
condition: service_healthy
redis:
condition: service_healthy
renderer:
condition: service_started
restart: unless-stopped
# ─── PostgreSQL + pgvector ──────────────────────────────────
postgres:
image: pgvector/pgvector:pg16
environment:
- POSTGRES_USER=${POSTGRES_USER:-fracta}
- POSTGRES_PASSWORD=${DB_PASS}
- POSTGRES_DB=${POSTGRES_DB:-fractafrag}
volumes:
- pgdata:/var/lib/postgresql/data
- ./db/init.sql:/docker-entrypoint-initdb.d/01-init.sql:ro
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-fracta} -d ${POSTGRES_DB:-fractafrag}"]
interval: 5s
timeout: 5s
retries: 5
restart: unless-stopped
# ─── Redis ──────────────────────────────────────────────────
redis:
image: redis:7-alpine
command: redis-server --appendonly yes --maxmemory 256mb --maxmemory-policy allkeys-lru
volumes:
- redisdata:/data
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 5s
timeout: 5s
retries: 5
restart: unless-stopped
volumes:
pgdata:
redisdata:
renders:

115
scripts/seed.py Normal file
View file

@ -0,0 +1,115 @@
"""Fractafrag — Development seed data."""
# TODO: Implement seed script (Track A completion)
# This script will:
# 1. Create test users (admin, moderator, regular, pro, studio)
# 2. Insert sample shaders with known-good GLSL code
# 3. Create sample desires/bounties
# 4. Set up initial engagement data for recommendation testing
SAMPLE_SHADERS = [
{
"title": "Plasma Wave",
"glsl_code": """
void mainImage(out vec4 fragColor, in vec2 fragCoord) {
vec2 uv = fragCoord / iResolution.xy;
float t = iTime;
float c = sin(uv.x * 10.0 + t) + sin(uv.y * 10.0 + t * 0.7);
c += sin((uv.x + uv.y) * 5.0 + t * 1.3);
c = c / 3.0 * 0.5 + 0.5;
fragColor = vec4(c, c * 0.5, 1.0 - c, 1.0);
}
""",
"tags": ["plasma", "colorful", "animated"],
"shader_type": "2d",
},
{
"title": "Fractal Noise",
"glsl_code": """
float hash(vec2 p) {
return fract(sin(dot(p, vec2(127.1, 311.7))) * 43758.5453);
}
float noise(vec2 p) {
vec2 i = floor(p);
vec2 f = fract(p);
f = f * f * (3.0 - 2.0 * f);
return mix(
mix(hash(i), hash(i + vec2(1.0, 0.0)), f.x),
mix(hash(i + vec2(0.0, 1.0)), hash(i + vec2(1.0, 1.0)), f.x),
f.y
);
}
float fbm(vec2 p) {
float v = 0.0, a = 0.5;
for (int i = 0; i < 6; i++) {
v += a * noise(p);
p *= 2.0;
a *= 0.5;
}
return v;
}
void mainImage(out vec4 fragColor, in vec2 fragCoord) {
vec2 uv = fragCoord / iResolution.xy;
float n = fbm(uv * 5.0 + iTime * 0.3);
fragColor = vec4(n * 0.3, n * 0.6, n, 1.0);
}
""",
"tags": ["noise", "fractal", "generative"],
"shader_type": "2d",
},
{
"title": "Ray March Sphere",
"glsl_code": """
float sdSphere(vec3 p, float r) { return length(p) - r; }
float map(vec3 p) {
return sdSphere(p - vec3(0.0, 0.0, 0.0), 1.0);
}
vec3 getNormal(vec3 p) {
vec2 e = vec2(0.001, 0.0);
return normalize(vec3(
map(p + e.xyy) - map(p - e.xyy),
map(p + e.yxy) - map(p - e.yxy),
map(p + e.yyx) - map(p - e.yyx)
));
}
void mainImage(out vec4 fragColor, in vec2 fragCoord) {
vec2 uv = (fragCoord - 0.5 * iResolution.xy) / iResolution.y;
vec3 ro = vec3(0.0, 0.0, -3.0);
vec3 rd = normalize(vec3(uv, 1.0));
float t = 0.0;
for (int i = 0; i < 64; i++) {
vec3 p = ro + rd * t;
float d = map(p);
if (d < 0.001) break;
t += d;
if (t > 20.0) break;
}
vec3 col = vec3(0.05);
if (t < 20.0) {
vec3 p = ro + rd * t;
vec3 n = getNormal(p);
vec3 light = normalize(vec3(sin(iTime), 1.0, cos(iTime)));
float diff = max(dot(n, light), 0.0);
col = vec3(0.2, 0.5, 0.9) * diff + vec3(0.05);
}
fragColor = vec4(col, 1.0);
}
""",
"tags": ["raymarching", "3d", "sphere", "lighting"],
"shader_type": "3d",
},
]
if __name__ == "__main__":
print("Seed script — run with: python scripts/seed.py")
print(f"Sample shaders available: {len(SAMPLE_SHADERS)}")
# TODO: Connect to DB and insert seed data

21
services/api/Dockerfile Normal file
View file

@ -0,0 +1,21 @@
FROM python:3.12-slim
WORKDIR /app
# Install system deps
RUN apt-get update && apt-get install -y --no-install-recommends \
curl \
build-essential \
&& rm -rf /var/lib/apt/lists/*
# Install Python deps
COPY pyproject.toml .
RUN pip install --no-cache-dir -e ".[dev]"
# Copy app code
COPY . .
# Default command (overridden in dev by docker-compose.override.yml)
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
EXPOSE 8000

39
services/api/alembic.ini Normal file
View file

@ -0,0 +1,39 @@
# Alembic configuration
[alembic]
script_location = migrations
sqlalchemy.url = postgresql://fracta:changeme@localhost:5432/fractafrag
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
qualname =
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

View file

@ -0,0 +1 @@
"""App package."""

View file

@ -0,0 +1,47 @@
"""Fractafrag API — Application configuration."""
from pydantic_settings import BaseSettings
from functools import lru_cache
class Settings(BaseSettings):
"""Application settings loaded from environment variables."""
# ── Database ──────────────────────────────────────────────
database_url: str = "postgresql+asyncpg://fracta:changeme@postgres:5432/fractafrag"
database_url_sync: str = "postgresql://fracta:changeme@postgres:5432/fractafrag"
# ── Redis ─────────────────────────────────────────────────
redis_url: str = "redis://redis:6379/0"
# ── JWT ───────────────────────────────────────────────────
jwt_secret: str = "changeme"
jwt_algorithm: str = "HS256"
jwt_access_token_expire_minutes: int = 15
jwt_refresh_token_expire_days: int = 30
# ── Cloudflare Turnstile ──────────────────────────────────
turnstile_secret: str = ""
# ── Stripe ────────────────────────────────────────────────
stripe_secret_key: str = ""
stripe_webhook_secret: str = ""
# ── Renderer ──────────────────────────────────────────────
renderer_url: str = "http://renderer:3100"
# ── BYOK Encryption ──────────────────────────────────────
byok_master_key: str = "changeme"
# ── AI Providers ──────────────────────────────────────────
anthropic_api_key: str = ""
openai_api_key: str = ""
class Config:
env_file = ".env"
case_sensitive = False
@lru_cache
def get_settings() -> Settings:
return Settings()

View file

@ -0,0 +1,36 @@
"""Fractafrag API — Database engine and session management."""
from sqlalchemy.ext.asyncio import create_async_engine, async_sessionmaker, AsyncSession
from sqlalchemy.orm import DeclarativeBase
from app.config import get_settings
settings = get_settings()
engine = create_async_engine(
settings.database_url,
echo=False,
pool_size=20,
max_overflow=10,
pool_pre_ping=True,
)
async_session = async_sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
class Base(DeclarativeBase):
"""Base class for all SQLAlchemy ORM models."""
pass
async def get_db() -> AsyncSession:
"""FastAPI dependency: yields an async DB session."""
async with async_session() as session:
try:
yield session
await session.commit()
except Exception:
await session.rollback()
raise
finally:
await session.close()

51
services/api/app/main.py Normal file
View file

@ -0,0 +1,51 @@
"""Fractafrag API — Main application entrypoint."""
from contextlib import asynccontextmanager
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from app.database import engine
from app.redis import close_redis
from app.routers import auth, shaders, feed, votes, generate, desires, users, payments, mcp_keys, health
@asynccontextmanager
async def lifespan(app: FastAPI):
"""Application startup and shutdown lifecycle."""
# Startup
yield
# Shutdown
await engine.dispose()
await close_redis()
app = FastAPI(
title="Fractafrag API",
description="GLSL shader platform — browse, create, generate, and share real-time GPU visuals",
version="0.1.0",
lifespan=lifespan,
docs_url="/api/docs",
redoc_url="/api/redoc",
openapi_url="/api/openapi.json",
)
# CORS — permissive in dev, lock down in production
app.add_middleware(
CORSMiddleware,
allow_origins=["*"], # TODO: restrict in production
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
# ── Mount Routers ─────────────────────────────────────────
app.include_router(health.router)
app.include_router(auth.router, prefix="/api/v1/auth", tags=["auth"])
app.include_router(shaders.router, prefix="/api/v1/shaders", tags=["shaders"])
app.include_router(feed.router, prefix="/api/v1/feed", tags=["feed"])
app.include_router(votes.router, prefix="/api/v1", tags=["votes"])
app.include_router(generate.router, prefix="/api/v1/generate", tags=["generate"])
app.include_router(desires.router, prefix="/api/v1/desires", tags=["desires"])
app.include_router(users.router, prefix="/api/v1", tags=["users"])
app.include_router(payments.router, prefix="/api/v1/payments", tags=["payments"])
app.include_router(mcp_keys.router, prefix="/api/v1/me/api-keys", tags=["api-keys"])

View file

@ -0,0 +1 @@
"""Middleware package."""

View file

@ -0,0 +1,144 @@
"""Fractafrag — JWT Authentication middleware and dependencies."""
from datetime import datetime, timedelta, timezone
from uuid import UUID
from typing import Optional
from fastapi import Depends, HTTPException, status, Request, Response
from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials
from jose import jwt, JWTError
from passlib.context import CryptContext
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
from app.config import get_settings
from app.database import get_db
from app.models import User
from app.redis import get_redis
settings = get_settings()
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
bearer_scheme = HTTPBearer(auto_error=False)
# ── Password Hashing ──────────────────────────────────────
def hash_password(password: str) -> str:
return pwd_context.hash(password)
def verify_password(plain: str, hashed: str) -> bool:
return pwd_context.verify(plain, hashed)
# ── JWT Token Management ──────────────────────────────────
def create_access_token(user_id: UUID, username: str, role: str, tier: str) -> str:
payload = {
"sub": str(user_id),
"username": username,
"role": role,
"tier": tier,
"iat": datetime.now(timezone.utc),
"exp": datetime.now(timezone.utc) + timedelta(minutes=settings.jwt_access_token_expire_minutes),
}
return jwt.encode(payload, settings.jwt_secret, algorithm=settings.jwt_algorithm)
def create_refresh_token(user_id: UUID) -> str:
payload = {
"sub": str(user_id),
"type": "refresh",
"iat": datetime.now(timezone.utc),
"exp": datetime.now(timezone.utc) + timedelta(days=settings.jwt_refresh_token_expire_days),
}
return jwt.encode(payload, settings.jwt_secret, algorithm=settings.jwt_algorithm)
def decode_token(token: str) -> dict:
try:
return jwt.decode(token, settings.jwt_secret, algorithms=[settings.jwt_algorithm])
except JWTError:
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid or expired token")
# ── Refresh Token Blocklist (Redis) ───────────────────────
async def is_token_blocklisted(token: str) -> bool:
redis = await get_redis()
return await redis.exists(f"blocklist:{token}")
async def blocklist_token(token: str, ttl_seconds: int):
redis = await get_redis()
await redis.setex(f"blocklist:{token}", ttl_seconds, "1")
# ── FastAPI Dependencies ──────────────────────────────────
async def get_current_user(
credentials: Optional[HTTPAuthorizationCredentials] = Depends(bearer_scheme),
db: AsyncSession = Depends(get_db),
) -> User:
"""Require authentication. Returns the current user."""
if credentials is None:
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Not authenticated")
payload = decode_token(credentials.credentials)
if payload.get("type") == "refresh":
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Cannot use refresh token for API access")
user_id = payload.get("sub")
if not user_id:
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid token payload")
result = await db.execute(select(User).where(User.id == UUID(user_id)))
user = result.scalar_one_or_none()
if user is None:
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="User not found")
return user
async def get_optional_user(
credentials: Optional[HTTPAuthorizationCredentials] = Depends(bearer_scheme),
db: AsyncSession = Depends(get_db),
) -> Optional[User]:
"""Optional authentication. Returns user or None for anonymous requests."""
if credentials is None:
return None
try:
payload = decode_token(credentials.credentials)
if payload.get("type") == "refresh":
return None
user_id = payload.get("sub")
if not user_id:
return None
result = await db.execute(select(User).where(User.id == UUID(user_id)))
return result.scalar_one_or_none()
except HTTPException:
return None
def require_role(*roles: str):
"""Dependency factory: require user to have one of the specified roles."""
async def check_role(user: User = Depends(get_current_user)) -> User:
if user.role not in roles:
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Insufficient permissions")
return user
return check_role
def require_tier(*tiers: str):
"""Dependency factory: require user to have one of the specified subscription tiers."""
async def check_tier(user: User = Depends(get_current_user)) -> User:
if user.subscription_tier not in tiers:
raise HTTPException(
status_code=status.HTTP_403_FORBIDDEN,
detail=f"This feature requires one of: {', '.join(tiers)}"
)
return user
return check_tier

View file

@ -0,0 +1,59 @@
"""Fractafrag — Redis-backed rate limiting middleware."""
import time
from fastapi import Request, HTTPException, status
from app.redis import get_redis
async def check_rate_limit(
key: str,
max_requests: int,
window_seconds: int = 60,
):
"""
Check and enforce rate limit.
Args:
key: Unique identifier (e.g., "ip:1.2.3.4" or "user:uuid")
max_requests: Maximum requests allowed in the window
window_seconds: Time window in seconds
Raises:
HTTPException 429 if rate limit exceeded
"""
redis = await get_redis()
redis_key = f"ratelimit:{key}"
pipe = redis.pipeline()
now = time.time()
window_start = now - window_seconds
# Remove old entries outside the window
pipe.zremrangebyscore(redis_key, 0, window_start)
# Count current entries
pipe.zcard(redis_key)
# Add current request
pipe.zadd(redis_key, {str(now): now})
# Set TTL on the key
pipe.expire(redis_key, window_seconds)
results = await pipe.execute()
current_count = results[1]
if current_count >= max_requests:
raise HTTPException(
status_code=status.HTTP_429_TOO_MANY_REQUESTS,
detail=f"Rate limit exceeded. Max {max_requests} requests per {window_seconds}s.",
headers={"Retry-After": str(window_seconds)},
)
async def rate_limit_ip(request: Request, max_requests: int = 100):
"""Rate limit by IP address. Default: 100 req/min."""
ip = request.client.host if request.client else "unknown"
await check_rate_limit(f"ip:{ip}", max_requests)
async def rate_limit_user(user_id: str, max_requests: int = 300):
"""Rate limit by user ID. Default: 300 req/min."""
await check_rate_limit(f"user:{user_id}", max_requests)

View file

@ -0,0 +1,12 @@
"""Models package."""
from app.models.models import (
User, Shader, Vote, EngagementEvent, Desire, DesireCluster,
BountyTip, CreatorPayout, ApiKey, GenerationLog, Comment,
SourceUnlock, CreatorEngagementSnapshot,
)
__all__ = [
"User", "Shader", "Vote", "EngagementEvent", "Desire", "DesireCluster",
"BountyTip", "CreatorPayout", "ApiKey", "GenerationLog", "Comment",
"SourceUnlock", "CreatorEngagementSnapshot",
]

View file

@ -0,0 +1,222 @@
"""Fractafrag — SQLAlchemy ORM Models."""
import uuid
from datetime import datetime
from sqlalchemy import (
Column, String, Text, Boolean, Integer, Float, SmallInteger,
ForeignKey, DateTime, UniqueConstraint, Index, CheckConstraint,
)
from sqlalchemy.dialects.postgresql import UUID, JSONB, ARRAY
from pgvector.sqlalchemy import Vector
from sqlalchemy.orm import relationship
from app.database import Base
class User(Base):
__tablename__ = "users"
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
username = Column(String, unique=True, nullable=False, index=True)
email = Column(String, unique=True, nullable=False, index=True)
password_hash = Column(String, nullable=False)
role = Column(String, nullable=False, default="user")
trust_tier = Column(String, nullable=False, default="standard")
stripe_customer_id = Column(String, nullable=True)
subscription_tier = Column(String, default="free")
ai_credits_remaining = Column(Integer, default=0)
taste_vector = Column(Vector(512), nullable=True)
# Creator economy stubs
is_verified_creator = Column(Boolean, default=False)
verified_creator_at = Column(DateTime(timezone=True), nullable=True)
stripe_connect_account_id = Column(String, nullable=True)
# Timestamps
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
last_active_at = Column(DateTime(timezone=True), nullable=True)
# Relationships
shaders = relationship("Shader", back_populates="author")
votes = relationship("Vote", back_populates="user")
api_keys = relationship("ApiKey", back_populates="user")
class Shader(Base):
__tablename__ = "shaders"
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
author_id = Column(UUID(as_uuid=True), ForeignKey("users.id", ondelete="SET NULL"), nullable=True)
title = Column(String, nullable=False)
description = Column(Text, nullable=True)
glsl_code = Column(Text, nullable=False)
is_public = Column(Boolean, default=True)
is_ai_generated = Column(Boolean, default=False)
ai_provider = Column(String, nullable=True)
thumbnail_url = Column(String, nullable=True)
preview_url = Column(String, nullable=True)
render_status = Column(String, default="pending")
style_vector = Column(Vector(512), nullable=True)
style_metadata = Column(JSONB, nullable=True)
tags = Column(ARRAY(String), default=list)
shader_type = Column(String, default="2d")
forked_from = Column(UUID(as_uuid=True), ForeignKey("shaders.id", ondelete="SET NULL"), nullable=True)
view_count = Column(Integer, default=0)
score = Column(Float, default=0.0)
# Creator economy stubs
access_tier = Column(String, default="open")
source_unlock_price_cents = Column(Integer, nullable=True)
commercial_license_price_cents = Column(Integer, nullable=True)
verified_creator_shader = Column(Boolean, default=False)
# Timestamps
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
updated_at = Column(DateTime(timezone=True), default=datetime.utcnow, onupdate=datetime.utcnow)
# Relationships
author = relationship("User", back_populates="shaders")
votes = relationship("Vote", back_populates="shader")
class Vote(Base):
__tablename__ = "votes"
__table_args__ = (UniqueConstraint("user_id", "shader_id"),)
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
user_id = Column(UUID(as_uuid=True), ForeignKey("users.id", ondelete="CASCADE"), nullable=False)
shader_id = Column(UUID(as_uuid=True), ForeignKey("shaders.id", ondelete="CASCADE"), nullable=False)
value = Column(SmallInteger, nullable=False)
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
user = relationship("User", back_populates="votes")
shader = relationship("Shader", back_populates="votes")
class EngagementEvent(Base):
__tablename__ = "engagement_events"
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
user_id = Column(UUID(as_uuid=True), ForeignKey("users.id", ondelete="SET NULL"), nullable=True)
session_id = Column(String, nullable=True)
shader_id = Column(UUID(as_uuid=True), ForeignKey("shaders.id", ondelete="CASCADE"), nullable=False)
event_type = Column(String, nullable=False)
dwell_secs = Column(Float, nullable=True)
metadata = Column(JSONB, nullable=True)
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
class Desire(Base):
__tablename__ = "desires"
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
author_id = Column(UUID(as_uuid=True), ForeignKey("users.id", ondelete="SET NULL"), nullable=True)
prompt_text = Column(Text, nullable=False)
prompt_embedding = Column(Vector(512), nullable=True)
style_hints = Column(JSONB, nullable=True)
tip_amount_cents = Column(Integer, default=0)
status = Column(String, default="open")
heat_score = Column(Float, default=1.0)
fulfilled_by_shader = Column(UUID(as_uuid=True), ForeignKey("shaders.id", ondelete="SET NULL"), nullable=True)
fulfilled_at = Column(DateTime(timezone=True), nullable=True)
expires_at = Column(DateTime(timezone=True), nullable=True)
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
class DesireCluster(Base):
__tablename__ = "desire_clusters"
cluster_id = Column(UUID(as_uuid=True), primary_key=True)
desire_id = Column(UUID(as_uuid=True), ForeignKey("desires.id", ondelete="CASCADE"), primary_key=True)
similarity = Column(Float)
class BountyTip(Base):
__tablename__ = "bounty_tips"
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
desire_id = Column(UUID(as_uuid=True), ForeignKey("desires.id", ondelete="CASCADE"), nullable=False)
tipper_id = Column(UUID(as_uuid=True), ForeignKey("users.id", ondelete="SET NULL"), nullable=True)
amount_cents = Column(Integer, nullable=False)
stripe_payment_intent_id = Column(String, nullable=True)
status = Column(String, default="held")
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
class CreatorPayout(Base):
__tablename__ = "creator_payouts"
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
creator_id = Column(UUID(as_uuid=True), ForeignKey("users.id", ondelete="SET NULL"), nullable=True)
desire_id = Column(UUID(as_uuid=True), ForeignKey("desires.id", ondelete="SET NULL"), nullable=True)
gross_amount_cents = Column(Integer)
platform_fee_cents = Column(Integer)
net_amount_cents = Column(Integer)
stripe_transfer_id = Column(String, nullable=True)
status = Column(String, default="pending")
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
class ApiKey(Base):
__tablename__ = "api_keys"
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
user_id = Column(UUID(as_uuid=True), ForeignKey("users.id", ondelete="CASCADE"), nullable=False)
key_hash = Column(String, unique=True, nullable=False)
key_prefix = Column(String, nullable=False)
name = Column(String, nullable=True)
trust_tier = Column(String, default="probation")
submissions_approved = Column(Integer, default=0)
rate_limit_per_hour = Column(Integer, default=10)
last_used_at = Column(DateTime(timezone=True), nullable=True)
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
revoked_at = Column(DateTime(timezone=True), nullable=True)
user = relationship("User", back_populates="api_keys")
class GenerationLog(Base):
__tablename__ = "generation_log"
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
user_id = Column(UUID(as_uuid=True), ForeignKey("users.id", ondelete="SET NULL"), nullable=True)
shader_id = Column(UUID(as_uuid=True), ForeignKey("shaders.id", ondelete="SET NULL"), nullable=True)
provider = Column(String, nullable=False)
prompt_text = Column(Text, nullable=True)
tokens_used = Column(Integer, nullable=True)
cost_cents = Column(Integer, nullable=True)
success = Column(Boolean, nullable=True)
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
class Comment(Base):
__tablename__ = "comments"
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
shader_id = Column(UUID(as_uuid=True), ForeignKey("shaders.id", ondelete="CASCADE"), nullable=False)
author_id = Column(UUID(as_uuid=True), ForeignKey("users.id", ondelete="SET NULL"), nullable=True)
body = Column(Text, nullable=False)
parent_id = Column(UUID(as_uuid=True), ForeignKey("comments.id", ondelete="CASCADE"), nullable=True)
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
# Creator economy stubs (dormant)
class SourceUnlock(Base):
__tablename__ = "source_unlocks"
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
shader_id = Column(UUID(as_uuid=True), ForeignKey("shaders.id", ondelete="CASCADE"), nullable=False)
buyer_id = Column(UUID(as_uuid=True), ForeignKey("users.id", ondelete="SET NULL"), nullable=True)
license_type = Column(String, nullable=False)
amount_cents = Column(Integer, nullable=False)
platform_fee_cents = Column(Integer, nullable=False)
stripe_payment_intent_id = Column(String, nullable=True)
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
class CreatorEngagementSnapshot(Base):
__tablename__ = "creator_engagement_snapshots"
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
creator_id = Column(UUID(as_uuid=True), ForeignKey("users.id", ondelete="CASCADE"), nullable=False)
month = Column(DateTime, nullable=False)
total_score = Column(Float, nullable=False)
pool_share = Column(Float, nullable=True)
payout_cents = Column(Integer, nullable=True)
paid_at = Column(DateTime(timezone=True), nullable=True)
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)

28
services/api/app/redis.py Normal file
View file

@ -0,0 +1,28 @@
"""Fractafrag — Redis connection manager."""
import redis.asyncio as redis
from app.config import get_settings
settings = get_settings()
redis_client: redis.Redis | None = None
async def get_redis() -> redis.Redis:
"""Get or create the Redis connection."""
global redis_client
if redis_client is None:
redis_client = redis.from_url(
settings.redis_url,
encoding="utf-8",
decode_responses=True,
)
return redis_client
async def close_redis():
"""Close Redis connection on shutdown."""
global redis_client
if redis_client:
await redis_client.close()
redis_client = None

View file

@ -0,0 +1 @@
"""Routers package."""

View file

@ -0,0 +1,157 @@
"""Auth router — registration, login, refresh, logout."""
from datetime import datetime, timezone
from fastapi import APIRouter, Depends, HTTPException, Response, Request, status
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
import httpx
from app.database import get_db
from app.config import get_settings
from app.models import User
from app.schemas import UserRegister, UserLogin, TokenResponse, UserMe
from app.middleware.auth import (
hash_password, verify_password,
create_access_token, create_refresh_token,
decode_token, blocklist_token, is_token_blocklisted,
get_current_user,
)
router = APIRouter()
settings = get_settings()
REFRESH_COOKIE_NAME = "fractafrag_refresh"
REFRESH_COOKIE_MAX_AGE = 30 * 24 * 60 * 60 # 30 days
async def verify_turnstile(token: str) -> bool:
"""Verify Cloudflare Turnstile token server-side."""
if not settings.turnstile_secret:
return True # Skip in dev if not configured
async with httpx.AsyncClient() as client:
resp = await client.post(
"https://challenges.cloudflare.com/turnstile/v0/siteverify",
data={"secret": settings.turnstile_secret, "response": token},
)
result = resp.json()
return result.get("success", False)
def set_refresh_cookie(response: Response, token: str):
response.set_cookie(
key=REFRESH_COOKIE_NAME,
value=token,
max_age=REFRESH_COOKIE_MAX_AGE,
httponly=True,
secure=True,
samesite="lax",
path="/api/v1/auth",
)
@router.post("/register", response_model=TokenResponse, status_code=status.HTTP_201_CREATED)
async def register(
body: UserRegister,
response: Response,
db: AsyncSession = Depends(get_db),
):
# Verify Turnstile
if not await verify_turnstile(body.turnstile_token):
raise HTTPException(status_code=400, detail="CAPTCHA verification failed")
# Check for existing user
existing = await db.execute(
select(User).where((User.email == body.email) | (User.username == body.username))
)
if existing.scalar_one_or_none():
raise HTTPException(status_code=409, detail="Username or email already taken")
# Create user
user = User(
username=body.username,
email=body.email,
password_hash=hash_password(body.password),
)
db.add(user)
await db.flush()
# Issue tokens
access = create_access_token(user.id, user.username, user.role, user.subscription_tier)
refresh = create_refresh_token(user.id)
set_refresh_cookie(response, refresh)
return TokenResponse(access_token=access)
@router.post("/login", response_model=TokenResponse)
async def login(
body: UserLogin,
response: Response,
db: AsyncSession = Depends(get_db),
):
if not await verify_turnstile(body.turnstile_token):
raise HTTPException(status_code=400, detail="CAPTCHA verification failed")
result = await db.execute(select(User).where(User.email == body.email))
user = result.scalar_one_or_none()
if not user or not verify_password(body.password, user.password_hash):
raise HTTPException(status_code=401, detail="Invalid email or password")
# Update last active
user.last_active_at = datetime.now(timezone.utc)
access = create_access_token(user.id, user.username, user.role, user.subscription_tier)
refresh = create_refresh_token(user.id)
set_refresh_cookie(response, refresh)
return TokenResponse(access_token=access)
@router.post("/refresh", response_model=TokenResponse)
async def refresh_token(
request: Request,
response: Response,
db: AsyncSession = Depends(get_db),
):
token = request.cookies.get(REFRESH_COOKIE_NAME)
if not token:
raise HTTPException(status_code=401, detail="No refresh token")
if await is_token_blocklisted(token):
raise HTTPException(status_code=401, detail="Token has been revoked")
payload = decode_token(token)
if payload.get("type") != "refresh":
raise HTTPException(status_code=401, detail="Not a refresh token")
user_id = payload.get("sub")
result = await db.execute(select(User).where(User.id == user_id))
user = result.scalar_one_or_none()
if not user:
raise HTTPException(status_code=401, detail="User not found")
# Rotate: blocklist old refresh, issue new pair
ttl = settings.jwt_refresh_token_expire_days * 86400
await blocklist_token(token, ttl)
access = create_access_token(user.id, user.username, user.role, user.subscription_tier)
new_refresh = create_refresh_token(user.id)
set_refresh_cookie(response, new_refresh)
return TokenResponse(access_token=access)
@router.post("/logout", status_code=status.HTTP_204_NO_CONTENT)
async def logout(
request: Request,
response: Response,
user: User = Depends(get_current_user),
):
token = request.cookies.get(REFRESH_COOKIE_NAME)
if token:
ttl = settings.jwt_refresh_token_expire_days * 86400
await blocklist_token(token, ttl)
response.delete_cookie(REFRESH_COOKIE_NAME, path="/api/v1/auth")

View file

@ -0,0 +1,97 @@
"""Desires & Bounties router."""
from uuid import UUID
from fastapi import APIRouter, Depends, HTTPException, Query, status
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
from app.database import get_db
from app.models import User, Desire
from app.schemas import DesireCreate, DesirePublic
from app.middleware.auth import get_current_user, require_tier
router = APIRouter()
@router.get("", response_model=list[DesirePublic])
async def list_desires(
status_filter: str | None = Query(None, alias="status"),
min_heat: float = Query(0, ge=0),
limit: int = Query(20, ge=1, le=50),
offset: int = Query(0, ge=0),
db: AsyncSession = Depends(get_db),
):
query = select(Desire).where(Desire.heat_score >= min_heat)
if status_filter:
query = query.where(Desire.status == status_filter)
else:
query = query.where(Desire.status == "open")
query = query.order_by(Desire.heat_score.desc()).limit(limit).offset(offset)
result = await db.execute(query)
return result.scalars().all()
@router.get("/{desire_id}", response_model=DesirePublic)
async def get_desire(desire_id: UUID, db: AsyncSession = Depends(get_db)):
result = await db.execute(select(Desire).where(Desire.id == desire_id))
desire = result.scalar_one_or_none()
if not desire:
raise HTTPException(status_code=404, detail="Desire not found")
return desire
@router.post("", response_model=DesirePublic, status_code=status.HTTP_201_CREATED)
async def create_desire(
body: DesireCreate,
db: AsyncSession = Depends(get_db),
user: User = Depends(require_tier("pro", "studio")),
):
desire = Desire(
author_id=user.id,
prompt_text=body.prompt_text,
style_hints=body.style_hints,
)
db.add(desire)
await db.flush()
# TODO: Embed prompt text (Track G)
# TODO: Check similarity clustering (Track G)
# TODO: Enqueue process_desire worker job (Track G)
return desire
@router.post("/{desire_id}/fulfill", status_code=status.HTTP_200_OK)
async def fulfill_desire(
desire_id: UUID,
shader_id: UUID = Query(..., description="Shader that fulfills this desire"),
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
"""Mark a desire as fulfilled by a shader. (Track G)"""
desire = (await db.execute(select(Desire).where(Desire.id == desire_id))).scalar_one_or_none()
if not desire:
raise HTTPException(status_code=404, detail="Desire not found")
if desire.status != "open":
raise HTTPException(status_code=400, detail="Desire is not open")
from datetime import datetime, timezone
desire.status = "fulfilled"
desire.fulfilled_by_shader = shader_id
desire.fulfilled_at = datetime.now(timezone.utc)
return {"status": "fulfilled", "desire_id": desire_id, "shader_id": shader_id}
@router.post("/{desire_id}/tip")
async def tip_desire(
desire_id: UUID,
db: AsyncSession = Depends(get_db),
user: User = Depends(require_tier("pro", "studio")),
):
"""Add a tip to a bounty. (Track H — stub)"""
raise HTTPException(
status_code=status.HTTP_501_NOT_IMPLEMENTED,
detail="Bounty tipping coming in M4"
)

View file

@ -0,0 +1,86 @@
"""Feed router — personalized feed, trending, new."""
from fastapi import APIRouter, Depends, Query
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
from app.database import get_db
from app.models import User, Shader
from app.schemas import ShaderFeedItem, DwellReport
from app.middleware.auth import get_optional_user, get_current_user
router = APIRouter()
@router.get("", response_model=list[ShaderFeedItem])
async def get_feed(
limit: int = Query(20, ge=1, le=50),
cursor: str | None = Query(None),
db: AsyncSession = Depends(get_db),
user: User | None = Depends(get_optional_user),
):
"""
Personalized feed for authenticated users (pgvector taste match).
Trending/new for anonymous users.
"""
# TODO: Implement full recommendation engine (Track F)
# For now: return newest public shaders
query = (
select(Shader)
.where(Shader.is_public == True, Shader.render_status == "ready")
.order_by(Shader.created_at.desc())
.limit(limit)
)
result = await db.execute(query)
return result.scalars().all()
@router.get("/trending", response_model=list[ShaderFeedItem])
async def get_trending(
limit: int = Query(20, ge=1, le=50),
db: AsyncSession = Depends(get_db),
):
query = (
select(Shader)
.where(Shader.is_public == True, Shader.render_status == "ready")
.order_by(Shader.score.desc())
.limit(limit)
)
result = await db.execute(query)
return result.scalars().all()
@router.get("/new", response_model=list[ShaderFeedItem])
async def get_new(
limit: int = Query(20, ge=1, le=50),
db: AsyncSession = Depends(get_db),
):
query = (
select(Shader)
.where(Shader.is_public == True, Shader.render_status == "ready")
.order_by(Shader.created_at.desc())
.limit(limit)
)
result = await db.execute(query)
return result.scalars().all()
@router.post("/dwell", status_code=204)
async def report_dwell(
body: DwellReport,
db: AsyncSession = Depends(get_db),
user: User | None = Depends(get_optional_user),
):
"""Report dwell time signal for recommendation engine."""
from app.models import EngagementEvent
event = EngagementEvent(
user_id=user.id if user else None,
session_id=body.session_id,
shader_id=body.shader_id,
event_type="dwell",
dwell_secs=body.dwell_secs,
metadata={"replayed": body.replayed},
)
db.add(event)
# TODO: Update user taste vector (Track F)

View file

@ -0,0 +1,49 @@
"""AI Generation router — start generation, poll status, check credits."""
from fastapi import APIRouter, Depends, HTTPException, status
from sqlalchemy.ext.asyncio import AsyncSession
from app.database import get_db
from app.models import User
from app.schemas import GenerateRequest, GenerateStatusResponse
from app.middleware.auth import get_current_user
router = APIRouter()
@router.post("", response_model=GenerateStatusResponse)
async def start_generation(
body: GenerateRequest,
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
"""Start an AI shader generation job. (Track I — stub)"""
# TODO: Implement in Track I
# - Credits check / BYOK validation
# - Enqueue ai_generate job
# - Return job_id for polling
raise HTTPException(
status_code=status.HTTP_501_NOT_IMPLEMENTED,
detail="AI generation coming in M5"
)
@router.get("/status/{job_id}", response_model=GenerateStatusResponse)
async def get_generation_status(
job_id: str,
user: User = Depends(get_current_user),
):
"""Poll AI generation job status. (Track I — stub)"""
raise HTTPException(
status_code=status.HTTP_501_NOT_IMPLEMENTED,
detail="AI generation coming in M5"
)
@router.get("/credits")
async def get_credits(user: User = Depends(get_current_user)):
"""Check remaining AI generation credits."""
return {
"credits_remaining": user.ai_credits_remaining,
"subscription_tier": user.subscription_tier,
}

View file

@ -0,0 +1,32 @@
"""Health check endpoint — outside /api/v1 prefix for Docker healthchecks."""
from fastapi import APIRouter, Depends
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import text
from app.database import get_db
from app.redis import get_redis
router = APIRouter()
@router.get("/health")
async def health_check(db: AsyncSession = Depends(get_db)):
"""Basic health check — verifies API, DB, and Redis are reachable."""
checks = {"api": "ok", "database": "error", "redis": "error"}
try:
await db.execute(text("SELECT 1"))
checks["database"] = "ok"
except Exception:
pass
try:
redis = await get_redis()
await redis.ping()
checks["redis"] = "ok"
except Exception:
pass
healthy = all(v == "ok" for v in checks.values())
return {"status": "healthy" if healthy else "degraded", "checks": checks}

View file

@ -0,0 +1,85 @@
"""MCP API Key management router."""
import secrets
from uuid import UUID
from fastapi import APIRouter, Depends, HTTPException, status
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
from passlib.context import CryptContext
from app.database import get_db
from app.models import User, ApiKey
from app.schemas import ApiKeyCreate, ApiKeyPublic, ApiKeyCreated
from app.middleware.auth import get_current_user, require_tier
router = APIRouter()
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
def generate_api_key() -> tuple[str, str, str]:
"""Generate an API key. Returns (full_key, prefix, hash)."""
raw = secrets.token_bytes(32)
# base58-like encoding using alphanumeric chars
import base64
encoded = base64.b32encode(raw).decode().rstrip("=").lower()
full_key = f"ff_key_{encoded}"
prefix = full_key[:16] # ff_key_ + 8 chars
key_hash = pwd_context.hash(full_key)
return full_key, prefix, key_hash
@router.get("", response_model=list[ApiKeyPublic])
async def list_api_keys(
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
result = await db.execute(
select(ApiKey).where(ApiKey.user_id == user.id, ApiKey.revoked_at == None)
)
return result.scalars().all()
@router.post("", response_model=ApiKeyCreated, status_code=status.HTTP_201_CREATED)
async def create_api_key(
body: ApiKeyCreate,
db: AsyncSession = Depends(get_db),
user: User = Depends(require_tier("pro", "studio")),
):
full_key, prefix, key_hash = generate_api_key()
api_key = ApiKey(
user_id=user.id,
key_hash=key_hash,
key_prefix=prefix,
name=body.name,
)
db.add(api_key)
await db.flush()
return ApiKeyCreated(
id=api_key.id,
key_prefix=prefix,
name=body.name,
trust_tier=api_key.trust_tier,
rate_limit_per_hour=api_key.rate_limit_per_hour,
last_used_at=None,
created_at=api_key.created_at,
full_key=full_key,
)
@router.delete("/{key_id}", status_code=status.HTTP_204_NO_CONTENT)
async def revoke_api_key(
key_id: UUID,
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
result = await db.execute(
select(ApiKey).where(ApiKey.id == key_id, ApiKey.user_id == user.id)
)
api_key = result.scalar_one_or_none()
if not api_key:
raise HTTPException(status_code=404, detail="API key not found")
from datetime import datetime, timezone
api_key.revoked_at = datetime.now(timezone.utc)

View file

@ -0,0 +1,38 @@
"""Payments router — Stripe subscriptions, credits, webhooks. (Track H — stubs)"""
from fastapi import APIRouter, Depends, HTTPException, Request, status
from app.models import User
from app.middleware.auth import get_current_user
router = APIRouter()
@router.post("/checkout")
async def create_checkout(user: User = Depends(get_current_user)):
"""Create Stripe checkout session for subscription. (Track H)"""
raise HTTPException(status_code=501, detail="Payments coming in M4")
@router.post("/webhook")
async def stripe_webhook(request: Request):
"""Handle Stripe webhook events. (Track H)"""
raise HTTPException(status_code=501, detail="Payments coming in M4")
@router.get("/portal")
async def customer_portal(user: User = Depends(get_current_user)):
"""Get Stripe customer portal URL. (Track H)"""
raise HTTPException(status_code=501, detail="Payments coming in M4")
@router.post("/credits")
async def purchase_credits(user: User = Depends(get_current_user)):
"""Purchase AI credit pack. (Track H)"""
raise HTTPException(status_code=501, detail="Payments coming in M4")
@router.post("/connect/onboard")
async def connect_onboard(user: User = Depends(get_current_user)):
"""Start Stripe Connect creator onboarding. (Track H)"""
raise HTTPException(status_code=501, detail="Payments coming in M4")

View file

@ -0,0 +1,154 @@
"""Shaders router — CRUD, submit, fork, search."""
from uuid import UUID
from fastapi import APIRouter, Depends, HTTPException, Query, status
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select, func, or_
from app.database import get_db
from app.models import User, Shader
from app.schemas import ShaderCreate, ShaderUpdate, ShaderPublic
from app.middleware.auth import get_current_user, get_optional_user
router = APIRouter()
@router.get("", response_model=list[ShaderPublic])
async def list_shaders(
q: str | None = Query(None, description="Search query"),
tags: list[str] | None = Query(None, description="Filter by tags"),
shader_type: str | None = Query(None, description="Filter by type: 2d, 3d, audio-reactive"),
sort: str = Query("trending", description="Sort: trending, new, top"),
limit: int = Query(20, ge=1, le=50),
offset: int = Query(0, ge=0),
db: AsyncSession = Depends(get_db),
):
query = select(Shader).where(Shader.is_public == True, Shader.render_status == "ready")
if q:
query = query.where(Shader.title.ilike(f"%{q}%"))
if tags:
query = query.where(Shader.tags.overlap(tags))
if shader_type:
query = query.where(Shader.shader_type == shader_type)
if sort == "new":
query = query.order_by(Shader.created_at.desc())
elif sort == "top":
query = query.order_by(Shader.score.desc())
else: # trending
query = query.order_by(Shader.score.desc(), Shader.created_at.desc())
query = query.limit(limit).offset(offset)
result = await db.execute(query)
return result.scalars().all()
@router.get("/{shader_id}", response_model=ShaderPublic)
async def get_shader(
shader_id: UUID,
db: AsyncSession = Depends(get_db),
user: User | None = Depends(get_optional_user),
):
result = await db.execute(select(Shader).where(Shader.id == shader_id))
shader = result.scalar_one_or_none()
if not shader:
raise HTTPException(status_code=404, detail="Shader not found")
if not shader.is_public and (not user or user.id != shader.author_id):
raise HTTPException(status_code=404, detail="Shader not found")
# Increment view count
shader.view_count += 1
return shader
@router.post("", response_model=ShaderPublic, status_code=status.HTTP_201_CREATED)
async def create_shader(
body: ShaderCreate,
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
# TODO: Turnstile verification for submit
# TODO: Rate limit check (free tier: 5/month)
# TODO: GLSL validation via glslang
# TODO: Enqueue render job
shader = Shader(
author_id=user.id,
title=body.title,
description=body.description,
glsl_code=body.glsl_code,
tags=body.tags,
shader_type=body.shader_type,
is_public=body.is_public,
style_metadata=body.style_metadata,
render_status="pending",
)
db.add(shader)
await db.flush()
return shader
@router.put("/{shader_id}", response_model=ShaderPublic)
async def update_shader(
shader_id: UUID,
body: ShaderUpdate,
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
result = await db.execute(select(Shader).where(Shader.id == shader_id))
shader = result.scalar_one_or_none()
if not shader:
raise HTTPException(status_code=404, detail="Shader not found")
if shader.author_id != user.id and user.role != "admin":
raise HTTPException(status_code=403, detail="Not the shader owner")
for field, value in body.model_dump(exclude_unset=True).items():
setattr(shader, field, value)
return shader
@router.delete("/{shader_id}", status_code=status.HTTP_204_NO_CONTENT)
async def delete_shader(
shader_id: UUID,
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
result = await db.execute(select(Shader).where(Shader.id == shader_id))
shader = result.scalar_one_or_none()
if not shader:
raise HTTPException(status_code=404, detail="Shader not found")
if shader.author_id != user.id and user.role != "admin":
raise HTTPException(status_code=403, detail="Not the shader owner")
await db.delete(shader)
@router.post("/{shader_id}/fork", response_model=ShaderPublic, status_code=status.HTTP_201_CREATED)
async def fork_shader(
shader_id: UUID,
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
result = await db.execute(select(Shader).where(Shader.id == shader_id))
original = result.scalar_one_or_none()
if not original:
raise HTTPException(status_code=404, detail="Shader not found")
if not original.is_public:
raise HTTPException(status_code=404, detail="Shader not found")
forked = Shader(
author_id=user.id,
title=f"Fork of {original.title}",
description=f"Forked from {original.title}",
glsl_code=original.glsl_code,
tags=original.tags,
shader_type=original.shader_type,
forked_from=original.id,
render_status="pending",
)
db.add(forked)
await db.flush()
return forked

View file

@ -0,0 +1,68 @@
"""Users & Settings router."""
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
from app.database import get_db
from app.models import User
from app.schemas import UserPublic, UserMe
from app.middleware.auth import get_current_user
router = APIRouter()
@router.get("/users/{username}", response_model=UserPublic)
async def get_user_profile(username: str, db: AsyncSession = Depends(get_db)):
result = await db.execute(select(User).where(User.username == username))
user = result.scalar_one_or_none()
if not user:
raise HTTPException(status_code=404, detail="User not found")
return user
@router.get("/me", response_model=UserMe)
async def get_me(user: User = Depends(get_current_user)):
return user
@router.put("/me", response_model=UserMe)
async def update_me(
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
"""Update user settings. (Expanded in Track B)"""
# TODO: Accept settings updates (username, email, etc.)
return user
# ── Creator Economy Stubs (501) ─────────────────────────────
@router.get("/dashboard")
async def creator_dashboard(user: User = Depends(get_current_user)):
raise HTTPException(status_code=501, detail="Creator dashboard coming in future release")
@router.get("/shaders/{shader_id}/unlock-status")
async def unlock_status(shader_id: str, user: User = Depends(get_current_user)):
raise HTTPException(status_code=501, detail="Source unlock coming in future release")
@router.post("/shaders/{shader_id}/unlock")
async def unlock_source(shader_id: str, user: User = Depends(get_current_user)):
raise HTTPException(status_code=501, detail="Source unlock coming in future release")
@router.post("/shaders/{shader_id}/commercial")
async def purchase_commercial(shader_id: str, user: User = Depends(get_current_user)):
raise HTTPException(status_code=501, detail="Commercial licensing coming in future release")
@router.post("/me/creator/apply")
async def apply_verified(user: User = Depends(get_current_user)):
raise HTTPException(status_code=501, detail="Verified creator program coming in future release")
@router.get("/me/creator/earnings")
async def creator_earnings(user: User = Depends(get_current_user)):
raise HTTPException(status_code=501, detail="Creator earnings coming in future release")

View file

@ -0,0 +1,68 @@
"""Votes & engagement router."""
from uuid import UUID
from fastapi import APIRouter, Depends, HTTPException, status
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
from app.database import get_db
from app.models import User, Shader, Vote, EngagementEvent
from app.schemas import VoteCreate
from app.middleware.auth import get_current_user, get_optional_user
router = APIRouter()
@router.post("/shaders/{shader_id}/vote", status_code=status.HTTP_200_OK)
async def vote_shader(
shader_id: UUID,
body: VoteCreate,
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
# Verify shader exists
shader = (await db.execute(select(Shader).where(Shader.id == shader_id))).scalar_one_or_none()
if not shader:
raise HTTPException(status_code=404, detail="Shader not found")
# Upsert vote
existing = (await db.execute(
select(Vote).where(Vote.user_id == user.id, Vote.shader_id == shader_id)
)).scalar_one_or_none()
if existing:
existing.value = body.value
else:
db.add(Vote(user_id=user.id, shader_id=shader_id, value=body.value))
# TODO: Recalculate hot score (Track F)
return {"status": "ok", "value": body.value}
@router.delete("/shaders/{shader_id}/vote", status_code=status.HTTP_204_NO_CONTENT)
async def remove_vote(
shader_id: UUID,
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
existing = (await db.execute(
select(Vote).where(Vote.user_id == user.id, Vote.shader_id == shader_id)
)).scalar_one_or_none()
if existing:
await db.delete(existing)
# TODO: Recalculate hot score (Track F)
@router.post("/shaders/{shader_id}/replay", status_code=status.HTTP_204_NO_CONTENT)
async def report_replay(
shader_id: UUID,
db: AsyncSession = Depends(get_db),
user: User | None = Depends(get_optional_user),
):
event = EngagementEvent(
user_id=user.id if user else None,
shader_id=shader_id,
event_type="replay",
)
db.add(event)

View file

@ -0,0 +1,2 @@
"""Schemas package."""
from app.schemas.schemas import *

View file

@ -0,0 +1,204 @@
"""Fractafrag — Pydantic Request/Response Schemas."""
from __future__ import annotations
from datetime import datetime
from uuid import UUID
from typing import Optional
from pydantic import BaseModel, EmailStr, Field, ConfigDict
# ════════════════════════════════════════════════════════════
# AUTH
# ════════════════════════════════════════════════════════════
class UserRegister(BaseModel):
username: str = Field(..., min_length=3, max_length=30, pattern=r"^[a-zA-Z0-9_-]+$")
email: EmailStr
password: str = Field(..., min_length=8, max_length=128)
turnstile_token: str
class UserLogin(BaseModel):
email: EmailStr
password: str
turnstile_token: str
class TokenResponse(BaseModel):
access_token: str
token_type: str = "bearer"
class UserPublic(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: UUID
username: str
role: str
subscription_tier: str
is_verified_creator: bool
created_at: datetime
class UserMe(UserPublic):
email: str
ai_credits_remaining: int
trust_tier: str
last_active_at: Optional[datetime] = None
# ════════════════════════════════════════════════════════════
# SHADERS
# ════════════════════════════════════════════════════════════
class ShaderCreate(BaseModel):
title: str = Field(..., min_length=1, max_length=120)
description: Optional[str] = Field(None, max_length=1000)
glsl_code: str = Field(..., min_length=10)
tags: list[str] = Field(default_factory=list, max_length=10)
shader_type: str = Field(default="2d", pattern=r"^(2d|3d|audio-reactive)$")
is_public: bool = True
style_metadata: Optional[dict] = None
fulfills_desire_id: Optional[UUID] = None
class ShaderUpdate(BaseModel):
title: Optional[str] = Field(None, min_length=1, max_length=120)
description: Optional[str] = Field(None, max_length=1000)
glsl_code: Optional[str] = Field(None, min_length=10)
tags: Optional[list[str]] = None
is_public: Optional[bool] = None
class ShaderPublic(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: UUID
author_id: Optional[UUID]
title: str
description: Optional[str]
glsl_code: str
is_public: bool
is_ai_generated: bool
ai_provider: Optional[str]
thumbnail_url: Optional[str]
preview_url: Optional[str]
render_status: str
style_metadata: Optional[dict]
tags: list[str]
shader_type: str
forked_from: Optional[UUID]
view_count: int
score: float
created_at: datetime
updated_at: datetime
class ShaderFeedItem(BaseModel):
"""Lighter shader representation for feed responses."""
model_config = ConfigDict(from_attributes=True)
id: UUID
author_id: Optional[UUID]
title: str
thumbnail_url: Optional[str]
preview_url: Optional[str]
glsl_code: str
tags: list[str]
shader_type: str
score: float
view_count: int
is_ai_generated: bool
style_metadata: Optional[dict]
created_at: datetime
# ════════════════════════════════════════════════════════════
# VOTES & ENGAGEMENT
# ════════════════════════════════════════════════════════════
class VoteCreate(BaseModel):
value: int = Field(..., ge=-1, le=1)
class DwellReport(BaseModel):
shader_id: UUID
dwell_secs: float = Field(..., gt=0)
replayed: bool = False
session_id: Optional[str] = None
# ════════════════════════════════════════════════════════════
# DESIRES / BOUNTIES
# ════════════════════════════════════════════════════════════
class DesireCreate(BaseModel):
prompt_text: str = Field(..., min_length=5, max_length=500)
style_hints: Optional[dict] = None
class DesirePublic(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: UUID
author_id: Optional[UUID]
prompt_text: str
style_hints: Optional[dict]
tip_amount_cents: int
status: str
heat_score: float
fulfilled_by_shader: Optional[UUID]
fulfilled_at: Optional[datetime]
created_at: datetime
# ════════════════════════════════════════════════════════════
# AI GENERATION
# ════════════════════════════════════════════════════════════
class GenerateRequest(BaseModel):
prompt: str = Field(..., min_length=5, max_length=500)
provider: Optional[str] = None # anthropic, openai, ollama — auto-selected if None
style_metadata: Optional[dict] = None
class GenerateStatusResponse(BaseModel):
job_id: str
status: str # queued, generating, rendering, complete, failed
shader_id: Optional[UUID] = None
error: Optional[str] = None
# ════════════════════════════════════════════════════════════
# API KEYS
# ════════════════════════════════════════════════════════════
class ApiKeyCreate(BaseModel):
name: str = Field(..., min_length=1, max_length=100)
class ApiKeyPublic(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: UUID
key_prefix: str
name: Optional[str]
trust_tier: str
rate_limit_per_hour: int
last_used_at: Optional[datetime]
created_at: datetime
class ApiKeyCreated(ApiKeyPublic):
"""Returned only on creation — includes the full key (shown once)."""
full_key: str
# ════════════════════════════════════════════════════════════
# PAGINATION
# ════════════════════════════════════════════════════════════
class PaginatedResponse(BaseModel):
items: list
cursor: Optional[str] = None
has_more: bool = False

View file

@ -0,0 +1 @@
"""Services package — business logic layer."""

View file

@ -0,0 +1,90 @@
"""Fractafrag — Celery worker configuration."""
from celery import Celery
import os
redis_url = os.environ.get("REDIS_URL", "redis://redis:6379/0")
celery_app = Celery(
"fractafrag",
broker=redis_url,
backend=redis_url,
)
celery_app.conf.update(
task_serializer="json",
accept_content=["json"],
result_serializer="json",
timezone="UTC",
enable_utc=True,
task_track_started=True,
task_time_limit=120, # hard kill after 2 min
task_soft_time_limit=90, # soft warning at 90s
worker_prefetch_multiplier=1,
worker_max_tasks_per_child=100,
)
# Auto-discover tasks from worker modules
celery_app.autodiscover_tasks(["app.worker"])
# ── Task Definitions ──────────────────────────────────────
@celery_app.task(name="render_shader", bind=True, max_retries=2)
def render_shader(self, shader_id: str):
"""Render a shader via the headless Chromium renderer. (Track C)"""
# TODO: Implement in Track C
# 1. Fetch shader GLSL from DB
# 2. POST to renderer service
# 3. Store thumbnail + preview URLs
# 4. Update shader render_status
pass
@celery_app.task(name="embed_shader", bind=True)
def embed_shader(self, shader_id: str):
"""Generate style embedding vector for a shader. (Track C/F)"""
# TODO: Implement in Track C/F
pass
@celery_app.task(name="process_desire", bind=True)
def process_desire(self, desire_id: str):
"""Process a new desire: embed, cluster, optionally auto-fulfill. (Track G)"""
# TODO: Implement in Track G
pass
@celery_app.task(name="ai_generate", bind=True, max_retries=3)
def ai_generate(self, job_id: str, prompt: str, provider: str, user_id: str):
"""AI shader generation: prompt → LLM → GLSL → validate → render. (Track I)"""
# TODO: Implement in Track I
pass
@celery_app.task(name="rebuild_feed_cache")
def rebuild_feed_cache():
"""Rebuild the anonymous feed cache (trending + new). Runs every 15 min. (Track F)"""
# TODO: Implement in Track F
pass
@celery_app.task(name="expire_bounties")
def expire_bounties():
"""Mark old unfulfilled bounties as expired. Runs daily. (Track G)"""
# TODO: Implement in Track G
pass
# ── Periodic Tasks (Celery Beat) ─────────────────────────
celery_app.conf.beat_schedule = {
"rebuild-feed-cache": {
"task": "rebuild_feed_cache",
"schedule": 900.0, # every 15 minutes
},
"expire-bounties": {
"task": "expire_bounties",
"schedule": 86400.0, # daily
},
}

View file

@ -0,0 +1,47 @@
"""Alembic migrations environment."""
from logging.config import fileConfig
from sqlalchemy import engine_from_config, pool
from alembic import context
import os
config = context.config
if config.config_file_name is not None:
fileConfig(config.config_file_name)
# Override URL from env
db_url = os.environ.get("DATABASE_URL_SYNC")
if db_url:
config.set_main_option("sqlalchemy.url", db_url)
# Import models so Alembic can detect them
from app.database import Base
from app.models import * # noqa: F401, F403
target_metadata = Base.metadata
def run_migrations_offline() -> None:
url = config.get_main_option("sqlalchemy.url")
context.configure(url=url, target_metadata=target_metadata, literal_binds=True)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online() -> None:
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection:
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()

View file

@ -0,0 +1,24 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None:
${upgrades if upgrades else "pass"}
def downgrade() -> None:
${downgrades if downgrades else "pass"}

View file

@ -0,0 +1,33 @@
[project]
name = "fractafrag-api"
version = "0.1.0"
description = "Fractafrag API — GLSL shader platform backend"
requires-python = ">=3.12"
dependencies = [
"fastapi>=0.115.0",
"uvicorn[standard]>=0.32.0",
"sqlalchemy[asyncio]>=2.0.36",
"asyncpg>=0.30.0",
"psycopg2-binary>=2.9.10",
"alembic>=1.14.0",
"pydantic>=2.10.0",
"pydantic-settings>=2.7.0",
"pgvector>=0.3.6",
"redis>=5.2.0",
"celery[redis]>=5.4.0",
"passlib[bcrypt]>=1.7.4",
"python-jose[cryptography]>=3.3.0",
"httpx>=0.28.0",
"python-multipart>=0.0.12",
"stripe>=11.0.0",
"numpy>=2.0.0",
]
[project.optional-dependencies]
dev = [
"pytest>=8.0",
"pytest-asyncio>=0.24.0",
"httpx>=0.28.0",
"ruff>=0.8.0",
]

View file

@ -0,0 +1,17 @@
FROM node:20-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
# Build for production (overridden in dev)
RUN npm run build
# Serve with a simple static server
RUN npm install -g serve
CMD ["serve", "-s", "dist", "-l", "5173"]
EXPOSE 5173

View file

@ -0,0 +1,31 @@
{
"name": "fractafrag-frontend",
"private": true,
"version": "0.1.0",
"type": "module",
"scripts": {
"dev": "vite",
"build": "tsc && vite build",
"preview": "vite preview"
},
"dependencies": {
"react": "^18.3.1",
"react-dom": "^18.3.1",
"react-router-dom": "^6.28.0",
"@tanstack/react-query": "^5.62.0",
"zustand": "^5.0.0",
"three": "^0.170.0",
"axios": "^1.7.9"
},
"devDependencies": {
"@types/react": "^18.3.12",
"@types/react-dom": "^18.3.1",
"@types/three": "^0.170.0",
"@vitejs/plugin-react": "^4.3.4",
"autoprefixer": "^10.4.20",
"postcss": "^8.4.49",
"tailwindcss": "^3.4.15",
"typescript": "^5.6.3",
"vite": "^6.0.0"
}
}

11
services/mcp/Dockerfile Normal file
View file

@ -0,0 +1,11 @@
FROM python:3.12-slim
WORKDIR /app
RUN pip install --no-cache-dir mcp httpx redis
COPY . .
CMD ["python", "server.py"]
EXPOSE 3200

33
services/mcp/server.py Normal file
View file

@ -0,0 +1,33 @@
"""Fractafrag MCP Server — stub entrypoint.
Full implementation in Track E.
"""
import json
from http.server import HTTPServer, BaseHTTPRequestHandler
class MCPHandler(BaseHTTPRequestHandler):
def do_GET(self):
if self.path == "/health":
self.send_response(200)
self.send_header("Content-Type", "application/json")
self.end_headers()
self.wfile.write(json.dumps({"status": "ok", "service": "mcp"}).encode())
else:
self.send_response(501)
self.send_header("Content-Type", "application/json")
self.end_headers()
self.wfile.write(json.dumps({"error": "MCP server coming in M2"}).encode())
def do_POST(self):
self.send_response(501)
self.send_header("Content-Type", "application/json")
self.end_headers()
self.wfile.write(json.dumps({"error": "MCP server coming in M2"}).encode())
if __name__ == "__main__":
server = HTTPServer(("0.0.0.0", 3200), MCPHandler)
print("MCP server stub listening on :3200")
server.serve_forever()

View file

@ -0,0 +1,56 @@
server {
listen 80;
server_name _;
# Frontend SPA
location / {
proxy_pass http://frontend:5173;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
# API
location /api/ {
proxy_pass http://api:8000/api/;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_read_timeout 120s;
client_max_body_size 10M;
}
# MCP Server (SSE support)
location /mcp/ {
proxy_pass http://mcp:3200/;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# SSE support
proxy_set_header Connection "";
proxy_buffering off;
proxy_cache off;
proxy_read_timeout 600s;
}
# Rendered media (thumbnails, preview videos)
location /renders/ {
alias /renders/;
expires 30d;
add_header Cache-Control "public, immutable";
}
# Health check
location /health {
access_log off;
return 200 "ok";
}
}

View file

@ -0,0 +1,24 @@
FROM node:20-slim
WORKDIR /app
# Install Chromium dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
chromium \
fonts-liberation \
libatk1.0-0 libatk-bridge2.0-0 libcups2 libxcomposite1 \
libxdamage1 libxrandr2 libgbm1 libpango-1.0-0 libcairo2 \
libasound2 libnspr4 libnss3 \
&& rm -rf /var/lib/apt/lists/*
ENV PUPPETEER_EXECUTABLE_PATH=/usr/bin/chromium
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
COPY package*.json ./
RUN npm ci
COPY . .
CMD ["node", "server.js"]
EXPOSE 3100

View file

@ -0,0 +1,10 @@
{
"name": "fractafrag-renderer",
"private": true,
"version": "0.1.0",
"type": "module",
"dependencies": {
"express": "^4.21.1",
"puppeteer-core": "^23.6.0"
}
}

View file

@ -0,0 +1,58 @@
/**
* Fractafrag Renderer Headless Chromium shader render service.
*
* Accepts GLSL code via POST /render, renders in an isolated browser context,
* returns thumbnail + preview video.
*
* Full implementation in Track C.
*/
import express from 'express';
import { writeFileSync, mkdirSync, existsSync } from 'fs';
import path from 'path';
const app = express();
app.use(express.json({ limit: '1mb' }));
const PORT = 3100;
const OUTPUT_DIR = process.env.OUTPUT_DIR || '/renders';
const MAX_DURATION = parseInt(process.env.MAX_RENDER_DURATION || '8', 10);
// Ensure output directory exists
if (!existsSync(OUTPUT_DIR)) {
mkdirSync(OUTPUT_DIR, { recursive: true });
}
// Health check
app.get('/health', (req, res) => {
res.json({ status: 'ok', service: 'renderer' });
});
// Render endpoint (stub — Track C)
app.post('/render', async (req, res) => {
const { glsl, duration = 5, width = 640, height = 360, fps = 30 } = req.body;
if (!glsl) {
return res.status(400).json({ error: 'Missing glsl field' });
}
// TODO: Track C implementation
// 1. Launch Puppeteer page
// 2. Inject GLSL into shader template HTML
// 3. Capture frames for `duration` seconds
// 4. Encode to WebM/MP4 + extract thumbnail
// 5. Write to OUTPUT_DIR
// 6. Return URLs
res.status(501).json({
error: 'Renderer implementation coming in Track C',
thumbnail_url: null,
preview_url: null,
});
});
app.listen(PORT, '0.0.0.0', () => {
console.log(`Renderer service listening on :${PORT}`);
console.log(`Output dir: ${OUTPUT_DIR}`);
console.log(`Max render duration: ${MAX_DURATION}s`);
});