Compare commits

..

10 commits

Author SHA1 Message Date
John Lightner
e462c7c452 chore: gitignore Claude Code, GSD, and planning directories
Remove .claude/, .bg-shell/, .gsd/, .planning/, and CLAUDE.md
from git tracking. These are local tooling configs that should
not be in the repo.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-25 11:36:58 -05:00
John Lightner
5936ab167e feat(M001): Desire Economy
Completed slices:
- S01: Desire Embedding & Clustering
- S02: Fulfillment Flow & Frontend

Branch: milestone/M001
2026-03-25 02:22:50 -05:00
John Lightner
a5f0c0e093 Bootstrap GSD structure for M001-M003
GSD artifacts:
- .gsd/PROJECT.md — full project state, architecture, milestone sequence
- .gsd/REQUIREMENTS.md — 16 requirements (10 active, 3 validated, 3 deferred, 2 out-of-scope)
- .gsd/DECISIONS.md — 18 decisions migrated from project root + new M1/M2 decisions
- .gsd/STATE.md — active milestone M001, phase: planning
- .gsd/DISCUSSION-MANIFEST.json — all 3 gates completed

Milestone contexts:
- M001-CONTEXT.md — Desire Economy: embedding, clustering, heat, fulfillment
- M001-ROADMAP.md — 3 slices: embedding/clustering, fulfillment/frontend, integration
- M002-CONTEXT.md — Monetization: Stripe subscriptions, credits, Connect (depends M001)
- M003-CONTEXT.md — AI Generation: LLM pipeline, BYOK, retry (depends M001, M002)

Removed old DECISIONS.md from project root (migrated to .gsd/).
2026-03-25 00:45:33 -05:00
John Lightner
dc27435ca1 M2 complete: Recommendation engine + similar shaders + tag affinities
Feed ranking (anonymous users):
- score * 0.6 + recency * 0.3 + random * 0.1
- Recency uses 72-hour half-life decay
- 10% randomness prevents filter bubbles

Feed ranking (authenticated users):
- score * 0.5 + recency * 0.2 + tag_affinity * 0.2 + random * 0.1
- Tag affinity built from engagement history:
  - Upvoted shader tags: +1.0 per tag
  - Downvoted: -0.5 per tag
  - Dwell >10s: +0.3, >30s: +0.6
- Over-fetches 3x candidates, re-ranks with affinity, returns top N

Similar shaders endpoint:
- GET /api/v1/feed/similar/{shader_id}
- Finds shaders with overlapping tags
- Ranks by tag overlap count, breaks ties by score
- MCP tool: get_similar_shaders

Fix: PostgreSQL text[] && varchar[] type mismatch
- Used type_coerce() instead of cast() for ARRAY overlap operator
- Affects both shaders search-by-tags and similar-by-tags queries
2026-03-24 23:25:45 -05:00
John Lightner
cf591424a1 M2: MCP server live + hot score ranking
MCP Server (8 tools):
- browse_shaders: search by title, tags, type, sort (trending/new/top)
- get_shader: full details + GLSL source by ID
- get_shader_versions: version history with change notes
- get_shader_version_code: GLSL code from any specific version
- submit_shader: create new shader (published or draft)
- update_shader: push revisions with change notes, auto-versions
- get_trending: top-scored shaders
- get_desire_queue: open community requests

MCP resource: fractafrag://platform-info with shader format guide

Auth: Internal service token (Bearer internal:mcp-service) allows MCP
server to write to the API as the system user. No user API keys needed
for the MCP→API internal path.

Transport: Streamable HTTP on port 3200 via FastMCP SDK.
Stateless mode with JSON responses.

Hot Score Ranking:
- Wilson score lower bound with 48-hour time decay
- Recalculated on every vote (up/down/remove)
- Feed sorts by score for trending view

Connection config for Claude Desktop:
{
  "mcpServers": {
    "fractafrag": {
      "url": "http://localhost:3200/mcp"
    }
  }
}
2026-03-24 22:56:03 -05:00
John Lightner
c9967a17a0 Fix ShaderCanvas scroll-back rendering via canvas element replacement
Root cause: Chromium limits ~16 simultaneous WebGL contexts. When scrolling
through a feed of 20+ shader cards, older contexts get silently evicted.
Once a context is lost on a canvas element, getContext('webgl2') returns null
on that same element forever — even after loseContext()/restore cycles.

Solution: The ShaderCanvas component now renders a container div and creates
canvas elements imperatively. When re-entering viewport:
1. Check if existing GL context is still alive (isContextLost)
2. If alive: just restart the animation loop
3. If dead: remove the old canvas, create a fresh DOM element, get a new
   context, recompile, and start rendering

This means scrolling down creates new contexts and scrolling back up
replaces dead canvases with fresh ones. At any given time only ~9 visible
canvases hold active contexts — well within Chrome's limit.

Also: 200px rootMargin on IntersectionObserver pre-compiles shaders
before cards enter viewport for smoother scroll experience.
2026-03-24 22:28:36 -05:00
John Lightner
164dda4760 Fix shader rendering: visibility-aware WebGL contexts, fix 2 GLSL shaders
ShaderCanvas rewrite:
- IntersectionObserver-driven rendering: WebGL context only created when canvas
  enters viewport, released when it leaves. Prevents context starvation when
  20+ shaders are in the feed simultaneously.
- Graceful fallback UI when WebGL context unavailable (hexagon + 'scroll to load')
- Context loss/restore event handlers
- powerPreference: 'low-power' for feed thumbnails
- Pause animation loop when off-screen (saves GPU even with context alive)
- Separate resize observer (no devicePixelRatio scaling for feed — saves memory)

Fixed shaders:
- Pixel Art Dither: replaced mat4 dynamic indexing with unrolled Bayer lookup
  (some WebGL drivers reject mat4[int_var][int_var])
- Wave Interference 2D: replaced C-style array element assignment with
  individual vec2 variables (GLSL ES 300 compatibility)
2026-03-24 22:12:58 -05:00
John Lightner
1047a1f5fe Versioning, drafts, resizable editor, My Shaders, 200 seed shaders
Architecture — Shader versioning & draft system:
- New shader_versions table: immutable snapshots of every edit
- Shaders now have status: draft, published, archived
- current_version counter tracks version number
- Every create/update creates a ShaderVersion record
- Restore-from-version endpoint creates new version (never destructive)
- Drafts are private, only visible to author
- Forks start as drafts
- Free tier rate limit applies only to published shaders (drafts unlimited)

Architecture — Platform identity:
- System account 'fractafrag' (UUID 00000000-...-000001) created in init.sql
- is_system flag on users and shaders
- system_label field: 'fractafrag-curated', future: 'fractafrag-generated'
- Feed/explore can filter by is_system
- System shaders display distinctly from user/AI content

API changes:
- GET /shaders/mine — user workspace (drafts, published, archived)
- GET /shaders/{id}/versions — version history
- GET /shaders/{id}/versions/{n} — specific version
- POST /shaders/{id}/versions/{n}/restore — restore old version
- POST /shaders accepts status: 'draft' | 'published'
- PUT /shaders/{id} accepts change_note for version descriptions
- PUT status transitions: draft→published, published→archived, archived→published

Frontend — Editor improvements:
- Resizable split pane with drag handle (20-80% range, smooth col-resize cursor)
- Save Draft button (creates/updates as draft, no publish)
- Publish button (validates, publishes, redirects to shader page)
- Version badge shows current version number when editing existing
- Owner detection: editing own shader vs forking someone else's
- Saved status indicator ('Draft saved', 'Published')

Frontend — My Shaders workspace:
- /my-shaders route with status tabs (All, Draft, Published, Archived)
- Count badges per tab
- Status badges on shader cards (draft=yellow, published=green, archived=grey)
- Version badges (v1, v2, etc.)
- Quick actions: Edit, Publish, Archive, Restore, Delete per status
- Drafts link to editor, published link to detail page

Seed data — 200 fractafrag-curated shaders:
- 171 2D + 29 3D shaders
- 500 unique tags across all shaders
- All 200 titles are unique
- Covers: fractals (Mandelbrot, Julia sets), noise (fbm, Voronoi, Perlin),
  raymarching (metaballs, terrain, torus knots, metall/glass),
  effects (glitch, VHS, plasma, aurora, lightning, fireworks),
  patterns (circuit, hex grid, stained glass, herringbone, moiré),
  physics (wave interference, pendulum, caustics, gravity lens),
  minimal (single shapes, gradients, dot grids),
  nature (ink, watercolor, smoke, sand garden, coral, nebula),
  color theory (RGB separation, CMY overlap, hue wheel),
  domain warping (acid trip, lava rift, storm eye),
  particles (fireflies, snow, ember, bubbles)
- Each shader has style_metadata (chaos_level, color_temperature, motion_type)
- Distributed creation times over 30 days for feed ranking variety
- Random initial scores for algorithm testing
- All authored by 'fractafrag' system account, is_system=true
- system_label='fractafrag-curated' for clear provenance

Schema:
- shader_versions table with (shader_id, version_number) unique constraint
- HNSW indexes on version lookup
- System account indexes
- Status-aware feed indexes
2026-03-24 22:00:10 -05:00
John Lightner
365c033e0e Fix Docker Compose startup issues
- Rename EngagementEvent.metadata → event_metadata (SQLAlchemy reserved name)
- Replace passlib with direct bcrypt usage (passlib incompatible with bcrypt 5.0)
- Fix renderer Dockerfile: npm ci → npm install (no lockfile)
- Fix frontend Dockerfile: single-stage, skip tsc for builds
- Remove deprecated 'version' key from docker-compose.yml
- Add docker-compose.dev.yml for data-stores-only local dev
- Add start_period to API healthcheck for startup grace
2026-03-24 21:06:01 -05:00
John Lightner
c4b8c0fe38 Tracks B+C+D: Auth system, renderer, full frontend shell
Track B — Auth & User System (complete):
- User registration with bcrypt + Turnstile verification
- JWT access/refresh token flow with httpOnly cookie rotation
- Redis refresh token blocklist for logout
- User profile + settings update endpoints (username, email)
- API key generation with bcrypt hashing (ff_key_ prefix)
- BYOK key management with AES-256-GCM encryption at rest
- Free tier rate limiting (5 shaders/month)
- Tier-gated endpoints (Pro/Studio for BYOK, API keys, bounty posting)

Track C — Shader Submission & Renderer (complete):
- GLSL validator: entry point check, banned extensions, infinite loop detection,
  brace balancing, loop bound warnings, code length limits
- Puppeteer/headless Chromium renderer with Shadertoy-compatible uniform injection
  (iTime, iResolution, iMouse), WebGL2 with SwiftShader fallback
- Shader compilation error detection via page title signaling
- Thumbnail capture at t=1s, preview frame at t=duration
- Renderer client service for API→renderer HTTP communication
- Shader submission pipeline: validate GLSL → create record → enqueue render job
- Desire fulfillment linking on shader submit
- Re-validation and re-render on shader code update
- Fork endpoint copies code, tags, metadata, enqueues new render

Track D — Frontend Shell (complete):
- React 18 + Vite + TypeScript + Tailwind CSS + TanStack Query + Zustand
- Dark theme with custom fracta color palette and surface tones
- Responsive layout with sticky navbar, gradient branding
- Auth: Login + Register pages with JWT token management
- API client with automatic 401 refresh interceptor
- ShaderCanvas: Full WebGL2 renderer component with Shadertoy uniforms,
  mouse tracking, ResizeObserver, debounced recompilation, error callbacks
- GLSL Editor: Split pane (code textarea + live preview), 400ms debounced
  preview, metadata panel (description, tags, type), GLSL validation errors,
  shader publish flow, fork-from-existing support
- Feed: Infinite scroll with IntersectionObserver sentinel, dwell time tracking,
  skeleton loading states, empty state with CTA
- Explore: Search + tag filter + sort tabs (trending/new/top), grid layout
- ShaderDetail: Full-screen preview, vote controls, view source toggle, fork button
- Bounties: Desire queue list sorted by heat score, status badges, tip display
- BountyDetail: Single desire view with style hints, fulfill CTA
- Profile: User header with avatar initial, shader grid
- Settings: Account info, API key management (create/revoke/copy), subscription tiers
- Generate: AI generation UI stub with prompt input, style controls, example prompts

76 files, ~5,700 lines of application code.
2026-03-24 20:56:42 -05:00
67 changed files with 8763 additions and 319 deletions

View file

@ -1 +0,0 @@
[]

22
.gitignore vendored
View file

@ -30,10 +30,14 @@ renders/
*~
.DS_Store
# ─── GSD ──────────────────────────────────────────────────
.gsd/browser-state/
.gsd/browser-baselines/
# ─── Claude Code ──────────────────────────────────────────
.claude/
CLAUDE.md
# ─── GSD / GSD v2 ────────────────────────────────────────
.gsd/
.bg-shell/
.planning/
# ─── SSL certs ────────────────────────────────────────────
services/nginx/certs/*.pem
@ -41,3 +45,15 @@ services/nginx/certs/*.key
# ─── Alembic ──────────────────────────────────────────────
*.db
Thumbs.db
*.code-workspace
.env.*
!.env.example
.next/
target/
vendor/
*.log
coverage/
.cache/
tmp/

View file

@ -1,67 +0,0 @@
# Fractafrag — Project Decisions
## D001 — Backend Language & Framework
- **Choice:** Python + FastAPI
- **Rationale:** AI/ML integrations (pgvector, LLM clients, embeddings) are Python-native. FastAPI gives async performance with Pydantic auto-generated OpenAPI docs. Celery + Redis is mature for job queues.
- **Made by:** Collaborative
- **Revisable:** No
## D002 — Frontend Stack
- **Choice:** React 18 + Vite + Three.js + TanStack Query + Zustand + Tailwind CSS
- **Rationale:** Three.js for 3D shader rendering, raw WebGL for feed thumbnails. React UI, TanStack Query for server state, Zustand for client state.
- **Made by:** Collaborative
- **Revisable:** No
## D003 — Database & Cache
- **Choice:** PostgreSQL 16 + pgvector + Redis 7
- **Rationale:** pgvector for taste/style/desire embeddings (ANN). Redis for sessions, feed cache, rate limiting, Celery broker.
- **Made by:** Collaborative
- **Revisable:** No
## D004 — Container Orchestration
- **Choice:** Single Docker Compose stack, self-hosted, no cloud dependencies
- **Rationale:** Self-contained with nginx reverse proxy. .env-driven config.
- **Made by:** Collaborative
- **Revisable:** No
## D005 — Media Storage (Q1)
- **Choice:** Docker volume initially, S3-compatible config flag for later migration
- **Rationale:** Volume is simplest for single-server. Add Minio/S3 when storage grows large.
- **Made by:** Agent (per spec recommendation)
- **Revisable:** Yes
## D006 — Style Embedding Model (Q2)
- **Choice:** Heuristic classifier + LLM structured output for M1, fine-tune later
- **Rationale:** No training data yet for fine-tuning. Heuristic is fast/cheap, LLM fills accuracy gaps.
- **Made by:** Agent (per spec recommendation)
- **Revisable:** Yes
## D007 — Renderer Approach (Q3)
- **Choice:** Puppeteer + Headless Chromium
- **Rationale:** Accurate browser-equivalent rendering. Profile at M2 and optimize if needed.
- **Made by:** Agent (per spec recommendation)
- **Revisable:** Yes
## D008 — Generation Status UX (Q4)
- **Choice:** Polling for M5, SSE upgrade later
- **Rationale:** Simpler to implement. Generation takes 5-30s, 2s polling is acceptable UX.
- **Made by:** Agent (per spec recommendation)
- **Revisable:** Yes
## D009 — Comments Scope (Q6)
- **Choice:** Defer to post-M5 polish sprint
- **Rationale:** Schema is in place. Feature is not on critical path for core product loop.
- **Made by:** Agent (per spec recommendation)
- **Revisable:** Yes
## D010 — Moderation Approach (Q7)
- **Choice:** Admin API endpoints only (/api/v1/admin/queue). No admin UI for M4.
- **Rationale:** Simple approve/reject actions via API. Admin panel deferred until scale demands it.
- **Made by:** Agent (per spec recommendation)
- **Revisable:** Yes
## D011 — Creator Economy
- **Choice:** Deferred until organic traction (500 DAU, 1000 shaders, 20 active creators)
- **Rationale:** Build the hooks (schema stubs, engagement tracking), not the features. Monetization on a platform nobody uses is worthless.
- **Made by:** Collaborative (per spec Section 11)
- **Revisable:** Yes

28
Makefile Normal file
View file

@ -0,0 +1,28 @@
# Fractafrag — Docker Compose monorepo
# Common development commands
.PHONY: up down build logs test api-shell worker-shell db-shell
up:
docker compose up -d
down:
docker compose down
build:
docker compose build
logs:
docker compose logs -f
test:
docker compose exec api python -m pytest tests/ -v
api-shell:
docker compose exec api bash
worker-shell:
docker compose exec worker bash
db-shell:
docker compose exec postgres psql -U fracta -d fractafrag

View file

@ -16,6 +16,7 @@ CREATE TABLE users (
password_hash TEXT NOT NULL,
role TEXT NOT NULL DEFAULT 'user', -- user, moderator, admin
trust_tier TEXT NOT NULL DEFAULT 'standard', -- standard, creator, trusted_api
is_system BOOLEAN NOT NULL DEFAULT FALSE, -- platform system account (fractafrag)
stripe_customer_id TEXT,
subscription_tier TEXT DEFAULT 'free', -- free, pro, studio
ai_credits_remaining INTEGER DEFAULT 0,
@ -38,9 +39,12 @@ CREATE TABLE shaders (
title TEXT NOT NULL,
description TEXT,
glsl_code TEXT NOT NULL,
status TEXT NOT NULL DEFAULT 'published', -- draft, published, archived
is_public BOOLEAN DEFAULT TRUE,
is_ai_generated BOOLEAN DEFAULT FALSE,
is_system BOOLEAN DEFAULT FALSE, -- generated by fractafrag platform
ai_provider TEXT, -- anthropic, openai, ollama, null
system_label TEXT, -- e.g. 'fractafrag-curated', 'fractafrag-generated'
thumbnail_url TEXT,
preview_url TEXT,
render_status TEXT DEFAULT 'pending', -- pending, rendering, ready, failed
@ -49,10 +53,11 @@ CREATE TABLE shaders (
tags TEXT[],
shader_type TEXT DEFAULT '2d', -- 2d, 3d, audio-reactive
forked_from UUID REFERENCES shaders(id) ON DELETE SET NULL,
current_version INTEGER NOT NULL DEFAULT 1, -- current version number
view_count INTEGER DEFAULT 0,
score FLOAT DEFAULT 0, -- cached hot score for feed ranking
-- Creator economy stubs (Section 11f)
access_tier TEXT DEFAULT 'open', -- open, source_locked, commercial
access_tier TEXT DEFAULT 'open',
source_unlock_price_cents INTEGER,
commercial_license_price_cents INTEGER,
verified_creator_shader BOOLEAN DEFAULT FALSE,
@ -61,6 +66,24 @@ CREATE TABLE shaders (
updated_at TIMESTAMPTZ DEFAULT NOW()
);
-- ════════════════════════════════════════════════════════════
-- SHADER VERSIONS — immutable snapshots of each edit
-- ════════════════════════════════════════════════════════════
CREATE TABLE shader_versions (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
shader_id UUID NOT NULL REFERENCES shaders(id) ON DELETE CASCADE,
version_number INTEGER NOT NULL,
glsl_code TEXT NOT NULL,
title TEXT NOT NULL,
description TEXT,
tags TEXT[],
style_metadata JSONB,
change_note TEXT, -- optional: "fixed the color bleeding", "added mouse interaction"
thumbnail_url TEXT,
created_at TIMESTAMPTZ DEFAULT NOW(),
UNIQUE (shader_id, version_number)
);
-- ════════════════════════════════════════════════════════════
-- VOTES
-- ════════════════════════════════════════════════════════════
@ -78,10 +101,10 @@ CREATE TABLE votes (
-- ════════════════════════════════════════════════════════════
CREATE TABLE engagement_events (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID REFERENCES users(id) ON DELETE SET NULL, -- null for anonymous
session_id TEXT, -- anonymous session token
user_id UUID REFERENCES users(id) ON DELETE SET NULL,
session_id TEXT,
shader_id UUID REFERENCES shaders(id) ON DELETE CASCADE,
event_type TEXT NOT NULL, -- dwell, replay, share, generate_similar
event_type TEXT NOT NULL,
dwell_secs FLOAT,
metadata JSONB,
created_at TIMESTAMPTZ DEFAULT NOW()
@ -94,18 +117,17 @@ CREATE TABLE desires (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
author_id UUID REFERENCES users(id) ON DELETE SET NULL,
prompt_text TEXT NOT NULL,
prompt_embedding vector(512), -- embedded for similarity grouping
style_hints JSONB, -- { chaos_level, color_temp, etc }
prompt_embedding vector(512),
style_hints JSONB,
tip_amount_cents INTEGER DEFAULT 0,
status TEXT DEFAULT 'open', -- open, in_progress, fulfilled, expired
heat_score FLOAT DEFAULT 1, -- updated as similar desires accumulate
status TEXT DEFAULT 'open',
heat_score FLOAT DEFAULT 1,
fulfilled_by_shader UUID REFERENCES shaders(id) ON DELETE SET NULL,
fulfilled_at TIMESTAMPTZ,
expires_at TIMESTAMPTZ,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- Similar desire grouping (many-to-many)
CREATE TABLE desire_clusters (
cluster_id UUID,
desire_id UUID REFERENCES desires(id) ON DELETE CASCADE,
@ -122,7 +144,7 @@ CREATE TABLE bounty_tips (
tipper_id UUID REFERENCES users(id) ON DELETE SET NULL,
amount_cents INTEGER NOT NULL,
stripe_payment_intent_id TEXT,
status TEXT DEFAULT 'held', -- held, released, refunded
status TEXT DEFAULT 'held',
created_at TIMESTAMPTZ DEFAULT NOW()
);
@ -134,10 +156,10 @@ CREATE TABLE creator_payouts (
creator_id UUID REFERENCES users(id) ON DELETE SET NULL,
desire_id UUID REFERENCES desires(id) ON DELETE SET NULL,
gross_amount_cents INTEGER,
platform_fee_cents INTEGER, -- 10%
net_amount_cents INTEGER, -- 90%
platform_fee_cents INTEGER,
net_amount_cents INTEGER,
stripe_transfer_id TEXT,
status TEXT DEFAULT 'pending', -- pending, processing, completed, failed
status TEXT DEFAULT 'pending',
created_at TIMESTAMPTZ DEFAULT NOW()
);
@ -147,10 +169,10 @@ CREATE TABLE creator_payouts (
CREATE TABLE api_keys (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID REFERENCES users(id) ON DELETE CASCADE,
key_hash TEXT UNIQUE NOT NULL, -- bcrypt hash of the actual key
key_prefix TEXT NOT NULL, -- first 8 chars for display (ff_key_XXXXXXXX)
name TEXT, -- user-given label
trust_tier TEXT DEFAULT 'probation', -- probation, trusted, premium
key_hash TEXT UNIQUE NOT NULL,
key_prefix TEXT NOT NULL,
name TEXT,
trust_tier TEXT DEFAULT 'probation',
submissions_approved INTEGER DEFAULT 0,
rate_limit_per_hour INTEGER DEFAULT 10,
last_used_at TIMESTAMPTZ,
@ -168,7 +190,7 @@ CREATE TABLE generation_log (
provider TEXT NOT NULL,
prompt_text TEXT,
tokens_used INTEGER,
cost_cents INTEGER, -- platform cost for credit-based generations
cost_cents INTEGER,
success BOOLEAN,
created_at TIMESTAMPTZ DEFAULT NOW()
);
@ -192,7 +214,7 @@ CREATE TABLE source_unlocks (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
shader_id UUID REFERENCES shaders(id) ON DELETE CASCADE,
buyer_id UUID REFERENCES users(id) ON DELETE SET NULL,
license_type TEXT NOT NULL, -- personal, commercial
license_type TEXT NOT NULL,
amount_cents INTEGER NOT NULL,
platform_fee_cents INTEGER NOT NULL,
stripe_payment_intent_id TEXT,
@ -215,22 +237,18 @@ CREATE TABLE creator_engagement_snapshots (
-- ════════════════════════════════════════════════════════════
-- Feed performance
CREATE INDEX idx_shaders_score ON shaders(score DESC) WHERE is_public = TRUE;
CREATE INDEX idx_shaders_created ON shaders(created_at DESC) WHERE is_public = TRUE;
CREATE INDEX idx_shaders_score ON shaders(score DESC) WHERE is_public = TRUE AND status = 'published';
CREATE INDEX idx_shaders_created ON shaders(created_at DESC) WHERE is_public = TRUE AND status = 'published';
CREATE INDEX idx_shaders_tags ON shaders USING GIN(tags);
CREATE INDEX idx_shaders_render_status ON shaders(render_status) WHERE render_status != 'ready';
CREATE INDEX idx_shaders_status ON shaders(status);
CREATE INDEX idx_shaders_author_status ON shaders(author_id, status, updated_at DESC);
CREATE INDEX idx_shaders_system ON shaders(is_system) WHERE is_system = TRUE;
-- Recommendation (pgvector ANN — ivfflat, will rebuild after data exists)
-- NOTE: ivfflat indexes require data in the table to build properly.
-- Run these AFTER seeding initial data:
-- CREATE INDEX idx_shaders_style_vector ON shaders
-- USING ivfflat (style_vector vector_cosine_ops) WITH (lists = 100);
-- CREATE INDEX idx_users_taste_vector ON users
-- USING ivfflat (taste_vector vector_cosine_ops) WITH (lists = 50);
-- CREATE INDEX idx_desires_embedding ON desires
-- USING ivfflat (prompt_embedding vector_cosine_ops) WITH (lists = 50);
-- Versioning
CREATE INDEX idx_shader_versions_shader ON shader_versions(shader_id, version_number DESC);
-- For now, use HNSW (works on empty tables, better perf at small scale)
-- Recommendation (pgvector HNSW — works on empty tables)
CREATE INDEX idx_shaders_style_vector ON shaders
USING hnsw (style_vector vector_cosine_ops) WITH (m = 16, ef_construction = 64);
CREATE INDEX idx_users_taste_vector ON users
@ -263,3 +281,20 @@ CREATE INDEX idx_comments_parent ON comments(parent_id);
-- Text search
CREATE INDEX idx_shaders_title_trgm ON shaders USING GIN(title gin_trgm_ops);
CREATE INDEX idx_desires_prompt_trgm ON desires USING GIN(prompt_text gin_trgm_ops);
-- ════════════════════════════════════════════════════════════
-- SYSTEM ACCOUNT: The "fractafrag" platform user
-- All system-generated/curated shaders are authored by this account
-- ════════════════════════════════════════════════════════════
INSERT INTO users (id, username, email, password_hash, role, trust_tier, is_system, subscription_tier, is_verified_creator)
VALUES (
'00000000-0000-0000-0000-000000000001',
'fractafrag',
'system@fractafrag.local',
'$2b$12$000000000000000000000000000000000000000000000000000000', -- not a valid login
'admin',
'trusted_api',
TRUE,
'studio',
TRUE
);

36
docker-compose.dev.yml Normal file
View file

@ -0,0 +1,36 @@
# Minimal compose for local dev — just the data stores
# Usage: docker compose -f docker-compose.dev.yml up -d
version: "3.9"
services:
postgres:
image: pgvector/pgvector:pg16
environment:
- POSTGRES_USER=fracta
- POSTGRES_PASSWORD=devpass
- POSTGRES_DB=fractafrag
volumes:
- pgdata:/var/lib/postgresql/data
- ./db/init.sql:/docker-entrypoint-initdb.d/01-init.sql:ro
ports:
- "5432:5432"
healthcheck:
test: ["CMD-SHELL", "pg_isready -U fracta -d fractafrag"]
interval: 5s
timeout: 5s
retries: 5
redis:
image: redis:7-alpine
command: redis-server --appendonly yes
ports:
- "6379:6379"
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 5s
timeout: 5s
retries: 5
volumes:
pgdata:

View file

@ -1,38 +1,36 @@
# docker-compose.override.yml — Local dev overrides
# This file is automatically picked up by docker compose
version: "3.9"
# Automatically picked up by `docker compose up`
services:
api:
volumes:
- ./services/api:/app
command: uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
command: ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]
ports:
- "8000:8000" # Direct access for debugging
- "8000:8000"
frontend:
volumes:
- ./services/frontend:/app
- /app/node_modules
command: npm run dev -- --host 0.0.0.0
command: ["npx", "vite", "--host", "0.0.0.0"]
ports:
- "5173:5173" # Vite dev server direct access
- "5173:5173"
mcp:
volumes:
- ./services/mcp:/app
ports:
- "3200:3200" # Direct MCP access
- "3200:3200"
renderer:
ports:
- "3100:3100" # Direct renderer access
- "3100:3100"
postgres:
ports:
- "5432:5432" # Direct DB access for dev tools
- "5432:5432"
redis:
ports:
- "6379:6379" # Direct Redis access for dev tools
- "6379:6379"

View file

@ -1,5 +1,3 @@
version: "3.9"
services:
# ─── Reverse Proxy ──────────────────────────────────────────
@ -7,10 +5,9 @@ services:
image: nginx:alpine
ports:
- "80:80"
- "443:443"
volumes:
- ./services/nginx/conf:/etc/nginx/conf.d:ro
- ./services/nginx/certs:/etc/ssl/certs:ro
- renders:/renders:ro
depends_on:
api:
condition: service_healthy
@ -25,7 +22,6 @@ services:
dockerfile: Dockerfile
environment:
- VITE_API_URL=${VITE_API_URL:-http://localhost/api}
- VITE_MCP_URL=${VITE_MCP_URL:-http://localhost/mcp}
restart: unless-stopped
# ─── API (FastAPI) ──────────────────────────────────────────
@ -34,18 +30,18 @@ services:
context: ./services/api
dockerfile: Dockerfile
environment:
- DATABASE_URL=postgresql+asyncpg://${POSTGRES_USER:-fracta}:${DB_PASS}@postgres:5432/${POSTGRES_DB:-fractafrag}
- DATABASE_URL_SYNC=postgresql://${POSTGRES_USER:-fracta}:${DB_PASS}@postgres:5432/${POSTGRES_DB:-fractafrag}
- DATABASE_URL=postgresql+asyncpg://${POSTGRES_USER:-fracta}:${DB_PASS:-devpass}@postgres:5432/${POSTGRES_DB:-fractafrag}
- DATABASE_URL_SYNC=postgresql://${POSTGRES_USER:-fracta}:${DB_PASS:-devpass}@postgres:5432/${POSTGRES_DB:-fractafrag}
- REDIS_URL=${REDIS_URL:-redis://redis:6379/0}
- JWT_SECRET=${JWT_SECRET}
- JWT_SECRET=${JWT_SECRET:-dev-secret-change-in-production}
- JWT_ALGORITHM=${JWT_ALGORITHM:-HS256}
- JWT_ACCESS_TOKEN_EXPIRE_MINUTES=${JWT_ACCESS_TOKEN_EXPIRE_MINUTES:-15}
- JWT_ACCESS_TOKEN_EXPIRE_MINUTES=${JWT_ACCESS_TOKEN_EXPIRE_MINUTES:-60}
- JWT_REFRESH_TOKEN_EXPIRE_DAYS=${JWT_REFRESH_TOKEN_EXPIRE_DAYS:-30}
- TURNSTILE_SECRET=${TURNSTILE_SECRET}
- STRIPE_SECRET_KEY=${STRIPE_SECRET_KEY}
- STRIPE_WEBHOOK_SECRET=${STRIPE_WEBHOOK_SECRET}
- TURNSTILE_SECRET=${TURNSTILE_SECRET:-}
- STRIPE_SECRET_KEY=${STRIPE_SECRET_KEY:-}
- STRIPE_WEBHOOK_SECRET=${STRIPE_WEBHOOK_SECRET:-}
- RENDERER_URL=http://renderer:3100
- BYOK_MASTER_KEY=${BYOK_MASTER_KEY}
- BYOK_MASTER_KEY=${BYOK_MASTER_KEY:-dev-byok-key}
depends_on:
postgres:
condition: service_healthy
@ -56,6 +52,7 @@ services:
interval: 10s
timeout: 5s
retries: 5
start_period: 15s
restart: unless-stopped
# ─── MCP Server ─────────────────────────────────────────────
@ -65,7 +62,7 @@ services:
dockerfile: Dockerfile
environment:
- API_BASE_URL=http://api:8000
- MCP_API_KEY_SALT=${MCP_API_KEY_SALT}
- MCP_API_KEY_SALT=${MCP_API_KEY_SALT:-dev-salt}
- REDIS_URL=${REDIS_URL:-redis://redis:6379/0}
depends_on:
api:
@ -80,7 +77,7 @@ services:
shm_size: "512mb"
environment:
- MAX_RENDER_DURATION=${MAX_RENDER_DURATION:-8}
- OUTPUT_DIR=${RENDER_OUTPUT_DIR:-/renders}
- OUTPUT_DIR=/renders
volumes:
- renders:/renders
restart: unless-stopped
@ -90,22 +87,20 @@ services:
build:
context: ./services/api
dockerfile: Dockerfile
command: celery -A app.worker.celery_app worker --loglevel=info --concurrency=4
command: ["python", "-m", "celery", "-A", "app.worker", "worker", "--loglevel=info", "--concurrency=2"]
environment:
- DATABASE_URL=postgresql+asyncpg://${POSTGRES_USER:-fracta}:${DB_PASS}@postgres:5432/${POSTGRES_DB:-fractafrag}
- DATABASE_URL_SYNC=postgresql://${POSTGRES_USER:-fracta}:${DB_PASS}@postgres:5432/${POSTGRES_DB:-fractafrag}
- DATABASE_URL=postgresql+asyncpg://${POSTGRES_USER:-fracta}:${DB_PASS:-devpass}@postgres:5432/${POSTGRES_DB:-fractafrag}
- DATABASE_URL_SYNC=postgresql://${POSTGRES_USER:-fracta}:${DB_PASS:-devpass}@postgres:5432/${POSTGRES_DB:-fractafrag}
- REDIS_URL=${REDIS_URL:-redis://redis:6379/0}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- OPENAI_API_KEY=${OPENAI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY:-}
- OPENAI_API_KEY=${OPENAI_API_KEY:-}
- RENDERER_URL=http://renderer:3100
- BYOK_MASTER_KEY=${BYOK_MASTER_KEY}
- BYOK_MASTER_KEY=${BYOK_MASTER_KEY:-dev-byok-key}
depends_on:
postgres:
condition: service_healthy
redis:
condition: service_healthy
renderer:
condition: service_started
restart: unless-stopped
# ─── PostgreSQL + pgvector ──────────────────────────────────
@ -113,7 +108,7 @@ services:
image: pgvector/pgvector:pg16
environment:
- POSTGRES_USER=${POSTGRES_USER:-fracta}
- POSTGRES_PASSWORD=${DB_PASS}
- POSTGRES_PASSWORD=${DB_PASS:-devpass}
- POSTGRES_DB=${POSTGRES_DB:-fractafrag}
volumes:
- pgdata:/var/lib/postgresql/data

2045
scripts/seed_shaders.py Normal file

File diff suppressed because it is too large Load diff

6
services/api/=0.20.0 Normal file
View file

@ -0,0 +1,6 @@
Defaulting to user installation because normal site-packages is not writeable
Collecting aiosqlite
Using cached aiosqlite-0.22.1-py3-none-any.whl.metadata (4.3 kB)
Using cached aiosqlite-0.22.1-py3-none-any.whl (17 kB)
Installing collected packages: aiosqlite
Successfully installed aiosqlite-0.22.1

View file

@ -2,7 +2,7 @@ FROM python:3.12-slim
WORKDIR /app
# Install system deps
# Install system deps (curl for healthcheck)
RUN apt-get update && apt-get install -y --no-install-recommends \
curl \
build-essential \
@ -10,12 +10,12 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
# Install Python deps
COPY pyproject.toml .
RUN pip install --no-cache-dir -e ".[dev]"
RUN pip install --no-cache-dir ".[dev]"
# Copy app code
COPY . .
EXPOSE 8000
# Default command (overridden in dev by docker-compose.override.yml)
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
EXPOSE 8000

View file

@ -7,7 +7,7 @@ from typing import Optional
from fastapi import Depends, HTTPException, status, Request, Response
from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials
from jose import jwt, JWTError
from passlib.context import CryptContext
import bcrypt
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
@ -17,18 +17,17 @@ from app.models import User
from app.redis import get_redis
settings = get_settings()
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
bearer_scheme = HTTPBearer(auto_error=False)
# ── Password Hashing ──────────────────────────────────────
def hash_password(password: str) -> str:
return pwd_context.hash(password)
return bcrypt.hashpw(password.encode("utf-8"), bcrypt.gensalt(rounds=12)).decode("utf-8")
def verify_password(plain: str, hashed: str) -> bool:
return pwd_context.verify(plain, hashed)
return bcrypt.checkpw(plain.encode("utf-8"), hashed.encode("utf-8"))
# ── JWT Token Management ──────────────────────────────────
@ -80,11 +79,27 @@ async def get_current_user(
credentials: Optional[HTTPAuthorizationCredentials] = Depends(bearer_scheme),
db: AsyncSession = Depends(get_db),
) -> User:
"""Require authentication. Returns the current user."""
"""Require authentication. Returns the current user.
Supports:
- JWT Bearer tokens (normal user auth)
- Internal service token: 'Bearer internal:<system-jwt>' from MCP/worker
"""
if credentials is None:
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Not authenticated")
payload = decode_token(credentials.credentials)
token = credentials.credentials
# Internal service auth — MCP server and workers use this to act as the system account
if token.startswith("internal:"):
from app.models.models import SYSTEM_USER_ID
result = await db.execute(select(User).where(User.id == SYSTEM_USER_ID))
user = result.scalar_one_or_none()
if user:
return user
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="System user not found")
payload = decode_token(token)
if payload.get("type") == "refresh":
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Cannot use refresh token for API access")

View file

@ -1,12 +1,14 @@
"""Models package."""
from app.models.models import (
User, Shader, Vote, EngagementEvent, Desire, DesireCluster,
SYSTEM_USER_ID,
User, Shader, ShaderVersion, Vote, EngagementEvent, Desire, DesireCluster,
BountyTip, CreatorPayout, ApiKey, GenerationLog, Comment,
SourceUnlock, CreatorEngagementSnapshot,
)
__all__ = [
"User", "Shader", "Vote", "EngagementEvent", "Desire", "DesireCluster",
"SYSTEM_USER_ID",
"User", "Shader", "ShaderVersion", "Vote", "EngagementEvent", "Desire", "DesireCluster",
"BountyTip", "CreatorPayout", "ApiKey", "GenerationLog", "Comment",
"SourceUnlock", "CreatorEngagementSnapshot",
]

View file

@ -11,6 +11,9 @@ from pgvector.sqlalchemy import Vector
from sqlalchemy.orm import relationship
from app.database import Base
# System account UUID — the "fractafrag" platform user
SYSTEM_USER_ID = uuid.UUID("00000000-0000-0000-0000-000000000001")
class User(Base):
__tablename__ = "users"
@ -21,19 +24,17 @@ class User(Base):
password_hash = Column(String, nullable=False)
role = Column(String, nullable=False, default="user")
trust_tier = Column(String, nullable=False, default="standard")
is_system = Column(Boolean, nullable=False, default=False)
stripe_customer_id = Column(String, nullable=True)
subscription_tier = Column(String, default="free")
ai_credits_remaining = Column(Integer, default=0)
taste_vector = Column(Vector(512), nullable=True)
# Creator economy stubs
is_verified_creator = Column(Boolean, default=False)
verified_creator_at = Column(DateTime(timezone=True), nullable=True)
stripe_connect_account_id = Column(String, nullable=True)
# Timestamps
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
last_active_at = Column(DateTime(timezone=True), nullable=True)
# Relationships
shaders = relationship("Shader", back_populates="author")
votes = relationship("Vote", back_populates="user")
api_keys = relationship("ApiKey", back_populates="user")
@ -47,9 +48,12 @@ class Shader(Base):
title = Column(String, nullable=False)
description = Column(Text, nullable=True)
glsl_code = Column(Text, nullable=False)
status = Column(String, nullable=False, default="published") # draft, published, archived
is_public = Column(Boolean, default=True)
is_ai_generated = Column(Boolean, default=False)
is_system = Column(Boolean, default=False)
ai_provider = Column(String, nullable=True)
system_label = Column(String, nullable=True)
thumbnail_url = Column(String, nullable=True)
preview_url = Column(String, nullable=True)
render_status = Column(String, default="pending")
@ -58,20 +62,38 @@ class Shader(Base):
tags = Column(ARRAY(String), default=list)
shader_type = Column(String, default="2d")
forked_from = Column(UUID(as_uuid=True), ForeignKey("shaders.id", ondelete="SET NULL"), nullable=True)
current_version = Column(Integer, nullable=False, default=1)
view_count = Column(Integer, default=0)
score = Column(Float, default=0.0)
# Creator economy stubs
access_tier = Column(String, default="open")
source_unlock_price_cents = Column(Integer, nullable=True)
commercial_license_price_cents = Column(Integer, nullable=True)
verified_creator_shader = Column(Boolean, default=False)
# Timestamps
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
updated_at = Column(DateTime(timezone=True), default=datetime.utcnow, onupdate=datetime.utcnow)
# Relationships
author = relationship("User", back_populates="shaders")
votes = relationship("Vote", back_populates="shader")
versions = relationship("ShaderVersion", back_populates="shader", order_by="ShaderVersion.version_number.desc()")
class ShaderVersion(Base):
__tablename__ = "shader_versions"
__table_args__ = (UniqueConstraint("shader_id", "version_number"),)
id = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
shader_id = Column(UUID(as_uuid=True), ForeignKey("shaders.id", ondelete="CASCADE"), nullable=False)
version_number = Column(Integer, nullable=False)
glsl_code = Column(Text, nullable=False)
title = Column(String, nullable=False)
description = Column(Text, nullable=True)
tags = Column(ARRAY(String), default=list)
style_metadata = Column(JSONB, nullable=True)
change_note = Column(Text, nullable=True)
thumbnail_url = Column(String, nullable=True)
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
shader = relationship("Shader", back_populates="versions")
class Vote(Base):
@ -97,7 +119,7 @@ class EngagementEvent(Base):
shader_id = Column(UUID(as_uuid=True), ForeignKey("shaders.id", ondelete="CASCADE"), nullable=False)
event_type = Column(String, nullable=False)
dwell_secs = Column(Float, nullable=True)
metadata = Column(JSONB, nullable=True)
event_metadata = Column("metadata", JSONB, nullable=True)
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
@ -195,7 +217,6 @@ class Comment(Base):
created_at = Column(DateTime(timezone=True), default=datetime.utcnow)
# Creator economy stubs (dormant)
class SourceUnlock(Base):
__tablename__ = "source_unlocks"

View file

@ -3,10 +3,10 @@
from uuid import UUID
from fastapi import APIRouter, Depends, HTTPException, Query, status
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
from sqlalchemy import select, text
from app.database import get_db
from app.models import User, Desire
from app.models import User, Desire, Shader
from app.schemas import DesireCreate, DesirePublic
from app.middleware.auth import get_current_user, require_tier
@ -29,7 +29,24 @@ async def list_desires(
query = query.order_by(Desire.heat_score.desc()).limit(limit).offset(offset)
result = await db.execute(query)
return result.scalars().all()
desires = list(result.scalars().all())
# Batch-annotate cluster_count to avoid N+1 queries
desire_ids = [d.id for d in desires]
if desire_ids:
cluster_query = text("""
SELECT dc1.desire_id, COUNT(dc2.desire_id) as cluster_count
FROM desire_clusters dc1
JOIN desire_clusters dc2 ON dc1.cluster_id = dc2.cluster_id
WHERE dc1.desire_id = ANY(:desire_ids)
GROUP BY dc1.desire_id
""")
cluster_result = await db.execute(cluster_query, {"desire_ids": desire_ids})
cluster_counts = {row[0]: row[1] for row in cluster_result}
for d in desires:
d.cluster_count = cluster_counts.get(d.id, 0)
return desires
@router.get("/{desire_id}", response_model=DesirePublic)
@ -38,6 +55,18 @@ async def get_desire(desire_id: UUID, db: AsyncSession = Depends(get_db)):
desire = result.scalar_one_or_none()
if not desire:
raise HTTPException(status_code=404, detail="Desire not found")
# Annotate cluster_count for single desire
cluster_query = text("""
SELECT COUNT(dc2.desire_id) as cluster_count
FROM desire_clusters dc1
JOIN desire_clusters dc2 ON dc1.cluster_id = dc2.cluster_id
WHERE dc1.desire_id = :desire_id
""")
cluster_result = await db.execute(cluster_query, {"desire_id": desire_id})
row = cluster_result.first()
desire.cluster_count = row[0] if row else 0
return desire
@ -55,9 +84,9 @@ async def create_desire(
db.add(desire)
await db.flush()
# TODO: Embed prompt text (Track G)
# TODO: Check similarity clustering (Track G)
# TODO: Enqueue process_desire worker job (Track G)
# Fire-and-forget: enqueue embedding + clustering worker task
from app.worker import process_desire
process_desire.delay(str(desire.id))
return desire
@ -76,6 +105,13 @@ async def fulfill_desire(
if desire.status != "open":
raise HTTPException(status_code=400, detail="Desire is not open")
# Validate shader exists and is published
shader = (await db.execute(select(Shader).where(Shader.id == shader_id))).scalar_one_or_none()
if not shader:
raise HTTPException(status_code=404, detail="Shader not found")
if shader.status != "published":
raise HTTPException(status_code=400, detail="Shader must be published to fulfill a desire")
from datetime import datetime, timezone
desire.status = "fulfilled"
desire.fulfilled_by_shader = shader_id

View file

@ -1,38 +1,88 @@
"""Feed router — personalized feed, trending, new."""
"""Feed router — personalized feed, trending, new, similar.
Feed ranking strategy:
- Anonymous users: score * 0.6 + recency * 0.3 + random * 0.1
- Authenticated users: same base + tag affinity boost from engagement history
- Excludes shaders the user has already seen (voted/dwelled >30 days)
"""
import random as py_random
from uuid import UUID
from datetime import datetime, timezone, timedelta
from fastapi import APIRouter, Depends, Query
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
from sqlalchemy import select, func, text, case, literal_column
from app.database import get_db
from app.models import User, Shader
from app.models import User, Shader, Vote, EngagementEvent
from app.schemas import ShaderFeedItem, DwellReport
from app.middleware.auth import get_optional_user, get_current_user
router = APIRouter()
_PUB = [Shader.is_public == True, Shader.status == "published"]
@router.get("", response_model=list[ShaderFeedItem])
async def get_feed(
limit: int = Query(20, ge=1, le=50),
cursor: str | None = Query(None),
offset: int = Query(0, ge=0),
db: AsyncSession = Depends(get_db),
user: User | None = Depends(get_optional_user),
):
"""
Personalized feed for authenticated users (pgvector taste match).
Trending/new for anonymous users.
Main feed. For authenticated users, boosts shaders matching their
tag affinities (built from votes and dwell time). For anonymous users,
blends trending score with recency and a randomness factor.
"""
# TODO: Implement full recommendation engine (Track F)
# For now: return newest public shaders
if user:
# Build tag affinity from user's positive engagement
# (upvoted shaders + shaders with >10s dwell time)
affinity_tags = await _get_user_tag_affinities(db, user.id)
# Fetch candidate shaders
query = (
select(Shader)
.where(Shader.is_public == True, Shader.render_status == "ready")
.order_by(Shader.created_at.desc())
.limit(limit)
.where(*_PUB)
.order_by(Shader.score.desc(), Shader.created_at.desc())
.limit(limit * 3) # over-fetch for re-ranking
.offset(offset)
)
result = await db.execute(query)
return result.scalars().all()
candidates = list(result.scalars().all())
# Re-rank with tag affinity boost + randomness
scored = []
for s in candidates:
base = (s.score or 0) * 0.5
recency = _recency_score(s.created_at) * 0.2
tag_boost = _tag_affinity_score(s.tags or [], affinity_tags) * 0.2
chaos = py_random.random() * 0.1
scored.append((base + recency + tag_boost + chaos, s))
scored.sort(key=lambda x: x[0], reverse=True)
return [s for _, s in scored[:limit]]
else:
# Anonymous: trending + recency + chaos
query = (
select(Shader)
.where(*_PUB)
.order_by(Shader.score.desc(), Shader.created_at.desc())
.limit(limit * 2)
.offset(offset)
)
result = await db.execute(query)
candidates = list(result.scalars().all())
scored = []
for s in candidates:
base = (s.score or 0) * 0.6
recency = _recency_score(s.created_at) * 0.3
chaos = py_random.random() * 0.1
scored.append((base + recency + chaos, s))
scored.sort(key=lambda x: x[0], reverse=True)
return [s for _, s in scored[:limit]]
@router.get("/trending", response_model=list[ShaderFeedItem])
@ -40,9 +90,10 @@ async def get_trending(
limit: int = Query(20, ge=1, le=50),
db: AsyncSession = Depends(get_db),
):
"""Pure score-ranked feed."""
query = (
select(Shader)
.where(Shader.is_public == True, Shader.render_status == "ready")
.where(*_PUB)
.order_by(Shader.score.desc())
.limit(limit)
)
@ -55,9 +106,10 @@ async def get_new(
limit: int = Query(20, ge=1, le=50),
db: AsyncSession = Depends(get_db),
):
"""Chronological feed."""
query = (
select(Shader)
.where(Shader.is_public == True, Shader.render_status == "ready")
.where(*_PUB)
.order_by(Shader.created_at.desc())
.limit(limit)
)
@ -65,22 +117,134 @@ async def get_new(
return result.scalars().all()
@router.get("/similar/{shader_id}", response_model=list[ShaderFeedItem])
async def get_similar(
shader_id: UUID,
limit: int = Query(10, ge=1, le=30),
db: AsyncSession = Depends(get_db),
):
"""Find shaders similar to a given shader by tag overlap."""
source = (await db.execute(select(Shader).where(Shader.id == shader_id))).scalar_one_or_none()
if not source or not source.tags:
return []
# Find shaders sharing the most tags
from sqlalchemy import type_coerce
from sqlalchemy.dialects.postgresql import ARRAY as PG_ARRAY
from sqlalchemy import Text
query = (
select(Shader)
.where(
*_PUB,
Shader.id != shader_id,
Shader.tags.overlap(type_coerce(source.tags, PG_ARRAY(Text)))
)
.order_by(Shader.score.desc())
.limit(limit * 2)
)
result = await db.execute(query)
candidates = list(result.scalars().all())
# Rank by tag overlap count
source_tags = set(source.tags)
scored = []
for s in candidates:
overlap = len(source_tags & set(s.tags or []))
scored.append((overlap, s.score or 0, s))
scored.sort(key=lambda x: (x[0], x[1]), reverse=True)
return [s for _, _, s in scored[:limit]]
@router.post("/dwell", status_code=204)
async def report_dwell(
body: DwellReport,
db: AsyncSession = Depends(get_db),
user: User | None = Depends(get_optional_user),
):
"""Report dwell time signal for recommendation engine."""
from app.models import EngagementEvent
"""Report dwell time. Updates tag affinity for authenticated users."""
event = EngagementEvent(
user_id=user.id if user else None,
session_id=body.session_id,
shader_id=body.shader_id,
event_type="dwell",
dwell_secs=body.dwell_secs,
metadata={"replayed": body.replayed},
event_metadata={"replayed": body.replayed},
)
db.add(event)
# TODO: Update user taste vector (Track F)
# ── Helpers ───────────────────────────────────────────────
def _recency_score(created_at) -> float:
"""Score from 1.0 (just created) to ~0.0 (30+ days old)."""
if not created_at:
return 0.0
if created_at.tzinfo is None:
created_at = created_at.replace(tzinfo=timezone.utc)
age_hours = (datetime.now(timezone.utc) - created_at).total_seconds() / 3600
return 1.0 / (1.0 + age_hours / 72.0) # half-life ~3 days
def _tag_affinity_score(shader_tags: list[str], affinity: dict[str, float]) -> float:
"""Score based on how well a shader's tags match the user's affinities."""
if not shader_tags or not affinity:
return 0.0
total = sum(affinity.get(tag, 0.0) for tag in shader_tags)
# Normalize by number of tags to avoid bias toward heavily-tagged shaders
return total / len(shader_tags)
async def _get_user_tag_affinities(db: AsyncSession, user_id: UUID) -> dict[str, float]:
"""Build a tag affinity map from user's engagement history.
Sources:
- Upvoted shaders: +1.0 per tag
- Downvoted shaders: -0.5 per tag
- Dwell > 10s: +0.3 per tag
- Dwell > 30s: +0.6 per tag
Returns: {tag: affinity_score}
"""
affinities: dict[str, float] = {}
# Votes
vote_query = (
select(Shader.tags, Vote.value)
.join(Vote, Vote.shader_id == Shader.id)
.where(Vote.user_id == user_id)
)
vote_result = await db.execute(vote_query)
for tags, value in vote_result:
if not tags:
continue
weight = 1.0 if value == 1 else -0.5
for tag in tags:
affinities[tag] = affinities.get(tag, 0.0) + weight
# Dwell events (last 30 days)
cutoff = datetime.now(timezone.utc) - timedelta(days=30)
dwell_query = (
select(Shader.tags, EngagementEvent.dwell_secs)
.join(EngagementEvent, EngagementEvent.shader_id == Shader.id)
.where(
EngagementEvent.user_id == user_id,
EngagementEvent.event_type == "dwell",
EngagementEvent.created_at >= cutoff,
)
)
dwell_result = await db.execute(dwell_query)
for tags, dwell in dwell_result:
if not tags or not dwell:
continue
if dwell > 30:
weight = 0.6
elif dwell > 10:
weight = 0.3
else:
continue # ignore short dwells
for tag in tags:
affinities[tag] = affinities.get(tag, 0.0) + weight
return affinities

View file

@ -5,7 +5,7 @@ from uuid import UUID
from fastapi import APIRouter, Depends, HTTPException, status
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
from passlib.context import CryptContext
import bcrypt
from app.database import get_db
from app.models import User, ApiKey
@ -13,18 +13,16 @@ from app.schemas import ApiKeyCreate, ApiKeyPublic, ApiKeyCreated
from app.middleware.auth import get_current_user, require_tier
router = APIRouter()
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
def generate_api_key() -> tuple[str, str, str]:
"""Generate an API key. Returns (full_key, prefix, hash)."""
raw = secrets.token_bytes(32)
# base58-like encoding using alphanumeric chars
import base64
encoded = base64.b32encode(raw).decode().rstrip("=").lower()
full_key = f"ff_key_{encoded}"
prefix = full_key[:16] # ff_key_ + 8 chars
key_hash = pwd_context.hash(full_key)
prefix = full_key[:16]
key_hash = bcrypt.hashpw(full_key.encode("utf-8"), bcrypt.gensalt(rounds=10)).decode("utf-8")
return full_key, prefix, key_hash

View file

@ -1,42 +1,54 @@
"""Shaders router — CRUD, submit, fork, search."""
"""Shaders router — CRUD, versioning, drafts, fork, search."""
from uuid import UUID
from datetime import datetime, timezone
from fastapi import APIRouter, Depends, HTTPException, Query, status
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select, func, or_
from sqlalchemy import select, func
from app.database import get_db
from app.models import User, Shader
from app.schemas import ShaderCreate, ShaderUpdate, ShaderPublic
from app.models import User, Shader, ShaderVersion
from app.schemas import ShaderCreate, ShaderUpdate, ShaderPublic, ShaderVersionPublic
from app.middleware.auth import get_current_user, get_optional_user
from app.services.glsl_validator import validate_glsl
router = APIRouter()
# ── Public list / search ──────────────────────────────────
@router.get("", response_model=list[ShaderPublic])
async def list_shaders(
q: str | None = Query(None, description="Search query"),
tags: list[str] | None = Query(None, description="Filter by tags"),
shader_type: str | None = Query(None, description="Filter by type: 2d, 3d, audio-reactive"),
sort: str = Query("trending", description="Sort: trending, new, top"),
is_system: bool | None = Query(None, description="Filter to system/platform shaders"),
limit: int = Query(20, ge=1, le=50),
offset: int = Query(0, ge=0),
db: AsyncSession = Depends(get_db),
):
query = select(Shader).where(Shader.is_public == True, Shader.render_status == "ready")
query = select(Shader).where(
Shader.is_public == True,
Shader.status == "published",
)
if q:
query = query.where(Shader.title.ilike(f"%{q}%"))
if tags:
query = query.where(Shader.tags.overlap(tags))
from sqlalchemy import type_coerce, Text
from sqlalchemy.dialects.postgresql import ARRAY as PG_ARRAY
query = query.where(Shader.tags.overlap(type_coerce(tags, PG_ARRAY(Text))))
if shader_type:
query = query.where(Shader.shader_type == shader_type)
if is_system is not None:
query = query.where(Shader.is_system == is_system)
if sort == "new":
query = query.order_by(Shader.created_at.desc())
elif sort == "top":
query = query.order_by(Shader.score.desc())
else: # trending
else:
query = query.order_by(Shader.score.desc(), Shader.created_at.desc())
query = query.limit(limit).offset(offset)
@ -44,6 +56,27 @@ async def list_shaders(
return result.scalars().all()
# ── My shaders (workspace) ───────────────────────────────
@router.get("/mine", response_model=list[ShaderPublic])
async def my_shaders(
status_filter: str | None = Query(None, alias="status", description="draft, published, archived"),
limit: int = Query(50, ge=1, le=100),
offset: int = Query(0, ge=0),
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
"""List the authenticated user's shaders — drafts, published, archived."""
query = select(Shader).where(Shader.author_id == user.id)
if status_filter:
query = query.where(Shader.status == status_filter)
query = query.order_by(Shader.updated_at.desc()).limit(limit).offset(offset)
result = await db.execute(query)
return result.scalars().all()
# ── Single shader ─────────────────────────────────────────
@router.get("/{shader_id}", response_model=ShaderPublic)
async def get_shader(
shader_id: UUID,
@ -55,24 +88,87 @@ async def get_shader(
if not shader:
raise HTTPException(status_code=404, detail="Shader not found")
# Drafts are only visible to their author
if shader.status == "draft" and (not user or user.id != shader.author_id):
raise HTTPException(status_code=404, detail="Shader not found")
if not shader.is_public and (not user or user.id != shader.author_id):
raise HTTPException(status_code=404, detail="Shader not found")
# Increment view count
shader.view_count += 1
return shader
# ── Version history ───────────────────────────────────────
@router.get("/{shader_id}/versions", response_model=list[ShaderVersionPublic])
async def list_versions(
shader_id: UUID,
db: AsyncSession = Depends(get_db),
user: User | None = Depends(get_optional_user),
):
"""Get the version history of a shader."""
shader = (await db.execute(select(Shader).where(Shader.id == shader_id))).scalar_one_or_none()
if not shader:
raise HTTPException(status_code=404, detail="Shader not found")
if shader.status == "draft" and (not user or user.id != shader.author_id):
raise HTTPException(status_code=404, detail="Shader not found")
result = await db.execute(
select(ShaderVersion)
.where(ShaderVersion.shader_id == shader_id)
.order_by(ShaderVersion.version_number.desc())
)
return result.scalars().all()
@router.get("/{shader_id}/versions/{version_number}", response_model=ShaderVersionPublic)
async def get_version(
shader_id: UUID,
version_number: int,
db: AsyncSession = Depends(get_db),
):
result = await db.execute(
select(ShaderVersion).where(
ShaderVersion.shader_id == shader_id,
ShaderVersion.version_number == version_number,
)
)
version = result.scalar_one_or_none()
if not version:
raise HTTPException(status_code=404, detail="Version not found")
return version
# ── Create shader (draft or published) ───────────────────
@router.post("", response_model=ShaderPublic, status_code=status.HTTP_201_CREATED)
async def create_shader(
body: ShaderCreate,
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
# TODO: Turnstile verification for submit
# TODO: Rate limit check (free tier: 5/month)
# TODO: GLSL validation via glslang
# TODO: Enqueue render job
# Rate limit published shaders for free tier (drafts are unlimited)
if body.status == "published" and user.subscription_tier == "free":
month_start = datetime.now(timezone.utc).replace(day=1, hour=0, minute=0, second=0, microsecond=0)
count_result = await db.execute(
select(func.count()).select_from(Shader).where(
Shader.author_id == user.id,
Shader.status == "published",
Shader.created_at >= month_start,
)
)
monthly_count = count_result.scalar()
if monthly_count >= 5:
raise HTTPException(status_code=429, detail="Free tier: 5 published shaders/month. Upgrade to Pro for unlimited.")
# Validate GLSL
validation = validate_glsl(body.glsl_code, body.shader_type)
if not validation.valid:
raise HTTPException(status_code=422, detail={
"message": "GLSL validation failed",
"errors": validation.errors,
"warnings": validation.warnings,
})
shader = Shader(
author_id=user.id,
@ -81,15 +177,52 @@ async def create_shader(
glsl_code=body.glsl_code,
tags=body.tags,
shader_type=body.shader_type,
is_public=body.is_public,
is_public=body.is_public if body.status == "published" else False,
status=body.status,
style_metadata=body.style_metadata,
render_status="pending",
render_status="ready" if body.status == "draft" else "pending",
current_version=1,
)
db.add(shader)
await db.flush()
# Create version 1 snapshot
v1 = ShaderVersion(
shader_id=shader.id,
version_number=1,
glsl_code=body.glsl_code,
title=body.title,
description=body.description,
tags=body.tags,
style_metadata=body.style_metadata,
change_note="Initial version",
)
db.add(v1)
# Enqueue render for published shaders
if body.status == "published":
from app.worker import celery_app
try:
celery_app.send_task("render_shader", args=[str(shader.id)])
except Exception:
shader.render_status = "ready"
# Link to desire if fulfilling
if body.fulfills_desire_id:
from app.models import Desire
desire = (await db.execute(
select(Desire).where(Desire.id == body.fulfills_desire_id, Desire.status == "open")
)).scalar_one_or_none()
if desire:
desire.status = "fulfilled"
desire.fulfilled_by_shader = shader.id
desire.fulfilled_at = datetime.now(timezone.utc)
return shader
# ── Update shader (creates new version) ──────────────────
@router.put("/{shader_id}", response_model=ShaderPublic)
async def update_shader(
shader_id: UUID,
@ -104,12 +237,64 @@ async def update_shader(
if shader.author_id != user.id and user.role != "admin":
raise HTTPException(status_code=403, detail="Not the shader owner")
for field, value in body.model_dump(exclude_unset=True).items():
updates = body.model_dump(exclude_unset=True)
change_note = updates.pop("change_note", None)
code_changed = "glsl_code" in updates
# Re-validate GLSL if code changed
if code_changed:
validation = validate_glsl(updates["glsl_code"], shader.shader_type)
if not validation.valid:
raise HTTPException(status_code=422, detail={
"message": "GLSL validation failed",
"errors": validation.errors,
"warnings": validation.warnings,
})
# Apply updates
for field, value in updates.items():
setattr(shader, field, value)
# Create a new version snapshot if code or metadata changed
if code_changed or "title" in updates or "description" in updates or "tags" in updates:
shader.current_version += 1
new_version = ShaderVersion(
shader_id=shader.id,
version_number=shader.current_version,
glsl_code=shader.glsl_code,
title=shader.title,
description=shader.description,
tags=shader.tags,
style_metadata=shader.style_metadata,
change_note=change_note,
)
db.add(new_version)
# Re-render if code changed and shader is published
if code_changed and shader.status == "published":
shader.render_status = "pending"
from app.worker import celery_app
try:
celery_app.send_task("render_shader", args=[str(shader.id)])
except Exception:
shader.render_status = "ready"
# If publishing a draft, ensure it's public and queue render
if "status" in updates and updates["status"] == "published" and shader.render_status != "ready":
shader.is_public = True
shader.render_status = "pending"
from app.worker import celery_app
try:
celery_app.send_task("render_shader", args=[str(shader.id)])
except Exception:
shader.render_status = "ready"
shader.updated_at = datetime.now(timezone.utc)
return shader
# ── Delete ────────────────────────────────────────────────
@router.delete("/{shader_id}", status_code=status.HTTP_204_NO_CONTENT)
async def delete_shader(
shader_id: UUID,
@ -122,10 +307,11 @@ async def delete_shader(
raise HTTPException(status_code=404, detail="Shader not found")
if shader.author_id != user.id and user.role != "admin":
raise HTTPException(status_code=403, detail="Not the shader owner")
await db.delete(shader)
# ── Fork ──────────────────────────────────────────────────
@router.post("/{shader_id}/fork", response_model=ShaderPublic, status_code=status.HTTP_201_CREATED)
async def fork_shader(
shader_id: UUID,
@ -136,7 +322,7 @@ async def fork_shader(
original = result.scalar_one_or_none()
if not original:
raise HTTPException(status_code=404, detail="Shader not found")
if not original.is_public:
if not original.is_public and original.status != "published":
raise HTTPException(status_code=404, detail="Shader not found")
forked = Shader(
@ -147,8 +333,84 @@ async def fork_shader(
tags=original.tags,
shader_type=original.shader_type,
forked_from=original.id,
render_status="pending",
style_metadata=original.style_metadata,
status="draft", # Forks start as drafts
is_public=False,
render_status="ready",
current_version=1,
)
db.add(forked)
await db.flush()
v1 = ShaderVersion(
shader_id=forked.id,
version_number=1,
glsl_code=original.glsl_code,
title=forked.title,
description=forked.description,
tags=original.tags,
style_metadata=original.style_metadata,
change_note=f"Forked from {original.title}",
)
db.add(v1)
return forked
# ── Restore a version ────────────────────────────────────
@router.post("/{shader_id}/versions/{version_number}/restore", response_model=ShaderPublic)
async def restore_version(
shader_id: UUID,
version_number: int,
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
"""Restore a shader to a previous version (creates a new version snapshot)."""
shader = (await db.execute(select(Shader).where(Shader.id == shader_id))).scalar_one_or_none()
if not shader:
raise HTTPException(status_code=404, detail="Shader not found")
if shader.author_id != user.id and user.role != "admin":
raise HTTPException(status_code=403, detail="Not the shader owner")
version = (await db.execute(
select(ShaderVersion).where(
ShaderVersion.shader_id == shader_id,
ShaderVersion.version_number == version_number,
)
)).scalar_one_or_none()
if not version:
raise HTTPException(status_code=404, detail="Version not found")
# Apply version data to shader
shader.glsl_code = version.glsl_code
shader.title = version.title
shader.description = version.description
shader.tags = version.tags
shader.style_metadata = version.style_metadata
shader.current_version += 1
shader.updated_at = datetime.now(timezone.utc)
# Create a new version snapshot for the restore
restore_v = ShaderVersion(
shader_id=shader.id,
version_number=shader.current_version,
glsl_code=version.glsl_code,
title=version.title,
description=version.description,
tags=version.tags,
style_metadata=version.style_metadata,
change_note=f"Restored from version {version_number}",
)
db.add(restore_v)
# Re-render if published
if shader.status == "published":
shader.render_status = "pending"
from app.worker import celery_app
try:
celery_app.send_task("render_shader", args=[str(shader.id)])
except Exception:
shader.render_status = "ready"
return shader

View file

@ -6,8 +6,9 @@ from sqlalchemy import select
from app.database import get_db
from app.models import User
from app.schemas import UserPublic, UserMe
from app.middleware.auth import get_current_user
from app.schemas import UserPublic, UserMe, UserUpdate, ByokKeysUpdate
from app.middleware.auth import get_current_user, require_tier
from app.services.byok import encrypt_key, get_stored_providers
router = APIRouter()
@ -28,13 +29,58 @@ async def get_me(user: User = Depends(get_current_user)):
@router.put("/me", response_model=UserMe)
async def update_me(
body: UserUpdate,
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
"""Update user settings. (Expanded in Track B)"""
# TODO: Accept settings updates (username, email, etc.)
updates = body.model_dump(exclude_unset=True)
if not updates:
return user
# Check uniqueness for username/email changes
if "username" in updates and updates["username"] != user.username:
existing = await db.execute(
select(User).where(User.username == updates["username"])
)
if existing.scalar_one_or_none():
raise HTTPException(status_code=409, detail="Username already taken")
if "email" in updates and updates["email"] != user.email:
existing = await db.execute(
select(User).where(User.email == updates["email"])
)
if existing.scalar_one_or_none():
raise HTTPException(status_code=409, detail="Email already taken")
for field, value in updates.items():
setattr(user, field, value)
return user
@router.put("/me/ai-keys")
async def update_byok_keys(
body: ByokKeysUpdate,
db: AsyncSession = Depends(get_db),
user: User = Depends(require_tier("pro", "studio")),
):
"""Store encrypted BYOK API keys for AI providers."""
from app.services.byok import save_user_keys
await save_user_keys(db, user, body)
providers = await get_stored_providers(db, user)
return {"status": "ok", "configured_providers": providers}
@router.get("/me/ai-keys")
async def get_byok_keys(
db: AsyncSession = Depends(get_db),
user: User = Depends(require_tier("pro", "studio")),
):
"""List which providers have BYOK keys configured (never returns actual keys)."""
providers = await get_stored_providers(db, user)
return {"configured_providers": providers}
# ── Creator Economy Stubs (501) ─────────────────────────────

View file

@ -1,9 +1,11 @@
"""Votes & engagement router."""
"""Votes & engagement router with hot score calculation."""
import math
from uuid import UUID
from datetime import datetime, timezone
from fastapi import APIRouter, Depends, HTTPException, status
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select
from sqlalchemy import select, func
from app.database import get_db
from app.models import User, Shader, Vote, EngagementEvent
@ -13,6 +15,37 @@ from app.middleware.auth import get_current_user, get_optional_user
router = APIRouter()
def hot_score(upvotes: int, downvotes: int, age_hours: float) -> float:
"""Wilson score lower bound with time decay.
Balances confidence (more votes = more certain) with recency
(newer shaders get a boost that decays over ~48 hours).
"""
n = upvotes + downvotes
if n == 0:
return 0.0
z = 1.96 # 95% confidence
p = upvotes / n
wilson = (p + z * z / (2 * n) - z * math.sqrt(p * (1 - p) / n + z * z / (4 * n * n))) / (1 + z * z / n)
decay = 1.0 / (1.0 + age_hours / 48.0)
return wilson * decay
async def recalculate_score(db: AsyncSession, shader: Shader):
"""Recalculate and update a shader's hot score based on current votes."""
up_result = await db.execute(
select(func.count()).select_from(Vote).where(Vote.shader_id == shader.id, Vote.value == 1)
)
down_result = await db.execute(
select(func.count()).select_from(Vote).where(Vote.shader_id == shader.id, Vote.value == -1)
)
upvotes = up_result.scalar() or 0
downvotes = down_result.scalar() or 0
age_hours = (datetime.now(timezone.utc) - shader.created_at.replace(tzinfo=timezone.utc)).total_seconds() / 3600
shader.score = hot_score(upvotes, downvotes, age_hours)
@router.post("/shaders/{shader_id}/vote", status_code=status.HTTP_200_OK)
async def vote_shader(
shader_id: UUID,
@ -20,12 +53,10 @@ async def vote_shader(
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
# Verify shader exists
shader = (await db.execute(select(Shader).where(Shader.id == shader_id))).scalar_one_or_none()
if not shader:
raise HTTPException(status_code=404, detail="Shader not found")
# Upsert vote
existing = (await db.execute(
select(Vote).where(Vote.user_id == user.id, Vote.shader_id == shader_id)
)).scalar_one_or_none()
@ -35,8 +66,10 @@ async def vote_shader(
else:
db.add(Vote(user_id=user.id, shader_id=shader_id, value=body.value))
# TODO: Recalculate hot score (Track F)
return {"status": "ok", "value": body.value}
await db.flush()
await recalculate_score(db, shader)
return {"status": "ok", "value": body.value, "new_score": round(shader.score, 4)}
@router.delete("/shaders/{shader_id}/vote", status_code=status.HTTP_204_NO_CONTENT)
@ -45,13 +78,18 @@ async def remove_vote(
db: AsyncSession = Depends(get_db),
user: User = Depends(get_current_user),
):
shader = (await db.execute(select(Shader).where(Shader.id == shader_id))).scalar_one_or_none()
if not shader:
raise HTTPException(status_code=404, detail="Shader not found")
existing = (await db.execute(
select(Vote).where(Vote.user_id == user.id, Vote.shader_id == shader_id)
)).scalar_one_or_none()
if existing:
await db.delete(existing)
# TODO: Recalculate hot score (Track F)
await db.flush()
await recalculate_score(db, shader)
@router.post("/shaders/{shader_id}/replay", status_code=status.HTTP_204_NO_CONTENT)

View file

@ -35,6 +35,7 @@ class UserPublic(BaseModel):
id: UUID
username: str
role: str
is_system: bool
subscription_tier: str
is_verified_creator: bool
created_at: datetime
@ -47,6 +48,17 @@ class UserMe(UserPublic):
last_active_at: Optional[datetime] = None
class UserUpdate(BaseModel):
username: Optional[str] = Field(None, min_length=3, max_length=30, pattern=r"^[a-zA-Z0-9_-]+$")
email: Optional[EmailStr] = None
class ByokKeysUpdate(BaseModel):
anthropic_key: Optional[str] = Field(None, description="Anthropic API key")
openai_key: Optional[str] = Field(None, description="OpenAI API key")
ollama_endpoint: Optional[str] = Field(None, description="Ollama endpoint URL")
# ════════════════════════════════════════════════════════════
# SHADERS
# ════════════════════════════════════════════════════════════
@ -58,6 +70,7 @@ class ShaderCreate(BaseModel):
tags: list[str] = Field(default_factory=list, max_length=10)
shader_type: str = Field(default="2d", pattern=r"^(2d|3d|audio-reactive)$")
is_public: bool = True
status: str = Field(default="published", pattern=r"^(draft|published)$")
style_metadata: Optional[dict] = None
fulfills_desire_id: Optional[UUID] = None
@ -68,6 +81,8 @@ class ShaderUpdate(BaseModel):
glsl_code: Optional[str] = Field(None, min_length=10)
tags: Optional[list[str]] = None
is_public: Optional[bool] = None
status: Optional[str] = Field(None, pattern=r"^(draft|published|archived)$")
change_note: Optional[str] = Field(None, max_length=200)
class ShaderPublic(BaseModel):
@ -78,9 +93,12 @@ class ShaderPublic(BaseModel):
title: str
description: Optional[str]
glsl_code: str
status: str
is_public: bool
is_ai_generated: bool
is_system: bool
ai_provider: Optional[str]
system_label: Optional[str]
thumbnail_url: Optional[str]
preview_url: Optional[str]
render_status: str
@ -88,6 +106,7 @@ class ShaderPublic(BaseModel):
tags: list[str]
shader_type: str
forked_from: Optional[UUID]
current_version: int
view_count: int
score: float
created_at: datetime
@ -109,10 +128,28 @@ class ShaderFeedItem(BaseModel):
score: float
view_count: int
is_ai_generated: bool
is_system: bool
system_label: Optional[str]
style_metadata: Optional[dict]
created_at: datetime
class ShaderVersionPublic(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: UUID
shader_id: UUID
version_number: int
glsl_code: str
title: str
description: Optional[str]
tags: list[str]
style_metadata: Optional[dict]
change_note: Optional[str]
thumbnail_url: Optional[str]
created_at: datetime
# ════════════════════════════════════════════════════════════
# VOTES & ENGAGEMENT
# ════════════════════════════════════════════════════════════
@ -147,6 +184,7 @@ class DesirePublic(BaseModel):
tip_amount_cents: int
status: str
heat_score: float
cluster_count: int = 0
fulfilled_by_shader: Optional[UUID]
fulfilled_at: Optional[datetime]
created_at: datetime
@ -158,13 +196,13 @@ class DesirePublic(BaseModel):
class GenerateRequest(BaseModel):
prompt: str = Field(..., min_length=5, max_length=500)
provider: Optional[str] = None # anthropic, openai, ollama — auto-selected if None
provider: Optional[str] = None
style_metadata: Optional[dict] = None
class GenerateStatusResponse(BaseModel):
job_id: str
status: str # queued, generating, rendering, complete, failed
status: str
shader_id: Optional[UUID] = None
error: Optional[str] = None
@ -190,7 +228,6 @@ class ApiKeyPublic(BaseModel):
class ApiKeyCreated(ApiKeyPublic):
"""Returned only on creation — includes the full key (shown once)."""
full_key: str

View file

@ -0,0 +1,105 @@
"""BYOK (Bring Your Own Key) encryption service.
Encrypts user API keys at rest using AES-256-GCM with a key derived from
the user's ID + the server master key. Keys are only decrypted in the
worker context when a generation job runs.
"""
import os
import base64
import hashlib
from cryptography.hazmat.primitives.ciphers.aead import AESGCM
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select, delete
from app.config import get_settings
# Keys stored as JSON in user metadata — simple approach for now.
# Could be a separate table if key management gets complex.
PROVIDERS = ("anthropic", "openai", "ollama")
def _derive_key(user_id: str) -> bytes:
"""Derive a per-user AES-256 key from master key + user ID."""
settings = get_settings()
master = settings.byok_master_key.encode()
return hashlib.pbkdf2_hmac("sha256", master, user_id.encode(), 100_000)
def encrypt_key(user_id: str, plaintext: str) -> str:
"""Encrypt an API key. Returns base64-encoded nonce+ciphertext."""
key = _derive_key(user_id)
aesgcm = AESGCM(key)
nonce = os.urandom(12)
ct = aesgcm.encrypt(nonce, plaintext.encode(), None)
return base64.b64encode(nonce + ct).decode()
def decrypt_key(user_id: str, encrypted: str) -> str:
"""Decrypt an API key from base64-encoded nonce+ciphertext."""
key = _derive_key(user_id)
raw = base64.b64decode(encrypted)
nonce = raw[:12]
ct = raw[12:]
aesgcm = AESGCM(key)
return aesgcm.decrypt(nonce, ct, None).decode()
async def save_user_keys(db: AsyncSession, user, body) -> None:
"""Save encrypted BYOK keys to the user's metadata.
Stores as a JSONB field on the user record. Each provider key is
individually encrypted so compromising one doesn't expose others.
"""
from app.models import User
existing_meta = user.style_metadata if hasattr(user, 'byok_keys') else {}
# We store byok data in a dedicated pattern — not in style_metadata
# For now using a simple approach: store in a known Redis key
# In production this should be a separate encrypted column or table
from app.redis import get_redis
redis = await get_redis()
user_id_str = str(user.id)
if body.anthropic_key is not None:
if body.anthropic_key == "":
await redis.hdel(f"byok:{user_id_str}", "anthropic")
else:
encrypted = encrypt_key(user_id_str, body.anthropic_key)
await redis.hset(f"byok:{user_id_str}", "anthropic", encrypted)
if body.openai_key is not None:
if body.openai_key == "":
await redis.hdel(f"byok:{user_id_str}", "openai")
else:
encrypted = encrypt_key(user_id_str, body.openai_key)
await redis.hset(f"byok:{user_id_str}", "openai", encrypted)
if body.ollama_endpoint is not None:
if body.ollama_endpoint == "":
await redis.hdel(f"byok:{user_id_str}", "ollama")
else:
encrypted = encrypt_key(user_id_str, body.ollama_endpoint)
await redis.hset(f"byok:{user_id_str}", "ollama", encrypted)
async def get_stored_providers(db: AsyncSession, user) -> list[str]:
"""Return list of provider names that have BYOK keys configured."""
from app.redis import get_redis
redis = await get_redis()
user_id_str = str(user.id)
keys = await redis.hkeys(f"byok:{user_id_str}")
return [k for k in keys if k in PROVIDERS]
async def get_decrypted_key(user_id: str, provider: str) -> str | None:
"""Decrypt and return a user's BYOK key for a provider. Worker-context only."""
from app.redis import get_redis
redis = await get_redis()
encrypted = await redis.hget(f"byok:{user_id}", provider)
if not encrypted:
return None
return decrypt_key(user_id, encrypted)

View file

@ -0,0 +1,406 @@
"""Clustering service — pgvector cosine similarity search and heat calculation.
Groups desires into clusters based on prompt embedding similarity.
Uses pgvector's <=> cosine distance operator for nearest-neighbor search.
Heat scores scale linearly with cluster size (more demand more visible).
Provides both async (for FastAPI) and sync (for Celery worker) variants.
"""
import logging
import uuid as uuid_mod
from uuid import UUID
from sqlalchemy import text
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import Session
from app.models.models import DesireCluster
logger = logging.getLogger(__name__)
async def find_nearest_cluster(
embedding: list[float],
db: AsyncSession,
threshold: float = 0.82,
) -> tuple[UUID | None, float]:
"""Find the nearest existing desire cluster for an embedding vector.
Uses pgvector cosine distance (<=> operator). A threshold of 0.82 means
cosine_similarity >= 0.82, i.e., cosine_distance <= 0.18.
Returns:
(cluster_id, similarity) if a match is found within threshold,
(None, 0.0) if no match exists.
"""
max_distance = 1.0 - threshold
# Raw SQL for pgvector cosine distance — SQLAlchemy ORM doesn't natively
# support the <=> operator without extra configuration.
query = text("""
SELECT d.id AS desire_id,
(d.prompt_embedding <=> :embedding) AS distance
FROM desires d
WHERE d.prompt_embedding IS NOT NULL
AND (d.prompt_embedding <=> :embedding) <= :max_distance
ORDER BY distance ASC
LIMIT 1
""")
result = await db.execute(
query,
{"embedding": str(embedding), "max_distance": max_distance},
)
row = result.first()
if row is None:
logger.debug("No nearby cluster found (threshold=%.2f)", threshold)
return (None, 0.0)
# Found a nearby desire — look up its cluster membership
matched_desire_id = row.desire_id
similarity = 1.0 - row.distance
cluster_query = text("""
SELECT cluster_id FROM desire_clusters
WHERE desire_id = :desire_id
LIMIT 1
""")
cluster_result = await db.execute(
cluster_query, {"desire_id": matched_desire_id}
)
cluster_row = cluster_result.first()
if cluster_row is None:
# Nearby desire exists but isn't in a cluster — shouldn't normally
# happen but handle gracefully by treating as no match.
logger.warning(
"Desire %s is nearby (sim=%.3f) but has no cluster assignment",
matched_desire_id,
similarity,
)
return (None, 0.0)
logger.info(
"Found nearby cluster %s via desire %s (similarity=%.3f)",
cluster_row.cluster_id,
matched_desire_id,
similarity,
)
return (cluster_row.cluster_id, similarity)
async def create_cluster(desire_id: UUID, db: AsyncSession) -> UUID:
"""Create a new single-member cluster for a desire.
Returns the new cluster_id.
"""
cluster_id = uuid_mod.uuid4()
entry = DesireCluster(
cluster_id=cluster_id,
desire_id=desire_id,
similarity=1.0,
)
db.add(entry)
await db.flush()
logger.info("Created new cluster %s for desire %s", cluster_id, desire_id)
return cluster_id
async def add_to_cluster(
cluster_id: UUID,
desire_id: UUID,
similarity: float,
db: AsyncSession,
) -> None:
"""Add a desire to an existing cluster.
Uses INSERT ... ON CONFLICT DO NOTHING to safely handle re-processing
(idempotent won't duplicate if the desire is already in the cluster).
"""
insert_query = text("""
INSERT INTO desire_clusters (cluster_id, desire_id, similarity)
VALUES (:cluster_id, :desire_id, :similarity)
ON CONFLICT (cluster_id, desire_id) DO NOTHING
""")
await db.execute(insert_query, {
"cluster_id": cluster_id,
"desire_id": desire_id,
"similarity": similarity,
})
await db.flush()
logger.info(
"Added desire %s to cluster %s (similarity=%.3f)",
desire_id,
cluster_id,
similarity,
)
async def recalculate_heat(cluster_id: UUID, db: AsyncSession) -> float:
"""Recalculate heat scores for all desires in a cluster.
Heat = cluster_size (linear scaling). A 3-member cluster means each
desire in it gets heat_score = 3.0.
Returns the new heat score.
"""
# Count members in this cluster
count_query = text("""
SELECT COUNT(*) AS cnt FROM desire_clusters
WHERE cluster_id = :cluster_id
""")
result = await db.execute(count_query, {"cluster_id": cluster_id})
cluster_size = result.scalar_one()
heat_score = float(cluster_size)
# Update all desires in the cluster
update_query = text("""
UPDATE desires SET heat_score = :heat
WHERE id IN (
SELECT desire_id FROM desire_clusters
WHERE cluster_id = :cluster_id
)
""")
await db.execute(update_query, {
"heat": heat_score,
"cluster_id": cluster_id,
})
await db.flush()
logger.info(
"Recalculated heat for cluster %s: size=%d, heat_score=%.1f",
cluster_id,
cluster_size,
heat_score,
)
return heat_score
async def cluster_desire(
desire_id: UUID,
embedding: list[float],
db: AsyncSession,
) -> dict:
"""Orchestrate clustering for a single desire.
Flow: find_nearest_cluster add_to_cluster + recalculate_heat (if match)
or create_cluster (if no match).
Returns an observability dict:
{
"cluster_id": UUID,
"is_new": bool,
"heat_score": float,
}
"""
cluster_id, similarity = await find_nearest_cluster(embedding, db)
if cluster_id is not None:
# Join existing cluster
await add_to_cluster(cluster_id, desire_id, similarity, db)
heat_score = await recalculate_heat(cluster_id, db)
logger.info(
"Desire %s joined cluster %s (similarity=%.3f, heat=%.1f)",
desire_id,
cluster_id,
similarity,
heat_score,
)
return {
"cluster_id": cluster_id,
"is_new": False,
"heat_score": heat_score,
}
else:
# Create new single-member cluster
cluster_id = await create_cluster(desire_id, db)
logger.info(
"Desire %s started new cluster %s (heat=1.0)",
desire_id,
cluster_id,
)
return {
"cluster_id": cluster_id,
"is_new": True,
"heat_score": 1.0,
}
# ── Synchronous variants (for Celery worker context) ─────────────────────
def find_nearest_cluster_sync(
embedding: list[float],
session: Session,
threshold: float = 0.82,
) -> tuple[UUID | None, float]:
"""Sync variant of find_nearest_cluster for Celery worker context."""
max_distance = 1.0 - threshold
query = text("""
SELECT d.id AS desire_id,
(d.prompt_embedding <=> :embedding) AS distance
FROM desires d
WHERE d.prompt_embedding IS NOT NULL
AND (d.prompt_embedding <=> :embedding) <= :max_distance
ORDER BY distance ASC
LIMIT 1
""")
result = session.execute(
query,
{"embedding": str(embedding), "max_distance": max_distance},
)
row = result.first()
if row is None:
logger.debug("No nearby cluster found (threshold=%.2f)", threshold)
return (None, 0.0)
matched_desire_id = row.desire_id
similarity = 1.0 - row.distance
cluster_query = text("""
SELECT cluster_id FROM desire_clusters
WHERE desire_id = :desire_id
LIMIT 1
""")
cluster_result = session.execute(
cluster_query, {"desire_id": matched_desire_id}
)
cluster_row = cluster_result.first()
if cluster_row is None:
logger.warning(
"Desire %s is nearby (sim=%.3f) but has no cluster assignment",
matched_desire_id,
similarity,
)
return (None, 0.0)
logger.info(
"Found nearby cluster %s via desire %s (similarity=%.3f)",
cluster_row.cluster_id,
matched_desire_id,
similarity,
)
return (cluster_row.cluster_id, similarity)
def create_cluster_sync(desire_id: UUID, session: Session) -> UUID:
"""Sync variant of create_cluster for Celery worker context."""
cluster_id = uuid_mod.uuid4()
entry = DesireCluster(
cluster_id=cluster_id,
desire_id=desire_id,
similarity=1.0,
)
session.add(entry)
session.flush()
logger.info("Created new cluster %s for desire %s", cluster_id, desire_id)
return cluster_id
def add_to_cluster_sync(
cluster_id: UUID,
desire_id: UUID,
similarity: float,
session: Session,
) -> None:
"""Sync variant of add_to_cluster for Celery worker context."""
insert_query = text("""
INSERT INTO desire_clusters (cluster_id, desire_id, similarity)
VALUES (:cluster_id, :desire_id, :similarity)
ON CONFLICT (cluster_id, desire_id) DO NOTHING
""")
session.execute(insert_query, {
"cluster_id": cluster_id,
"desire_id": desire_id,
"similarity": similarity,
})
session.flush()
logger.info(
"Added desire %s to cluster %s (similarity=%.3f)",
desire_id,
cluster_id,
similarity,
)
def recalculate_heat_sync(cluster_id: UUID, session: Session) -> float:
"""Sync variant of recalculate_heat for Celery worker context."""
count_query = text("""
SELECT COUNT(*) AS cnt FROM desire_clusters
WHERE cluster_id = :cluster_id
""")
result = session.execute(count_query, {"cluster_id": cluster_id})
cluster_size = result.scalar_one()
heat_score = float(cluster_size)
update_query = text("""
UPDATE desires SET heat_score = :heat
WHERE id IN (
SELECT desire_id FROM desire_clusters
WHERE cluster_id = :cluster_id
)
""")
session.execute(update_query, {
"heat": heat_score,
"cluster_id": cluster_id,
})
session.flush()
logger.info(
"Recalculated heat for cluster %s: size=%d, heat_score=%.1f",
cluster_id,
cluster_size,
heat_score,
)
return heat_score
def cluster_desire_sync(
desire_id: UUID,
embedding: list[float],
session: Session,
) -> dict:
"""Sync orchestrator for clustering a single desire (Celery worker context).
Same flow as async cluster_desire but uses synchronous Session.
Returns:
{"cluster_id": UUID, "is_new": bool, "heat_score": float}
"""
cluster_id, similarity = find_nearest_cluster_sync(embedding, session)
if cluster_id is not None:
add_to_cluster_sync(cluster_id, desire_id, similarity, session)
heat_score = recalculate_heat_sync(cluster_id, session)
logger.info(
"Desire %s joined cluster %s (similarity=%.3f, heat=%.1f)",
desire_id,
cluster_id,
similarity,
heat_score,
)
return {
"cluster_id": cluster_id,
"is_new": False,
"heat_score": heat_score,
}
else:
cluster_id = create_cluster_sync(desire_id, session)
logger.info(
"Desire %s started new cluster %s (heat=1.0)",
desire_id,
cluster_id,
)
return {
"cluster_id": cluster_id,
"is_new": True,
"heat_score": 1.0,
}

View file

@ -0,0 +1,291 @@
"""Text embedding service using TF-IDF + TruncatedSVD.
Converts desire prompt text into 512-dimensional dense vectors suitable
for pgvector cosine similarity search. Pre-seeded with shader/visual-art
domain vocabulary so the model produces meaningful vectors even from a
single short text input.
"""
import logging
import time
import numpy as np
from sklearn.decomposition import TruncatedSVD
from sklearn.feature_extraction.text import TfidfVectorizer
logger = logging.getLogger(__name__)
# Shader / visual-art domain seed corpus.
# Gives the TF-IDF model a vocabulary foundation so it can produce
# meaningful vectors from short creative text descriptions.
# Target: 500+ unique TF-IDF features (unigrams + bigrams) to support
# near-512 SVD components without heavy padding.
_SEED_CORPUS: list[str] = [
"particle system fluid simulation dynamics motion",
"raymarching signed distance field sdf shapes volumes",
"procedural noise fractal pattern generation recursive",
"color palette gradient blend interpolation smooth",
"audio reactive frequency spectrum visualization beat",
"ragdoll physics dark moody atmosphere heavy",
"kaleidoscope symmetry rotation mirror reflection",
"voronoi cellular texture organic growth biological",
"bloom glow post processing effect luminance",
"retro pixel art scanlines crt monitor vintage",
"geometry morphing vertex displacement deformation mesh",
"wave propagation ripple interference oscillation",
"fire smoke volumetric rendering density fog",
"crystal refraction caustics light transparency",
"terrain heightmap erosion landscape mountain valley",
"shader glitch distortion databend corruption artifact",
"feedback loop recursive transformation iteration",
"physics collision rigid body dynamics impulse",
"abstract minimal geometric composition shape",
"aurora borealis atmospheric optical phenomenon sky",
"underwater caustics god rays depth ocean",
"cyberpunk neon wireframe grid futuristic urban",
"organic growth branching lsystem tree root",
"mandelbrot julia set fractal zoom iteration",
"cloth simulation soft body drape fabric textile",
"dissolve transition threshold mask binary cutoff",
"chromatic aberration lens distortion optical warp",
"shadow mapping ambient occlusion darkness depth",
"motion blur temporal accumulation streak trail",
"boids flocking swarm emergence collective behavior",
"reaction diffusion turing pattern spots stripes",
"perlin simplex worley noise texture procedural",
"voxel rendering isometric cube block pixel",
"gravity orbital celestial mechanics planet orbit",
"psychedelic trippy color shift hue rotation",
"spiral fibonacci golden ratio mathematical curve",
"explosion debris shatter fragment destruction impact",
"rain snow weather precipitation droplet splash",
"electric lightning bolt plasma energy discharge",
"tunnel infinite corridor perspective vanishing point",
"metaball blob isosurface marching cubes implicit",
"starfield galaxy nebula cosmic space stellar",
"shadow puppet silhouette outline contour edge",
"mosaic tessellation tile pattern hexagonal grid",
"hologram iridescent spectrum rainbow interference",
"ink watercolor paint brush stroke artistic",
"sand dune desert wind erosion granular",
"ice frost frozen crystal snowflake cold",
"magma lava volcanic molten heat flow",
"laser beam scanning projection line vector",
"DNA helix molecular biology strand protein",
"circuit board electronic trace signal digital",
"camouflage pattern dithering halftone screening dots",
"waveform synthesizer oscillator modulation frequency",
"topographic contour map elevation isoline level",
"origami fold paper crease geometric angular",
"stained glass window colorful segmented panels",
"smoke ring vortex toroidal turbulence curl",
"pendulum harmonic oscillation swing periodic cycle",
"cloud formation cumulus atmospheric convection wispy",
"ripple pond surface tension concentric circular",
"decay rust corrosion entropy degradation aging",
"fiber optic strand luminous filament glow",
"prism dispersion spectral separation wavelength band",
"radar sonar ping pulse echo scanning",
"compass rose navigation cardinal directional symbol",
"clock mechanism gear cog rotation mechanical",
"barcode matrix encoding data stripe identification",
"fingerprint unique biometric whorl ridge pattern",
"maze labyrinth path algorithm recursive backtrack",
"chess checkerboard alternating square pattern grid",
"domino cascade chain sequential trigger falling",
"balloon inflation expansion pressure sphere elastic",
"ribbon flowing fabric curve spline bezier",
"confetti celebration scatter random distribution joyful",
"ember spark ignition tiny particle hot",
"bubble foam soap iridescent sphere surface tension",
"whirlpool maelstrom spinning vortex fluid drain",
"mirage shimmer heat haze atmospheric refraction",
"echo reverberation delay repetition fading diminish",
"pulse heartbeat rhythmic expanding ring concentric",
"weave interlocking thread textile warp weft",
"honeycomb hexagonal efficient packing natural structure",
"coral reef branching organic marine growth colony",
"mushroom spore cap stem fungal network mycelium",
"neuron synapse network brain signal impulse dendrite",
"constellation star connect dot line celestial chart",
"seismograph earthquake wave amplitude vibration tremor",
"aurora curtain charged particle magnetic solar wind",
"tidal wave surge ocean force gravitational pull",
"sandstorm particle erosion wind desert visibility",
"volcanic eruption ash plume pyroclastic flow magma",
]
class EmbeddingService:
"""Produces 512-dim L2-normalized vectors from text using TF-IDF + SVD."""
def __init__(self) -> None:
self._vectorizer = TfidfVectorizer(
ngram_range=(1, 2),
max_features=10_000,
stop_words="english",
)
self._svd = TruncatedSVD(n_components=512, random_state=42)
self._corpus: list[str] = list(_SEED_CORPUS)
self._fitted: bool = False
def _fit_if_needed(self) -> None:
"""Fit the vectorizer and SVD on the seed corpus if not yet fitted."""
if self._fitted:
return
tfidf_matrix = self._vectorizer.fit_transform(self._corpus)
# SVD n_components must be < number of features in the TF-IDF matrix.
# If the corpus is too small, reduce SVD components temporarily.
n_features = tfidf_matrix.shape[1]
if n_features < self._svd.n_components:
logger.warning(
"TF-IDF produced only %d features, reducing SVD components "
"from %d to %d",
n_features,
self._svd.n_components,
n_features - 1,
)
self._svd = TruncatedSVD(
n_components=n_features - 1, random_state=42
)
self._svd.fit(tfidf_matrix)
self._fitted = True
def embed_text(self, text: str) -> list[float]:
"""Embed a single text into a 512-dim L2-normalized vector.
Args:
text: Input text to embed.
Returns:
List of 512 floats (L2-normalized).
Raises:
ValueError: If text is empty or whitespace-only.
"""
if not text or not text.strip():
raise ValueError("Cannot embed empty or whitespace-only text")
start = time.perf_counter()
self._fit_if_needed()
tfidf = self._vectorizer.transform([text])
svd_vec = self._svd.transform(tfidf)[0]
# L2 normalize — handle zero vectors from OOV text
norm = np.linalg.norm(svd_vec)
if norm > 1e-10:
svd_vec = svd_vec / norm
else:
# Text produced no recognized vocabulary — generate a
# deterministic low-magnitude vector from the text hash
# so the vector is non-zero but won't cluster with anything.
logger.warning(
"Text produced zero TF-IDF vector (no recognized vocabulary): "
"'%s'",
text[:80],
)
rng = np.random.RandomState(hash(text) % (2**31))
svd_vec = rng.randn(len(svd_vec))
svd_vec = svd_vec / np.linalg.norm(svd_vec)
# Pad to 512 if SVD produced fewer components
if len(svd_vec) < 512:
padded = np.zeros(512)
padded[: len(svd_vec)] = svd_vec
# Re-normalize after padding
pad_norm = np.linalg.norm(padded)
if pad_norm > 0:
padded = padded / pad_norm
svd_vec = padded
elapsed_ms = (time.perf_counter() - start) * 1000
logger.info(
"Embedded text (%d chars) → %d-dim vector in %.1fms",
len(text),
len(svd_vec),
elapsed_ms,
)
return svd_vec.tolist()
def embed_batch(self, texts: list[str]) -> list[list[float]]:
"""Embed multiple texts at once.
Args:
texts: List of input texts.
Returns:
List of 512-dim L2-normalized float lists.
Raises:
ValueError: If any text is empty or whitespace-only.
"""
for i, text in enumerate(texts):
if not text or not text.strip():
raise ValueError(
f"Cannot embed empty or whitespace-only text at index {i}"
)
start = time.perf_counter()
self._fit_if_needed()
tfidf = self._vectorizer.transform(texts)
svd_vecs = self._svd.transform(tfidf)
results: list[list[float]] = []
for idx, vec in enumerate(svd_vecs):
norm = np.linalg.norm(vec)
if norm > 1e-10:
vec = vec / norm
else:
logger.warning(
"Text at index %d produced zero TF-IDF vector: '%s'",
idx,
texts[idx][:80],
)
rng = np.random.RandomState(hash(texts[idx]) % (2**31))
vec = rng.randn(len(vec))
vec = vec / np.linalg.norm(vec)
if len(vec) < 512:
padded = np.zeros(512)
padded[: len(vec)] = vec
pad_norm = np.linalg.norm(padded)
if pad_norm > 0:
padded = padded / pad_norm
vec = padded
results.append(vec.tolist())
elapsed_ms = (time.perf_counter() - start) * 1000
logger.info(
"Batch-embedded %d texts → %d-dim vectors in %.1fms",
len(texts),
512,
elapsed_ms,
)
return results
# Module-level singleton
embedding_service = EmbeddingService()
def embed_text(text: str) -> list[float]:
"""Embed a single text into a 512-dim normalized vector.
Convenience wrapper around the module-level EmbeddingService singleton.
"""
return embedding_service.embed_text(text)
def embed_batch(texts: list[str]) -> list[list[float]]:
"""Embed multiple texts into 512-dim normalized vectors.
Convenience wrapper around the module-level EmbeddingService singleton.
"""
return embedding_service.embed_batch(texts)

View file

@ -0,0 +1,123 @@
"""GLSL Validator — validates shader code before rendering.
Uses basic static analysis. In production, this would shell out to
glslangValidator for full Khronos reference compilation. For now,
performs structural checks that catch the most common issues.
"""
import re
from dataclasses import dataclass
@dataclass
class ValidationResult:
valid: bool
errors: list[str]
warnings: list[str]
# Extensions that are banned (GPU-specific, compute shaders, etc.)
BANNED_EXTENSIONS = {
"GL_ARB_compute_shader",
"GL_NV_gpu_shader5",
"GL_NV_shader_atomic_float",
"GL_NV_shader_atomic_int64",
"GL_EXT_shader_image_load_store",
}
# Patterns that suggest infinite loops or excessive iteration
DANGEROUS_PATTERNS = [
(r"for\s*\(\s*;\s*;\s*\)", "Infinite for-loop detected"),
(r"while\s*\(\s*true\s*\)", "Infinite while-loop detected"),
(r"while\s*\(\s*1\s*\)", "Infinite while-loop detected"),
]
# Maximum allowed loop iterations (heuristic check)
MAX_LOOP_ITERATIONS = 1024
def validate_glsl(code: str, shader_type: str = "2d") -> ValidationResult:
"""
Validate GLSL fragment shader code.
Checks:
1. Required entry point exists (mainImage or main)
2. Output writes to fragColor
3. No banned extensions
4. No obvious infinite loops
5. Reasonable code length
"""
errors = []
warnings = []
# Basic sanity
if not code or len(code.strip()) < 20:
errors.append("Shader code is too short to be valid")
return ValidationResult(valid=False, errors=errors, warnings=warnings)
if len(code) > 100_000:
errors.append("Shader code exceeds 100KB limit")
return ValidationResult(valid=False, errors=errors, warnings=warnings)
# Must have mainImage entry point (Shadertoy format)
has_main_image = bool(re.search(
r"void\s+mainImage\s*\(\s*out\s+vec4\s+\w+\s*,\s*in\s+vec2\s+\w+\s*\)",
code
))
has_main = bool(re.search(r"void\s+main\s*\(\s*\)", code))
if not has_main_image and not has_main:
errors.append(
"Missing entry point: expected 'void mainImage(out vec4 fragColor, in vec2 fragCoord)' "
"or 'void main()'"
)
# Check for fragColor output
if has_main_image and "fragColor" not in code and "fragcolour" not in code.lower():
# The output param could be named anything, but fragColor is conventional
# Only warn if mainImage exists and the first param name isn't used
main_match = re.search(
r"void\s+mainImage\s*\(\s*out\s+vec4\s+(\w+)",
code
)
if main_match:
out_name = main_match.group(1)
if out_name not in code.split("mainImage")[1]:
warnings.append(f"Output parameter '{out_name}' may not be written to")
# Check for banned extensions
for ext in BANNED_EXTENSIONS:
if ext in code:
errors.append(f"Banned extension: {ext}")
# Check for dangerous patterns
for pattern, message in DANGEROUS_PATTERNS:
if re.search(pattern, code):
errors.append(message)
# Check for unreasonably large loop bounds
for match in re.finditer(r"for\s*\([^;]*;\s*\w+\s*<\s*(\d+)", code):
bound = int(match.group(1))
if bound > MAX_LOOP_ITERATIONS:
warnings.append(
f"Loop with {bound} iterations may be too expensive for real-time rendering "
f"(recommended max: {MAX_LOOP_ITERATIONS})"
)
# Check #extension directives
for match in re.finditer(r"#extension\s+(\w+)", code):
ext_name = match.group(1)
if ext_name in BANNED_EXTENSIONS:
errors.append(f"Banned extension directive: #extension {ext_name}")
# Balanced braces check
open_braces = code.count("{")
close_braces = code.count("}")
if open_braces != close_braces:
errors.append(f"Unbalanced braces: {open_braces} opening vs {close_braces} closing")
return ValidationResult(
valid=len(errors) == 0,
errors=errors,
warnings=warnings,
)

View file

@ -0,0 +1,73 @@
"""Renderer client — communicates with the headless Chromium renderer service."""
import httpx
from dataclasses import dataclass
from typing import Optional
from app.config import get_settings
@dataclass
class RenderResult:
success: bool
thumbnail_url: Optional[str] = None
preview_url: Optional[str] = None
duration_ms: Optional[int] = None
error: Optional[str] = None
async def render_shader(
glsl_code: str,
shader_id: str,
duration: int = 5,
width: int = 640,
height: int = 360,
fps: int = 30,
) -> RenderResult:
"""
Submit GLSL code to the renderer service for thumbnail + preview generation.
Args:
glsl_code: Complete GLSL fragment shader
shader_id: UUID for organizing output files
duration: Seconds to render
width: Output width in pixels
height: Output height in pixels
fps: Frames per second for video preview
"""
settings = get_settings()
try:
async with httpx.AsyncClient(timeout=30.0) as client:
resp = await client.post(
f"{settings.renderer_url}/render",
json={
"glsl": glsl_code,
"shader_id": shader_id,
"duration": duration,
"width": width,
"height": height,
"fps": fps,
},
)
if resp.status_code == 200:
data = resp.json()
return RenderResult(
success=True,
thumbnail_url=data.get("thumbnail_url"),
preview_url=data.get("preview_url"),
duration_ms=data.get("duration_ms"),
)
else:
data = resp.json() if resp.headers.get("content-type", "").startswith("application/json") else {}
return RenderResult(
success=False,
error=data.get("error", f"Renderer returned status {resp.status_code}"),
)
except httpx.TimeoutException:
return RenderResult(success=False, error="Renderer timed out after 30s")
except httpx.ConnectError:
return RenderResult(success=False, error="Could not connect to renderer service")
except Exception as e:
return RenderResult(success=False, error=f"Renderer error: {str(e)}")

View file

@ -1,8 +1,15 @@
"""Fractafrag — Celery worker configuration."""
import logging
import time
import uuid as uuid_mod
from celery import Celery
import os
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
redis_url = os.environ.get("REDIS_URL", "redis://redis:6379/0")
celery_app = Celery(
@ -28,6 +35,19 @@ celery_app.conf.update(
celery_app.autodiscover_tasks(["app.worker"])
# ── Sync DB session factory for worker tasks ──────────────
def _get_sync_session_factory():
"""Lazy-init sync session factory using settings.database_url_sync."""
from app.config import get_settings
settings = get_settings()
engine = create_engine(settings.database_url_sync, pool_pre_ping=True)
return sessionmaker(bind=engine)
logger = logging.getLogger(__name__)
# ── Task Definitions ──────────────────────────────────────
@celery_app.task(name="render_shader", bind=True, max_retries=2)
@ -48,11 +68,77 @@ def embed_shader(self, shader_id: str):
pass
@celery_app.task(name="process_desire", bind=True)
@celery_app.task(name="process_desire", bind=True, max_retries=3)
def process_desire(self, desire_id: str):
"""Process a new desire: embed, cluster, optionally auto-fulfill. (Track G)"""
# TODO: Implement in Track G
pass
"""Process a new desire: embed text, store embedding, cluster, update heat.
Flow:
1. Load desire from DB by id
2. Embed prompt_text 512-dim vector
3. Store embedding on desire row
4. Run sync clustering (find nearest or create new cluster)
5. Commit all changes
On transient DB errors, retries up to 3 times with 30s backoff.
On success, logs desire_id, cluster_id, heat_score, and elapsed_ms.
On failure, desire keeps prompt_embedding=NULL and heat_score=1.0.
"""
start = time.perf_counter()
desire_uuid = uuid_mod.UUID(desire_id)
SessionFactory = _get_sync_session_factory()
session = SessionFactory()
try:
from app.models.models import Desire
from app.services.embedding import embed_text
from app.services.clustering import cluster_desire_sync
# 1. Load desire
desire = session.get(Desire, desire_uuid)
if desire is None:
logger.warning(
"process_desire: desire %s not found, skipping", desire_id
)
return
# 2. Embed prompt text
embedding = embed_text(desire.prompt_text)
# 3. Store embedding on desire
desire.prompt_embedding = embedding
session.flush()
# 4. Cluster
cluster_result = cluster_desire_sync(desire.id, embedding, session)
# 5. Commit
session.commit()
elapsed_ms = (time.perf_counter() - start) * 1000
logger.info(
"process_desire completed: desire_id=%s cluster_id=%s "
"is_new=%s heat_score=%.1f elapsed_ms=%.1f",
desire_id,
cluster_result["cluster_id"],
cluster_result["is_new"],
cluster_result["heat_score"],
elapsed_ms,
)
except Exception as exc:
session.rollback()
elapsed_ms = (time.perf_counter() - start) * 1000
logger.error(
"process_desire failed: desire_id=%s error=%s elapsed_ms=%.1f",
desire_id,
str(exc),
elapsed_ms,
)
raise self.retry(exc=exc, countdown=30)
finally:
session.close()
@celery_app.task(name="ai_generate", bind=True, max_retries=3)

View file

@ -13,15 +13,18 @@ dependencies = [
"alembic>=1.14.0",
"pydantic>=2.10.0",
"pydantic-settings>=2.7.0",
"email-validator>=2.2.0",
"pgvector>=0.3.6",
"redis>=5.2.0",
"celery[redis]>=5.4.0",
"passlib[bcrypt]>=1.7.4",
"bcrypt>=4.2.0",
"python-jose[cryptography]>=3.3.0",
"cryptography>=43.0.0",
"httpx>=0.28.0",
"python-multipart>=0.0.12",
"stripe>=11.0.0",
"numpy>=2.0.0",
"scikit-learn>=1.4",
]
[project.optional-dependencies]
@ -30,4 +33,5 @@ dev = [
"pytest-asyncio>=0.24.0",
"httpx>=0.28.0",
"ruff>=0.8.0",
"aiosqlite>=0.20.0",
]

View file

@ -0,0 +1 @@
"""Tests package for fractafrag-api."""

View file

@ -0,0 +1,190 @@
"""Pytest configuration and shared fixtures for fractafrag-api tests.
Integration test infrastructure:
- Async SQLite in-memory database (via aiosqlite)
- FastAPI test client with dependency overrides
- Auth dependency overrides (mock pro-tier user)
- Celery worker mock (process_desire.delay no-op)
Environment variables are set BEFORE any app.* imports to ensure
get_settings() picks up test values (database.py calls get_settings()
at module scope with @lru_cache).
"""
import os
import sys
import uuid
from pathlib import Path
from unittest.mock import MagicMock, patch
# ── 1. sys.path setup ─────────────────────────────────────
_api_root = str(Path(__file__).resolve().parent.parent)
if _api_root not in sys.path:
sys.path.insert(0, _api_root)
# ── 2. Set env vars BEFORE any app.* imports ──────────────
# We do NOT override DATABASE_URL — the module-level engine in database.py
# uses pool_size/max_overflow which are PostgreSQL-specific. The default
# PostgreSQL URL creates an engine that never actually connects (no queries
# hit it). Our integration tests override get_db with a test SQLite session.
# We only set dummy values for env vars that cause validation failures.
os.environ.setdefault("JWT_SECRET", "test-secret")
os.environ.setdefault("REDIS_URL", "redis://localhost:6379/0")
os.environ.setdefault("BYOK_MASTER_KEY", "test-master-key-0123456789abcdef")
# ── 3. Now safe to import app modules ─────────────────────
import pytest # noqa: E402
import pytest_asyncio # noqa: E402
from httpx import ASGITransport, AsyncClient # noqa: E402
from sqlalchemy import event, text # noqa: E402
from sqlalchemy.ext.asyncio import ( # noqa: E402
AsyncSession,
async_sessionmaker,
create_async_engine,
)
from sqlalchemy.ext.compiler import compiles # noqa: E402
from pgvector.sqlalchemy import Vector # noqa: E402
from sqlalchemy.dialects.postgresql import UUID as PG_UUID, JSONB, ARRAY # noqa: E402
from app.database import Base, get_db # noqa: E402
from app.main import app # noqa: E402
from app.middleware.auth import get_current_user, require_tier # noqa: E402
from app.models.models import User # noqa: E402
# ── 4. SQLite type compilation overrides ──────────────────
# pgvector Vector, PostgreSQL UUID, JSONB, and ARRAY don't exist in SQLite.
# Register custom compilation rules so create_all() works.
@compiles(Vector, "sqlite")
def _compile_vector_sqlite(type_, compiler, **kw):
"""Render pgvector Vector as TEXT in SQLite."""
return "TEXT"
# Override PostgreSQL UUID to TEXT for SQLite
@compiles(PG_UUID, "sqlite")
def _compile_pg_uuid_sqlite(type_, compiler, **kw):
"""Render PostgreSQL UUID as TEXT in SQLite (standard UUID is fine, dialect-specific isn't)."""
return "TEXT"
# Override JSONB to TEXT for SQLite
@compiles(JSONB, "sqlite")
def _compile_jsonb_sqlite(type_, compiler, **kw):
"""Render JSONB as TEXT in SQLite."""
return "TEXT"
# Override ARRAY to TEXT for SQLite
@compiles(ARRAY, "sqlite")
def _compile_array_sqlite(type_, compiler, **kw):
"""Render PostgreSQL ARRAY as TEXT in SQLite."""
return "TEXT"
# Register Python uuid.UUID as a SQLite adapter so raw text() queries
# can bind UUID parameters without "type 'UUID' is not supported" errors.
# IMPORTANT: Use .hex (no hyphens) to match SQLAlchemy's UUID storage format in SQLite.
# Also register list adapter so ARRAY columns (compiled as TEXT in SQLite)
# can bind Python lists without "type 'list' is not supported" errors.
import json as _json # noqa: E402
import sqlite3 # noqa: E402
sqlite3.register_adapter(uuid.UUID, lambda u: u.hex)
sqlite3.register_adapter(list, lambda lst: _json.dumps(lst))
# ── 5. Test database engine and session fixtures ──────────
# Shared test user ID — consistent across all integration tests
TEST_USER_ID = uuid.UUID("aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa")
@pytest_asyncio.fixture(scope="session")
async def db_engine():
"""Create an async SQLite engine and all tables. Session-scoped."""
engine = create_async_engine(
"sqlite+aiosqlite://",
echo=False,
# SQLite doesn't support pool_size/max_overflow
)
async with engine.begin() as conn:
await conn.run_sync(Base.metadata.create_all)
yield engine
async with engine.begin() as conn:
await conn.run_sync(Base.metadata.drop_all)
await engine.dispose()
@pytest_asyncio.fixture
async def db_session(db_engine):
"""Yield a fresh AsyncSession per test. Rolls back after each test for isolation."""
session_factory = async_sessionmaker(
db_engine, class_=AsyncSession, expire_on_commit=False
)
async with session_factory() as session:
# Start a nested transaction so we can roll back after the test
async with session.begin():
yield session
# Rollback ensures test isolation — no committed state leaks between tests
await session.rollback()
# ── 6. Mock user fixture ─────────────────────────────────
@pytest.fixture
def test_user():
"""Return a mock User object for auth dependency overrides."""
user = MagicMock(spec=User)
user.id = TEST_USER_ID
user.username = "testuser"
user.email = "testuser@test.com"
user.role = "user"
user.subscription_tier = "pro"
user.is_system = False
user.trust_tier = "standard"
return user
# ── 7. FastAPI test client fixture ────────────────────────
@pytest_asyncio.fixture
async def client(db_session, test_user):
"""Async HTTP client wired to the FastAPI app with dependency overrides.
Overrides:
- get_db yields the test db_session
- get_current_user returns test_user (pro tier)
- require_tier returns test_user unconditionally (bypasses tier check)
- process_desire.delay no-op (prevents Celery/Redis connection)
"""
# Override get_db to yield test session
async def _override_get_db():
yield db_session
# Override get_current_user to return mock user
async def _override_get_current_user():
return test_user
app.dependency_overrides[get_db] = _override_get_db
app.dependency_overrides[get_current_user] = _override_get_current_user
# require_tier is a factory that returns inner functions depending on
# get_current_user. Since we override get_current_user to return a pro-tier
# user, the tier check inside require_tier will pass naturally.
# We still need to mock process_desire to prevent Celery/Redis connection.
with patch("app.worker.process_desire") as mock_task:
# process_desire.delay() should be a no-op
mock_task.delay = MagicMock(return_value=None)
transport = ASGITransport(app=app)
async with AsyncClient(transport=transport, base_url="http://test") as ac:
yield ac
# Clean up dependency overrides after test
app.dependency_overrides.clear()

View file

@ -0,0 +1,366 @@
"""Unit tests for the clustering service.
Tests use mocked async DB sessions to isolate clustering logic from
pgvector and database concerns. Synthetic 512-dim vectors verify the
service's orchestration, heat calculation, and threshold behavior.
"""
import uuid
from unittest.mock import AsyncMock, MagicMock, patch
import pytest
from app.models.models import DesireCluster
from app.services.clustering import (
add_to_cluster,
cluster_desire,
create_cluster,
find_nearest_cluster,
recalculate_heat,
)
# ---------------------------------------------------------------------------
# Helpers
# ---------------------------------------------------------------------------
def _make_embedding(dim: int = 512) -> list[float]:
"""Create a synthetic embedding vector for testing."""
import numpy as np
rng = np.random.default_rng(42)
vec = rng.standard_normal(dim)
vec = vec / np.linalg.norm(vec)
return vec.tolist()
def _mock_result_row(**kwargs):
"""Create a mock DB result row with named attributes."""
row = MagicMock()
for key, value in kwargs.items():
setattr(row, key, value)
return row
# ---------------------------------------------------------------------------
# Tests: cluster_desire orchestration
# ---------------------------------------------------------------------------
class TestClusterDesireOrchestration:
"""Test the main cluster_desire orchestrator with mocked sub-functions."""
@pytest.mark.asyncio
@patch("app.services.clustering.find_nearest_cluster", new_callable=AsyncMock)
@patch("app.services.clustering.create_cluster", new_callable=AsyncMock)
async def test_new_desire_creates_own_cluster(
self, mock_create, mock_find
) -> None:
"""When no nearby cluster exists, create a new one."""
new_cluster_id = uuid.uuid4()
desire_id = uuid.uuid4()
embedding = _make_embedding()
mock_find.return_value = (None, 0.0)
mock_create.return_value = new_cluster_id
db = AsyncMock()
result = await cluster_desire(desire_id, embedding, db)
mock_find.assert_awaited_once_with(embedding, db)
mock_create.assert_awaited_once_with(desire_id, db)
assert result["is_new"] is True
assert result["cluster_id"] == new_cluster_id
assert result["heat_score"] == 1.0
@pytest.mark.asyncio
@patch("app.services.clustering.find_nearest_cluster", new_callable=AsyncMock)
@patch("app.services.clustering.add_to_cluster", new_callable=AsyncMock)
@patch("app.services.clustering.recalculate_heat", new_callable=AsyncMock)
async def test_similar_desire_joins_existing_cluster(
self, mock_recalc, mock_add, mock_find
) -> None:
"""When a nearby cluster is found, join it and recalculate heat."""
existing_cluster_id = uuid.uuid4()
desire_id = uuid.uuid4()
embedding = _make_embedding()
similarity = 0.92
mock_find.return_value = (existing_cluster_id, similarity)
mock_recalc.return_value = 3.0
db = AsyncMock()
result = await cluster_desire(desire_id, embedding, db)
mock_find.assert_awaited_once_with(embedding, db)
mock_add.assert_awaited_once_with(
existing_cluster_id, desire_id, similarity, db
)
mock_recalc.assert_awaited_once_with(existing_cluster_id, db)
assert result["is_new"] is False
assert result["cluster_id"] == existing_cluster_id
assert result["heat_score"] == 3.0
@pytest.mark.asyncio
@patch("app.services.clustering.find_nearest_cluster", new_callable=AsyncMock)
@patch("app.services.clustering.create_cluster", new_callable=AsyncMock)
async def test_cluster_desire_returns_observability_dict(
self, mock_create, mock_find
) -> None:
"""Returned dict always has cluster_id, is_new, heat_score."""
cluster_id = uuid.uuid4()
mock_find.return_value = (None, 0.0)
mock_create.return_value = cluster_id
db = AsyncMock()
result = await cluster_desire(uuid.uuid4(), _make_embedding(), db)
assert "cluster_id" in result
assert "is_new" in result
assert "heat_score" in result
assert isinstance(result["cluster_id"], uuid.UUID)
assert isinstance(result["is_new"], bool)
assert isinstance(result["heat_score"], float)
# ---------------------------------------------------------------------------
# Tests: recalculate_heat
# ---------------------------------------------------------------------------
class TestRecalculateHeat:
"""Test heat score recalculation with mocked DB results."""
@pytest.mark.asyncio
async def test_heat_scales_with_cluster_size(self) -> None:
"""Heat score should equal cluster size (linear scaling)."""
cluster_id = uuid.uuid4()
db = AsyncMock()
# First call: COUNT(*) returns 3
count_result = MagicMock()
count_result.scalar_one.return_value = 3
# Second call: UPDATE (no return value needed)
update_result = MagicMock()
db.execute = AsyncMock(side_effect=[count_result, update_result])
heat = await recalculate_heat(cluster_id, db)
assert heat == 3.0
assert db.execute.await_count == 2
assert db.flush.await_count >= 1
@pytest.mark.asyncio
async def test_heat_for_single_member_cluster(self) -> None:
"""A single-member cluster should have heat_score = 1.0."""
cluster_id = uuid.uuid4()
db = AsyncMock()
count_result = MagicMock()
count_result.scalar_one.return_value = 1
update_result = MagicMock()
db.execute = AsyncMock(side_effect=[count_result, update_result])
heat = await recalculate_heat(cluster_id, db)
assert heat == 1.0
@pytest.mark.asyncio
async def test_heat_for_large_cluster(self) -> None:
"""Heat scales to large cluster sizes."""
cluster_id = uuid.uuid4()
db = AsyncMock()
count_result = MagicMock()
count_result.scalar_one.return_value = 15
update_result = MagicMock()
db.execute = AsyncMock(side_effect=[count_result, update_result])
heat = await recalculate_heat(cluster_id, db)
assert heat == 15.0
# ---------------------------------------------------------------------------
# Tests: find_nearest_cluster
# ---------------------------------------------------------------------------
class TestFindNearestCluster:
"""Test pgvector distance query with mocked DB results."""
@pytest.mark.asyncio
async def test_empty_db_returns_none(self) -> None:
"""No desires with embeddings → no cluster match."""
db = AsyncMock()
# Query returns no rows
empty_result = MagicMock()
empty_result.first.return_value = None
db.execute = AsyncMock(return_value=empty_result)
cluster_id, similarity = await find_nearest_cluster(
_make_embedding(), db
)
assert cluster_id is None
assert similarity == 0.0
@pytest.mark.asyncio
async def test_match_found_with_cluster(self) -> None:
"""A desire within threshold that has a cluster → returns cluster."""
desire_id = uuid.uuid4()
cluster_id = uuid.uuid4()
db = AsyncMock()
# First query: find nearest desire (distance = 0.08 → similarity = 0.92)
desire_row = _mock_result_row(desire_id=desire_id, distance=0.08)
desire_result = MagicMock()
desire_result.first.return_value = desire_row
# Second query: cluster lookup
cluster_row = _mock_result_row(cluster_id=cluster_id)
cluster_result = MagicMock()
cluster_result.first.return_value = cluster_row
db.execute = AsyncMock(side_effect=[desire_result, cluster_result])
found_id, sim = await find_nearest_cluster(_make_embedding(), db)
assert found_id == cluster_id
assert abs(sim - 0.92) < 1e-6
@pytest.mark.asyncio
async def test_match_found_without_cluster(self) -> None:
"""A nearby desire that has no cluster entry → returns None."""
desire_id = uuid.uuid4()
db = AsyncMock()
# First query: find nearest desire
desire_row = _mock_result_row(desire_id=desire_id, distance=0.10)
desire_result = MagicMock()
desire_result.first.return_value = desire_row
# Second query: cluster lookup returns nothing
cluster_result = MagicMock()
cluster_result.first.return_value = None
db.execute = AsyncMock(side_effect=[desire_result, cluster_result])
found_id, sim = await find_nearest_cluster(_make_embedding(), db)
assert found_id is None
assert sim == 0.0
@pytest.mark.asyncio
async def test_threshold_boundary_at_0_82(self) -> None:
"""Threshold of 0.82 means max distance of 0.18.
A desire at exactly distance=0.18 (similarity=0.82) should be
returned by the SQL query (distance <= 0.18).
"""
desire_id = uuid.uuid4()
cluster_id = uuid.uuid4()
db = AsyncMock()
# Exactly at boundary: distance = 0.18 → similarity = 0.82
desire_row = _mock_result_row(desire_id=desire_id, distance=0.18)
desire_result = MagicMock()
desire_result.first.return_value = desire_row
cluster_row = _mock_result_row(cluster_id=cluster_id)
cluster_result = MagicMock()
cluster_result.first.return_value = cluster_row
db.execute = AsyncMock(side_effect=[desire_result, cluster_result])
found_id, sim = await find_nearest_cluster(
_make_embedding(), db, threshold=0.82
)
assert found_id == cluster_id
assert abs(sim - 0.82) < 1e-6
@pytest.mark.asyncio
async def test_below_threshold_returns_none(self) -> None:
"""A desire beyond the distance threshold is not returned by SQL.
With threshold=0.82 (max_distance=0.18), a desire at distance=0.19
(similarity=0.81) would be filtered out by the WHERE clause.
The mock simulates this by returning no rows.
"""
db = AsyncMock()
# SQL filters it out → no rows
empty_result = MagicMock()
empty_result.first.return_value = None
db.execute = AsyncMock(return_value=empty_result)
found_id, sim = await find_nearest_cluster(
_make_embedding(), db, threshold=0.82
)
assert found_id is None
assert sim == 0.0
# ---------------------------------------------------------------------------
# Tests: create_cluster
# ---------------------------------------------------------------------------
class TestCreateCluster:
"""Test cluster creation."""
@pytest.mark.asyncio
async def test_create_cluster_returns_uuid(self) -> None:
"""New cluster gets a valid UUID."""
db = AsyncMock()
db.add = MagicMock() # Session.add() is synchronous
desire_id = uuid.uuid4()
cluster_id = await create_cluster(desire_id, db)
assert isinstance(cluster_id, uuid.UUID)
db.add.assert_called_once()
db.flush.assert_awaited_once()
@pytest.mark.asyncio
async def test_create_cluster_adds_desire_cluster_row(self) -> None:
"""The DesireCluster row has similarity=1.0 (self-reference)."""
db = AsyncMock()
db.add = MagicMock() # Session.add() is synchronous
desire_id = uuid.uuid4()
cluster_id = await create_cluster(desire_id, db)
added_obj = db.add.call_args[0][0]
assert isinstance(added_obj, DesireCluster)
assert added_obj.cluster_id == cluster_id
assert added_obj.desire_id == desire_id
assert added_obj.similarity == 1.0
# ---------------------------------------------------------------------------
# Tests: add_to_cluster
# ---------------------------------------------------------------------------
class TestAddToCluster:
"""Test adding a desire to an existing cluster."""
@pytest.mark.asyncio
async def test_add_to_cluster_executes_insert(self) -> None:
"""Insert is executed and flushed."""
db = AsyncMock()
cluster_id = uuid.uuid4()
desire_id = uuid.uuid4()
await add_to_cluster(cluster_id, desire_id, 0.91, db)
db.execute.assert_awaited_once()
db.flush.assert_awaited()
# Verify the parameters passed to execute
call_kwargs = db.execute.call_args[0][1]
assert call_kwargs["cluster_id"] == cluster_id
assert call_kwargs["desire_id"] == desire_id
assert call_kwargs["similarity"] == 0.91

View file

@ -0,0 +1,250 @@
"""Pipeline integration tests — embed → cluster → heat.
Proves the full desire processing pipeline works end-to-end by:
1. Verifying similar texts produce embeddings with cosine similarity above
the clustering threshold (0.82)
2. Verifying dissimilar texts stay below the clustering threshold
3. Validating heat calculation logic for clustered desires
4. Checking that the router and worker are wired correctly (static assertions)
"""
import uuid
from pathlib import Path
from unittest.mock import MagicMock
import numpy as np
import pytest
from app.services.embedding import embed_text
def cosine_sim(a: list[float], b: list[float]) -> float:
"""Cosine similarity between two L2-normalized vectors (= dot product)."""
return float(np.dot(a, b))
# ---------------------------------------------------------------------------
# Embedding pipeline: similar texts cluster, dissimilar texts don't
# ---------------------------------------------------------------------------
class TestSimilarDesiresClustering:
"""Verify that similar desire texts produce clusterable embeddings."""
SIMILAR_TEXTS = [
"ragdoll physics dark moody slow motion",
"dark physics ragdoll slow motion moody",
"slow motion ragdoll dark physics moody",
]
def test_similar_desires_produce_clusterable_embeddings(self) -> None:
"""All pairwise cosine similarities among similar texts exceed 0.82."""
embeddings = [embed_text(t) for t in self.SIMILAR_TEXTS]
for i in range(len(embeddings)):
for j in range(i + 1, len(embeddings)):
sim = cosine_sim(embeddings[i], embeddings[j])
assert sim > 0.82, (
f"Texts [{i}] and [{j}] should cluster (sim > 0.82), "
f"got {sim:.4f}:\n"
f" [{i}] '{self.SIMILAR_TEXTS[i]}'\n"
f" [{j}] '{self.SIMILAR_TEXTS[j]}'"
)
def test_dissimilar_desire_does_not_cluster(self) -> None:
"""A dissimilar text has cosine similarity < 0.82 with all similar texts."""
dissimilar = embed_text("bright colorful kaleidoscope flowers rainbow")
similar_embeddings = [embed_text(t) for t in self.SIMILAR_TEXTS]
for i, emb in enumerate(similar_embeddings):
sim = cosine_sim(dissimilar, emb)
assert sim < 0.82, (
f"Dissimilar text should NOT cluster with text [{i}] "
f"(sim < 0.82), got {sim:.4f}:\n"
f" dissimilar: 'bright colorful kaleidoscope flowers rainbow'\n"
f" similar[{i}]: '{self.SIMILAR_TEXTS[i]}'"
)
# ---------------------------------------------------------------------------
# Heat calculation logic
# ---------------------------------------------------------------------------
class TestPipelineHeatCalculation:
"""Verify heat score calculation matches cluster size."""
def test_pipeline_heat_calculation_logic(self) -> None:
"""A cluster of 3 desires should produce heat_score = 3.0 for each member.
This tests the recalculate_heat_sync logic by simulating its
DB interaction pattern with mocks.
"""
from app.services.clustering import recalculate_heat_sync
cluster_id = uuid.uuid4()
session = MagicMock()
# Mock COUNT(*) returning 3 members
count_result = MagicMock()
count_result.scalar_one.return_value = 3
# Mock UPDATE (no meaningful return)
update_result = MagicMock()
session.execute = MagicMock(side_effect=[count_result, update_result])
heat = recalculate_heat_sync(cluster_id, session)
assert heat == 3.0
assert session.execute.call_count == 2
assert session.flush.call_count >= 1
def test_single_member_cluster_has_heat_1(self) -> None:
"""A new single-member cluster should have heat_score = 1.0."""
from app.services.clustering import recalculate_heat_sync
cluster_id = uuid.uuid4()
session = MagicMock()
count_result = MagicMock()
count_result.scalar_one.return_value = 1
update_result = MagicMock()
session.execute = MagicMock(side_effect=[count_result, update_result])
heat = recalculate_heat_sync(cluster_id, session)
assert heat == 1.0
# ---------------------------------------------------------------------------
# Sync clustering orchestrator
# ---------------------------------------------------------------------------
class TestSyncClusteringOrchestrator:
"""Test cluster_desire_sync orchestration with mocked sub-functions."""
def test_new_desire_creates_cluster(self) -> None:
"""When no nearby cluster exists, creates a new one."""
from unittest.mock import patch
from app.services.clustering import cluster_desire_sync
desire_id = uuid.uuid4()
embedding = embed_text("ragdoll physics dark moody slow")
new_cluster_id = uuid.uuid4()
session = MagicMock()
with patch("app.services.clustering.find_nearest_cluster_sync") as mock_find, \
patch("app.services.clustering.create_cluster_sync") as mock_create:
mock_find.return_value = (None, 0.0)
mock_create.return_value = new_cluster_id
result = cluster_desire_sync(desire_id, embedding, session)
assert result["is_new"] is True
assert result["cluster_id"] == new_cluster_id
assert result["heat_score"] == 1.0
mock_find.assert_called_once_with(embedding, session)
mock_create.assert_called_once_with(desire_id, session)
def test_similar_desire_joins_existing_cluster(self) -> None:
"""When a nearby cluster is found, joins it and recalculates heat."""
from unittest.mock import patch
from app.services.clustering import cluster_desire_sync
desire_id = uuid.uuid4()
embedding = embed_text("ragdoll physics dark moody slow")
existing_cluster_id = uuid.uuid4()
session = MagicMock()
with patch("app.services.clustering.find_nearest_cluster_sync") as mock_find, \
patch("app.services.clustering.add_to_cluster_sync") as mock_add, \
patch("app.services.clustering.recalculate_heat_sync") as mock_recalc:
mock_find.return_value = (existing_cluster_id, 0.91)
mock_recalc.return_value = 3.0
result = cluster_desire_sync(desire_id, embedding, session)
assert result["is_new"] is False
assert result["cluster_id"] == existing_cluster_id
assert result["heat_score"] == 3.0
mock_add.assert_called_once_with(
existing_cluster_id, desire_id, 0.91, session
)
# ---------------------------------------------------------------------------
# Wiring checks: router + worker are connected
# ---------------------------------------------------------------------------
class TestWiring:
"""Static assertions that the router and worker are properly wired."""
def test_router_has_worker_enqueue(self) -> None:
"""desires.py contains process_desire.delay — fire-and-forget enqueue."""
desires_path = (
Path(__file__).resolve().parent.parent
/ "app"
/ "routers"
/ "desires.py"
)
source = desires_path.read_text()
assert "process_desire.delay" in source, (
"Router should call process_desire.delay() to enqueue worker task"
)
def test_worker_task_is_implemented(self) -> None:
"""process_desire task body is not just 'pass' — has real implementation.
Reads the worker source file directly to avoid importing celery
(which may not be installed in the test environment).
"""
worker_path = (
Path(__file__).resolve().parent.parent
/ "app"
/ "worker"
/ "__init__.py"
)
source = worker_path.read_text()
# Should contain key implementation markers
assert "embed_text" in source, (
"Worker should call embed_text to embed desire prompt"
)
assert "cluster_desire_sync" in source, (
"Worker should call cluster_desire_sync to cluster the desire"
)
assert "session.commit" in source, (
"Worker should commit the DB transaction"
)
def test_worker_has_structured_logging(self) -> None:
"""process_desire task includes structured logging of key fields."""
worker_path = (
Path(__file__).resolve().parent.parent
/ "app"
/ "worker"
/ "__init__.py"
)
source = worker_path.read_text()
assert "desire_id" in source, "Should log desire_id"
assert "cluster_id" in source, "Should log cluster_id"
assert "heat_score" in source, "Should log heat_score"
assert "elapsed_ms" in source, "Should log elapsed_ms"
def test_worker_has_error_handling_with_retry(self) -> None:
"""process_desire catches exceptions and retries."""
worker_path = (
Path(__file__).resolve().parent.parent
/ "app"
/ "worker"
/ "__init__.py"
)
source = worker_path.read_text()
assert "self.retry" in source, (
"Worker should use self.retry for transient error handling"
)
assert "session.rollback" in source, (
"Worker should rollback on error before retrying"
)

View file

@ -0,0 +1,137 @@
"""Unit tests for the text embedding service.
Validates that TF-IDF + TruncatedSVD produces 512-dim L2-normalized vectors
with meaningful cosine similarity for shader/visual-art domain text.
"""
import numpy as np
import pytest
from app.services.embedding import EmbeddingService, embed_text
def cosine_sim(a: list[float], b: list[float]) -> float:
"""Compute cosine similarity between two vectors.
Since our vectors are already L2-normalized, this is just the dot product.
"""
return float(np.dot(a, b))
class TestEmbedDimension:
"""Verify output vector dimensions."""
def test_embed_produces_512_dim_vector(self) -> None:
result = embed_text("particle system fluid simulation")
assert len(result) == 512, f"Expected 512 dims, got {len(result)}"
def test_embed_returns_list_of_floats(self) -> None:
result = embed_text("fractal noise pattern")
assert isinstance(result, list)
assert all(isinstance(x, float) for x in result)
class TestNormalization:
"""Verify L2 normalization of output vectors."""
def test_embed_vectors_are_normalized(self) -> None:
result = embed_text("raymarching distance field shapes")
norm = np.linalg.norm(result)
assert abs(norm - 1.0) < 1e-6, f"Expected norm ≈ 1.0, got {norm}"
def test_various_inputs_all_normalized(self) -> None:
texts = [
"short",
"a much longer description of a complex visual effect with many words",
"ragdoll physics dark moody atmosphere simulation",
]
for text in texts:
result = embed_text(text)
norm = np.linalg.norm(result)
assert abs(norm - 1.0) < 1e-6, (
f"Norm for '{text}' = {norm}, expected ≈ 1.0"
)
class TestSimilarity:
"""Verify semantic similarity properties of the embeddings."""
def test_similar_texts_have_high_cosine_similarity(self) -> None:
a = embed_text("ragdoll physics dark and slow")
b = embed_text("dark physics simulation ragdoll")
sim = cosine_sim(a, b)
assert sim > 0.8, (
f"Similar texts should have >0.8 cosine sim, got {sim:.4f}"
)
def test_dissimilar_texts_have_low_cosine_similarity(self) -> None:
a = embed_text("ragdoll physics dark")
b = embed_text("bright colorful kaleidoscope flowers")
sim = cosine_sim(a, b)
assert sim < 0.5, (
f"Dissimilar texts should have <0.5 cosine sim, got {sim:.4f}"
)
def test_identical_texts_have_perfect_similarity(self) -> None:
text = "procedural noise fractal generation"
a = embed_text(text)
b = embed_text(text)
sim = cosine_sim(a, b)
assert sim > 0.999, (
f"Identical texts should have ~1.0 cosine sim, got {sim:.4f}"
)
class TestBatch:
"""Verify batch embedding matches individual embeddings."""
def test_embed_batch_matches_individual(self) -> None:
texts = [
"particle system fluid",
"ragdoll physics dark moody",
"kaleidoscope symmetry rotation",
]
# Fresh service to ensure deterministic results
service = EmbeddingService()
individual = [service.embed_text(t) for t in texts]
# Reset and do batch
service2 = EmbeddingService()
batched = service2.embed_batch(texts)
assert len(batched) == len(individual)
for i, (ind, bat) in enumerate(zip(individual, batched)):
sim = cosine_sim(ind, bat)
assert sim > 0.999, (
f"Batch result {i} doesn't match individual: sim={sim:.6f}"
)
def test_batch_dimensions(self) -> None:
texts = ["fire smoke volumetric", "crystal refraction light"]
results = EmbeddingService().embed_batch(texts)
assert len(results) == 2
for vec in results:
assert len(vec) == 512
class TestErrorHandling:
"""Verify clear error messages on invalid input."""
def test_empty_string_raises_valueerror(self) -> None:
with pytest.raises(ValueError, match="empty or whitespace"):
embed_text("")
def test_whitespace_only_raises_valueerror(self) -> None:
with pytest.raises(ValueError, match="empty or whitespace"):
embed_text(" \n\t ")
def test_batch_with_empty_string_raises_valueerror(self) -> None:
service = EmbeddingService()
with pytest.raises(ValueError, match="empty or whitespace"):
service.embed_batch(["valid text", ""])
def test_batch_with_whitespace_raises_valueerror(self) -> None:
service = EmbeddingService()
with pytest.raises(ValueError, match="empty or whitespace"):
service.embed_batch([" ", "valid text"])

View file

@ -0,0 +1,294 @@
"""Tests for desire fulfillment endpoint and cluster_count annotation.
Covers:
- fulfill_desire endpoint: happy path, not-found, not-open, shader validation
(tested via source assertions since FastAPI isn't in the test environment)
- cluster_count annotation: batch query pattern, single desire query
- Schema field: cluster_count exists in DesirePublic
Approach: Per K005, router functions can't be imported without FastAPI installed.
We verify correctness through:
1. Source-level structure assertions (endpoint wiring, imports, validation logic)
2. Isolated logic unit tests (annotation loop, status transitions)
3. Schema field verification via Pydantic model introspection
"""
import uuid
from datetime import datetime, timezone
from pathlib import Path
from unittest.mock import MagicMock
import pytest
# ---------------------------------------------------------------------------
# Helpers
# ---------------------------------------------------------------------------
def _router_source() -> str:
"""Read the desires router source code."""
return (
Path(__file__).resolve().parent.parent
/ "app"
/ "routers"
/ "desires.py"
).read_text(encoding="utf-8")
def _schema_source() -> str:
"""Read the schemas source code."""
return (
Path(__file__).resolve().parent.parent
/ "app"
/ "schemas"
/ "schemas.py"
).read_text(encoding="utf-8")
def _make_mock_desire(
*,
desire_id=None,
status="open",
heat_score=1.0,
):
"""Create a mock object simulating a Desire ORM instance."""
d = MagicMock()
d.id = desire_id or uuid.uuid4()
d.status = status
d.heat_score = heat_score
d.cluster_count = 0 # default before annotation
return d
# ---------------------------------------------------------------------------
# fulfill_desire — happy path structure
# ---------------------------------------------------------------------------
class TestFulfillHappyPath:
"""Verify the fulfill endpoint's happy-path logic via source analysis."""
def test_fulfill_sets_status_to_fulfilled(self):
"""The endpoint sets desire.status = 'fulfilled' on success."""
source = _router_source()
assert 'desire.status = "fulfilled"' in source
def test_fulfill_sets_fulfilled_by_shader(self):
"""The endpoint sets desire.fulfilled_by_shader = shader_id."""
source = _router_source()
assert "desire.fulfilled_by_shader = shader_id" in source
def test_fulfill_sets_fulfilled_at_timestamp(self):
"""The endpoint sets desire.fulfilled_at to current UTC time."""
source = _router_source()
assert "desire.fulfilled_at" in source
assert "datetime.now(timezone.utc)" in source
def test_fulfill_returns_status_response(self):
"""The endpoint returns a dict with status, desire_id, shader_id."""
source = _router_source()
assert '"status": "fulfilled"' in source
assert '"desire_id"' in source
assert '"shader_id"' in source
# ---------------------------------------------------------------------------
# fulfill_desire — error paths
# ---------------------------------------------------------------------------
class TestFulfillDesireNotFound:
"""404 when desire doesn't exist."""
def test_desire_not_found_raises_404(self):
source = _router_source()
# After desire lookup, checks scalar_one_or_none result
assert "Desire not found" in source
class TestFulfillDesireNotOpen:
"""400 when desire is not in 'open' status."""
def test_desire_not_open_check_exists(self):
source = _router_source()
assert 'desire.status != "open"' in source
def test_desire_not_open_error_message(self):
source = _router_source()
assert "Desire is not open" in source
class TestFulfillShaderNotFound:
"""404 when shader_id doesn't match any shader."""
def test_shader_lookup_exists(self):
source = _router_source()
assert "select(Shader).where(Shader.id == shader_id)" in source
def test_shader_not_found_raises_404(self):
source = _router_source()
assert "Shader not found" in source
class TestFulfillShaderNotPublished:
"""400 when shader status is not 'published'."""
def test_shader_status_validation(self):
source = _router_source()
assert 'shader.status != "published"' in source
def test_shader_not_published_error_message(self):
source = _router_source()
assert "Shader must be published to fulfill a desire" in source
# ---------------------------------------------------------------------------
# cluster_count annotation — logic unit tests
# ---------------------------------------------------------------------------
class TestClusterCountAnnotation:
"""Verify cluster_count annotation logic patterns."""
def test_list_desires_has_batch_cluster_query(self):
"""list_desires uses a batch query with ANY(:desire_ids)."""
source = _router_source()
assert "ANY(:desire_ids)" in source
assert "desire_clusters dc1" in source
assert "desire_clusters dc2" in source
def test_list_desires_avoids_n_plus_1(self):
"""Cluster counts are fetched in a single batch, not per-desire."""
source = _router_source()
# The pattern: build dict from batch query, then loop to annotate
assert "cluster_counts = {" in source
assert "cluster_counts.get(d.id, 0)" in source
def test_list_desires_skips_cluster_query_when_empty(self):
"""When no desires are returned, cluster query is skipped."""
source = _router_source()
assert "if desire_ids:" in source
def test_get_desire_annotates_single_cluster_count(self):
"""get_desire runs a cluster count query for the single desire."""
source = _router_source()
# Should have a cluster query scoped to a single desire_id
assert "WHERE dc1.desire_id = :desire_id" in source
def test_annotation_loop_sets_default_zero(self):
"""Desires without cluster entries default to cluster_count = 0."""
source = _router_source()
assert "cluster_counts.get(d.id, 0)" in source
def test_annotation_loop_logic(self):
"""Unit test: the annotation loop correctly maps cluster counts to desires."""
# Simulate the annotation loop from list_desires
d1 = _make_mock_desire()
d2 = _make_mock_desire()
d3 = _make_mock_desire()
desires = [d1, d2, d3]
# Simulate cluster query result: d1 has 3 in cluster, d3 has 2
cluster_counts = {d1.id: 3, d3.id: 2}
# This is the exact logic from the router
for d in desires:
d.cluster_count = cluster_counts.get(d.id, 0)
assert d1.cluster_count == 3
assert d2.cluster_count == 0 # not in any cluster
assert d3.cluster_count == 2
def test_get_desire_cluster_count_fallback(self):
"""get_desire sets cluster_count=0 when no cluster row exists."""
source = _router_source()
# The router checks `row[0] if row else 0`
assert "row[0] if row else 0" in source
# ---------------------------------------------------------------------------
# Schema field verification
# ---------------------------------------------------------------------------
class TestDesirePublicSchema:
"""Verify DesirePublic schema has the cluster_count field."""
def test_cluster_count_field_in_schema_source(self):
"""DesirePublic schema source contains cluster_count field."""
source = _schema_source()
assert "cluster_count" in source
def test_cluster_count_default_zero(self):
"""cluster_count defaults to 0 in the schema."""
source = _schema_source()
assert "cluster_count: int = 0" in source
def test_schema_from_attributes_enabled(self):
"""DesirePublic uses from_attributes=True for ORM compatibility."""
source = _schema_source()
# Find the DesirePublic class section
desire_public_idx = source.index("class DesirePublic")
desire_public_section = source[desire_public_idx:desire_public_idx + 200]
assert "from_attributes=True" in desire_public_section
def test_cluster_count_pydantic_model(self):
"""DesirePublic schema has cluster_count as an int field with default 0."""
source = _schema_source()
# Find the DesirePublic class and verify cluster_count is between
# heat_score and fulfilled_by_shader (correct field ordering)
desire_idx = source.index("class DesirePublic")
desire_section = source[desire_idx:desire_idx + 500]
heat_pos = desire_section.index("heat_score")
cluster_pos = desire_section.index("cluster_count")
fulfilled_pos = desire_section.index("fulfilled_by_shader")
assert heat_pos < cluster_pos < fulfilled_pos, (
"cluster_count should be between heat_score and fulfilled_by_shader"
)
# ---------------------------------------------------------------------------
# Wiring assertions
# ---------------------------------------------------------------------------
class TestFulfillmentWiring:
"""Structural assertions that the router is properly wired."""
def test_router_imports_shader_model(self):
"""desires.py imports Shader for shader validation."""
source = _router_source()
assert "Shader" in source.split("\n")[8] # near top imports
def test_router_imports_text_from_sqlalchemy(self):
"""desires.py imports text from sqlalchemy for raw SQL."""
source = _router_source()
assert "from sqlalchemy import" in source
assert "text" in source
def test_fulfill_endpoint_requires_auth(self):
"""fulfill_desire uses get_current_user dependency."""
source = _router_source()
# Find the fulfill_desire function
fulfill_idx = source.index("async def fulfill_desire")
fulfill_section = source[fulfill_idx:fulfill_idx + 500]
assert "get_current_user" in fulfill_section
def test_fulfill_endpoint_takes_shader_id_param(self):
"""fulfill_desire accepts shader_id as a query parameter."""
source = _router_source()
fulfill_idx = source.index("async def fulfill_desire")
fulfill_section = source[fulfill_idx:fulfill_idx + 300]
assert "shader_id" in fulfill_section
def test_list_desires_returns_desire_public(self):
"""list_desires endpoint uses DesirePublic response model."""
source = _router_source()
assert "response_model=list[DesirePublic]" in source
def test_get_desire_returns_desire_public(self):
"""get_desire endpoint uses DesirePublic response model."""
source = _router_source()
# Find the get_desire endpoint specifically
lines = source.split("\n")
for i, line in enumerate(lines):
if "async def get_desire" in line:
# Check the decorator line above
decorator_line = lines[i - 1]
assert "response_model=DesirePublic" in decorator_line
break

View file

@ -0,0 +1,412 @@
"""Integration tests — end-to-end acceptance scenarios through FastAPI.
Uses async SQLite test database, real FastAPI endpoint handlers,
and dependency overrides for auth and Celery worker.
Test classes:
TestInfrastructureSmoke proves test infra works (T01)
TestClusteringScenario clustering + heat elevation via API (T02)
TestFulfillmentScenario desire fulfillment lifecycle (T02)
TestMCPFieldPassthrough MCP tool field passthrough (T02, source-level)
"""
import inspect
import json
import uuid
from pathlib import Path
import pytest
from httpx import AsyncClient
from sqlalchemy import select, update
# ── Smoke Test: proves integration infrastructure works ───
class TestInfrastructureSmoke:
"""Verify that the integration test infrastructure (DB, client, auth, Celery mock) works."""
@pytest.mark.asyncio
async def test_create_and_read_desire(self, client: AsyncClient):
"""POST a desire, then GET it back — proves DB, serialization, auth override, and Celery mock."""
# Create a desire
response = await client.post(
"/api/v1/desires",
json={"prompt_text": "glowing neon wireframe city"},
)
assert response.status_code == 201, f"Expected 201, got {response.status_code}: {response.text}"
data = response.json()
desire_id = data["id"]
assert data["prompt_text"] == "glowing neon wireframe city"
assert data["status"] == "open"
# Read it back
response = await client.get(f"/api/v1/desires/{desire_id}")
assert response.status_code == 200, f"Expected 200, got {response.status_code}: {response.text}"
data = response.json()
assert data["id"] == desire_id
assert data["prompt_text"] == "glowing neon wireframe city"
assert data["heat_score"] == 1.0
assert data["cluster_count"] == 0
# ── Clustering Scenario ──────────────────────────────────────
class TestClusteringScenario:
"""Prove that clustered desires have elevated heat and cluster_count via the API.
Strategy: POST desires through the API, then directly insert DesireCluster
rows and update heat_score in the test DB (simulating what the Celery worker
pipeline does). Verify via GET /api/v1/desires/{id} that the API returns
correct heat_score and cluster_count.
Note: list_desires uses PostgreSQL ANY(:desire_ids) which doesn't work in
SQLite, so we verify via individual GET requests instead.
"""
@pytest.mark.asyncio
async def test_similar_desires_cluster_and_elevate_heat(
self, client: AsyncClient, db_session
):
"""Create 3 desires, cluster them, elevate heat, verify API returns correct data."""
from app.models.models import Desire, DesireCluster
# Create 3 desires via API
desire_ids = []
prompts = [
"neon fractal explosion in deep space",
"colorful fractal burst cosmic background",
"glowing fractal nova against dark stars",
]
for prompt in prompts:
resp = await client.post(
"/api/v1/desires", json={"prompt_text": prompt}
)
assert resp.status_code == 201, f"Create failed: {resp.text}"
desire_ids.append(resp.json()["id"])
# Simulate clustering: insert DesireCluster rows linking all 3 to one cluster
cluster_id = uuid.uuid4()
for did in desire_ids:
dc = DesireCluster(
cluster_id=cluster_id,
desire_id=uuid.UUID(did),
similarity=0.88,
)
db_session.add(dc)
# Simulate heat recalculation: update heat_score on all 3 desires
for did in desire_ids:
await db_session.execute(
update(Desire)
.where(Desire.id == uuid.UUID(did))
.values(heat_score=3.0)
)
await db_session.flush()
# Verify each desire via GET shows correct heat_score and cluster_count
for did in desire_ids:
resp = await client.get(f"/api/v1/desires/{did}")
assert resp.status_code == 200, f"GET {did} failed: {resp.text}"
data = resp.json()
assert data["heat_score"] >= 3.0, (
f"Desire {did} heat_score={data['heat_score']}, expected >= 3.0"
)
assert data["cluster_count"] >= 3, (
f"Desire {did} cluster_count={data['cluster_count']}, expected >= 3"
)
@pytest.mark.asyncio
async def test_lone_desire_has_default_heat(self, client: AsyncClient):
"""A single desire without clustering has heat_score=1.0 and cluster_count=0."""
resp = await client.post(
"/api/v1/desires",
json={"prompt_text": "unique standalone art concept"},
)
assert resp.status_code == 201
desire_id = resp.json()["id"]
resp = await client.get(f"/api/v1/desires/{desire_id}")
assert resp.status_code == 200
data = resp.json()
assert data["heat_score"] == 1.0, f"Expected heat_score=1.0, got {data['heat_score']}"
assert data["cluster_count"] == 0, f"Expected cluster_count=0, got {data['cluster_count']}"
@pytest.mark.asyncio
async def test_desires_sorted_by_heat_descending(
self, client: AsyncClient, db_session
):
"""When fetching desires, high-heat desires appear before low-heat ones.
Uses individual GET since list_desires relies on PostgreSQL ANY().
Verifies the ordering guarantee via direct heat_score comparison.
"""
from app.models.models import Desire, DesireCluster
# Create a "hot" desire and cluster it
hot_resp = await client.post(
"/api/v1/desires",
json={"prompt_text": "blazing hot fractal vortex"},
)
assert hot_resp.status_code == 201
hot_id = hot_resp.json()["id"]
# Simulate clustering for hot desire
cluster_id = uuid.uuid4()
dc = DesireCluster(
cluster_id=cluster_id,
desire_id=uuid.UUID(hot_id),
similarity=0.90,
)
db_session.add(dc)
await db_session.execute(
update(Desire)
.where(Desire.id == uuid.UUID(hot_id))
.values(heat_score=5.0)
)
await db_session.flush()
# Create a "cold" desire (no clustering)
cold_resp = await client.post(
"/api/v1/desires",
json={"prompt_text": "calm minimal zen garden"},
)
assert cold_resp.status_code == 201
cold_id = cold_resp.json()["id"]
# Verify hot desire has higher heat than cold
hot_data = (await client.get(f"/api/v1/desires/{hot_id}")).json()
cold_data = (await client.get(f"/api/v1/desires/{cold_id}")).json()
assert hot_data["heat_score"] > cold_data["heat_score"], (
f"Hot ({hot_data['heat_score']}) should be > Cold ({cold_data['heat_score']})"
)
# ── Fulfillment Scenario ─────────────────────────────────────
class TestFulfillmentScenario:
"""Prove desire fulfillment transitions status and links to a shader."""
@pytest.mark.asyncio
async def test_fulfill_desire_transitions_status(
self, client: AsyncClient, db_session
):
"""Create desire, insert published shader, fulfill, verify status transition."""
from app.models.models import Shader
# Create desire
resp = await client.post(
"/api/v1/desires",
json={"prompt_text": "ethereal particle waterfall"},
)
assert resp.status_code == 201
desire_id = resp.json()["id"]
# Insert a published shader directly in test DB
shader_id = uuid.uuid4()
shader = Shader(
id=shader_id,
title="Particle Waterfall",
glsl_code="void mainImage(out vec4 c, in vec2 f) { c = vec4(0); }",
status="published",
author_id=None,
)
db_session.add(shader)
await db_session.flush()
# Fulfill the desire
resp = await client.post(
f"/api/v1/desires/{desire_id}/fulfill",
params={"shader_id": str(shader_id)},
)
assert resp.status_code == 200, f"Fulfill failed: {resp.text}"
data = resp.json()
assert data["status"] == "fulfilled"
assert data["desire_id"] == desire_id
assert data["shader_id"] == str(shader_id)
# Verify read-back shows fulfilled status and linked shader
resp = await client.get(f"/api/v1/desires/{desire_id}")
assert resp.status_code == 200
data = resp.json()
assert data["status"] == "fulfilled"
assert data["fulfilled_by_shader"] == str(shader_id)
@pytest.mark.asyncio
async def test_fulfill_requires_published_shader(
self, client: AsyncClient, db_session
):
"""Fulfilling with a draft shader returns 400."""
from app.models.models import Shader
# Create desire
resp = await client.post(
"/api/v1/desires",
json={"prompt_text": "glitch art mosaic pattern"},
)
assert resp.status_code == 201
desire_id = resp.json()["id"]
# Insert a draft shader
shader_id = uuid.uuid4()
shader = Shader(
id=shader_id,
title="Draft Mosaic",
glsl_code="void mainImage(out vec4 c, in vec2 f) { c = vec4(1); }",
status="draft",
author_id=None,
)
db_session.add(shader)
await db_session.flush()
# Attempt fulfill — should fail
resp = await client.post(
f"/api/v1/desires/{desire_id}/fulfill",
params={"shader_id": str(shader_id)},
)
assert resp.status_code == 400, f"Expected 400, got {resp.status_code}: {resp.text}"
assert "published" in resp.json()["detail"].lower()
@pytest.mark.asyncio
async def test_fulfill_already_fulfilled_returns_400(
self, client: AsyncClient, db_session
):
"""Fulfilling an already-fulfilled desire returns 400."""
from app.models.models import Shader
# Create desire
resp = await client.post(
"/api/v1/desires",
json={"prompt_text": "recursive mirror tunnel"},
)
assert resp.status_code == 201
desire_id = resp.json()["id"]
# Insert published shader
shader_id = uuid.uuid4()
shader = Shader(
id=shader_id,
title="Mirror Tunnel",
glsl_code="void mainImage(out vec4 c, in vec2 f) { c = vec4(0.5); }",
status="published",
author_id=None,
)
db_session.add(shader)
await db_session.flush()
# First fulfill — should succeed
resp = await client.post(
f"/api/v1/desires/{desire_id}/fulfill",
params={"shader_id": str(shader_id)},
)
assert resp.status_code == 200
# Second fulfill — should fail
resp = await client.post(
f"/api/v1/desires/{desire_id}/fulfill",
params={"shader_id": str(shader_id)},
)
assert resp.status_code == 400, f"Expected 400, got {resp.status_code}: {resp.text}"
assert "not open" in resp.json()["detail"].lower()
# ── MCP Field Passthrough (source-level) ─────────────────────
class TestMCPFieldPassthrough:
"""Verify MCP server tools pass through all required fields via source inspection.
The MCP server runs as a separate process and can't be tested through
FastAPI TestClient. These tests verify the source code structure to ensure
field passthrough is correct.
"""
@classmethod
def _read_mcp_server_source(cls) -> str:
"""Read the MCP server source file."""
# From services/api/tests/ → up 3 to services/ → mcp/server.py
mcp_path = Path(__file__).resolve().parent.parent.parent / "mcp" / "server.py"
assert mcp_path.exists(), f"MCP server.py not found at {mcp_path}"
return mcp_path.read_text()
def test_get_desire_queue_includes_cluster_fields(self):
"""get_desire_queue maps cluster_count, heat_score, style_hints, fulfilled_by_shader."""
source = self._read_mcp_server_source()
# Verify get_desire_queue function exists
assert "async def get_desire_queue" in source, "get_desire_queue function not found"
# Extract the function body (from def to next @mcp or end)
fn_start = source.index("async def get_desire_queue")
# Find next top-level decorator or end of file
next_decorator = source.find("\n@mcp.", fn_start + 1)
if next_decorator == -1:
fn_body = source[fn_start:]
else:
fn_body = source[fn_start:next_decorator]
required_fields = ["cluster_count", "heat_score", "style_hints", "fulfilled_by_shader"]
for field in required_fields:
assert field in fn_body, (
f"get_desire_queue missing field '{field}' in response mapping"
)
def test_fulfill_desire_tool_exists(self):
"""fulfill_desire function exists and uses api_post_with_params."""
source = self._read_mcp_server_source()
assert "async def fulfill_desire" in source, "fulfill_desire function not found"
# Extract function body
fn_start = source.index("async def fulfill_desire")
next_decorator = source.find("\n@mcp.", fn_start + 1)
if next_decorator == -1:
fn_body = source[fn_start:]
else:
fn_body = source[fn_start:next_decorator]
assert "api_post_with_params" in fn_body, (
"fulfill_desire should call api_post_with_params"
)
def test_fulfill_desire_returns_structured_response(self):
"""fulfill_desire returns JSON with status, desire_id, shader_id."""
source = self._read_mcp_server_source()
fn_start = source.index("async def fulfill_desire")
next_decorator = source.find("\n@mcp.", fn_start + 1)
if next_decorator == -1:
fn_body = source[fn_start:]
else:
fn_body = source[fn_start:next_decorator]
# Check the success-path return contains the required fields
required_keys = ['"status"', '"desire_id"', '"shader_id"']
for key in required_keys:
assert key in fn_body, (
f"fulfill_desire response missing key {key}"
)
def test_submit_shader_accepts_fulfills_desire_id(self):
"""submit_shader accepts fulfills_desire_id parameter and passes it to the API."""
source = self._read_mcp_server_source()
assert "async def submit_shader" in source, "submit_shader function not found"
fn_start = source.index("async def submit_shader")
next_decorator = source.find("\n@mcp.", fn_start + 1)
if next_decorator == -1:
fn_body = source[fn_start:]
else:
fn_body = source[fn_start:next_decorator]
# Verify parameter exists in function signature
assert "fulfills_desire_id" in fn_body, (
"submit_shader should accept fulfills_desire_id parameter"
)
# Verify it's passed to the payload
assert 'payload["fulfills_desire_id"]' in fn_body or \
'"fulfills_desire_id"' in fn_body, (
"submit_shader should include fulfills_desire_id in the API payload"
)

View file

@ -2,16 +2,13 @@ FROM node:20-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY package.json ./
RUN npm install
COPY . .
# Build for production (overridden in dev)
RUN npm run build
# Serve with a simple static server
RUN npm install -g serve
CMD ["serve", "-s", "dist", "-l", "5173"]
EXPOSE 5173
# In production: build and serve static files
# In dev: overridden to `npx vite --host 0.0.0.0`
CMD ["sh", "-c", "npm run build && npx serve -s dist -l 5173"]

View file

@ -0,0 +1,17 @@
<!DOCTYPE html>
<html lang="en" class="dark">
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/fracta.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="description" content="Fractafrag — Create, browse, and generate GLSL shaders" />
<link rel="preconnect" href="https://fonts.googleapis.com" />
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
<link href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&family=JetBrains+Mono:wght@400;500&display=swap" rel="stylesheet" />
<title>Fractafrag</title>
</head>
<body class="bg-surface-0 text-white antialiased">
<div id="root"></div>
<script type="module" src="/src/main.tsx"></script>
</body>
</html>

View file

@ -5,7 +5,7 @@
"type": "module",
"scripts": {
"dev": "vite",
"build": "tsc && vite build",
"build": "vite build",
"preview": "vite preview"
},
"dependencies": {

View file

@ -0,0 +1,6 @@
export default {
plugins: {
tailwindcss: {},
autoprefixer: {},
},
};

View file

@ -0,0 +1,10 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 100 100">
<defs>
<linearGradient id="g" x1="0%" y1="0%" x2="100%" y2="100%">
<stop offset="0%" stop-color="#7a60ff"/>
<stop offset="100%" stop-color="#4d10f0"/>
</linearGradient>
</defs>
<polygon points="50,5 95,27.5 95,72.5 50,95 5,72.5 5,27.5" fill="url(#g)" stroke="#9f94ff" stroke-width="2"/>
<text x="50" y="60" text-anchor="middle" fill="white" font-family="monospace" font-size="28" font-weight="bold">ff</text>
</svg>

After

Width:  |  Height:  |  Size: 507 B

View file

@ -0,0 +1,36 @@
import { Routes, Route } from 'react-router-dom';
import Layout from './components/Layout';
import Feed from './pages/Feed';
import Explore from './pages/Explore';
import ShaderDetail from './pages/ShaderDetail';
import Editor from './pages/Editor';
import MyShaders from './pages/MyShaders';
import Generate from './pages/Generate';
import Bounties from './pages/Bounties';
import BountyDetail from './pages/BountyDetail';
import Profile from './pages/Profile';
import Settings from './pages/Settings';
import Login from './pages/Login';
import Register from './pages/Register';
export default function App() {
return (
<Routes>
<Route element={<Layout />}>
<Route path="/" element={<Feed />} />
<Route path="/explore" element={<Explore />} />
<Route path="/shader/:id" element={<ShaderDetail />} />
<Route path="/editor" element={<Editor />} />
<Route path="/editor/:id" element={<Editor />} />
<Route path="/my-shaders" element={<MyShaders />} />
<Route path="/generate" element={<Generate />} />
<Route path="/bounties" element={<Bounties />} />
<Route path="/bounties/:id" element={<BountyDetail />} />
<Route path="/profile/:username" element={<Profile />} />
<Route path="/settings" element={<Settings />} />
</Route>
<Route path="/login" element={<Login />} />
<Route path="/register" element={<Register />} />
</Routes>
);
}

View file

@ -0,0 +1,13 @@
import { Outlet } from 'react-router-dom';
import Navbar from './Navbar';
export default function Layout() {
return (
<div className="min-h-screen flex flex-col">
<Navbar />
<main className="flex-1">
<Outlet />
</main>
</div>
);
}

View file

@ -0,0 +1,66 @@
import { Link, useNavigate } from 'react-router-dom';
import { useAuthStore } from '@/stores/auth';
import api from '@/lib/api';
export default function Navbar() {
const { user, isAuthenticated, logout } = useAuthStore();
const navigate = useNavigate();
const handleLogout = async () => {
try {
await api.post('/auth/logout');
} catch {
// Best-effort
}
logout();
navigate('/');
};
return (
<nav className="sticky top-0 z-50 bg-surface-1/80 backdrop-blur-xl border-b border-surface-3">
<div className="max-w-7xl mx-auto px-4 h-14 flex items-center justify-between">
{/* Logo + Nav */}
<div className="flex items-center gap-6">
<Link to="/" className="flex items-center gap-2 text-lg font-bold">
<span className="text-fracta-400"></span>
<span className="bg-gradient-to-r from-fracta-400 to-fracta-600 bg-clip-text text-transparent">
fractafrag
</span>
</Link>
<div className="hidden md:flex items-center gap-1">
<Link to="/" className="btn-ghost text-sm py-1 px-3">Feed</Link>
<Link to="/explore" className="btn-ghost text-sm py-1 px-3">Explore</Link>
<Link to="/editor" className="btn-ghost text-sm py-1 px-3">Editor</Link>
<Link to="/bounties" className="btn-ghost text-sm py-1 px-3">Bounties</Link>
<Link to="/generate" className="btn-ghost text-sm py-1 px-3">Generate</Link>
</div>
</div>
{/* Auth */}
<div className="flex items-center gap-3">
{isAuthenticated() && user ? (
<>
<Link to="/my-shaders" className="btn-ghost text-sm py-1 px-3">My Shaders</Link>
<Link
to={`/profile/${user.username}`}
className="text-sm text-gray-300 hover:text-white transition-colors"
>
{user.username}
</Link>
<Link to="/settings" className="btn-ghost text-sm py-1 px-3">Settings</Link>
<button onClick={handleLogout} className="btn-ghost text-sm py-1 px-3">
Logout
</button>
</>
) : (
<>
<Link to="/login" className="btn-ghost text-sm py-1 px-3">Login</Link>
<Link to="/register" className="btn-primary text-sm py-1 px-3">Sign Up</Link>
</>
)}
</div>
</div>
</nav>
);
}

View file

@ -0,0 +1,260 @@
/**
* ShaderCanvas WebGL GLSL renderer with viewport-aware lifecycle.
*
* Chromium limits ~16 simultaneous WebGL contexts. When a context is lost
* (browser evicts it silently), you can't re-create it on the same canvas.
*
* Solution: when re-entering viewport, if the context is dead, replace the
* canvas DOM element with a fresh one. A new canvas gets a new context.
*/
import { useRef, useEffect, useCallback, useState } from 'react';
interface ShaderCanvasProps {
code: string;
width?: number;
height?: number;
className?: string;
animate?: boolean;
onError?: (error: string) => void;
onCompileSuccess?: () => void;
}
const VERT = `#version 300 es
in vec4 a_position;
void main() { gl_Position = a_position; }`;
function buildFrag(userCode: string): string {
const pfx = `#version 300 es
precision highp float;
uniform float iTime;
uniform vec3 iResolution;
uniform vec4 iMouse;
out vec4 outColor;
`;
if (userCode.includes('mainImage')) {
return pfx + userCode + `
void main() { vec4 c; mainImage(c, gl_FragCoord.xy); outColor = c; }`;
}
return pfx + userCode.replace(/gl_FragColor/g, 'outColor');
}
export default function ShaderCanvas({
code,
width,
height,
className = '',
animate = true,
onError,
onCompileSuccess,
}: ShaderCanvasProps) {
const containerRef = useRef<HTMLDivElement>(null);
const stateRef = useRef({
canvas: null as HTMLCanvasElement | null,
gl: null as WebGL2RenderingContext | null,
prog: null as WebGLProgram | null,
anim: 0,
t0: 0,
visible: false,
running: false,
mouse: [0, 0, 0, 0] as [number, number, number, number],
});
const codeRef = useRef(code);
codeRef.current = code;
// ── Create a fresh canvas element ──────────────────────
const createCanvas = useCallback(() => {
const container = containerRef.current;
if (!container) return null;
const s = stateRef.current;
// Remove old canvas
if (s.canvas && s.canvas.parentNode) {
if (s.anim) { cancelAnimationFrame(s.anim); s.anim = 0; }
s.running = false;
s.prog = null;
s.gl = null;
s.canvas.remove();
}
const canvas = document.createElement('canvas');
canvas.className = 'block w-full h-full';
canvas.width = container.clientWidth || width || 640;
canvas.height = container.clientHeight || height || 360;
canvas.addEventListener('mousemove', (e) => {
const r = canvas.getBoundingClientRect();
s.mouse = [e.clientX - r.left, r.height - (e.clientY - r.top), 0, 0];
});
container.appendChild(canvas);
s.canvas = canvas;
return canvas;
}, [width, height]);
// ── Compile shader ─────────────────────────────────────
const compile = useCallback((canvas: HTMLCanvasElement) => {
const s = stateRef.current;
let gl = s.gl;
if (!gl || gl.isContextLost()) {
gl = canvas.getContext('webgl2', { antialias: false, powerPreference: 'low-power' });
if (!gl) return false;
s.gl = gl;
}
if (s.prog) { gl.deleteProgram(s.prog); s.prog = null; }
try {
const vs = gl.createShader(gl.VERTEX_SHADER)!;
gl.shaderSource(vs, VERT);
gl.compileShader(vs);
if (!gl.getShaderParameter(vs, gl.COMPILE_STATUS))
throw new Error(gl.getShaderInfoLog(vs) || 'VS error');
const fs = gl.createShader(gl.FRAGMENT_SHADER)!;
gl.shaderSource(fs, buildFrag(codeRef.current));
gl.compileShader(fs);
if (!gl.getShaderParameter(fs, gl.COMPILE_STATUS)) {
const e = gl.getShaderInfoLog(fs) || 'FS error';
gl.deleteShader(vs); gl.deleteShader(fs); throw new Error(e);
}
const p = gl.createProgram()!;
gl.attachShader(p, vs); gl.attachShader(p, fs);
gl.linkProgram(p);
gl.deleteShader(vs); gl.deleteShader(fs);
if (!gl.getProgramParameter(p, gl.LINK_STATUS)) {
const e = gl.getProgramInfoLog(p) || 'Link error';
gl.deleteProgram(p); throw new Error(e);
}
s.prog = p;
gl.useProgram(p);
const buf = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buf);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([-1,-1,1,-1,-1,1,1,1]), gl.STATIC_DRAW);
const loc = gl.getAttribLocation(p, 'a_position');
gl.enableVertexAttribArray(loc);
gl.vertexAttribPointer(loc, 2, gl.FLOAT, false, 0, 0);
s.t0 = performance.now();
onCompileSuccess?.();
onError?.('');
return true;
} catch (e: any) {
onError?.(e.message);
return false;
}
}, [onError, onCompileSuccess]);
// ── Animation loop ─────────────────────────────────────
const startLoop = useCallback(() => {
const s = stateRef.current;
if (s.running || !s.gl || !s.prog || !s.canvas) return;
s.running = true;
const gl = s.gl;
const prog = s.prog;
const canvas = s.canvas;
const tick = () => {
if (!s.visible && animate) { s.running = false; s.anim = 0; return; }
if (!s.gl || !s.prog || gl.isContextLost()) { s.running = false; s.anim = 0; return; }
const w = canvas.width, h = canvas.height;
gl.viewport(0, 0, w, h);
const t = (performance.now() - s.t0) / 1000;
const uT = gl.getUniformLocation(prog, 'iTime');
const uR = gl.getUniformLocation(prog, 'iResolution');
const uM = gl.getUniformLocation(prog, 'iMouse');
if (uT) gl.uniform1f(uT, t);
if (uR) gl.uniform3f(uR, w, h, 1);
if (uM) gl.uniform4f(uM, ...s.mouse);
gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);
s.anim = animate ? requestAnimationFrame(tick) : 0;
};
tick();
}, [animate]);
// ── Full setup: new canvas → compile → loop ────────────
const fullSetup = useCallback(() => {
const canvas = createCanvas();
if (!canvas) return;
if (compile(canvas)) {
startLoop();
}
}, [createCanvas, compile, startLoop]);
// ── Visibility observer ────────────────────────────────
useEffect(() => {
const container = containerRef.current;
if (!container) return;
const observer = new IntersectionObserver(
([entry]) => {
const s = stateRef.current;
const was = s.visible;
s.visible = entry.isIntersecting;
if (s.visible && !was) {
// Entering viewport
if (s.gl && !s.gl.isContextLost() && s.prog) {
// Context still alive — just restart loop
startLoop();
} else {
// Context lost or never created — fresh canvas
fullSetup();
}
} else if (!s.visible && was) {
// Leaving viewport — stop loop (context stays on canvas)
if (s.anim) { cancelAnimationFrame(s.anim); s.anim = 0; }
s.running = false;
}
},
{ threshold: 0.01, rootMargin: '200px' },
);
observer.observe(container);
return () => {
observer.disconnect();
const s = stateRef.current;
if (s.anim) cancelAnimationFrame(s.anim);
s.running = false;
};
}, [fullSetup, startLoop]);
// ── Code change (editor) ───────────────────────────────
useEffect(() => {
const s = stateRef.current;
if (s.visible && s.canvas) {
if (s.anim) { cancelAnimationFrame(s.anim); s.anim = 0; }
s.running = false;
if (compile(s.canvas)) startLoop();
}
}, [code]); // eslint-disable-line react-hooks/exhaustive-deps
// ── Container resize → canvas resize ───────────────────
useEffect(() => {
const container = containerRef.current;
if (!container) return;
const ro = new ResizeObserver(([e]) => {
const s = stateRef.current;
if (!s.canvas) return;
const w = Math.floor(e.contentRect.width);
const h = Math.floor(e.contentRect.height);
if (w > 0 && h > 0) { s.canvas.width = w; s.canvas.height = h; }
});
ro.observe(container);
return () => ro.disconnect();
}, []);
return (
<div
ref={containerRef}
className={className}
style={{ width: width ? `${width}px` : '100%', height: height ? `${height}px` : '100%' }}
/>
);
}

View file

@ -0,0 +1,62 @@
@tailwind base;
@tailwind components;
@tailwind utilities;
@layer base {
:root {
color-scheme: dark;
}
body {
@apply bg-surface-0 text-gray-100;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
/* Custom scrollbar */
::-webkit-scrollbar {
width: 8px;
height: 8px;
}
::-webkit-scrollbar-track {
@apply bg-surface-1;
}
::-webkit-scrollbar-thumb {
@apply bg-surface-4 rounded-full;
}
::-webkit-scrollbar-thumb:hover {
@apply bg-fracta-600;
}
}
@layer components {
.btn {
@apply inline-flex items-center justify-center gap-2 px-4 py-2
font-medium rounded-lg transition-all duration-150
focus:outline-none focus:ring-2 focus:ring-fracta-500/50
disabled:opacity-50 disabled:cursor-not-allowed;
}
.btn-primary {
@apply btn bg-fracta-600 hover:bg-fracta-500 text-white;
}
.btn-secondary {
@apply btn bg-surface-3 hover:bg-surface-4 text-gray-200;
}
.btn-ghost {
@apply btn bg-transparent hover:bg-surface-3 text-gray-300;
}
.btn-danger {
@apply btn bg-red-600/20 hover:bg-red-600/30 text-red-400 border border-red-600/30;
}
.input {
@apply w-full px-3 py-2 bg-surface-2 border border-surface-4
rounded-lg text-gray-100 placeholder-gray-500
focus:outline-none focus:border-fracta-500 focus:ring-1 focus:ring-fracta-500/30
transition-colors;
}
.card {
@apply bg-surface-1 border border-surface-3 rounded-xl overflow-hidden;
}
}

View file

@ -0,0 +1,54 @@
/**
* API client Axios instance with JWT auth and automatic refresh.
*/
import axios from 'axios';
import { useAuthStore } from '@/stores/auth';
const API_BASE = import.meta.env.VITE_API_URL || '/api';
const api = axios.create({
baseURL: `${API_BASE}/v1`,
headers: { 'Content-Type': 'application/json' },
withCredentials: true, // Send refresh token cookie
});
// Request interceptor: attach access token
api.interceptors.request.use((config) => {
const token = useAuthStore.getState().accessToken;
if (token) {
config.headers.Authorization = `Bearer ${token}`;
}
return config;
});
// Response interceptor: auto-refresh on 401
api.interceptors.response.use(
(response) => response,
async (error) => {
const original = error.config;
if (error.response?.status === 401 && !original._retry) {
original._retry = true;
try {
const { data } = await axios.post(
`${API_BASE}/v1/auth/refresh`,
{},
{ withCredentials: true },
);
useAuthStore.getState().setAccessToken(data.access_token);
original.headers.Authorization = `Bearer ${data.access_token}`;
return api(original);
} catch {
useAuthStore.getState().logout();
window.location.href = '/login';
return Promise.reject(error);
}
}
return Promise.reject(error);
},
);
export default api;

View file

@ -0,0 +1,26 @@
import React from 'react';
import ReactDOM from 'react-dom/client';
import { BrowserRouter } from 'react-router-dom';
import { QueryClient, QueryClientProvider } from '@tanstack/react-query';
import App from './App';
import './index.css';
const queryClient = new QueryClient({
defaultOptions: {
queries: {
staleTime: 30_000,
retry: 1,
refetchOnWindowFocus: false,
},
},
});
ReactDOM.createRoot(document.getElementById('root')!).render(
<React.StrictMode>
<QueryClientProvider client={queryClient}>
<BrowserRouter>
<App />
</BrowserRouter>
</QueryClientProvider>
</React.StrictMode>,
);

View file

@ -0,0 +1,80 @@
/**
* Bounties page browse open desire queue.
*/
import { useQuery } from '@tanstack/react-query';
import { Link } from 'react-router-dom';
import api from '@/lib/api';
export default function Bounties() {
const { data: desires = [], isLoading } = useQuery({
queryKey: ['desires'],
queryFn: async () => {
const { data } = await api.get('/desires', { params: { limit: 30 } });
return data;
},
});
return (
<div className="max-w-4xl mx-auto px-4 py-6">
<div className="flex items-center justify-between mb-6">
<div>
<h1 className="text-xl font-semibold">Desire Queue</h1>
<p className="text-gray-500 text-sm mt-1">
What the community wants to see. Fulfill a desire to earn tips.
</p>
</div>
</div>
{isLoading ? (
<div className="space-y-3">
{Array.from({ length: 5 }).map((_, i) => (
<div key={i} className="card p-4 animate-pulse">
<div className="h-4 bg-surface-3 rounded w-3/4" />
<div className="h-3 bg-surface-3 rounded w-1/4 mt-2" />
</div>
))}
</div>
) : desires.length > 0 ? (
<div className="space-y-3">
{desires.map((desire: any) => (
<Link key={desire.id} to={`/bounties/${desire.id}`} className="card p-4 block hover:border-fracta-600/30 transition-colors">
<div className="flex items-start justify-between">
<div className="flex-1">
<p className="text-gray-100 font-medium">{desire.prompt_text}</p>
<div className="flex items-center gap-3 mt-2 text-xs text-gray-500">
<span className="flex items-center gap-1">
🔥 Heat: {desire.heat_score.toFixed(1)}
</span>
{desire.cluster_count > 1 && (
<span className="text-purple-400">
👥 {desire.cluster_count} similar
</span>
)}
{desire.tip_amount_cents > 0 && (
<span className="text-green-400">
💰 ${(desire.tip_amount_cents / 100).toFixed(2)} tip
</span>
)}
<span>{new Date(desire.created_at).toLocaleDateString()}</span>
</div>
</div>
<span className={`text-xs px-2 py-1 rounded-full ${
desire.status === 'open' ? 'bg-green-600/20 text-green-400' :
desire.status === 'fulfilled' ? 'bg-blue-600/20 text-blue-400' :
'bg-gray-600/20 text-gray-400'
}`}>
{desire.status}
</span>
</div>
</Link>
))}
</div>
) : (
<div className="text-center py-20 text-gray-500">
No open desires yet. The queue is empty.
</div>
)}
</div>
);
}

View file

@ -0,0 +1,108 @@
/**
* Bounty detail page single desire with fulfillment option.
*/
import { useParams, Link } from 'react-router-dom';
import { useQuery } from '@tanstack/react-query';
import api from '@/lib/api';
export default function BountyDetail() {
const { id } = useParams<{ id: string }>();
const { data: desire, isLoading } = useQuery({
queryKey: ['desire', id],
queryFn: async () => {
const { data } = await api.get(`/desires/${id}`);
return data;
},
enabled: !!id,
});
if (isLoading) {
return (
<div className="max-w-2xl mx-auto px-4 py-10">
<div className="card p-6 animate-pulse space-y-4">
<div className="h-6 bg-surface-3 rounded w-3/4" />
<div className="h-4 bg-surface-3 rounded w-1/2" />
</div>
</div>
);
}
if (!desire) {
return (
<div className="max-w-2xl mx-auto px-4 py-10 text-center text-red-400">
Desire not found
</div>
);
}
return (
<div className="max-w-2xl mx-auto px-4 py-6">
<Link to="/bounties" className="text-sm text-gray-500 hover:text-gray-300 mb-4 inline-block">
Back to Bounties
</Link>
<div className="card p-6">
<div className="flex items-start justify-between">
<div>
<h1 className="text-xl font-bold">{desire.prompt_text}</h1>
<div className="flex items-center gap-3 mt-3 text-sm text-gray-500">
<span>🔥 Heat: {desire.heat_score.toFixed(1)}</span>
{desire.cluster_count > 1 && (
<span className="text-purple-400">
👥 {desire.cluster_count} similar
</span>
)}
{desire.tip_amount_cents > 0 && (
<span className="text-green-400">
💰 ${(desire.tip_amount_cents / 100).toFixed(2)} bounty
</span>
)}
<span>{new Date(desire.created_at).toLocaleDateString()}</span>
</div>
</div>
<span className={`text-sm px-3 py-1 rounded-full ${
desire.status === 'open' ? 'bg-green-600/20 text-green-400' :
desire.status === 'fulfilled' ? 'bg-blue-600/20 text-blue-400' :
'bg-gray-600/20 text-gray-400'
}`}>
{desire.status}
</span>
</div>
{desire.style_hints && (
<div className="mt-4 p-3 bg-surface-2 rounded-lg">
<h3 className="text-sm font-medium text-gray-400 mb-2">Style hints</h3>
<pre className="text-xs text-gray-500 font-mono">
{JSON.stringify(desire.style_hints, null, 2)}
</pre>
</div>
)}
{desire.status === 'open' && (
<div className="mt-6 pt-4 border-t border-surface-3">
<Link to={`/editor?fulfill=${desire.id}`} className="btn-primary">
Fulfill this Desire
</Link>
<p className="text-xs text-gray-500 mt-2">
Write a shader that matches this description, then submit it as fulfillment.
</p>
</div>
)}
{desire.fulfilled_by_shader && (
<div className="mt-6 pt-4 border-t border-surface-3">
<h3 className="text-sm font-medium text-gray-400 mb-2">Fulfilled by</h3>
<Link
to={`/shader/${desire.fulfilled_by_shader}`}
className="text-fracta-400 hover:text-fracta-300"
>
View shader
</Link>
</div>
)}
</div>
</div>
);
}

View file

@ -0,0 +1,359 @@
/**
* Editor page GLSL editor with live WebGL preview.
*
* Features:
* - Resizable split pane with drag handle
* - Save as draft or publish
* - Version history (when editing existing shader)
* - Live preview with 400ms debounce
*/
import { useState, useEffect, useCallback, useRef } from 'react';
import { useParams, useNavigate, useSearchParams } from 'react-router-dom';
import { useQuery } from '@tanstack/react-query';
import api from '@/lib/api';
import { useAuthStore } from '@/stores/auth';
import ShaderCanvas from '@/components/ShaderCanvas';
const DEFAULT_SHADER = `// Fractafrag — write your shader here
// Shadertoy-compatible: mainImage(out vec4 fragColor, in vec2 fragCoord)
// Available uniforms: iTime, iResolution, iMouse
void mainImage(out vec4 fragColor, in vec2 fragCoord) {
vec2 uv = fragCoord / iResolution.xy;
float t = iTime;
// Gradient with time-based animation
vec3 col = 0.5 + 0.5 * cos(t + uv.xyx + vec3(0, 2, 4));
// Add some structure
float d = length(uv - 0.5);
col *= 1.0 - smoothstep(0.0, 0.5, d);
col += 0.05;
fragColor = vec4(col, 1.0);
}
`;
export default function Editor() {
const { id } = useParams<{ id: string }>();
const navigate = useNavigate();
const [searchParams] = useSearchParams();
const { isAuthenticated, user } = useAuthStore();
// Fulfillment context — read once from URL, persist in ref so it survives navigation
const fulfillId = searchParams.get('fulfill');
const fulfillDesireId = useRef(fulfillId);
const [code, setCode] = useState(DEFAULT_SHADER);
const [liveCode, setLiveCode] = useState(DEFAULT_SHADER);
const [title, setTitle] = useState('Untitled Shader');
const [description, setDescription] = useState('');
const [tags, setTags] = useState('');
const [shaderType, setShaderType] = useState('2d');
const [compileError, setCompileError] = useState('');
const [submitting, setSubmitting] = useState(false);
const [submitError, setSubmitError] = useState('');
const [showMeta, setShowMeta] = useState(false);
const [savedStatus, setSavedStatus] = useState<string | null>(null);
const [editingExisting, setEditingExisting] = useState(false);
// Resizable pane state
const [editorWidth, setEditorWidth] = useState(50); // percentage
const isDragging = useRef(false);
const containerRef = useRef<HTMLDivElement>(null);
const debounceRef = useRef<ReturnType<typeof setTimeout>>();
// Load existing shader for editing or forking
const { data: existingShader } = useQuery({
queryKey: ['shader', id],
queryFn: async () => {
const { data } = await api.get(`/shaders/${id}`);
return data;
},
enabled: !!id,
});
// Fetch desire context when fulfilling
const { data: fulfillDesire } = useQuery({
queryKey: ['desire', fulfillDesireId.current],
queryFn: async () => {
const { data } = await api.get(`/desires/${fulfillDesireId.current}`);
return data;
},
enabled: !!fulfillDesireId.current,
});
useEffect(() => {
if (existingShader) {
setCode(existingShader.glsl_code);
setLiveCode(existingShader.glsl_code);
setTitle(existingShader.title);
setDescription(existingShader.description || '');
setShaderType(existingShader.shader_type);
setTags(existingShader.tags?.join(', ') || '');
// If we own it, we're editing; otherwise forking
if (user && existingShader.author_id === user.id) {
setEditingExisting(true);
} else {
setTitle(`Fork of ${existingShader.title}`);
setEditingExisting(false);
}
}
}, [existingShader, user]);
// ── Drag handle for resizable pane ──────────────────────
const handleMouseDown = useCallback((e: React.MouseEvent) => {
e.preventDefault();
isDragging.current = true;
document.body.style.cursor = 'col-resize';
document.body.style.userSelect = 'none';
}, []);
useEffect(() => {
const handleMouseMove = (e: MouseEvent) => {
if (!isDragging.current || !containerRef.current) return;
const rect = containerRef.current.getBoundingClientRect();
const pct = ((e.clientX - rect.left) / rect.width) * 100;
setEditorWidth(Math.max(20, Math.min(80, pct)));
};
const handleMouseUp = () => {
isDragging.current = false;
document.body.style.cursor = '';
document.body.style.userSelect = '';
};
document.addEventListener('mousemove', handleMouseMove);
document.addEventListener('mouseup', handleMouseUp);
return () => {
document.removeEventListener('mousemove', handleMouseMove);
document.removeEventListener('mouseup', handleMouseUp);
};
}, []);
// ── Debounced live preview ──────────────────────────────
const handleCodeChange = useCallback((value: string) => {
setCode(value);
setSavedStatus(null);
if (debounceRef.current) clearTimeout(debounceRef.current);
debounceRef.current = setTimeout(() => {
setLiveCode(value);
}, 400);
}, []);
// ── Save / Publish ─────────────────────────────────────
const handleSave = async (publishStatus: 'draft' | 'published') => {
if (!isAuthenticated()) {
navigate('/login');
return;
}
setSubmitting(true);
setSubmitError('');
const payload = {
title,
description,
glsl_code: code,
tags: tags.split(',').map(t => t.trim()).filter(Boolean),
shader_type: shaderType,
status: publishStatus,
is_public: publishStatus === 'published',
};
try {
if (editingExisting && id) {
// Update existing shader
const { data } = await api.put(`/shaders/${id}`, {
...payload,
change_note: publishStatus === 'published' ? 'Updated' : undefined,
});
setSavedStatus(publishStatus === 'draft' ? 'Draft saved' : 'Published');
if (publishStatus === 'published') {
setTimeout(() => navigate(`/shader/${data.id}`), 800);
}
} else {
// Create new shader
const { data } = await api.post('/shaders', {
...payload,
fulfills_desire_id: fulfillDesireId.current || undefined,
});
if (publishStatus === 'published') {
navigate(`/shader/${data.id}`);
} else {
// Redirect to editor with the new ID so subsequent saves are updates
setSavedStatus('Draft saved');
navigate(`/editor/${data.id}`, { replace: true });
}
}
} catch (err: any) {
const detail = err.response?.data?.detail;
if (typeof detail === 'object' && detail.errors) {
setSubmitError(detail.errors.join('\n'));
} else {
setSubmitError(detail || 'Save failed');
}
} finally {
setSubmitting(false);
}
};
return (
<div className="h-[calc(100vh-3.5rem)] flex flex-col">
{/* Toolbar */}
<div className="flex items-center justify-between px-4 py-2 bg-surface-1 border-b border-surface-3">
<div className="flex items-center gap-3">
<input
type="text"
value={title}
onChange={(e) => setTitle(e.target.value)}
className="bg-transparent text-lg font-medium text-gray-100 focus:outline-none
border-b border-transparent focus:border-fracta-500 transition-colors w-64"
placeholder="Shader title..."
/>
<button
onClick={() => setShowMeta(!showMeta)}
className="btn-ghost text-xs py-1 px-2"
>
{showMeta ? 'Hide details' : 'Details'}
</button>
{editingExisting && existingShader && (
<span className="text-xs text-gray-500">
v{existingShader.current_version}
</span>
)}
</div>
<div className="flex items-center gap-2">
{compileError && (
<span className="text-xs text-red-400 max-w-xs truncate" title={compileError}>
{compileError.split('\n')[0]}
</span>
)}
{savedStatus && (
<span className="text-xs text-green-400 animate-fade-in">{savedStatus}</span>
)}
<button
onClick={() => handleSave('draft')}
disabled={submitting}
className="btn-secondary text-sm py-1.5"
>
{submitting ? '...' : 'Save Draft'}
</button>
<button
onClick={() => handleSave('published')}
disabled={submitting || !!compileError}
className="btn-primary text-sm py-1.5"
>
{submitting ? 'Publishing...' : 'Publish'}
</button>
</div>
</div>
{/* Metadata panel */}
{showMeta && (
<div className="px-4 py-3 bg-surface-1 border-b border-surface-3 flex gap-4 items-end animate-slide-up">
<div className="flex-1">
<label className="text-xs text-gray-500">Description</label>
<input
type="text"
value={description}
onChange={(e) => setDescription(e.target.value)}
className="input text-sm mt-1"
placeholder="What does this shader do?"
/>
</div>
<div className="w-48">
<label className="text-xs text-gray-500">Tags (comma-separated)</label>
<input
type="text"
value={tags}
onChange={(e) => setTags(e.target.value)}
className="input text-sm mt-1"
placeholder="fractal, noise, 3d"
/>
</div>
<div className="w-32">
<label className="text-xs text-gray-500">Type</label>
<select
value={shaderType}
onChange={(e) => setShaderType(e.target.value)}
className="input text-sm mt-1"
>
<option value="2d">2D</option>
<option value="3d">3D</option>
<option value="audio-reactive">Audio</option>
</select>
</div>
</div>
)}
{/* Submit error */}
{submitError && (
<div className="px-4 py-2 bg-red-600/10 text-red-400 text-sm border-b border-red-600/20">
{submitError}
</div>
)}
{/* Desire fulfillment context banner */}
{fulfillDesire && (
<div className="px-4 py-3 bg-amber-600/10 border-b border-amber-600/20 flex items-center gap-3">
<span className="text-amber-400 text-sm font-medium">🎯 Fulfilling desire:</span>
<span className="text-gray-300 text-sm flex-1">{fulfillDesire.prompt_text}</span>
{fulfillDesire.style_hints && (
<span className="text-xs text-gray-500">Style hints available</span>
)}
</div>
)}
{/* Split pane: editor + drag handle + preview */}
<div ref={containerRef} className="flex-1 flex min-h-0">
{/* Code editor */}
<div className="flex flex-col" style={{ width: `${editorWidth}%` }}>
<div className="px-3 py-1.5 bg-surface-2 text-xs text-gray-500 border-b border-surface-3 flex items-center gap-2">
<span className="w-2 h-2 rounded-full bg-green-500" />
fragment.glsl
</div>
<textarea
value={code}
onChange={(e) => handleCodeChange(e.target.value)}
className="flex-1 bg-surface-0 text-gray-200 font-mono text-sm p-4
resize-none focus:outline-none leading-relaxed
selection:bg-fracta-600/30"
spellCheck={false}
autoCapitalize="off"
autoCorrect="off"
/>
</div>
{/* Drag handle */}
<div
onMouseDown={handleMouseDown}
className="w-1.5 bg-surface-3 hover:bg-fracta-600 cursor-col-resize
transition-colors flex-shrink-0 relative group"
>
<div className="absolute inset-y-0 -left-1 -right-1" /> {/* Wider hit area */}
<div className="absolute top-1/2 left-1/2 -translate-x-1/2 -translate-y-1/2
w-1 h-8 bg-gray-600 group-hover:bg-fracta-400 rounded-full transition-colors" />
</div>
{/* Live preview */}
<div className="flex-1 bg-black relative min-w-0">
<ShaderCanvas
code={liveCode}
className="w-full h-full"
animate={true}
onError={(err) => setCompileError(err)}
onCompileSuccess={() => setCompileError('')}
/>
{!liveCode.trim() && (
<div className="absolute inset-0 flex items-center justify-center text-gray-600">
Write some GLSL to see it rendered live
</div>
)}
</div>
</div>
</div>
);
}

View file

@ -0,0 +1,127 @@
/**
* Explore page browse shaders by tag, trending, new, top.
*/
import { useState } from 'react';
import { useQuery } from '@tanstack/react-query';
import { Link, useSearchParams } from 'react-router-dom';
import api from '@/lib/api';
import ShaderCanvas from '@/components/ShaderCanvas';
type SortOption = 'trending' | 'new' | 'top';
export default function Explore() {
const [searchParams, setSearchParams] = useSearchParams();
const [sort, setSort] = useState<SortOption>((searchParams.get('sort') as SortOption) || 'trending');
const [query, setQuery] = useState(searchParams.get('q') || '');
const tagFilter = searchParams.get('tags')?.split(',').filter(Boolean) || [];
const { data: shaders = [], isLoading } = useQuery({
queryKey: ['explore', sort, query, tagFilter.join(',')],
queryFn: async () => {
const params: any = { sort, limit: 30 };
if (query) params.q = query;
if (tagFilter.length) params.tags = tagFilter;
const { data } = await api.get('/shaders', { params });
return data;
},
});
const handleSearch = (e: React.FormEvent) => {
e.preventDefault();
setSearchParams({ sort, q: query });
};
return (
<div className="max-w-7xl mx-auto px-4 py-6">
<div className="flex items-center justify-between mb-6">
<h1 className="text-xl font-semibold">Explore</h1>
{/* Sort tabs */}
<div className="flex gap-1 bg-surface-2 rounded-lg p-1">
{(['trending', 'new', 'top'] as SortOption[]).map((s) => (
<button
key={s}
onClick={() => setSort(s)}
className={`px-3 py-1 text-sm rounded-md transition-colors ${
sort === s ? 'bg-fracta-600 text-white' : 'text-gray-400 hover:text-gray-200'
}`}
>
{s.charAt(0).toUpperCase() + s.slice(1)}
</button>
))}
</div>
</div>
{/* Search */}
<form onSubmit={handleSearch} className="mb-6">
<input
type="text"
value={query}
onChange={(e) => setQuery(e.target.value)}
className="input max-w-md"
placeholder="Search shaders..."
/>
</form>
{/* Tag filter pills */}
{tagFilter.length > 0 && (
<div className="flex gap-2 mb-4">
{tagFilter.map((tag) => (
<span key={tag} className="text-xs px-2 py-1 bg-fracta-600/20 text-fracta-400 rounded-full flex items-center gap-1">
#{tag}
<button
onClick={() => {
const newTags = tagFilter.filter(t => t !== tag);
setSearchParams(newTags.length ? { tags: newTags.join(',') } : {});
}}
className="hover:text-white"
>
×
</button>
</span>
))}
</div>
)}
{/* Grid */}
{isLoading ? (
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4">
{Array.from({ length: 8 }).map((_, i) => (
<div key={i} className="card animate-pulse">
<div className="aspect-video bg-surface-3" />
<div className="p-3 space-y-2">
<div className="h-4 bg-surface-3 rounded w-3/4" />
<div className="h-3 bg-surface-3 rounded w-1/2" />
</div>
</div>
))}
</div>
) : shaders.length > 0 ? (
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4">
{shaders.map((shader: any) => (
<Link key={shader.id} to={`/shader/${shader.id}`} className="card group">
<div className="aspect-video bg-surface-2 overflow-hidden">
<ShaderCanvas code={shader.glsl_code} className="w-full h-full" animate={true} />
</div>
<div className="p-3">
<h3 className="font-medium text-gray-100 group-hover:text-fracta-400 transition-colors truncate">
{shader.title}
</h3>
<div className="flex items-center gap-2 mt-1 text-xs text-gray-500">
<span>{shader.shader_type}</span>
<span>·</span>
<span>{shader.view_count} views</span>
</div>
</div>
</Link>
))}
</div>
) : (
<div className="text-center py-20 text-gray-500">
No shaders found. Try a different search or sort.
</div>
)}
</div>
);
}

View file

@ -0,0 +1,203 @@
/**
* Feed page infinite scroll of live-rendered shaders.
* Dwell time tracking via IntersectionObserver.
*/
import { useInfiniteQuery } from '@tanstack/react-query';
import { useRef, useEffect, useCallback } from 'react';
import { Link } from 'react-router-dom';
import api from '@/lib/api';
import { useAuthStore } from '@/stores/auth';
import ShaderCanvas from '@/components/ShaderCanvas';
interface Shader {
id: string;
title: string;
author_id: string | null;
glsl_code: string;
thumbnail_url: string | null;
tags: string[];
shader_type: string;
score: number;
view_count: number;
is_ai_generated: boolean;
style_metadata: any;
created_at: string;
}
function FeedCard({ shader }: { shader: Shader }) {
const cardRef = useRef<HTMLDivElement>(null);
const startTimeRef = useRef<number | null>(null);
const { isAuthenticated } = useAuthStore();
// Dwell time tracking
useEffect(() => {
const el = cardRef.current;
if (!el) return;
const observer = new IntersectionObserver(
(entries) => {
for (const entry of entries) {
if (entry.isIntersecting) {
startTimeRef.current = Date.now();
} else if (startTimeRef.current) {
const dwell = (Date.now() - startTimeRef.current) / 1000;
if (dwell > 1) {
// Fire-and-forget dwell report
api.post('/feed/dwell', {
shader_id: shader.id,
dwell_secs: dwell,
replayed: false,
}).catch(() => {}); // best effort
}
startTimeRef.current = null;
}
}
},
{ threshold: 0.5 },
);
observer.observe(el);
return () => observer.disconnect();
}, [shader.id]);
return (
<div ref={cardRef} className="card group animate-fade-in">
<Link to={`/shader/${shader.id}`} className="block">
<div className="aspect-video bg-surface-2 relative overflow-hidden">
<ShaderCanvas
code={shader.glsl_code}
className="w-full h-full"
animate={true}
/>
{shader.is_ai_generated && (
<span className="absolute top-2 right-2 px-2 py-0.5 bg-fracta-600/80 text-xs rounded-full">
AI
</span>
)}
</div>
</Link>
<div className="p-3">
<Link to={`/shader/${shader.id}`}>
<h3 className="font-medium text-gray-100 group-hover:text-fracta-400 transition-colors truncate">
{shader.title}
</h3>
</Link>
<div className="flex items-center justify-between mt-1">
<div className="flex items-center gap-2 text-xs text-gray-500">
<span>{shader.shader_type}</span>
<span>·</span>
<span>{shader.view_count} views</span>
</div>
<div className="flex gap-1">
{shader.tags.slice(0, 3).map((tag) => (
<span key={tag} className="text-xs px-1.5 py-0.5 bg-surface-3 rounded text-gray-400">
{tag}
</span>
))}
</div>
</div>
</div>
</div>
);
}
export default function Feed() {
const sentinelRef = useRef<HTMLDivElement>(null);
const {
data,
fetchNextPage,
hasNextPage,
isFetchingNextPage,
isLoading,
error,
} = useInfiniteQuery({
queryKey: ['feed'],
queryFn: async ({ pageParam = 0 }) => {
const { data } = await api.get('/feed', { params: { offset: pageParam, limit: 20 } });
return data;
},
getNextPageParam: (lastPage, allPages) => {
if (lastPage.length < 20) return undefined;
return allPages.flat().length;
},
initialPageParam: 0,
});
// Infinite scroll trigger
useEffect(() => {
const sentinel = sentinelRef.current;
if (!sentinel) return;
const observer = new IntersectionObserver(
(entries) => {
if (entries[0]?.isIntersecting && hasNextPage && !isFetchingNextPage) {
fetchNextPage();
}
},
{ rootMargin: '200px' },
);
observer.observe(sentinel);
return () => observer.disconnect();
}, [hasNextPage, isFetchingNextPage, fetchNextPage]);
const shaders = data?.pages.flat() ?? [];
return (
<div className="max-w-7xl mx-auto px-4 py-6">
<div className="flex items-center justify-between mb-6">
<h1 className="text-xl font-semibold">Your Feed</h1>
<Link to="/editor" className="btn-primary text-sm">
+ New Shader
</Link>
</div>
{isLoading && (
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-4">
{Array.from({ length: 6 }).map((_, i) => (
<div key={i} className="card animate-pulse">
<div className="aspect-video bg-surface-3" />
<div className="p-3 space-y-2">
<div className="h-4 bg-surface-3 rounded w-3/4" />
<div className="h-3 bg-surface-3 rounded w-1/2" />
</div>
</div>
))}
</div>
)}
{error && (
<div className="p-4 bg-red-600/10 border border-red-600/20 rounded-lg text-red-400">
Failed to load feed. Please try again.
</div>
)}
{shaders.length > 0 && (
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-4">
{shaders.map((shader: Shader) => (
<FeedCard key={shader.id} shader={shader} />
))}
</div>
)}
{shaders.length === 0 && !isLoading && (
<div className="text-center py-20">
<p className="text-gray-400 text-lg">No shaders yet</p>
<p className="text-gray-500 mt-2">Be the first to create one</p>
<Link to="/editor" className="btn-primary mt-4 inline-flex">
Open Editor
</Link>
</div>
)}
{/* Infinite scroll sentinel */}
<div ref={sentinelRef} className="h-10" />
{isFetchingNextPage && (
<div className="text-center py-4 text-gray-500">Loading more...</div>
)}
</div>
);
}

View file

@ -0,0 +1,106 @@
/**
* AI Generation page prompt-to-shader interface.
* Stub for M5 shows UI with "coming soon" state.
*/
import { useState } from 'react';
import { useAuthStore } from '@/stores/auth';
import { Link } from 'react-router-dom';
export default function Generate() {
const { isAuthenticated, user } = useAuthStore();
const [prompt, setPrompt] = useState('');
return (
<div className="max-w-3xl mx-auto px-4 py-6">
<div className="text-center mb-8">
<h1 className="text-2xl font-bold">AI Shader Generator</h1>
<p className="text-gray-400 mt-2">
Describe what you want to see and let AI write the shader for you.
</p>
</div>
<div className="card p-6">
{/* Prompt input */}
<div className="mb-6">
<label className="block text-sm text-gray-400 mb-2">What do you want to see?</label>
<textarea
value={prompt}
onChange={(e) => setPrompt(e.target.value)}
className="input min-h-[100px] resize-y font-normal"
placeholder="A flowing aurora borealis with deep purples and greens, slowly morphing..."
/>
</div>
{/* Style controls */}
<div className="grid grid-cols-3 gap-4 mb-6">
<div>
<label className="block text-xs text-gray-500 mb-1">Chaos Level</label>
<input type="range" min="0" max="100" defaultValue="50"
className="w-full accent-fracta-500" />
</div>
<div>
<label className="block text-xs text-gray-500 mb-1">Color Temperature</label>
<select className="input text-sm">
<option>Warm</option>
<option>Cool</option>
<option>Neutral</option>
<option>Monochrome</option>
</select>
</div>
<div>
<label className="block text-xs text-gray-500 mb-1">Motion Type</label>
<select className="input text-sm">
<option>Fluid</option>
<option>Geometric</option>
<option>Pulsing</option>
<option>Static</option>
</select>
</div>
</div>
{/* Generate button / status */}
<div className="text-center">
<button
disabled
className="btn-primary opacity-60 cursor-not-allowed px-8 py-3 text-lg"
>
Generate Shader
</button>
<p className="text-sm text-gray-500 mt-3">
AI generation is coming in M5. For now, use the{' '}
<Link to="/editor" className="text-fracta-400 hover:text-fracta-300">editor</Link>{' '}
to write shaders manually.
</p>
{isAuthenticated() && user && (
<p className="text-xs text-gray-600 mt-2">
Credits remaining: {user.ai_credits_remaining}
</p>
)}
</div>
</div>
{/* Teaser examples */}
<div className="mt-8">
<h2 className="text-sm font-medium text-gray-400 mb-3">Example prompts (coming soon)</h2>
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3">
{[
"Ragdoll physics but dark and slow",
"Underwater caustics with bioluminescent particles",
"Infinite fractal zoom through a crystal cathedral",
"VHS glitch art with neon pink scanlines",
].map((example) => (
<button
key={example}
onClick={() => setPrompt(example)}
className="text-left p-3 bg-surface-2 hover:bg-surface-3 rounded-lg text-sm text-gray-400 transition-colors"
>
"{example}"
</button>
))}
</div>
</div>
</div>
);
}

View file

@ -0,0 +1,97 @@
import { useState } from 'react';
import { Link, useNavigate } from 'react-router-dom';
import api from '@/lib/api';
import { useAuthStore } from '@/stores/auth';
export default function Login() {
const [email, setEmail] = useState('');
const [password, setPassword] = useState('');
const [error, setError] = useState('');
const [loading, setLoading] = useState(false);
const navigate = useNavigate();
const { login } = useAuthStore();
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
setError('');
setLoading(true);
try {
const { data } = await api.post('/auth/login', {
email,
password,
turnstile_token: 'dev-bypass', // TODO: Turnstile widget
});
// Fetch user profile
const profileResp = await api.get('/me', {
headers: { Authorization: `Bearer ${data.access_token}` },
});
login(data.access_token, profileResp.data);
navigate('/');
} catch (err: any) {
setError(err.response?.data?.detail || 'Login failed');
} finally {
setLoading(false);
}
};
return (
<div className="min-h-screen bg-surface-0 flex items-center justify-center px-4">
<div className="w-full max-w-sm">
<div className="text-center mb-8">
<Link to="/" className="inline-block">
<h1 className="text-2xl font-bold bg-gradient-to-r from-fracta-400 to-fracta-600 bg-clip-text text-transparent">
fractafrag
</h1>
</Link>
<p className="text-gray-400 mt-2">Welcome back</p>
</div>
<form onSubmit={handleSubmit} className="card p-6 space-y-4">
{error && (
<div className="p-3 bg-red-600/10 border border-red-600/20 rounded-lg text-red-400 text-sm">
{error}
</div>
)}
<div>
<label htmlFor="email" className="block text-sm text-gray-400 mb-1">Email</label>
<input
id="email"
type="email"
value={email}
onChange={(e) => setEmail(e.target.value)}
className="input"
placeholder="you@example.com"
required
/>
</div>
<div>
<label htmlFor="password" className="block text-sm text-gray-400 mb-1">Password</label>
<input
id="password"
type="password"
value={password}
onChange={(e) => setPassword(e.target.value)}
className="input"
placeholder="••••••••"
required
/>
</div>
<button type="submit" disabled={loading} className="btn-primary w-full">
{loading ? 'Signing in...' : 'Sign In'}
</button>
<p className="text-center text-sm text-gray-500">
Don't have an account?{' '}
<Link to="/register" className="text-fracta-400 hover:text-fracta-300">Sign up</Link>
</p>
</form>
</div>
</div>
);
}

View file

@ -0,0 +1,197 @@
/**
* My Shaders personal workspace with drafts, published, archived.
* Version history access and iteration workflow.
*/
import { useState } from 'react';
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
import { Link, useNavigate } from 'react-router-dom';
import api from '@/lib/api';
import { useAuthStore } from '@/stores/auth';
import ShaderCanvas from '@/components/ShaderCanvas';
type StatusTab = 'all' | 'draft' | 'published' | 'archived';
export default function MyShaders() {
const { isAuthenticated, user } = useAuthStore();
const navigate = useNavigate();
const queryClient = useQueryClient();
const [tab, setTab] = useState<StatusTab>('all');
if (!isAuthenticated()) {
navigate('/login');
return null;
}
const { data: shaders = [], isLoading } = useQuery({
queryKey: ['my-shaders', tab],
queryFn: async () => {
const params: any = { limit: 100 };
if (tab !== 'all') params.status = tab;
const { data } = await api.get('/shaders/mine', { params });
return data;
},
});
const archiveMutation = useMutation({
mutationFn: async (id: string) => {
await api.put(`/shaders/${id}`, { status: 'archived' });
},
onSuccess: () => queryClient.invalidateQueries({ queryKey: ['my-shaders'] }),
});
const publishMutation = useMutation({
mutationFn: async (id: string) => {
await api.put(`/shaders/${id}`, { status: 'published', is_public: true });
},
onSuccess: () => queryClient.invalidateQueries({ queryKey: ['my-shaders'] }),
});
const deleteMutation = useMutation({
mutationFn: async (id: string) => {
await api.delete(`/shaders/${id}`);
},
onSuccess: () => queryClient.invalidateQueries({ queryKey: ['my-shaders'] }),
});
const counts = {
all: shaders.length,
draft: shaders.filter((s: any) => s.status === 'draft').length,
published: shaders.filter((s: any) => s.status === 'published').length,
archived: shaders.filter((s: any) => s.status === 'archived').length,
};
return (
<div className="max-w-6xl mx-auto px-4 py-6">
<div className="flex items-center justify-between mb-6">
<div>
<h1 className="text-xl font-semibold">My Shaders</h1>
<p className="text-gray-500 text-sm mt-1">
Your workspace drafts, published shaders, and version history.
</p>
</div>
<Link to="/editor" className="btn-primary text-sm">+ New Shader</Link>
</div>
{/* Status tabs */}
<div className="flex gap-1 bg-surface-2 rounded-lg p-1 mb-6 w-fit">
{(['all', 'draft', 'published', 'archived'] as StatusTab[]).map((s) => (
<button
key={s}
onClick={() => setTab(s)}
className={`px-3 py-1.5 text-sm rounded-md transition-colors flex items-center gap-1.5 ${
tab === s ? 'bg-fracta-600 text-white' : 'text-gray-400 hover:text-gray-200'
}`}
>
{s === 'all' ? 'All' : s.charAt(0).toUpperCase() + s.slice(1)}
<span className={`text-xs px-1.5 py-0.5 rounded-full ${
tab === s ? 'bg-fracta-500/50' : 'bg-surface-4'
}`}>
{counts[s]}
</span>
</button>
))}
</div>
{/* Shader list */}
{isLoading ? (
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-4">
{Array.from({ length: 6 }).map((_, i) => (
<div key={i} className="card animate-pulse">
<div className="aspect-video bg-surface-3" />
<div className="p-3 space-y-2">
<div className="h-4 bg-surface-3 rounded w-3/4" />
<div className="h-3 bg-surface-3 rounded w-1/2" />
</div>
</div>
))}
</div>
) : shaders.length > 0 ? (
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-4">
{shaders.map((shader: any) => (
<div key={shader.id} className="card group">
<Link to={shader.status === 'draft' ? `/editor/${shader.id}` : `/shader/${shader.id}`}>
<div className="aspect-video bg-surface-2 relative overflow-hidden">
<ShaderCanvas code={shader.glsl_code} className="w-full h-full" animate={true} />
{/* Status badge */}
<span className={`absolute top-2 left-2 px-2 py-0.5 text-xs rounded-full ${
shader.status === 'draft' ? 'bg-yellow-600/80 text-yellow-100' :
shader.status === 'archived' ? 'bg-gray-600/80 text-gray-300' :
'bg-green-600/80 text-green-100'
}`}>
{shader.status}
</span>
{/* Version badge */}
<span className="absolute top-2 right-2 px-2 py-0.5 text-xs rounded-full bg-surface-0/80 text-gray-400">
v{shader.current_version}
</span>
</div>
</Link>
<div className="p-3">
<Link to={shader.status === 'draft' ? `/editor/${shader.id}` : `/shader/${shader.id}`}>
<h3 className="font-medium text-gray-100 group-hover:text-fracta-400 transition-colors truncate">
{shader.title}
</h3>
</Link>
<div className="flex items-center justify-between mt-2">
<span className="text-xs text-gray-500">
{new Date(shader.updated_at).toLocaleDateString()} · {shader.shader_type}
</span>
<div className="flex gap-1">
{shader.status === 'draft' && (
<>
<Link to={`/editor/${shader.id}`} className="btn-ghost text-xs py-0.5 px-2">
Edit
</Link>
<button
onClick={(e) => { e.stopPropagation(); publishMutation.mutate(shader.id); }}
className="btn-primary text-xs py-0.5 px-2"
>
Publish
</button>
</>
)}
{shader.status === 'published' && (
<>
<Link to={`/editor/${shader.id}`} className="btn-ghost text-xs py-0.5 px-2">
Edit
</Link>
<button
onClick={(e) => { e.stopPropagation(); archiveMutation.mutate(shader.id); }}
className="btn-ghost text-xs py-0.5 px-2 text-yellow-400"
>
Archive
</button>
</>
)}
{shader.status === 'archived' && (
<>
<button
onClick={(e) => { e.stopPropagation(); publishMutation.mutate(shader.id); }}
className="btn-ghost text-xs py-0.5 px-2 text-green-400"
>
Restore
</button>
<button
onClick={(e) => { e.stopPropagation(); deleteMutation.mutate(shader.id); }}
className="btn-ghost text-xs py-0.5 px-2 text-red-400"
>
Delete
</button>
</>
)}
</div>
</div>
</div>
</div>
))}
</div>
) : (
<div className="text-center py-20">
<p className="text-gray-400 text-lg">No {tab === 'all' ? '' : tab + ' '}shaders yet</p>
<Link to="/editor" className="btn-primary mt-4 inline-flex">Create Your First Shader</Link>
</div>
)}
</div>
);
}

View file

@ -0,0 +1,91 @@
/**
* Profile page user's shaders, stats.
*/
import { useParams } from 'react-router-dom';
import { useQuery } from '@tanstack/react-query';
import { Link } from 'react-router-dom';
import api from '@/lib/api';
import ShaderCanvas from '@/components/ShaderCanvas';
export default function Profile() {
const { username } = useParams<{ username: string }>();
const { data: profile, isLoading: loadingProfile } = useQuery({
queryKey: ['profile', username],
queryFn: async () => {
const { data } = await api.get(`/users/${username}`);
return data;
},
enabled: !!username,
});
const { data: shaders = [] } = useQuery({
queryKey: ['user-shaders', username],
queryFn: async () => {
// Use search to find shaders by this user
const { data } = await api.get('/shaders', { params: { limit: 50 } });
// Filter client-side for now — proper user-shader endpoint in future
return data.filter((s: any) => s.author_id === profile?.id);
},
enabled: !!profile?.id,
});
if (loadingProfile) {
return (
<div className="max-w-4xl mx-auto px-4 py-10 text-center text-gray-500">Loading...</div>
);
}
if (!profile) {
return (
<div className="max-w-4xl mx-auto px-4 py-10 text-center text-red-400">User not found</div>
);
}
return (
<div className="max-w-4xl mx-auto px-4 py-6">
{/* Profile header */}
<div className="flex items-center gap-4 mb-8">
<div className="w-16 h-16 bg-fracta-600/20 rounded-full flex items-center justify-center text-2xl">
{profile.username.charAt(0).toUpperCase()}
</div>
<div>
<h1 className="text-xl font-bold flex items-center gap-2">
{profile.username}
{profile.is_verified_creator && (
<span className="text-fracta-400 text-sm"> Verified</span>
)}
</h1>
<p className="text-sm text-gray-500">
Joined {new Date(profile.created_at).toLocaleDateString()}
<span className="mx-2">·</span>
{profile.subscription_tier} tier
</p>
</div>
</div>
{/* Shaders grid */}
<h2 className="text-lg font-semibold mb-4">Shaders ({shaders.length})</h2>
{shaders.length > 0 ? (
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 gap-4">
{shaders.map((shader: any) => (
<Link key={shader.id} to={`/shader/${shader.id}`} className="card group">
<div className="aspect-video bg-surface-2 overflow-hidden">
<ShaderCanvas code={shader.glsl_code} className="w-full h-full" animate={true} />
</div>
<div className="p-3">
<h3 className="font-medium text-gray-100 group-hover:text-fracta-400 transition-colors truncate">
{shader.title}
</h3>
</div>
</Link>
))}
</div>
) : (
<div className="text-center py-10 text-gray-500">No shaders yet</div>
)}
</div>
);
}

View file

@ -0,0 +1,116 @@
import { useState } from 'react';
import { Link, useNavigate } from 'react-router-dom';
import api from '@/lib/api';
import { useAuthStore } from '@/stores/auth';
export default function Register() {
const [username, setUsername] = useState('');
const [email, setEmail] = useState('');
const [password, setPassword] = useState('');
const [error, setError] = useState('');
const [loading, setLoading] = useState(false);
const navigate = useNavigate();
const { login } = useAuthStore();
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
setError('');
setLoading(true);
try {
const { data } = await api.post('/auth/register', {
username,
email,
password,
turnstile_token: 'dev-bypass', // TODO: Turnstile widget
});
const profileResp = await api.get('/me', {
headers: { Authorization: `Bearer ${data.access_token}` },
});
login(data.access_token, profileResp.data);
navigate('/');
} catch (err: any) {
setError(err.response?.data?.detail || 'Registration failed');
} finally {
setLoading(false);
}
};
return (
<div className="min-h-screen bg-surface-0 flex items-center justify-center px-4">
<div className="w-full max-w-sm">
<div className="text-center mb-8">
<Link to="/" className="inline-block">
<h1 className="text-2xl font-bold bg-gradient-to-r from-fracta-400 to-fracta-600 bg-clip-text text-transparent">
fractafrag
</h1>
</Link>
<p className="text-gray-400 mt-2">Create your account</p>
</div>
<form onSubmit={handleSubmit} className="card p-6 space-y-4">
{error && (
<div className="p-3 bg-red-600/10 border border-red-600/20 rounded-lg text-red-400 text-sm">
{error}
</div>
)}
<div>
<label htmlFor="username" className="block text-sm text-gray-400 mb-1">Username</label>
<input
id="username"
type="text"
value={username}
onChange={(e) => setUsername(e.target.value)}
className="input"
placeholder="shader_wizard"
pattern="[a-zA-Z0-9_-]+"
minLength={3}
maxLength={30}
required
/>
</div>
<div>
<label htmlFor="email" className="block text-sm text-gray-400 mb-1">Email</label>
<input
id="email"
type="email"
value={email}
onChange={(e) => setEmail(e.target.value)}
className="input"
placeholder="you@example.com"
required
/>
</div>
<div>
<label htmlFor="password" className="block text-sm text-gray-400 mb-1">Password</label>
<input
id="password"
type="password"
value={password}
onChange={(e) => setPassword(e.target.value)}
className="input"
placeholder="••••••••"
minLength={8}
required
/>
<p className="text-xs text-gray-500 mt-1">Minimum 8 characters</p>
</div>
<button type="submit" disabled={loading} className="btn-primary w-full">
{loading ? 'Creating account...' : 'Create Account'}
</button>
<p className="text-center text-sm text-gray-500">
Already have an account?{' '}
<Link to="/login" className="text-fracta-400 hover:text-fracta-300">Sign in</Link>
</p>
</form>
</div>
</div>
);
}

View file

@ -0,0 +1,188 @@
/**
* Settings page account, subscription, API keys.
*/
import { useState } from 'react';
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
import { useNavigate } from 'react-router-dom';
import api from '@/lib/api';
import { useAuthStore } from '@/stores/auth';
export default function Settings() {
const { user, isAuthenticated } = useAuthStore();
const navigate = useNavigate();
const queryClient = useQueryClient();
const [newKeyName, setNewKeyName] = useState('');
const [createdKey, setCreatedKey] = useState<string | null>(null);
if (!isAuthenticated() || !user) {
navigate('/login');
return null;
}
const { data: apiKeys = [] } = useQuery({
queryKey: ['api-keys'],
queryFn: async () => {
const { data } = await api.get('/me/api-keys');
return data;
},
});
const createKey = useMutation({
mutationFn: async (name: string) => {
const { data } = await api.post('/me/api-keys', { name });
return data;
},
onSuccess: (data) => {
setCreatedKey(data.full_key);
setNewKeyName('');
queryClient.invalidateQueries({ queryKey: ['api-keys'] });
},
});
const revokeKey = useMutation({
mutationFn: async (keyId: string) => {
await api.delete(`/me/api-keys/${keyId}`);
},
onSuccess: () => queryClient.invalidateQueries({ queryKey: ['api-keys'] }),
});
return (
<div className="max-w-2xl mx-auto px-4 py-6">
<h1 className="text-xl font-semibold mb-6">Settings</h1>
{/* Account info */}
<section className="card p-6 mb-6">
<h2 className="text-lg font-medium mb-4">Account</h2>
<div className="space-y-3 text-sm">
<div className="flex justify-between">
<span className="text-gray-400">Username</span>
<span>{user.username}</span>
</div>
<div className="flex justify-between">
<span className="text-gray-400">Email</span>
<span>{user.email}</span>
</div>
<div className="flex justify-between">
<span className="text-gray-400">Subscription</span>
<span className="capitalize">{user.subscription_tier}</span>
</div>
<div className="flex justify-between">
<span className="text-gray-400">AI Credits</span>
<span>{user.ai_credits_remaining}</span>
</div>
</div>
</section>
{/* API Keys */}
<section className="card p-6 mb-6">
<h2 className="text-lg font-medium mb-4">API Keys (MCP)</h2>
<p className="text-sm text-gray-500 mb-4">
Connect AI tools like Claude Desktop to Fractafrag.
</p>
{/* New key created alert */}
{createdKey && (
<div className="mb-4 p-3 bg-green-600/10 border border-green-600/20 rounded-lg">
<p className="text-sm text-green-400 font-medium mb-1">
Key created! Copy it now it won't be shown again.
</p>
<code className="block text-xs font-mono bg-surface-0 p-2 rounded mt-1 break-all select-all">
{createdKey}
</code>
<button
onClick={() => {
navigator.clipboard.writeText(createdKey);
setCreatedKey(null);
}}
className="btn-secondary text-xs mt-2"
>
Copy & Dismiss
</button>
</div>
)}
{/* Existing keys */}
{apiKeys.length > 0 && (
<div className="space-y-2 mb-4">
{apiKeys.map((key: any) => (
<div key={key.id} className="flex items-center justify-between p-3 bg-surface-2 rounded-lg">
<div>
<span className="text-sm font-medium">{key.name}</span>
<span className="text-xs text-gray-500 ml-2 font-mono">{key.key_prefix}...</span>
<span className="text-xs text-gray-500 ml-2">({key.trust_tier})</span>
</div>
<button
onClick={() => revokeKey.mutate(key.id)}
className="btn-danger text-xs py-1 px-2"
>
Revoke
</button>
</div>
))}
</div>
)}
{/* Create new key */}
<div className="flex gap-2">
<input
type="text"
value={newKeyName}
onChange={(e) => setNewKeyName(e.target.value)}
className="input text-sm flex-1"
placeholder="Key name (e.g., Claude Desktop)"
/>
<button
onClick={() => newKeyName && createKey.mutate(newKeyName)}
disabled={!newKeyName || createKey.isPending}
className="btn-primary text-sm"
>
Create Key
</button>
</div>
{user.subscription_tier === 'free' && (
<p className="text-xs text-gray-500 mt-2">
API key creation requires Pro or Studio subscription.
</p>
)}
</section>
{/* Subscription */}
<section className="card p-6">
<h2 className="text-lg font-medium mb-4">Subscription</h2>
<div className="grid grid-cols-3 gap-3">
{[
{ name: 'Free', price: '$0/mo', features: ['5 shaders/month', 'Browse & vote', 'Read-only API'] },
{ name: 'Pro', price: '$12/mo', features: ['Unlimited shaders', '50 AI generations', 'BYOK support', 'MCP API access'] },
{ name: 'Studio', price: '$39/mo', features: ['Everything in Pro', '200 AI generations', 'Trusted API tier', 'Priority support'] },
].map((tier) => (
<div
key={tier.name}
className={`p-4 rounded-lg border ${
user.subscription_tier === tier.name.toLowerCase()
? 'border-fracta-500 bg-fracta-600/10'
: 'border-surface-3 bg-surface-2'
}`}
>
<h3 className="font-medium">{tier.name}</h3>
<p className="text-lg font-bold mt-1">{tier.price}</p>
<ul className="mt-3 space-y-1">
{tier.features.map((f) => (
<li key={f} className="text-xs text-gray-400"> {f}</li>
))}
</ul>
{user.subscription_tier === tier.name.toLowerCase() ? (
<span className="text-xs text-fracta-400 mt-3 block">Current plan</span>
) : (
<button className="btn-secondary text-xs mt-3 w-full">
{tier.name === 'Free' ? 'Downgrade' : 'Upgrade'}
</button>
)}
</div>
))}
</div>
</section>
</div>
);
}

View file

@ -0,0 +1,149 @@
/**
* Shader detail page full-screen view, code, vote controls.
*/
import { useState } from 'react';
import { useParams, Link, useNavigate } from 'react-router-dom';
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
import api from '@/lib/api';
import { useAuthStore } from '@/stores/auth';
import ShaderCanvas from '@/components/ShaderCanvas';
export default function ShaderDetail() {
const { id } = useParams<{ id: string }>();
const { isAuthenticated, user } = useAuthStore();
const navigate = useNavigate();
const queryClient = useQueryClient();
const [showCode, setShowCode] = useState(false);
const { data: shader, isLoading, error } = useQuery({
queryKey: ['shader', id],
queryFn: async () => {
const { data } = await api.get(`/shaders/${id}`);
return data;
},
enabled: !!id,
});
const voteMutation = useMutation({
mutationFn: async (value: number) => {
await api.post(`/shaders/${id}/vote`, { value });
},
onSuccess: () => queryClient.invalidateQueries({ queryKey: ['shader', id] }),
});
if (isLoading) {
return (
<div className="flex items-center justify-center h-[calc(100vh-3.5rem)]">
<div className="text-gray-500">Loading shader...</div>
</div>
);
}
if (error || !shader) {
return (
<div className="flex items-center justify-center h-[calc(100vh-3.5rem)]">
<div className="text-red-400">Shader not found</div>
</div>
);
}
return (
<div className="max-w-6xl mx-auto px-4 py-6">
{/* Shader preview */}
<div className="card overflow-hidden">
<div className="aspect-video bg-black relative">
<ShaderCanvas
code={shader.glsl_code}
className="w-full h-full"
animate={true}
/>
</div>
</div>
{/* Info bar */}
<div className="flex items-center justify-between mt-4">
<div>
<h1 className="text-2xl font-bold">{shader.title}</h1>
{shader.description && (
<p className="text-gray-400 mt-1">{shader.description}</p>
)}
<div className="flex items-center gap-3 mt-2 text-sm text-gray-500">
<span>{shader.shader_type.toUpperCase()}</span>
<span>·</span>
<span>{shader.view_count} views</span>
<span>·</span>
<span>{new Date(shader.created_at).toLocaleDateString()}</span>
{shader.is_ai_generated && (
<>
<span>·</span>
<span className="text-fracta-400">AI Generated</span>
</>
)}
</div>
</div>
{/* Actions */}
<div className="flex items-center gap-2">
<button
onClick={() => isAuthenticated() ? voteMutation.mutate(1) : navigate('/login')}
className="btn-secondary text-sm"
>
Upvote
</button>
<button
onClick={() => isAuthenticated() ? voteMutation.mutate(-1) : navigate('/login')}
className="btn-ghost text-sm"
>
</button>
<Link to={`/editor/${shader.id}`} className="btn-secondary text-sm">
Fork
</Link>
</div>
</div>
{/* Tags */}
{shader.tags?.length > 0 && (
<div className="flex gap-2 mt-3">
{shader.tags.map((tag: string) => (
<Link
key={tag}
to={`/explore?tags=${tag}`}
className="text-xs px-2 py-1 bg-surface-2 hover:bg-surface-3 rounded-full text-gray-400 transition-colors"
>
#{tag}
</Link>
))}
</div>
)}
{/* Code toggle */}
<div className="mt-6">
<button
onClick={() => setShowCode(!showCode)}
className="btn-secondary text-sm"
>
{showCode ? 'Hide Code' : 'View Source'}
</button>
{showCode && (
<div className="mt-3 card">
<div className="px-3 py-2 bg-surface-2 border-b border-surface-3 text-xs text-gray-500 flex items-center justify-between">
<span>fragment.glsl</span>
<button
onClick={() => navigator.clipboard.writeText(shader.glsl_code)}
className="btn-ghost text-xs py-0.5 px-2"
>
Copy
</button>
</div>
<pre className="p-4 overflow-x-auto text-sm font-mono text-gray-300 leading-relaxed max-h-96 overflow-y-auto">
{shader.glsl_code}
</pre>
</div>
)}
</div>
</div>
);
}

View file

@ -0,0 +1,50 @@
/**
* Auth store JWT token management via Zustand.
*/
import { create } from 'zustand';
import { persist } from 'zustand/middleware';
export interface User {
id: string;
username: string;
email: string;
role: string;
subscription_tier: string;
ai_credits_remaining: number;
trust_tier: string;
is_verified_creator: boolean;
created_at: string;
}
interface AuthState {
accessToken: string | null;
user: User | null;
setAccessToken: (token: string) => void;
setUser: (user: User) => void;
login: (token: string, user: User) => void;
logout: () => void;
isAuthenticated: () => boolean;
}
export const useAuthStore = create<AuthState>()(
persist(
(set, get) => ({
accessToken: null,
user: null,
setAccessToken: (token) => set({ accessToken: token }),
setUser: (user) => set({ user }),
login: (token, user) => set({ accessToken: token, user }),
logout: () => set({ accessToken: null, user: null }),
isAuthenticated: () => !!get().accessToken,
}),
{
name: 'fractafrag-auth',
partialize: (state) => ({
accessToken: state.accessToken,
user: state.user,
}),
},
),
);

10
services/frontend/src/vite-env.d.ts vendored Normal file
View file

@ -0,0 +1,10 @@
/// <reference types="vite/client" />
interface ImportMetaEnv {
readonly VITE_API_URL: string;
readonly VITE_MCP_URL: string;
}
interface ImportMeta {
readonly env: ImportMetaEnv;
}

View file

@ -0,0 +1,50 @@
/** @type {import('tailwindcss').Config} */
export default {
content: ['./index.html', './src/**/*.{js,ts,jsx,tsx}'],
theme: {
extend: {
colors: {
fracta: {
50: '#f0f0ff',
100: '#e0e0ff',
200: '#c4c0ff',
300: '#9f94ff',
400: '#7a60ff',
500: '#5b30ff',
600: '#4d10f0',
700: '#4008cc',
800: '#350aa5',
900: '#2b0d80',
950: '#1a0550',
},
surface: {
0: '#0a0a0f',
1: '#12121a',
2: '#1a1a25',
3: '#222230',
4: '#2a2a3a',
},
},
fontFamily: {
sans: ['Inter', 'system-ui', 'sans-serif'],
mono: ['JetBrains Mono', 'Fira Code', 'monospace'],
},
animation: {
'pulse-slow': 'pulse 3s cubic-bezier(0.4, 0, 0.6, 1) infinite',
'fade-in': 'fadeIn 0.3s ease-out',
'slide-up': 'slideUp 0.3s ease-out',
},
keyframes: {
fadeIn: {
'0%': { opacity: '0' },
'100%': { opacity: '1' },
},
slideUp: {
'0%': { opacity: '0', transform: 'translateY(10px)' },
'100%': { opacity: '1', transform: 'translateY(0)' },
},
},
},
},
plugins: [],
};

View file

@ -0,0 +1,24 @@
{
"compilerOptions": {
"target": "ES2020",
"useDefineForClassFields": true,
"lib": ["ES2020", "DOM", "DOM.Iterable"],
"module": "ESNext",
"skipLibCheck": true,
"moduleResolution": "bundler",
"allowImportingTsExtensions": true,
"resolveJsonModule": true,
"isolatedModules": true,
"noEmit": true,
"jsx": "react-jsx",
"strict": true,
"noUnusedLocals": false,
"noUnusedParameters": false,
"noFallthroughCasesInSwitch": true,
"baseUrl": ".",
"paths": {
"@/*": ["src/*"]
}
},
"include": ["src"]
}

View file

@ -0,0 +1,22 @@
import { defineConfig } from 'vite';
import react from '@vitejs/plugin-react';
import { resolve } from 'path';
export default defineConfig({
plugins: [react()],
resolve: {
alias: {
'@': resolve(__dirname, './src'),
},
},
server: {
host: '0.0.0.0',
port: 5173,
proxy: {
'/api': {
target: 'http://localhost:8000',
changeOrigin: true,
},
},
},
});

View file

@ -1,33 +1,303 @@
"""Fractafrag MCP Server — stub entrypoint.
"""
Fractafrag MCP Server AI agent interface to the shader platform.
Full implementation in Track E.
Enables Claude, GPT, and other MCP clients to:
- Browse and search shaders
- Get full shader details by ID
- Submit new shaders
- Update existing shaders (push revisions)
- View version history
- Browse the desire queue
"""
import os
import json
from http.server import HTTPServer, BaseHTTPRequestHandler
import httpx
from mcp.server.fastmcp import FastMCP
API_BASE = os.environ.get("API_BASE_URL", "http://api:8000")
INTERNAL_AUTH = {"Authorization": "Bearer internal:mcp-service"}
mcp = FastMCP(
"Fractafrag",
stateless_http=True,
json_response=True,
host="0.0.0.0",
port=3200,
)
class MCPHandler(BaseHTTPRequestHandler):
def do_GET(self):
if self.path == "/health":
self.send_response(200)
self.send_header("Content-Type", "application/json")
self.end_headers()
self.wfile.write(json.dumps({"status": "ok", "service": "mcp"}).encode())
else:
self.send_response(501)
self.send_header("Content-Type", "application/json")
self.end_headers()
self.wfile.write(json.dumps({"error": "MCP server coming in M2"}).encode())
async def api_get(path: str, params: dict | None = None):
async with httpx.AsyncClient(base_url=API_BASE, timeout=15.0) as client:
resp = await client.get(f"/api/v1{path}", params=params)
resp.raise_for_status()
return resp.json()
def do_POST(self):
self.send_response(501)
self.send_header("Content-Type", "application/json")
self.end_headers()
self.wfile.write(json.dumps({"error": "MCP server coming in M2"}).encode())
async def api_post(path: str, data: dict):
async with httpx.AsyncClient(base_url=API_BASE, timeout=15.0) as client:
resp = await client.post(f"/api/v1{path}", json=data, headers=INTERNAL_AUTH)
resp.raise_for_status()
return resp.json()
async def api_post_with_params(path: str, params: dict):
"""POST with query parameters (not JSON body). Used for endpoints like fulfill."""
async with httpx.AsyncClient(base_url=API_BASE, timeout=15.0) as client:
resp = await client.post(f"/api/v1{path}", params=params, headers=INTERNAL_AUTH)
resp.raise_for_status()
return resp.json()
async def api_put(path: str, data: dict):
async with httpx.AsyncClient(base_url=API_BASE, timeout=15.0) as client:
resp = await client.put(f"/api/v1{path}", json=data, headers=INTERNAL_AUTH)
resp.raise_for_status()
return resp.json()
@mcp.tool()
async def browse_shaders(query: str = "", tags: str = "", shader_type: str = "", sort: str = "trending", limit: int = 20) -> str:
"""Browse and search shaders on Fractafrag.
Args:
query: Search text (matches title)
tags: Comma-separated tag filter (e.g. "fractal,colorful")
shader_type: Filter by type: 2d, 3d, or audio-reactive
sort: Sort order: trending, new, or top
limit: Number of results (1-50)
"""
params: dict = {"sort": sort, "limit": min(limit, 50)}
if query: params["q"] = query
if tags: params["tags"] = [t.strip() for t in tags.split(",") if t.strip()]
if shader_type: params["shader_type"] = shader_type
shaders = await api_get("/shaders", params)
results = [{"id": s["id"], "title": s["title"], "description": s.get("description", ""),
"shader_type": s["shader_type"], "tags": s.get("tags", []),
"score": s.get("score", 0), "view_count": s.get("view_count", 0),
"is_system": s.get("is_system", False), "current_version": s.get("current_version", 1)}
for s in shaders]
return json.dumps({"count": len(results), "shaders": results})
@mcp.tool()
async def get_shader(shader_id: str) -> str:
"""Get full details of a shader by its ID, including GLSL source code.
Args:
shader_id: UUID of the shader
"""
s = await api_get(f"/shaders/{shader_id}")
return json.dumps({"id": s["id"], "title": s["title"], "description": s.get("description"),
"glsl_code": s["glsl_code"], "shader_type": s["shader_type"],
"tags": s.get("tags", []), "status": s.get("status"),
"is_system": s.get("is_system", False), "style_metadata": s.get("style_metadata"),
"current_version": s.get("current_version", 1),
"score": s.get("score", 0), "view_count": s.get("view_count", 0),
"forked_from": s.get("forked_from"), "created_at": s.get("created_at"),
"updated_at": s.get("updated_at")})
@mcp.tool()
async def get_shader_versions(shader_id: str) -> str:
"""Get the version history of a shader.
Args:
shader_id: UUID of the shader
"""
versions = await api_get(f"/shaders/{shader_id}/versions")
return json.dumps({"shader_id": shader_id, "version_count": len(versions),
"versions": [{"version_number": v["version_number"], "title": v["title"],
"change_note": v.get("change_note"), "created_at": v["created_at"]}
for v in versions]})
@mcp.tool()
async def get_shader_version_code(shader_id: str, version_number: int) -> str:
"""Get the GLSL code of a specific version of a shader.
Args:
shader_id: UUID of the shader
version_number: Version number to retrieve
"""
v = await api_get(f"/shaders/{shader_id}/versions/{version_number}")
return json.dumps({"shader_id": shader_id, "version_number": v["version_number"],
"title": v["title"], "glsl_code": v["glsl_code"],
"tags": v.get("tags", []), "change_note": v.get("change_note")})
@mcp.tool()
async def submit_shader(title: str, glsl_code: str, description: str = "", tags: str = "",
shader_type: str = "2d", status: str = "published",
fulfills_desire_id: str = "") -> str:
"""Submit a new GLSL shader to Fractafrag.
Shader format: void mainImage(out vec4 fragColor, in vec2 fragCoord)
Uniforms: iTime (float), iResolution (vec3), iMouse (vec4)
Args:
title: Shader title (max 120 chars)
glsl_code: Complete GLSL fragment shader code
description: Optional description
tags: Comma-separated tags (e.g. "fractal,noise,colorful")
shader_type: 2d, 3d, or audio-reactive
status: "published" to go live, "draft" to save privately
fulfills_desire_id: Optional UUID of a desire this shader fulfills
"""
tag_list = [t.strip() for t in tags.split(",") if t.strip()] if tags else []
payload = {"title": title, "glsl_code": glsl_code,
"description": description, "tags": tag_list,
"shader_type": shader_type, "status": status}
if fulfills_desire_id:
payload["fulfills_desire_id"] = fulfills_desire_id
result = await api_post("/shaders", payload)
return json.dumps({"id": result["id"], "title": result["title"],
"status": result.get("status"), "current_version": result.get("current_version", 1),
"message": f"Shader '{result['title']}' created.", "url": f"/shader/{result['id']}"})
@mcp.tool()
async def update_shader(shader_id: str, glsl_code: str = "", title: str = "",
description: str = "", tags: str = "", status: str = "",
change_note: str = "") -> str:
"""Update an existing shader. Creates a new version in the history.
Use this to iterate change code, adjust metadata, or publish a draft.
Every code change creates an immutable version snapshot.
Args:
shader_id: UUID of the shader to update
glsl_code: New GLSL code (empty = keep current)
title: New title (empty = keep current)
description: New description (empty = keep current)
tags: New comma-separated tags (empty = keep current)
status: Change status: draft, published, or archived
change_note: Brief note about what changed (e.g. "made it pinker")
"""
payload = {}
if glsl_code: payload["glsl_code"] = glsl_code
if title: payload["title"] = title
if description: payload["description"] = description
if tags: payload["tags"] = [t.strip() for t in tags.split(",") if t.strip()]
if status: payload["status"] = status
if change_note: payload["change_note"] = change_note
if not payload:
return json.dumps({"error": "No changes provided."})
result = await api_put(f"/shaders/{shader_id}", payload)
return json.dumps({"id": result["id"], "title": result["title"], "status": result.get("status"),
"current_version": result.get("current_version"),
"message": f"Updated to v{result.get('current_version', '?')}.",
"url": f"/shader/{result['id']}"})
@mcp.tool()
async def get_trending(limit: int = 10) -> str:
"""Get currently trending shaders.
Args:
limit: Number of results (1-50)
"""
shaders = await api_get("/feed/trending", {"limit": min(limit, 50)})
return json.dumps({"count": len(shaders),
"shaders": [{"id": s["id"], "title": s["title"], "shader_type": s["shader_type"],
"tags": s.get("tags", []), "score": s.get("score", 0)}
for s in shaders]})
@mcp.tool()
async def get_similar_shaders(shader_id: str, limit: int = 10) -> str:
"""Find shaders visually similar to a given shader (by tag overlap).
Args:
shader_id: UUID of the reference shader
limit: Number of results (1-30)
"""
shaders = await api_get(f"/feed/similar/{shader_id}", {"limit": min(limit, 30)})
return json.dumps({"reference": shader_id, "count": len(shaders),
"similar": [{"id": s["id"], "title": s["title"], "shader_type": s["shader_type"],
"tags": s.get("tags", []), "score": s.get("score", 0)}
for s in shaders]})
@mcp.tool()
async def get_desire_queue(min_heat: float = 0, limit: int = 10) -> str:
"""Get open shader desires/bounties with cluster context and style hints.
Returns community requests ranked by heat. Use cluster_count to identify
high-demand desires (many similar requests). Use style_hints to understand
the visual direction requested.
Args:
min_heat: Minimum heat score (higher = more demand)
limit: Number of results (1-20)
"""
desires = await api_get("/desires", {"min_heat": min_heat, "limit": min(limit, 20)})
return json.dumps({"count": len(desires),
"desires": [{"id": d["id"], "prompt_text": d["prompt_text"],
"heat_score": d.get("heat_score", 0),
"cluster_count": d.get("cluster_count", 0),
"style_hints": d.get("style_hints"),
"tip_amount_cents": d.get("tip_amount_cents", 0),
"status": d.get("status"),
"fulfilled_by_shader": d.get("fulfilled_by_shader")}
for d in desires]})
@mcp.tool()
async def fulfill_desire(desire_id: str, shader_id: str) -> str:
"""Mark a desire as fulfilled by linking it to a published shader.
The shader must be published. The desire must be open.
Use get_desire_queue to find open desires, then submit_shader or
use an existing shader ID to fulfill one.
Args:
desire_id: UUID of the desire to fulfill
shader_id: UUID of the published shader that fulfills this desire
"""
try:
result = await api_post_with_params(
f"/desires/{desire_id}/fulfill",
{"shader_id": shader_id}
)
return json.dumps({"status": "fulfilled", "desire_id": desire_id,
"shader_id": shader_id,
"message": f"Desire {desire_id} fulfilled by shader {shader_id}."})
except httpx.HTTPStatusError as e:
try:
error_detail = e.response.json().get("detail", str(e))
except Exception:
error_detail = str(e)
return json.dumps({"error": error_detail, "status_code": e.response.status_code})
@mcp.resource("fractafrag://platform-info")
def platform_info() -> str:
"""Platform overview and shader writing guidelines."""
return """# Fractafrag — GLSL Shader Platform
## Shader Format (Shadertoy-compatible, WebGL2 / GLSL ES 3.00)
```glsl
void mainImage(out vec4 fragColor, in vec2 fragCoord) {
vec2 uv = fragCoord / iResolution.xy;
fragColor = vec4(color, 1.0);
}
```
## Uniforms: iTime (float), iResolution (vec3), iMouse (vec4)
## Workflow
1. browse_shaders to see what exists
2. get_shader to read code by ID
3. submit_shader to create new, or update_shader to revise existing
4. Every code update creates a versioned snapshot
5. get_shader_versions / get_shader_version_code for history
"""
if __name__ == "__main__":
server = HTTPServer(("0.0.0.0", 3200), MCPHandler)
print("MCP server stub listening on :3200")
server.serve_forever()
print(f"Fractafrag MCP server starting on :3200")
print(f"API backend: {API_BASE}")
mcp.run(transport="streamable-http")

View file

@ -15,7 +15,7 @@ ENV PUPPETEER_EXECUTABLE_PATH=/usr/bin/chromium
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
COPY package*.json ./
RUN npm ci
RUN npm install
COPY . .

View file

@ -2,14 +2,16 @@
* Fractafrag Renderer Headless Chromium shader render service.
*
* Accepts GLSL code via POST /render, renders in an isolated browser context,
* returns thumbnail + preview video.
* captures a thumbnail (JPEG) and a short preview video (WebM frames GIF/WebM).
*
* Full implementation in Track C.
* For M1: captures a still thumbnail at t=1s. Video preview is a future enhancement.
*/
import express from 'express';
import { writeFileSync, mkdirSync, existsSync } from 'fs';
import path from 'path';
import puppeteer from 'puppeteer-core';
import { writeFileSync, mkdirSync, existsSync, readFileSync } from 'fs';
import { join } from 'path';
import { randomUUID } from 'crypto';
const app = express();
app.use(express.json({ limit: '1mb' }));
@ -17,42 +19,242 @@ app.use(express.json({ limit: '1mb' }));
const PORT = 3100;
const OUTPUT_DIR = process.env.OUTPUT_DIR || '/renders';
const MAX_DURATION = parseInt(process.env.MAX_RENDER_DURATION || '8', 10);
const CHROMIUM_PATH = process.env.PUPPETEER_EXECUTABLE_PATH || '/usr/bin/chromium';
// Ensure output directory exists
if (!existsSync(OUTPUT_DIR)) {
mkdirSync(OUTPUT_DIR, { recursive: true });
}
/**
* Generate the HTML page that hosts the shader for rendering.
* Shadertoy-compatible uniform injection.
*/
function buildShaderHTML(glsl, width, height) {
return `<!DOCTYPE html>
<html><head><style>*{margin:0;padding:0}canvas{display:block}</style></head>
<body>
<canvas id="c" width="${width}" height="${height}"></canvas>
<script>
const canvas = document.getElementById('c');
const gl = canvas.getContext('webgl2') || canvas.getContext('webgl');
if (!gl) { document.title = 'ERROR:NO_WEBGL'; throw new Error('No WebGL'); }
const vs = \`#version 300 es
in vec4 a_position;
void main() { gl_Position = a_position; }
\`;
const fsPrefix = \`#version 300 es
precision highp float;
uniform float iTime;
uniform vec3 iResolution;
uniform vec4 iMouse;
out vec4 outColor;
\`;
const fsUser = ${JSON.stringify(glsl)};
// Wrap mainImage if present
let fsBody;
if (fsUser.includes('mainImage')) {
fsBody = fsPrefix + fsUser + \`
void main() {
vec4 col;
mainImage(col, gl_FragCoord.xy);
outColor = col;
}
\`;
} else {
// Assume it already has a main() that writes to outColor or gl_FragColor
fsBody = fsPrefix + fsUser.replace('gl_FragColor', 'outColor');
}
function createShader(type, src) {
const s = gl.createShader(type);
gl.shaderSource(s, src);
gl.compileShader(s);
if (!gl.getShaderParameter(s, gl.COMPILE_STATUS)) {
const err = gl.getShaderInfoLog(s);
document.title = 'COMPILE_ERROR:' + err.substring(0, 200);
throw new Error(err);
}
return s;
}
let program;
try {
const vShader = createShader(gl.VERTEX_SHADER, vs);
const fShader = createShader(gl.FRAGMENT_SHADER, fsBody);
program = gl.createProgram();
gl.attachShader(program, vShader);
gl.attachShader(program, fShader);
gl.linkProgram(program);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
const err = gl.getProgramInfoLog(program);
document.title = 'LINK_ERROR:' + err.substring(0, 200);
throw new Error(err);
}
} catch(e) {
throw e;
}
gl.useProgram(program);
// Fullscreen quad
const buf = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buf);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([-1,-1,1,-1,-1,1,1,1]), gl.STATIC_DRAW);
const loc = gl.getAttribLocation(program, 'a_position');
gl.enableVertexAttribArray(loc);
gl.vertexAttribPointer(loc, 2, gl.FLOAT, false, 0, 0);
const uTime = gl.getUniformLocation(program, 'iTime');
const uRes = gl.getUniformLocation(program, 'iResolution');
const uMouse = gl.getUniformLocation(program, 'iMouse');
gl.uniform3f(uRes, ${width}.0, ${height}.0, 1.0);
gl.uniform4f(uMouse, 0, 0, 0, 0);
const startTime = performance.now();
let frameCount = 0;
function render() {
const t = (performance.now() - startTime) / 1000.0;
gl.uniform1f(uTime, t);
gl.viewport(0, 0, ${width}, ${height});
gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);
frameCount++;
// Signal frame count in title for Puppeteer to read
document.title = 'FRAME:' + frameCount + ':TIME:' + t.toFixed(3);
requestAnimationFrame(render);
}
render();
</script></body></html>`;
}
let browser = null;
async function getBrowser() {
if (!browser || !browser.isConnected()) {
browser = await puppeteer.launch({
executablePath: CHROMIUM_PATH,
headless: 'new',
args: [
'--no-sandbox',
'--disable-setuid-sandbox',
'--disable-dev-shm-usage',
'--disable-gpu-sandbox',
'--use-gl=swiftshader', // Software GL for headless
'--enable-webgl',
'--no-first-run',
'--disable-extensions',
'--max-gum-memory-mb=256',
],
});
}
return browser;
}
// Health check
app.get('/health', (req, res) => {
res.json({ status: 'ok', service: 'renderer' });
app.get('/health', async (req, res) => {
try {
const b = await getBrowser();
res.json({ status: 'ok', service: 'renderer', browserConnected: b.isConnected() });
} catch (e) {
res.status(500).json({ status: 'error', error: e.message });
}
});
// Render endpoint (stub — Track C)
// Render endpoint
app.post('/render', async (req, res) => {
const { glsl, duration = 5, width = 640, height = 360, fps = 30 } = req.body;
const { glsl, shader_id, duration = 3, width = 640, height = 360, fps = 30 } = req.body;
if (!glsl) {
return res.status(400).json({ error: 'Missing glsl field' });
}
// TODO: Track C implementation
// 1. Launch Puppeteer page
// 2. Inject GLSL into shader template HTML
// 3. Capture frames for `duration` seconds
// 4. Encode to WebM/MP4 + extract thumbnail
// 5. Write to OUTPUT_DIR
// 6. Return URLs
const renderId = shader_id || randomUUID();
const renderDir = join(OUTPUT_DIR, renderId);
mkdirSync(renderDir, { recursive: true });
res.status(501).json({
error: 'Renderer implementation coming in Track C',
thumbnail_url: null,
preview_url: null,
const startMs = Date.now();
let page = null;
try {
const b = await getBrowser();
page = await b.newPage();
await page.setViewport({ width, height, deviceScaleFactor: 1 });
const html = buildShaderHTML(glsl, width, height);
// Set content and wait for first paint
await page.setContent(html, { waitUntil: 'domcontentloaded' });
// Wait for shader to compile (check title for errors)
await page.waitForFunction(
() => document.title.startsWith('FRAME:') || document.title.startsWith('COMPILE_ERROR:') || document.title.startsWith('LINK_ERROR:') || document.title.startsWith('ERROR:'),
{ timeout: 10000 }
);
const title = await page.title();
if (title.startsWith('COMPILE_ERROR:') || title.startsWith('LINK_ERROR:') || title.startsWith('ERROR:')) {
const errorMsg = title.split(':').slice(1).join(':');
return res.status(422).json({ error: `Shader compilation failed: ${errorMsg}` });
}
// Let it render for the specified duration to reach a visually interesting state
const captureDelay = Math.min(duration, MAX_DURATION) * 1000;
// Wait at least 1 second, capture at t=1s for thumbnail
await new Promise(r => setTimeout(r, Math.min(captureDelay, 1500)));
// Capture thumbnail
const thumbPath = join(renderDir, 'thumb.jpg');
await page.screenshot({ path: thumbPath, type: 'jpeg', quality: 85 });
// Capture a second frame later for variety (preview frame)
if (captureDelay > 1500) {
await new Promise(r => setTimeout(r, captureDelay - 1500));
}
const previewPath = join(renderDir, 'preview.jpg');
await page.screenshot({ path: previewPath, type: 'jpeg', quality: 85 });
const durationMs = Date.now() - startMs;
res.json({
thumbnail_url: `/renders/${renderId}/thumb.jpg`,
preview_url: `/renders/${renderId}/preview.jpg`,
duration_ms: durationMs,
error: null,
});
} catch (e) {
const elapsed = Date.now() - startMs;
if (elapsed > MAX_DURATION * 1000) {
return res.status(408).json({ error: `Render timed out after ${MAX_DURATION}s` });
}
res.status(500).json({ error: `Render failed: ${e.message}` });
} finally {
if (page) {
try { await page.close(); } catch (_) {}
}
}
});
// Graceful shutdown
process.on('SIGTERM', async () => {
console.log('Shutting down renderer...');
if (browser) await browser.close();
process.exit(0);
});
app.listen(PORT, '0.0.0.0', () => {
console.log(`Renderer service listening on :${PORT}`);
console.log(`Output dir: ${OUTPUT_DIR}`);
console.log(`Max render duration: ${MAX_DURATION}s`);
console.log(`Chromium: ${CHROMIUM_PATH}`);
// Pre-launch browser
getBrowser().then(() => console.log('Chromium ready')).catch(e => console.error('Browser launch failed:', e.message));
});