# Sup API Documentation > Sup is a real-time chat platform where users create interactive JavaScript "patches" that process text, images, audio, and video inputs. This is a concise API reference optimized for code generation and common runtime patterns. For complete behavior and edge cases, see: - Full reference: https://sup.net/docs/llms-full.txt - Interactive docs: https://sup.net/docs/ ## Quick Start Every patch needs a `main()` function that returns content for display: ```js function main() { return "Hello, world!"; } ``` Common patterns: ```js // main() takes NO arguments — use sup.input for user input function main() { const text = sup.input.text; const image = sup.input.image; // Return inline HTML with sup.html() — NOT a bare html`` tag return sup.html(`

Hello, ${text}!

`); } // Generate AI content const response = sup.ai.prompt("Write a haiku about", sup.input.image); // Store/retrieve data (persists per chat, shared by all users) sup.set("count", sup.get("count") + 1); // Store/retrieve per-user data (private to each user) sup.user.set("score", 100); const userScore = sup.user.get("score"); // Create media const img = sup.image("https://example.com/image.jpg"); const audio = sup.audio("sound.mp3"); // from uploaded asset const video = sup.video("clip.mp4"); // Get uploaded asset by filename const asset = sup.asset("filename.png"); // ✅ Correct // NOT sup.assets["filename.png"] - that doesn't work! // Random numbers const dice = sup.random.integer(1, 6); // Random int from 1-6 const choice = sup.random.choice(["a", "b", "c"]); // Random array element ``` ## Core Patterns This section summarizes common implementation patterns and runtime contracts. It is not exhaustive. ### State and storage scopes ```js // Chat scope (shared by all users in this chat) sup.set("round", 1); // same as sup.chat.set(...) const round = sup.get("round"); // same as sup.chat.get(...) // Message scope (tied to current message) sup.message.set("votes", 3); const votes = sup.message.get("votes"); // User scope (private per user) sup.user.set("theme", "dark"); const theme = sup.user.get("theme"); // Global scope (shared across all chats for this patch) sup.global.set("totalRuns", 42); const totalRuns = sup.global.get("totalRuns"); // Discover keys in each scope sup.keys(); // chat/current scope sup.message.keys(); // message scope sup.user.keys(); // user scope sup.global.keys(); // global scope ``` ### AI prompting and model/provider configuration ```js // Prompt behavior tuning const result = sup.ai.prompt("Classify this:", sup.input.text, { temperature: 0.2, // deterministic reasoning: { effort: "medium" }, // reasoning budget schema: { label: { type: "string" } } // structured output }); // Function calling const withTools = sup.ai.prompt("Need external data?", { tools: { get_score: { description: "Fetch score by user", parameters: { userId: { type: "string" } }, callback: ({ userId }) => ({ userId, score: 10 }) } } }); // Specific text model/provider selection const customModel = sup.ex.openrouter( "anthropic/claude-3-haiku", "Summarize this in one sentence." ); // Schema format note: define fields directly (no top-level type/properties wrapper) const structured = sup.ai.prompt("Extract fields", { schema: { title: { type: "string" }, score: { type: "number" } } }); // Image/video model selection const img = sup.ai.image.create("futuristic city", { model: "fast", quality: "high" }); const vid = sup.ai.video.create("drone shot of a beach", { model: "best", resolution: "1080p" }); // Image model caveats: // - referenceImages: works with "fast", "gemini-3-pro-image", and "gemini-3.1-flash-image-preview", not "best" // - useWebSearch: works with "gemini-3-pro-image" and "gemini-3.1-flash-image-preview" ``` ### Runtime contracts ```js // Input/reply precedence: // - sup.input.text uses invocation text first // - when replying, sup.input.texts may include both invocation + reply text // - for exact values use sup.message.body (invocation) and sup.reply?.body (reply target) // launch() contract: // - if launch() exists, it is used instead of main() // - launch() must return sup.html(...) function launch() { return sup.html("
App
"); } // Button callback must be top-level (not nested) function handleVote(event) { sup.message.set("votes", (sup.message.get("votes") || 0) + 1); } function main() { return sup.button("Vote", handleVote); } // Segments are client-side display only (NOT security) // Do not place secrets/private data in sup.segment(...) ``` ### Secrets and inter-patch communication ```js // Secrets const apiKey = sup.secret("OPENAI_API_KEY"); // Never log or return secret values // Inter-patch calls: use .public and serializable args/results const util = sup.patch("/alice/tools"); const summary = util.public.summarize("hello"); // Prefer serializable values (e.g., pass image.url, not complex objects) const bgRemoved = util.public.removeBg(sup.input.image?.url); ``` ## ⚠️ Common Mistakes ```js // ❌ WRONG: main() does not receive arguments function main(message) { return message.text; } // ✅ CORRECT: use sup.input function main() { return sup.input.text; } // ❌ WRONG: nested callback for button function main() { function onClick() { sup.set("x", 1); } // not top-level return sup.button("Click", onClick); } // ✅ CORRECT: callback at top level function onClick() { sup.set("x", 1); } function main() { return sup.button("Click", onClick); } // ❌ WRONG: bare html tag doesn't exist return html`
Hello
`; // ✅ CORRECT: use sup.html() return sup.html(`
Hello
`); // ❌ WRONG: sup.assets is an array, not an object const img = sup.assets["myimage.png"]; // ✅ CORRECT: Use sup.asset() to look up by filename const img = sup.asset("myimage.png"); // ❌ WRONG: Manual user ID prefixing for per-user data const notes = sup.get(`notes_${event.user.id}`); sup.set(`notes_${event.user.id}`, notes); // ✅ CORRECT: Use sup.user.get()/set() for per-user state const notes = sup.user.get("notes"); sup.user.set("notes", notes); // ❌ WRONG: sup.input() is not a function, sup.text() and sup.md() don't exist const text = sup.input(); // sup.input is an object, not a function return sup.text("Hello!"); // there is no sup.text() method return sup.md("# Title"); // there is no sup.md() method // ✅ CORRECT: Access input properties, return strings directly const text = sup.input.text; return "Hello!"; return "# Title"; // ❌ WRONG: Using await with sup.fetch() and response methods const response = await sup.fetch(url); // await not needed! const data = await response.json(); // await not needed! // ✅ CORRECT: Sup patch runtime is synchronous - no await needed const response = sup.fetch(url); const data = response.json(); // Note: sup.fetch(), response.json(), response.text() all work synchronously // ❌ WRONG: segment as security boundary return sup.segment(adminUser, "SECRET_API_KEY=...", "hidden"); // ✅ CORRECT: gate in logic and never include secrets in output if (sup.user.id !== adminUser.id) return "Unauthorized"; return "Admin tools"; ``` ## API Reference Detailed documentation for each API (append `.md` to any URL for plain text): ### Core Package - [sup](https://sup.net/docs/reference/packages/sup.md) - Main API: input, state, media creation, fetch, and more ### AI APIs - [sup.ai](https://sup.net/docs/reference/packages/sup.ai.md) - Text generation, structured output, embeddings, TTS - [sup.ai.image](https://sup.net/docs/reference/packages/sup.ai.image.md) - AI image generation and manipulation - [sup.ai.video](https://sup.net/docs/reference/packages/sup.ai.video.md) - AI video generation - [sup.ai.audio](https://sup.net/docs/reference/packages/sup.ai.audio.md) - AI audio processing ### Types - [SupInput](https://sup.net/docs/reference/types/Input.md) - User input (text, images, audio, video, files) - [SupImage](https://sup.net/docs/reference/types/Image.md) - Image object with editing capabilities - [SupAudio](https://sup.net/docs/reference/types/Audio.md) - Audio object with editing capabilities - [SupVideo](https://sup.net/docs/reference/types/Video.md) - Video object with editing and sequencing - [SupHTML](https://sup.net/docs/reference/types/HTML.md) - Custom HTML content for rich displays - [SupUser](https://sup.net/docs/reference/types/User.md) - User information and per-user state - [SupMessage](https://sup.net/docs/reference/types/Message.md) - Message data and reply handling - [SupChat](https://sup.net/docs/reference/types/Chat.md) - Chat information and state - [SupButton](https://sup.net/docs/reference/types/Button.md) - Interactive buttons - [SupSegment](https://sup.net/docs/reference/types/Segment.md) - User-specific content visibility ### Lifecycle Functions - [main](https://sup.net/docs/reference/functions/main.md) - Entry point, called on each patch run - [init](https://sup.net/docs/reference/functions/init.md) - Called when patch is saved/updated - [onReact](https://sup.net/docs/reference/functions/onReact.md) - Called when user reacts with emoji - [onThreadReply](https://sup.net/docs/reference/functions/onThreadReply.md) - Called on thread replies - [launch](https://sup.net/docs/reference/functions/launch.md) - For patches returning HTML via launch API ### Other Packages - [sup.random](https://sup.net/docs/reference/packages/sup.random.md) - Random numbers and array shuffling - [sup.global](https://sup.net/docs/reference/packages/sup.global.md) - Global state across all patch runs - [sup.ex](https://sup.net/docs/reference/packages/sup.ex.md) - External API helpers - [sup.vc](https://sup.net/docs/reference/packages/sup.vc.md) - Voice call functionality - [sup.search](https://sup.net/docs/reference/packages/sup.search.md) - Search functionality ### Guides - [Getting Started](https://sup.net/docs/guides/getting-started.md) - [Working with Input](https://sup.net/docs/guides/input.md) - [AI Features](https://sup.net/docs/guides/ai.md) - [State Management](https://sup.net/docs/guides/state.md) - [Media Editing](https://sup.net/docs/guides/media-editing.md) - [Interactivity](https://sup.net/docs/guides/interactivity.md) - [External APIs](https://sup.net/docs/guides/external.md) --- ## Essential Type Definitions ### `sup` Object Properties ```typescript sup.id: string // Unique patch ID sup.user: SupUser // User who ran the patch sup.chat?: SupChat // Current chat (if available) sup.message?: SupMessage // Current message sup.input: SupInput // User input data sup.assets: (SupImage | SupAudio | SupVideo | SupFile)[] sup.assets.images: SupImage[] // Asset helpers sup.assets.audios: SupAudio[] sup.assets.videos: SupVideo[] sup.assets.files: SupFile[] sup.this: SupPatch // Current patch metadata sup.caller: SupCaller // user/patch + isPreview info sup.isReply: boolean // True if replying to a message sup.reply?: SupMessage // The replied-to message ``` ### `sup` Methods ```typescript // Media creation sup.image(urlOrBlob: string | Blob | SupSVG | SupHTML): SupImage sup.audio(urlOrBlob: string | Blob): SupAudio sup.video(urlOrImage: string | SupImage): SupVideo sup.file(url: string, mimeType: string): SupFile sup.html(html: string | SupBundle, options?: { width?: number, // 100-1600 pixels height?: number, // 100-1600 pixels type?: "html" | "image" | "video", // Output format (default: "html") duration?: number, // Required for type: "video" tailwind?: boolean | string, // Tailwind CSS enabled by default. Set false to disable, or version string transparent?: boolean, // Transparent background for type: "image" webgl?: boolean, // WebGL renderer for Three.js/WebGL content waitForSelector?: string, // CSS selector to wait for before capture callbacks?: string[] // Functions callable from client via window.sup.exec() }): SupHTML sup.svg(content: string): SupSVG sup.gif(imagesOrVideo: SupImage[] | SupVideo, options?: { frameRate?: number }): SupImage sup.sequence(clips: (SupVideo | SupImage | string)[]): SupSequence // State management sup.set(key: string, value: any): void sup.get(key: string): any | null // Assets sup.asset(filename: string): SupImage | SupAudio | SupVideo | SupFile // UI sup.button(label: string, callback: Function, value?: any): SupButton sup.status(message: string): void sup.segment(users, matching?, notMatching?): SupSegment sup.metadata(data: any): SupMetadata // Network (all synchronous - no await needed!) sup.fetch(url: string, options?: FetchOptions): FetchResponse // sync sup.scrape(url: string, prompt?: string): string // sync sup.screenshot(url: string, selector?: string): SupImage // sync // Response methods are also sync: response.json(), response.text(), response.blob() // Utilities sup.uuid(): string sup.sleep(ms: number): void sup.secret(key: string): string sup.emoji(name: string): SupEmoji | undefined sup.patch(idOrFullname: string): SupPatch sup.makePublic(...fns: Function[]): void ``` ### `SupInput` Type ```typescript sup.input.text?: string // First text input sup.input.image?: SupImage // First image attachment sup.input.images: SupImage[] // All image attachments sup.input.audio?: SupAudio // First audio attachment sup.input.audios: SupAudio[] // All audio attachments sup.input.video?: SupVideo // First video attachment sup.input.videos: SupVideo[] // All video attachments sup.input.files: SupFile[] // All file attachments sup.input.combined: (string | SupImage | SupAudio | SupVideo | SupFile | SupEmoji)[] sup.input.texts: string[] // Includes invocation/reply text when both exist // For exact sources: sup.message?.body (invocation), sup.reply?.body (replied-to message) ``` ### `SupUser` Type ```typescript sup.user.id: string // Unique user ID sup.user.username: string // Display username sup.user.pfp?: SupImage // Profile picture image (NOT .avatar!) sup.user.get(key): any // Get per-user state sup.user.set(key, value): void // Set per-user state ``` ### `sup.ai` Methods ```typescript // Text generation sup.ai.prompt(...inputs: (string | SupImage)[]): string sup.ai.prompt(...inputs, options: { temperature?: number, // 0-1, controls randomness schema?: object, // Structured output schema reasoning?: boolean | { effort: "high" | "medium" | "low" }, tools?: Record }): string | object sup.ai.promptWithContext(...inputs): string // Auto-includes message context sup.ai.promptFull(...inputs): SupAICompletion // Returns full completion object // Other AI sup.ai.tts(...text: string[], options?: { temperature?, exaggeration?, cfg? }): SupAudio sup.ai.embedding.embed(input: string | SupImage | SupAudio | SupVideo, options?: SupAIEmbeddingOptions): SupEmbedding sup.ai.embedding.distance(a: SupEmbedding, b: SupEmbedding | string | SupImage | SupAudio | SupVideo): number sup.ai.embedding.nearest(target: SupEmbedding, candidates: (SupEmbedding | string | SupImage | SupAudio | SupVideo)[]): { item: SupEmbedding, distance: number, index: number } // Image AI sup.ai.image.create(prompt: string, options?: { model?: "best" | "fast" | "gemini-3-pro-image" | "gemini-3.1-flash-image-preview", // "best" = highest quality, "fast" = faster quality?: "low" | "medium" | "high", aspectRatio?: string, // e.g., "16:9", "1:1" referenceImages?: SupImage[], // Style/content guidance (max 14, not supported by "best") useWebSearch?: boolean // Real-time data (Gemini web-search image models only) }): SupImage sup.ai.image.edit(image: SupImage | SupImage[], prompt: string, options?: { model?: "best" | "fast" | "gemini-3-pro-image" | "gemini-3.1-flash-image-preview", quality?: "low" | "medium" | "high", mask?: SupImage, // Specify areas to edit referenceImages?: SupImage[], // Style guidance (not supported by "best") useWebSearch?: boolean // Real-time data (Gemini web-search image models only) }): SupImage sup.ai.image.interpret(image: SupImage, prompt: string): string // Analyze/describe image // Video AI sup.ai.video.create(prompt: string | SupImage, options?: { duration?: number, // Length in seconds aspectRatio?: "16:9" | "9:16", // Landscape or portrait resolution?: "720p" | "1080p" }): SupVideo sup.ai.video.interpret(video: SupVideo, prompt?: string): string // Analyze/describe video // Audio AI sup.ai.audio.interpret(audio: SupAudio, prompt?: string): string // Transcribe/analyze audio ``` ### `launch()` Contract ```typescript // If launch() is defined, it is called instead of main() // launch() must return SupHTML (from sup.html) function launch(): SupHTML ``` ### Media Editing ```typescript // Image editing image.edit() .resize(width, height, opts?) // opts: { mode?: "cover"|"contain"|"fill", pixelated?: boolean } .crop(top, left, bottom, right) .rotate(degrees) .flip(horizontal, vertical) .grayscale() .blur(sigma) .sharpen(sigma) .invert() .composite(...overlayImages) .render(): SupImage // Video editing video.edit() .trimStart(seconds) // Trim from beginning .trimEnd(seconds) // Trim from end .duration(seconds) // Set duration .fadeIn(seconds) .fadeOut(seconds) .addAudio(audio, mode?) // mode: "mix" | "replace" .addOverlay(image, position) // position: "topLeft" | "topRight" | etc. .addCaption(text, options?) .render(): SupVideo // Video sequences sup.sequence([video1, video2, image]).render(): SupVideo ``` --- ## Common Examples ### Inline HTML output ```js function main() { const name = sup.input.text || "world"; return sup.html(`

Hello, ${name}!

This is an inline HTML patch.

`); } ``` ### Analyze image with AI ```js function main() { const image = sup.input.image; if (!image) return "Please attach an image"; // Use sup.ai.image.interpret() for image analysis (NOT sup.ai.prompt) return sup.ai.image.interpret(image, "Describe this image in detail"); } ``` ### Interactive counter with button ```js function main() { const count = sup.get("count") || 0; return [ `Count: ${count}`, sup.button("Increment", handleClick) ]; } // Must be top-level to be used by sup.button function handleClick() { sup.set("count", (sup.get("count") || 0) + 1); } ``` ### Generate AI image ```js function main() { const prompt = sup.input.text || "a beautiful sunset"; return sup.ai.image.create(prompt); } ``` ### Structured AI response ```js function main() { const result = sup.ai.prompt("Analyze this text", sup.input.text, { schema: { sentiment: { type: "string", enum: ["positive", "negative", "neutral"] }, keywords: { type: "array", items: { type: "string" } }, summary: { type: "string", maxLength: 100 } } }); return `Sentiment: ${result.sentiment}\nKeywords: ${result.keywords.join(", ")}`; } ``` ### Edit and composite images ```js function main() { const bg = sup.input.image; const overlay = sup.asset("logo.png"); if (!bg) return "Attach a background image"; return bg.edit() .resize(800, 600) .composite(overlay) .render(); } ``` ### AI with function calling ```js function main() { return sup.ai.prompt("What is the weather in Tokyo?", { tools: { get_weather: { description: "Get weather for a city", parameters: { city: { type: "string", description: "City name" } }, callback: (args) => { // In real usage, call a weather API here return `Weather in ${args.city}: 22°C, sunny`; } } } }); } ``` ### User-specific content ```js function main() { const secretUser = sup.user; // Personalization only; not access control return sup.segment( secretUser, "🎉 You can see this secret message!", "Nothing to see here..." ); } ``` ### Lifecycle function signatures ```js // onReact receives an event object, NOT separate parameters function onReact(e) { const emoji = e.reactionEmoji; // The emoji that was reacted const user = e.user; // SupUser who reacted sup.set("lastReaction", emoji); } // onThreadReply receives an event object function onThreadReply(e) { const input = e.input; // SupInput from the thread reply const user = e.user; // SupUser who replied const text = e.input.text; } ``` ### Calling other patches ```js // Access another patch and call its public functions const otherPatch = sup.patch("patchId"); const result = otherPatch.public.functionName(); // Must use .public accessor // Make functions callable from other patches function getScore() { return sup.get("score") || 0; } sup.makePublic(getScore); // Now callable as patch.public.getScore() // Cross-patch calls should use serializable values const result2 = otherPatch.public.processImageUrl(sup.input.image?.url); ``` ### Hidden metadata for workflows ```js // Return visible output + hidden machine-readable state return [ "Round complete", sup.metadata({ round: 3, score: 120 }) ]; // Later (on reply), read metadata from replied message const state = sup.reply?.metadata?.[0]; ``` ### Interactive HTML app with callbacks **Important:** When generating HTML content, developers should make it responsive and ensure it supports both web and mobile controls. Use relative units, flexible layouts, and touch-friendly interactions so patches work well on all screen sizes. Use `callbacks` to enable two-way communication between HTML content and server-side patch code. This is essential for building interactive HTML apps that need AI or server-side processing. **`window.sup` is automatically injected into ALL HTML content rendered by `sup.html()`** — both inline HTML strings and uploaded asset bundles. You do not need to import, initialize, or set up any bridge. The following client-side APIs are always available: - `window.sup.exec(functionName, ...args)` - Call server-side callback functions (returns a Promise) - `window.sup.share(content)` - Share content to chat (text, URLs, { url, type: 'image' } for data URLs) - `window.sup.user` - Read-only user info: `{ id, username, pfp }` - `window.sup.chat` - Read-only chat info: `{ id, title, type }` **⚠️ CRITICAL: Functions listed in `callbacks` receive a `SupUserEvent` object, NOT the raw arguments. Access the passed value via `event.value`.** **Callbacks work with both inline HTML and asset bundles, from both `main()` and `launch()`:** ```js // Inline HTML with callbacks — works from main() function handleClick(event) { const city = event.value; return sup.ai.prompt(`Tell me about ${city}`); } function main() { return sup.html(`

`, { callbacks: ["handleClick"] }); } ``` ```js // Asset bundle with callbacks — works from launch() function generateResponse(event) { const userInput = event.value; return sup.ai.prompt("You are a helpful assistant. User input: " + userInput); } function launch() { return sup.html(sup.asset("myApp"), { callbacks: ["generateResponse"] }); } ``` ```html ``` **Getting user info in HTML apps:** `window.sup.user` is available directly in client-side HTML — no callback needed for basic user info: ```js // Client-side — use window.sup.user directly (works in both inline HTML and asset bundles) document.getElementById("username").textContent = window.sup.user.username; document.getElementById("avatar").src = window.sup.user.pfp; ``` **Common mistakes to avoid:** ```js // ❌ WRONG: Expecting raw argument function generateResponse(userInput) { return sup.ai.prompt(userInput); // userInput is actually a SupUserEvent! } // ✅ CORRECT: Receive event object, access event.value function generateResponse(event) { const userInput = event.value; return sup.ai.prompt(userInput); } // ❌ WRONG: Expecting wrapped response in client const result = await window.sup.exec("myFunc", "hello"); const text = result.value; // Wrong! result IS the value // ✅ CORRECT: Result is the return value directly const text = await window.sup.exec("myFunc", "hello"); // ❌ WRONG: Using sup.share() in server-side exec function function saveDrawing(event) { sup.share(event.value); // sup.share() doesn't exist server-side! } // ✅ CORRECT: Use window.sup.share() in client-side HTML // Client HTML: await window.sup.share('Hello!'); // text await window.sup.share('https://example.com/image.png'); // image URL (auto-detected) await window.sup.share({ url: dataUrl, type: 'image' }); // base64 data URL (MUST use explicit type!) await window.sup.share(['text', 'https://img.url/a.png']); // Multiple items // ⚠️ For base64 data URLs (canvas.toDataURL()), MUST use { url, type: 'image' } // Plain data URLs are treated as text, not images! // ❌ WRONG: Using event.user.get/set for per-user state function saveNotes(event) { event.user.set('notes', event.value); // event.user.set doesn't exist! return event.user.get('notes'); // event.user.get doesn't exist! } // ✅ CORRECT: Use sup.user.get/set for per-user persistent state function saveNotes(event) { sup.user.set('notes', event.value); // sup.user.set() for state return sup.user.get('notes'); // sup.user.get() for state } // event.user vs sup.user - they are DIFFERENT: // - event.user = user INFO (read-only): event.user.username, event.user.pfp, event.user.id // - sup.user.get/set = per-user persistent STATE storage ``` --- For complete API documentation with all properties and methods, see https://sup.net/docs/llms-full.txt