# Sup API Documentation
> Complete API reference for Sup, a real-time chat platform where users create interactive JavaScript "patches" that process text, images, audio, and video inputs.
Sup patches are JavaScript functions that can access user inputs, generate AI content, manipulate media, and create interactive experiences in chat.
## Quick Navigation
- [Getting Started](#getting-started) - Basic patch structure and common patterns
- [Core Types](#core-types) - Essential data types for media and user interaction
- [Essential APIs](#essential-apis) - Most commonly used functions
- [AI & Media](#ai-media) - AI generation and media processing
- [Advanced Features](#advanced-features) - Specialized functionality
- [Reference](#reference) - Complete API reference
## Getting Started {#getting-started}
Every patch needs a `main()` function that returns content for display:
```js
function main() {
return "Hello, world!";
}
```
Access user input via `sup.input`, create media with `sup.image()`, `sup.audio()`, `sup.video()`, generate AI content with `sup.ai.prompt()`, and store data with `sup.set()`/`sup.get()`.
**Important - Working with Assets:** To get an asset by filename, use `sup.asset("filename.png")`. Do NOT use `sup.assets["filename.png"]` β that syntax does not work because `sup.assets` is an array, not an object.
## Core Types {#core-types}
Essential data types used in most patches:
- [SupVideo](#supvideo): object represents a video. Returning one from the `main()` function will display the video in the chat
- [SupUser](#supuser): object represents a user on Sup. It provides info about the user and methods for storing/retrieving user-specific data
- [SupMessage](#supmessage): object represents a message in a Sup conversation. It contains the message content, author information, and associated input/result data
- [SupInput](#supinput): object represents a collection of input data in various formats (text, images, audio, video, and other files) used as input to patch functions. It is available as `sup.input` in your patch functions
- [SupImage](#supimage): object represents an image in Sup. It can be created from either a URL or a Blob
- [SupHTML](#suphtml): object represents custom HTML content that can be displayed in Sup. It supports static content, images, videos, and interactive elements. It's intended to be returned from the `main()` function, to be displayed in the chat
- [SupAudio](#supaudio): object represents an audio file. It can be created from either a URL or a Blob
## Essential APIs {#essential-apis}
Most commonly used packages and functions:
- [sup.random](#suprandom): provides methods for generating random numbers and working with randomized arrays
- [sup](#sup): The `sup` package contains information about the patch being run and provides access to various methods for interacting with the Sup platform.
- [sup.global](#supglobal): provides methods for storing and retrieving global state that persists across all patch runs on Sup. It is accessed through `sup.global`
- [sup.ai](#supai): provides access to the runtime's basic AI capabilities, including text generation and analysis, and embeddings. For image operations, see the [`sup.ai.image`](/docs/reference/packages/sup.ai.image) package
- [onThreadReply](#onthreadreply): will be called when a user replies in the comments (or thread) of the patch message. `onThreadReply` should be defined at the top level of the patch code, outside of any other functions
- [onReact](#onreact): will be called when a user reacts with an emoji to the patch message. `onReact` should be defined at the top level of the patch code, outside of any other functions
- [main](#main): the entry point for the patch and is called every time the patch is run
- [launch](#launch): specifically designed for patches that return HTML content. When a patch with `launch()` is executed via the launch API, it must return a [`SupHTML`](/docs/reference/types/html) object. Other return types will result in an error
- [init](#init): will be called when a patch is saved or updated
## AI & Media {#ai-media}
AI generation and media processing capabilities:
- [sup.vc](#supvc): provides access to voice call functionality in Sup
- [sup.ex](#supex): provides convenience functions for interacting with external APIs
- [sup.ai.image](#supaiimage): provides AI-powered image generation and manipulation capabilities. It is accessed through `sup.ai.image`
## Advanced Features {#advanced-features}
Specialized types and functionality:
- [SupSVG](#supsvg): object represents SVG content in Sup. You can return it from the `main()` function to display it in the chat
- [SupFile](#supfile): object represents assets uploaded in the patch editor that aren't images, audio, or video
- [SupEmoji](#supemoji): object represents a custom emoji in Sup, which can have both an image and an audio component
- [SupChat](#supchat): object represents a chat or conversation in Sup. It provides information about the chat and methods for storing chat-specific data
- [SupButton](#supbutton): object represents an interactive button in Sup that can trigger callbacks when clicked. It is created using [`sup.button()`](/docs/reference/packages/sup#button), and should be returned from the `main()` function
## Reference {#reference}
Complete API documentation with examples and detailed usage:
---
## SupVideo {#supvideo}
The `SupVideo` object represents a video. Returning one from the `main()` function will display the video in the chat.
```js title="Example Video"
SupVideo: {
"url": "https://user-uploads.cdn.overworld.xyz/v8hd62jka9lpqmxn4wfr3tcy.m3u8",
"filename": "screen-recording.mp4"
}
```
---
## Properties
### `filename`
type: `string | undefined`
The filename of the video, if available. This is usually set when the video is uploaded as an asset in the patch editor.
### `url`
type: `string`
Gets the CDN URL for the video. Note: This URL may sometimes be an M3U8 playlist URL.
### `width`
type: `number | undefined`
The width of the video in pixels. This is automatically populated when video metadata is available.
### `height`
type: `number | undefined`
The height of the video in pixels. This is automatically populated when video metadata is available.
### `duration`
type: `number | undefined`
The duration of the video in seconds. This is automatically populated when video metadata is available.
### `thumbnailUrl`
type: `string | undefined`
A URL to a thumbnail image of the video. This is automatically generated and populated when video metadata is available.
### `firstFrame`
type: `SupImage | undefined`
The first frame of the video as an image object. Accessed lazily when needed.
### `lastFrame`
type: `SupImage | undefined`
The last frame of the video as an image object. Accessed lazily when needed.
### `audio`
type: `SupAudio | undefined`
The audio track extracted from the video. Accessed lazily when needed.
---
## Creating Videos from Images
You can create videos from images using `sup.video()` with a `SupImage` parameter. By default, image-to-video conversion creates a **3-second video clip**.
```js
// Create a 3-second video from an image (default duration)
const imageVideo = sup.video(sup.image("photo.jpg"));
// Customize the duration using the edit cursor
const longerImageVideo = sup.video(sup.image("slide.png"))
.edit()
.duration(5) // Make it 5 seconds instead of 3
.render();
// Add effects to image-based videos
const enhancedImageVideo = sup.video(sup.image("background.jpg"))
.edit()
.duration(8)
.fadeIn(1)
.fadeOut(1)
.addAudio(sup.audio("music.mp3"), 'mix')
.render();
```
This is useful for creating slide presentations, static backgrounds with audio, or transitional elements in video sequences.
---
## Methods
### `edit()`
Returns a `SupVideoEditCursor` that allows you to chain video editing operations.
```js
const editedVideo = sup.video("video.mp4")
.edit()
.addAudio(sup.audio("music.mp3"), 'mix')
.addOverlay(sup.image("logo.png"), 'topRight')
.duration(10)
.fadeIn(1)
.fadeOut(1)
.addCaption("Hello World!", { position: 'bottom' })
.render();
```
#### Video Editing Operations
- `addAudio(audio: SupAudio | string, mode?: 'mix' | 'replace')` - Add audio track
- `addOverlay(media: SupImage | SupVideo | string, position: 'topLeft' | 'topRight' | 'bottomLeft' | 'bottomRight' | 'center')` - Add overlay
- `duration(seconds: number)` - Set video duration (useful for customizing image-to-video duration)
- `fadeIn(seconds: number)` - Add fade in effect
- `fadeOut(seconds: number)` - Add fade out effect
- `trimStart(seconds: number)` - Trim from beginning
- `trimEnd(seconds: number)` - Trim from end
- `addCaption(text: string, options?: { position?: 'top' | 'center' | 'bottom' })` - Add text caption
- `render()` - Apply all edits and return new SupVideo
---
## Video Sequences
Create sequences of multiple video clips using `sup.sequence()`. The `SupSequence` object represents a sequence of video clips that can be combined into a single video, supporting mixing of videos, images, URLs, and asset filenames.
```js title="Basic Sequence"
const sequence = sup.sequence([
"intro.mp4", // Asset filename
"https://example.com/middle.mp4", // Full URL
sup.video("main.mp4"), // SupVideo object
sup.image("end-slide.jpg") // Images become 3-second video clips
]);
const finalVideo = sequence.render();
```
### Sequence Editing
Like individual videos, sequences support the same editing operations:
```js
const editedSequence = sup.sequence([
"clip1.mp4", // Asset filename
"clip2.mp4", // Asset filename
"https://example.com/clip3.mp4" // Full URL
])
.addAudio(sup.audio("background-music.mp3"), 'mix')
.addOverlay(sup.image("watermark.png"), 'bottomRight')
.fadeIn(2)
.fadeOut(2)
.render();
// Customize image clip durations in sequences
const customDurationSequence = sup.sequence([
sup.video(sup.image("intro.jpg")).edit().duration(5).render(), // 5-second intro
sup.video("main-content.mp4"),
sup.video(sup.image("outro.jpg")).edit().duration(2).render() // 2-second outro
]).render();
```
### Mixed Media Examples
```js
// Presentation-style video
function main() {
const sequence = sup.sequence([
sup.image("title-slide.jpg"), // 3-second image clip
sup.video("presentation.mp4"),
sup.image("thank-you.jpg") // 3-second image clip
]);
return sequence
.addAudio(sup.audio("background.mp3"), 'mix')
.addOverlay(sup.image("logo.png"), 'topRight')
.render();
}
```
---
## Output
Returning a SupVideo from a patch's `main()` function will display the video in the chat.
```js
function main() {
return sup.video("https://example.com/video.mp4");
}
```
## SupUser {#supuser}
The `SupUser` object represents a user on Sup. It provides info about the user and methods for storing/retrieving user-specific data.
```js title="Example User"
// Get info about the current user
sup.user.username // => "alice"
sup.user.displayName // => "Alice"
sup.user.pfp // => "SupImage: { ... }"
sup.user.banner // => "SupImage: { ... }"
sup.user.id // => "cabcdefghijklm0123456789"
// Get info about other users
sup.reply.author.username // => "bob"
sup.reply.author.displayName // => "Bob"
sup.reply.author.pfp // => "SupImage: { ... }"
sup.reply.author.banner // => "SupImage: { ... }"
sup.reply.author.id // => "cabcdefghijklm0123456789"
```
## Properties
### username
type: `string`
The user's username.
### displayName
type: `string`
The user's display name.
### pfp
type: `SupImage`
The user's profile picture.
### banner
type: `SupImage`
The user's profile banner.
### id
type: `string`
The user's unique ID.
## Methods
### set
type: `function(key: string, value: any): void`
Store a value in user-scoped state. For example:
```js
sup.user.set("favoriteColor", "green");
```
The key can be any string, and the value can be any JSON-serializable object (including Sup classes like `SupUser`, `SupImage`, `SupMessage`, etc.).
### get
type: `function(key: string): any`
Get a value from user-scoped state. For example:
```js
const favoriteColor = sup.user.get("favoriteColor"); // "green"
```
### keys
type: `function(): string[]`
Returns an array of all keys stored in this user's scoped state.
```js
const keys = sup.user.keys(); // ["favoriteColor", "gameScore", ...]
```
## SupMessage {#supmessage}
The `SupMessage` object represents a message in a Sup conversation. It contains the message content, author information, and associated input/result data.
```js title="Example Message"
SupMessage: {
"id": "zhgxm4knad2uzi32ylx1pqah",
"body": "sup, world!",
"author": {
"id": "123",
"username": "riffbit",
"bio": "sup"
},
"input": {
"text": "sup, world!",
"texts": ["sup, world!"],
"image": undefined,
"images": [],
"audio": undefined,
"audios": [],
"video": undefined,
"videos": [],
"combined": ["sup, world!"]
}
}
```
## Properties
### `id`
type: `string`
The unique identifier for the message.
### `body`
type: `string`
The text content of the message.
### `author`
type: `SupUser`
The user who sent this message.
### `input`
type: `SupInput`
Input data that was specifically associated with this message.
### `result`
type: `SupInput` `| undefined`
If this message is a reply to another message that ran a patch, contains the result data from that patch.
### `pinned`
type: `boolean`
Whether this message is pinned in the chat.
### `metadata`
type: `any[] | undefined`
```js
// Access metadata from a replied-to message
const gameState = sup.reply.metadata[0];
const fen = gameState.fen;
const pgn = gameState.pgn;
```
An array of metadata objects attached to this message via `sup.metadata()`. Returns `undefined` if no metadata was attached. Each element in the array corresponds to a `sup.metadata()` call in the patch's output.
### `reactions`
type: `{ list(options?: { limit?: number }): SupReaction[]; add(emojiOrId: SupEmoji | string): void; remove(emojiOrId: SupEmoji | string): void }`
```js
// List reactions on the current message
const reactions = sup.message.reactions.list({ limit: 20 });
// Add/remove by emoji id
sup.message.reactions.add('emoji-id');
sup.message.reactions.remove('emoji-id');
// Or with a SupEmoji object
const thumbsUp = sup.emoji(':thumbsup:');
if (thumbsUp) {
sup.message.reactions.add(thumbsUp);
}
```
Message reaction helper methods:
- `list(options?)`: Returns all reactions on this message. Optional `limit` controls how many reactions are returned.
- `add(emojiOrId)`: Adds a reaction to this message.
- `remove(emojiOrId)`: Removes a reaction from this message.
Each reaction returned by `list()` has:
- `emoji`: `SupEmoji`
- `count`: `number`
- `users`: [`SupUser[]`](https://sup.net/docs/reference/types/user)
## Methods
### `set`
type: `function(key: string, value: any): void`
```js
sup.message.set("yes", ["riffbit"]);
```
Sets a key-value pair in the patch's state, scoped to the message containing the current patch run.
The key can be any string, and the value can be any JSON-serializable object (including Sup classes like `SupUser`, `SupImage`, `SupMessage`, etc.).
Message-scoped values are only accessible for the active message.
### `get`
type: `function(key: string): any | null`
```js
const yesVotes = sup.message.get("yes") || [];
```
Retrieves the value of a key from the message's state in this patch.
If the key does not exist, `get` returns `null`.
### `keys`
type: `function(): string[]`
```js
const keys = sup.message.keys(); // ["yes", "no", ...]
```
Returns an array of all keys stored in this message's scoped state.
## SupInput {#supinput}
The `SupInput` object represents a collection of input data in various formats (text, images, audio, video, and other files) used as input to patch functions. It is available as `sup.input` in your patch functions.
The instance on `sup.input` looks for input data from different pieces of the message that invoked the patch, and the message it replied to, prioritizing input from the invocation message over the reply.
The effect is flexibility in how a patch gets it input. For example `sup.input.text` will either contain the text used to invoke the patch with (`/user/patch `), or if none is provided but the patch as a reply to another message, then `sup.input.text` will contain the text from the message being replied to. If both are available, then `sup.input.text` will be set to the text from the invocation, and `sup.input.texts` will be an array containing both. This is sufficient for most patches, though for more fine-grained control, the invocation's text can be retrieved via `sup.message.body`, and the reply's via `sup.reply.body`.
`sup.input.image(s)`, `sup.input.audio(s)`, `sup.input.video(s)`, and `sup.input.files(s)` work similarly, with the priority:
- Attachments on the invocation message
- Media returned by the message being replied to
- Files attached to the reply message
```js title="Example Input"
SupInput: {
"text": "sup, world!",
"texts": ["sup, world!"],
"image": {
"url": "https://user-uploads.cdn.overworld.xyz/cumnehs74u2bfcas8gxsdzqx.webp",
"width": 256,
"height": 256
},
"images": [{
"url": "https://user-uploads.cdn.overworld.xyz/cumnehs74u2bfcas8gxsdzqx.webp",
"width": 256,
"height": 256
}],
"audio": {
"url": "https://user-uploads.cdn.overworld.xyz/k49lv3f90remnuqln1i8xt9p.mp3",
"duration": 1.5
},
"audios": [{
"url": "https://user-uploads.cdn.overworld.xyz/k49lv3f90remnuqln1i8xt9p.mp3",
"duration": 1.5
}],
"video": undefined,
"videos": [],
"files": [{
"url": "https://user-uploads.cdn.overworld.xyz/document.pdf",
"name": "document.pdf",
"size": 1024000
}]
}
```
---
## Properties
### `text`
type: `string | undefined`
Gets the first text from the `texts` array, or `undefined` if there are no texts.
Priority:
- Text from the message that invoked the patch
- Text from the message that this message is replying to
### `texts`
type: `string[]`
Array of text inputs. This will contain multiple entries if the patch message has text, and is replying to another message with its own text.
Priority:
- Text from the message that invoked the patch
- Text from the message that this message is replying to
### `image`
type: `SupImage` `| undefined`
Gets the first image from the `images` array, or `undefined` if there are no images.
### `images`
type: [`SupImage[]`](https://sup.net/docs/reference/types/image)
Array of images included in the input.
### `audio`
type: `SupAudio` `| undefined`
Gets the first audio file from the `audios` array, or `undefined` if there are no audio files.
### `audios`
type: [`SupAudio[]`](https://sup.net/docs/reference/types/audio)
Array of audio files (such as voice messages) included in the input.
### `video`
type: `SupVideo` `| undefined`
Gets the first video from the `videos` array, or `undefined` if there are no videos.
### `videos`
type: [`SupVideo[]`](https://sup.net/docs/reference/types/video)
Array of video files included in the input.
### `files`
type: [`SupFile[]`](https://sup.net/docs/reference/types/file)
Array of other files that aren't audio, video, or images.
### `combined`
type: `(string | SupAudio | SupImage | SupVideo | SupFile)[]`
Returns an array containing all text, audio, video, and images combined. This is particularly useful when working with `sup.ai.prompt()` which can process multiple types of input together.
The array preserves the order: texts first, followed by audios, images, and videos.
## SupImage {#supimage}
The `SupImage` object represents an image in Sup. It can be created from either a URL or a Blob.
```js title="Example Image"
SupImage: {
"url": "https://user-uploads.cdn.overworld.xyz/cumnehs74u2bfcas8gxsdzqx.webp",
"filename": "riffbit.webp",
"width": 256,
"height": 256
}
```
---
## Properties
### `name`
type: `string | undefined`
The original filename of the image, if available.
### `width`
type: `number | undefined`
The width of the image in pixels.
### `height`
type: `number | undefined`
The height of the image in pixels.
### `url`
type: `string`
Gets the CDN URL for the image. If the image was created from a Blob, the Blob will be automatically uploaded and a URL will be generated.
### `blob`
type: `Blob`
Gets the image data as a Blob. If the image was created from a URL, the image will be downloaded and returned as a Blob.
## Output
Returning an Image from a patch's `main()` function will display the image in the chat.
```js
function main() {
return sup.image(
"https://user-uploads.cdn.overworld.xyz/cumnehs74u2bfcas8gxsdzqx.webp"
);
}
```
---
## Editing
Use the `edit()` method to modify an Image. This returns an editing cursor that supports transformations and effects. After editing, call `render()` to get the final result.
```js
function main() {
const image = sup.input.image;
if (!image) return "Please attach an image";
const cursor = image.edit();
// Resize
cursor.resize(800, 600, { mode: "contain" }); // Other modes: cover, fill, inside, outside
cursor.resize(64, 64, { pixelated: true }); // Nearest-neighbor scaling for pixel art
// Basic transformations
cursor.crop(0, 0, 400, 300);
cursor.rotate(90);
cursor.flip(true, false); // Flip horizontally or vertically
// Effects
cursor.grayscale();
cursor.blur(2.5);
cursor.sharpen(1.5);
cursor.invert();
return cursor.render();
}
```
### Basic transformations
- `resize(width: number, height: number, opts?: { mode?: "cover" | "contain" | "fill" | "inside" | "outside", pixelated?: boolean })`
Resizes the image. Set `pixelated: true` for nearest-neighbor scaling (useful for pixel art).
- `crop(top: number, left: number, bottom: number, right: number)`
Crops the image.
- `rotate(degrees: number)`
Rotates the image.
- `flip(horizontal: boolean, vertical: boolean)`
Flips the image (set horizontal or vertical to true).
- `flop()`
Flips the image horizontally.
### Effects
- `grayscale()`
Converts the image to grayscale.
- `invert()`
Inverts the colors of the image.
- `blur(sigma: number)`
Applies Gaussian blur.
- `sharpen(sigma: number)`
Sharpens the image.
- `mask(mask: Image)`
Applies a mask to create transparency.
### Compositing
- `composite(...layers: Image[], options?: { top?: number, left?: number }[])`
Overlays multiple images on top of the current one. You can specify the position of each layer using the `options` parameter.
For image compositing, layer multiple images together:
```js
function main() {
const background = sup.input.image;
const overlay = sup.image("overlay.png"); // overlay.png is uploaded as an asset
const cursor = background.edit();
cursor.composite(overlay);
return cursor.render();
}
```
## SupHTML {#suphtml}
The `SupHTML` object represents custom HTML content that can be displayed in Sup. It supports static content, images, videos, and interactive elements. It's intended to be returned from the `main()` function, to be displayed in the chat.
```js title="Example HTML"
{
"html": "",
"dimensions": {
"width": 400,
"height": 300
},
"type": "html",
}
```
## Constructor
### `sup.html(html, options?)`
```js
const html = sup.html("
Hello World!
", {
width: 800,
height: 600,
});
```
Parameters:
- `html`: The HTML content to display. Accepts:
- A `string` of raw HTML
- A `SupBundle` from `sup.asset()` (extracted zip with an entry point like `index.html`)
- A `SupFile` from `sup.asset()` (a standalone `.html` file uploaded as an asset β its content will be fetched automatically)
- `options`: Optional configuration object:
```ts
{
width: number; // Width in pixels (100-1600)
height: number; // Height in pixels (100-1600)
type?: "video" | "image" | "html"; // Content type to render as
duration?: number; // Required for video type
bannerUrl?: string; // URL for banner image
tailwind?: boolean | string; // Tailwind CSS enabled by default. Set false to disable, or string for specific version
waitForSelector?: string; // CSS selector to wait for before capturing (for image/video types)
webgl?: boolean; // Use WebGL-enabled renderer for image type (for Three.js, WebGL content)
transparent?: boolean; // Render with transparent background for image type
callbacks?: string[]; // Function names that client JS can call via window.sup.exec()
}
```
## Properties
### `html`
type: `string`
The HTML content to display.
### `dimensions`
type: `{ width: number; height: number } | undefined`
The dimensions for rendering the content. Both width and height must be between 100 and 1600 pixels.
### `type`
type: `"video" | "image" | "html" | undefined`
The type to render the content as:
- `"html"`: Output interactive HTML content (default)
- `"image"`: Output the HTML content as a static image
- `"video"`: Output the HTML content as a video
### `duration`
type: `number | undefined`
Required when type is "video". Specifies the video duration in seconds.
### `bannerUrl`
type: `string | undefined`
Optional URL for a banner image to display with the HTML content before launching.
### `tailwind`
type: `boolean | string`
Tailwind CSS is enabled by default for HTML patches:
- `true` (default): Includes the latest version of Tailwind CSS from the CDN
- `string`: Includes a specific version of Tailwind CSS (e.g., `"4.0.0"`)
- `false`: Disables Tailwind CSS
When enabled, this automatically includes the Tailwind CSS library and Inter font, allowing you to use Tailwind utility classes in your HTML.
### `waitForSelector`
type: `string | undefined`
For `type: "image"` or `type: "video"`, this option allows you to specify a CSS selector to wait for before capturing the screenshot or starting video recording. This is useful when your HTML content loads dynamically or you need to wait for specific elements to appear before capturing.
```js
// Wait for an element with class "chart" to appear before taking screenshot
const chart = sup.html(chartHtml, {
width: 800,
height: 600,
type: "image",
waitForSelector: ".chart"
});
```
### `webgl`
type: `boolean | undefined`
For `type: "image"`, enables a WebGL-capable renderer. Use this when your HTML content uses WebGL features like Three.js, WebGL canvases, or other GPU-accelerated graphics.
```js
// Render Three.js or WebGL content as an image
const render = sup.html(threeJsHtml, {
width: 800,
height: 600,
type: "image",
webgl: true
});
```
### `transparent`
type: `boolean | undefined`
For `type: "image"`, renders the HTML content with a transparent background instead of the default white background. The resulting image will be a PNG with alpha transparency.
```js
// Render HTML as an image with transparent background
const sticker = sup.html(`
π
`, {
width: 200,
height: 200,
type: "image",
transparent: true
});
```
### `callbacks`
type: `string[] | undefined`
An array of function names that can be called from client-side JavaScript via `window.sup.exec()`. This enables two-way communication between the HTML content and the patch's server-side code.
**Callbacks work with both inline HTML strings and uploaded asset bundles, from both `main()` and `launch()`.**
**Important:** Functions listed in `callbacks` receive a `SupUserEvent` object as their parameter, NOT the raw arguments. Access the passed value via `event.value`.
```js
// Inline HTML with callbacks (from main)
function handleClick(event) {
const city = event.value;
return sup.ai.prompt(`Tell me about ${city}`);
}
function main() {
return sup.html(`
`, { callbacks: ["handleClick"] });
}
```
```js
// Asset bundle with callbacks (from launch)
function generateResponse(event) {
const userInput = event.value;
return sup.ai.prompt(`Help with: ${userInput}`);
}
function launch() {
return sup.html(sup.asset("myApp"), {
callbacks: ["generateResponse"]
});
}
```
```js
// Standalone HTML file asset (from main or launch)
function main() {
return sup.html(sup.asset("index.html"), {
width: 800,
height: 600
});
}
```
```js
// Client-side JavaScript (in the HTML)
// Call the server function and get the result
const response = await window.sup.exec("generateResponse", "Hello!");
// response is the return value directly (a string in this case)
console.log(response); // "Hello! How can I help you today?"
```
The `window.sup.exec()` function:
- First argument: function name (must be in `callbacks` array)
- Remaining arguments: passed to the function via `event.value`
- Returns: a Promise that resolves to the function's return value
## Client-Side APIs
`window.sup` is automatically injected into all HTML content rendered by `sup.html()` β both inline HTML strings and uploaded asset bundles. You do not need to import, initialize, or set up any bridge. The following APIs are always available:
### `window.sup.exec()`
Call server-side functions defined in `callbacks`. See above for details.
### `window.sup.share()`
Share content directly to the chat from your HTML app.
```js
// Share text
await window.sup.share('Hello world!');
// Share image URL (auto-detected)
await window.sup.share('https://example.com/image.png');
// Share base64 data URL (MUST use explicit type!)
const dataUrl = canvas.toDataURL('image/png');
await window.sup.share({ url: dataUrl, type: 'image' });
// Share HTML content
await window.sup.share({ html: 'Rich content' });
// Share multiple items
await window.sup.share(['Text message', 'https://example.com/image.png']);
```
**Important:** For base64 data URLs (like from `canvas.toDataURL()`), you MUST use the object format with explicit `type: 'image'`. Plain data URL strings will be treated as text, not images.
### `window.sup.user`
Read-only object containing the current user's information.
```js
console.log(window.sup.user.id); // User ID
console.log(window.sup.user.username); // Username
console.log(window.sup.user.pfp); // Profile picture URL
```
### `window.sup.chat`
Read-only object containing the current chat's information.
```js
console.log(window.sup.chat.id); // Chat ID
console.log(window.sup.chat.title); // Chat title
console.log(window.sup.chat.type); // "DIRECT" | "GROUP" | "CHANNEL" | "SOLOCHAT" | "THREAD" | "PAGE"
```
## Browser APIs & Permissions
HTML content rendered via `sup.html()` runs inside a sandboxed iframe (web) or WebView (mobile). The following browser APIs are available:
| API | Status | Notes |
|-----|--------|-------|
| JavaScript | Allowed | Full JS execution |
| Microphone & Camera | Allowed | Via `getUserMedia()` β browser will prompt the user |
| Clipboard (write) | Allowed | Via `navigator.clipboard.writeText()` β browser may prompt |
| Geolocation | Allowed | Via `navigator.geolocation` β browser will prompt the user |
| Fullscreen API | Allowed | Via `element.requestFullscreen()` |
| Audio/Video playback | Allowed | Autoplay is enabled |
| WebGL / Canvas | Allowed | Full GPU-accelerated rendering |
| Web Audio API | Allowed | AudioContext, oscillators, etc. |
| Fetch / XHR | Allowed | But no cookies are sent (credentialless mode) |
| WebSockets | Allowed | Real-time connections work |
| `localStorage` / `sessionStorage` | Unavailable | Runs in credentialless/incognito mode |
| Cookies | Unavailable | Credentialless mode prevents cookie access |
| `alert()` / `confirm()` / `prompt()` | Blocked | Disabled on mobile, may work on web |
| Notifications | Blocked | Not permitted |
| Screen Capture | Blocked | `getDisplayMedia()` is not permitted |
| Bluetooth / USB / Serial | Blocked | Hardware APIs are not permitted |
| Top-level navigation | Blocked | Cannot navigate the parent page |
### Microphone & Camera
Your HTML content can request microphone and camera access using the standard `getUserMedia` API. The browser will show its native permission prompt to the user.
```js
// Request camera and microphone
const stream = await navigator.mediaDevices.getUserMedia({
video: true,
audio: true
});
// Use the stream (e.g., attach to a video element)
document.querySelector('video').srcObject = stream;
```
### Clipboard
You can write to the user's clipboard:
```js
await navigator.clipboard.writeText('Copied from a Sup patch!');
```
### Geolocation
You can request the user's location. On mobile, the Sup app must have location permissions granted by the user.
```js
navigator.geolocation.getCurrentPosition((position) => {
console.log(position.coords.latitude, position.coords.longitude);
});
```
## Usage Notes
```js title="Common Patterns"
// Static content
const html = sup.html("
sup, world!
", {
width: 400,
height: 300,
});
// As an image
const image = sup.html("
sup, world!
", {
width: 800,
height: 600,
type: "image",
});
// As a video
const video = sup.html("", {
width: 1280,
height: 720,
type: "video",
duration: 10,
});
// With banner
const game = sup.html(gameHtml, {
width: 300,
height: 300,
bannerUrl: "https://example.com/banner.png"
});
// Tailwind CSS is enabled by default - just use utility classes
const styledContent = sup.html(`
Hello Tailwind!
`, {
width: 400,
height: 200
});
// Using specific Tailwind version
const styledContentSpecific = sup.html(`
Using Tailwind v4.0.0
`, {
width: 400,
height: 200,
tailwind: "4.0.0"
});
// Screenshot with wait for selector
const dynamicChart = sup.html(`
Loading chart...
Chart content here
`, {
width: 800,
height: 600,
type: "image",
waitForSelector: ".chart"
});
```
## JSX Support
In addition to using the `sup.html()` function, you can create HTML content using JSX syntax. JSX provides a more intuitive and component-based approach to building HTML content.
### Basic JSX Syntax
```jsx
function main() {
return
Hello, World!
This is JSX!
;
}
```
This is equivalent to:
```js
function main() {
return sup.html(`
Hello, World!
This is JSX!
`, {
width: 400,
height: 300
});
}
```
### JSX Props
You can pass all the same options as attributes to the `` element:
```jsx
function main() {
return
Styled with Tailwind
;
}
```
### JavaScript Expressions in JSX
Use curly braces `{}` to embed JavaScript expressions:
```jsx
function main() {
const name = sup.user.username;
const count = sup.get("count") || 0;
return
}
;
}
```
### JSX Components
Create reusable components as functions:
```jsx
function main() {
return ;
}
function UserCard({ user }) {
return (
{user.username}
User ID: {user.id}
);
}
function MessageList({ messages }) {
return (
{messages.map((message, index) => (
{message.author}: {message.text}
))}
);
}
```
### Fragments
Use fragments `<>...>` to group multiple elements without adding extra DOM nodes:
```jsx
function main() {
return
<>
Multiple Elements
First paragraph
Second paragraph
>
;
}
```
### Styling in JSX
You can include CSS styles directly in your JSX:
```jsx
function main() {
return
Styled Content
Beautiful gradients and styling!
;
}
```
### Interactive Elements
Combine JSX with interactive features:
```jsx
function main() {
const count = sup.get("count") || 0;
return [
Counter: {count}
Click the button below to increment!
,
sup.button("Increment", handleIncrement)
];
}
function handleIncrement() {
const count = sup.get("count") || 0;
sup.set("count", count + 1);
}
```
### Mixed Content
You can combine JSX with other Sup elements:
```jsx
function main() {
return [
Here's some HTML content:,
Mixed Content Example
This HTML is created with JSX
,
sup.image("https://example.com/image.jpg")
];
}
```
### Best Practices
1. **Use meaningful component names**: `` instead of ``
2. **Keep components small and focused**: Each component should have a single responsibility
3. **Use props for reusability**: Pass data through props to make components flexible
4. **Organize your code**: Put helper components at the bottom of your patch
5. **Use className instead of class**: JSX uses `className` for CSS classes
6. **Remember to close tags**: JSX requires all tags to be properly closed (` ` not ` `)
## SupAudio {#supaudio}
The `SupAudio` object represents an audio file. It can be created from either a URL or a Blob.
```js title="Example Audio"
SupAudio: {
filename: "a.mp3",
url: "https://user-uploads.cdn.overworld.xyz/k49lv3f90remnuqln1i8xt9p.mp3"
}
```
---
## Properties
### `name`
type: `string | undefined`
The original filename of the audio file, if available.
### `url`
type: `string`
Gets the CDN URL for the audio file. If the audio was created from a Blob, the Blob will be automatically uploaded and a URL will be generated.
### `blob`
type: `Blob`
Gets the audio data as a Blob. If the audio was created from a URL, the audio will be downloaded and returned as a Blob.
---
## Output
Returning an Audio from a patch's `main()` function will display an audio player in the message.
```js
function main() {
return sup.audio(
"https://user-uploads.cdn.overworld.xyz/k49lv3f90remnuqln1i8xt9p.mp3"
);
}
```
---
## Editing
Use the `edit()` method to modify a SupAudio. This returns an editing cursor that supports transformations and effects. After queueing up edits on the cursor, call `cursor.render()` to get a new `SupAudio` with the changes applied.
```js
function main() {
const audio = sup.input.audio;
if (!audio) return "Please attach an audio file";
const cursor = audio.edit();
// Time modifications
cursor.trimStart(1.5); // Remove first 1.5 seconds
cursor.trimEnd(1.0); // Remove last second
cursor.padStart(0.5); // Add 0.5 seconds silence at start
cursor.padEnd(1.0); // Add 1 second silence at end
cursor.fadeIn(1.0); // Fade in over 1 second
cursor.fadeOut(1.0); // Fade out over 1 second
cursor.reverse(); // Reverse audio
// Audio adjustments
cursor.changeSpeed(1.5);
cursor.changeTempo(1.25);
cursor.changePitch(100);
cursor.changeVolume(2);
cursor.normalize();
// Effects
cursor.reverb(50, {
roomScale: 75,
stereoDepth: 50,
preDelay: 20,
});
cursor.echo(0.5, 0.3, {
delay: 0.1,
decay: 0.5,
});
cursor.chorus(0.6, 0.4, {
speed: 1.5,
depth: 2,
});
return cursor.render();
}
```
### Basic adjustments
- `trimStart(seconds: number)`
- Removes audio from the start
- `trimEnd(seconds: number)`
- Removes audio from the end
- `padStart(seconds: number)`
- Adds silence at the start
- `padEnd(seconds: number)`
- Adds silence at the end
- `fadeIn(seconds: number)`
- Fades in the audio
- `fadeOut(seconds: number)`
- Fades out the audio
- `reverse()`
- Reverses the audio
- `changeSpeed(factor: number)`
- Changes playback speed (0.01-5)
- `changeTempo(factor: number)`
- Changes tempo without altering pitch (0.01-5)
- `changePitch(change: number)`
- Changes pitch (-2000 to 2000 cents)
- `changeVolume(gain: number)`
- Changes volume (-20 to 20 dB)
- `normalize()`
- Normalizes audio levels
### Effects
- `reverb(reverberance: number, options: object)`
- Adds reverb effect
```js
options: {
hfDamping?: number; // 0-100
roomScale?: number; // 0-100
stereoDepth?: number; // 0-100
preDelay?: number; // 0-200ms
wetGain?: number; // -10 to 10dB
}
```
- `echo(gainIn: number, gainOut: number, options: object)`
- Adds echo effect
```js
options: {
delay?: number; // >0 seconds
decay?: number; // 0-1
}
```
- `chorus(gainIn: number, gainOut: number, options: object)`
- Adds chorus effect
```js
options: {
delay?: number; // >0 seconds
decay?: number; // 0-1
speed?: number; // 0.1-5 Hz
depth?: number; // 0-10
}
```
### Combining Audio
Use `mixWith()` to overlay multiple on top of each other, or `combineWith()` to concatenate them. Call `render()` to get the final result.
```js
function main() {
const audio1 = sup.input.audio;
const audio2 = sup.audio("track2.mp3"); // track2.mp3 uploaded as an asset
const cursor = audio1.edit();
// Mix tracks together
cursor.mixWith(audio2);
// Or concatenate tracks
cursor.combineWith(audio2);
return cursor.render(); // Returns a SupAudio
}
```
## sup.random {#suprandom}
The `sup.random` package provides methods for generating random numbers and working with randomized arrays.
---
## Methods
### sup.random.integer()
#### `(min: number, max: number)` `β number`
```js
// Random number between 1 and 6 (inclusive)
const roll = sup.random.integer(1, 6);
```
Returns a random integer between min and max (inclusive).
### sup.random.float()
#### `(min: number, max: number)` `β number`
```js
// Random probability between 0 and 1
const probability = sup.random.float(0, 1);
```
Returns a random floating point number between min and max (inclusive).
### sup.random.boolean()
#### `()` `β boolean`
```js
// Random true/false with 50% probability
const coinFlip = sup.random.boolean();
```
Returns true or false with equal probability (50/50).
### sup.random.choice()
#### `(arr: T[])` `β T`
```js
// Pick a random color
const color = sup.random.choice(["red", "green", "blue"]);
// Pick a random patch asset
const randomAsset = sup.random.choice(sup.assets);
```
Returns a random element from the given array.
### sup.random.shuffle()
#### `(arr: T[])` `β T[]`
```js
// Shuffle an array of numbers
const numbers = sup.random.shuffle([1, 2, 3, 4, 5]);
// Create a randomized order of users
const queue = sup.random.shuffle([...users]);
```
Shuffles the array using the Fisher-Yates algorithm and returns the shuffled array.
## sup {#sup}
The `sup` package contains information about the patch being run and provides access to various methods for interacting with the Sup platform.
Every patch has a global `sup` value that can be accessed from anywhere in the code.
---
## Properties
### sup.id
#### `string`
Contains a unique string ID for the patch.
`sup.id` can be useful for connecting patches to share functionality.
### sup.user
#### `User`
Contains a `User` value for the user that ran the patch for the current message.
Note: Patch interactions (like `sup.button` or `onReply`) never change the patch's `User` value. `User` will always be the initial user that ran the patch for the current message.
### sup.message
#### `SupMessage` `optional`
Contains a `SupMessage` value for the current message.
### sup.chat
#### `SupChat`
Contains a `SupChat` value for the chat the patch was run in.
`sup.chat.fs` provides file storage access scoped to the current chat. See `sup.fs`.
### sup.thread
#### `SupThread` `optional`
Thread-scoped helpers. This is only available when the patch is running in a thread context.
```js
if (sup.thread) {
const messages = sup.thread.history.list({ limit: 50, offset: 0 });
const total = sup.thread.history.count();
sup.thread.send('Following up in the thread');
}
```
#### `sup.thread.history.list(options?)` `β SupMessage[]`
Returns messages in the current thread.
- `limit`: Maximum messages to return
- `offset`: Number of messages to skip (pagination)
#### `sup.thread.history.count()` `β number`
Returns the total number of messages in the current thread.
#### `sup.thread.send(firstArg?, ...args)` `β void`
Sends a message into the current thread.
```js
if (sup.thread) {
sup.thread.send('Build finished');
sup.thread.send('Here is the preview:', sup.image('https://example.com/preview.png'));
}
```
### sup.isReply
#### `boolean`
`sup.isReply` returns true if the patch was run with a reply message attached.
When `true`, `sup.reply` returns a `SupMessage` value.
### sup.reply
#### `SupMessage` `optional`
Contains a `SupMessage` value for the message that the current message is replying to.
`sup.reply` is only available when `sup.isReply` is `true`, otherwise it is `undefined`.
### sup.input
#### `SupInput`
Contains a `SupInput` value for the current message. Patches usually receive user input through this object.
### sup.assets
#### `(SupImage | SupAudio | SupVideo | SupFile)[]`
```js
// Iterate over all assets
for (const asset of sup.assets) {
console.log(asset.filename);
}
// Filter by type using built-in accessors
const images = sup.assets.images; // SupImage[]
const audios = sup.assets.audios; // SupAudio[]
const videos = sup.assets.videos; // SupVideo[]
const files = sup.assets.files; // SupFile[] (non-media files)
```
Returns an array of all assets uploaded in the patch editor. Each asset has a `filename` property.
The array also has convenient accessors for filtering by type:
- `.images` - All image assets (`SupImage[]`)
- `.audios` - All audio assets (`SupAudio[]`)
- `.videos` - All video assets (`SupVideo[]`)
- `.files` - All non-media file assets (`SupFile[]`)
**CRITICAL: To get a specific asset by name, use `sup.asset(name)` NOT `sup.assets[name]`**
```js
// β WRONG - This does NOT work (sup.assets is an array, not an object)
const img = sup.assets["myimage.png"]; // WRONG!
const img = sup.assets[filename]; // WRONG!
// β CORRECT - Use sup.asset() to look up by filename
const img = sup.asset("myimage.png"); // Correct!
const img = sup.asset(filename); // Correct!
```
### sup.fs
#### `SupFiles`
Provides file storage access for the current chat, with cross-chat reads available through absolute paths when supported.
```js
sup.fs.write('/images/output.png', sup.image('https://example.com/image.png'));
const saved = sup.fs.read('/images/output.png');
return saved;
```
See the `sup.fs` reference for `exists`, `list`, `mkdir`, `move`, `read`, `stat`, and `write`.
### sup.this
#### `SupPatch`
```js
const currentPatchId = sup.this.id;
const sourceCode = sup.this.code;
// Note: sup.this cannot call its own run() method or access its own public functions
// to prevent infinite recursion
```
Contains metadata about the currently executing patch. This is a self-referential `SupPatch` object that provides information about the patch that's currently running.
**Properties:**
- `id`: The unique ID of the current patch
- `code`: The source code of the current patch
- `author`: The `User` who created this patch
**Limitations:**
- Cannot call `sup.this.run()` - will throw an error to prevent recursion
- Cannot access `sup.this.public` properties - will throw an error to prevent recursion
This is useful for patches that need to know their own identity or source code, such as for logging, debugging, or meta-programming scenarios.
### sup.caller
#### `SupCaller`
```js
// Check if run by a user or another patch
if (sup.caller.type === 'user') {
return `Run by ${sup.caller.user.username}`;
} else {
return `Called by patch ${sup.caller.patch.name}`;
}
// Check if this is a preview (unsent draft)
sup.caller.isPreview // => true or false
```
Contains a `SupCaller` object describing what invoked this patch. The `type` property is `'user'` when run directly by a user, or `'patch'` when called by another patch via `patch.run()` or `patch.public`.
### sup.intent
#### `SupIntent`
Provides intent return values that open native Sup flows with prefilled values.
```js
const image = sup.ai.image.create('emoji of a shooting star');
return sup.intent.emoji({ name: 'star', image });
```
See the `sup.intent` reference.
### sup.lookup
#### `SupLookup`
Looks up chats, users, patches, and emojis by ID or slug.
```js
const patch = sup.lookup.patch('/shahruz/weather');
if (patch) {
return patch.run();
}
```
See the `sup.lookup` reference.
---
## Methods
### sup.audio()
#### `(urlOrBlob: string | Blob)` `β SupAudio`
```js
const audio = sup.audio("https://example.com/audio.mp3");
const audio = sup.audio("asset.mp3");
```
Creates a new `SupAudio` object from a URL, blob, or the filename of an asset uploaded in the editor. Returning this from `main()` will display the audio in the chat.
### sup.image()
#### `(src: string | Blob | SupSVG | SupHTML)` `β Image`
```js
const image = sup.image("https://example.com/image.jpg");
```
Creates a new `SupImage` object from the specified URL, blob, SVG, asset uploaded in the editor, or HTML (takes a screenshot). Returning this from `main()` display the image in the chat.
### sup.video()
#### `(url: string | SupImage)` `β SupVideo`
```js
const video = sup.video("https://example.com/video.mp4");
const videoFromImage = sup.video(sup.image("image.jpg")); // Convert image to 3-second video
const customDurationVideo = sup.video(sup.image("slide.png"))
.edit()
.duration(5) // Override default 3-second duration
.render();
```
Creates a new `SupVideo` object from the specified URL or SupImage. When using a SupImage, it creates a 3-second video by default, which can be customized using the `.edit().duration()` method. Returning this from `main()` will display the video in the chat.
### sup.file()
#### `(url: string, mimeType: string)` `β SupFile`
```js
const file = sup.file("https://example.com/data.json", "application/json");
const file = sup.file("document.pdf", "application/pdf"); // From uploaded asset
```
Creates a new `SupFile` object from a URL or the filename of an asset uploaded in the editor. The `mimeType` parameter specifies the file's MIME type. Returning this from `main()` will display a downloadable file attachment in the chat.
### sup.sequence()
#### `(clips: (SupVideo | SupImage | string)[])` `β SupSequence`
```js
const sequence = sup.sequence([
"https://example.com/video1.mp4", // Full URL
sup.video("video2.mp4"), // SupVideo (supports asset names)
sup.image("image.jpg") // SupImage (supports asset names)
]);
const finalVideo = sequence.render();
```
Creates a video sequence from multiple clips (videos, images, or URLs). Returns a `SupSequence` object that can be edited and rendered into a final video.
### sup.html()
#### `(html: string | SupBundle | SupFile, options?: HTMLOptions)` `β SupHTML`
```js
const html = sup.html("
Hello, world!
");
```
Creates a new `SupHTML` object from the specified HTML content. Returning this from `main()` will display the HTML inline in the chat.
Accepts a raw HTML string, a `SupBundle` from `sup.asset()` (extracted zip bundle), or a `SupFile` from `sup.asset()` (standalone HTML file β its content is fetched automatically).
Refer to the `HTMLOptions` interface for available options.
### sup.button()
#### `(label: string, clickCallbackFn: Function, value?: any)` `β Button`
The click callback function receives an argument of type:
```
{
"user": User,
"value": any
}
```
```js
function main() {
const roll = sup.get("roll");
const lastRoller = sup.get("lastRoller");
return [
roll ? `${lastRoller} rolled a ${roll}` : `Click to roll the dice`,
sup.button("Roll", handleClick, sup.random.integer(1, 6)),
];
}
function handleClick(e) {
sup.set("roll", e.value);
sup.set("lastRoller", e.user.username);
}
```
Creates an interactive button that can be returned from `main()`. The callback function must be defined at the top level of your patch.
### sup.keys()
#### `β string[]`
```js
const keys = sup.keys(); // ["gameActive", "currentRound", ...]
```
Returns an array of all keys stored in the current scope's datastore. If called inside a chat, this is equivalent to `sup.chat.keys()`. Otherwise, it is equivalent to `sup.global.keys()`.
Keys that have been `set()` during the current patch run are included in the results, even before they are persisted to the database.
### sup.set()
#### `(key: string, value: any)` `β void`
```js {3}
let count = sup.get("count") || 0;
count++;
sup.set("count", count);
```
Sets a key-value pair in the patch's chat state. This is equivalent to `sup.chat.set()`.
The key can be any string, and the value can be any JSON-serializable object (including Sup classes like `SupUser`, `SupImage`, `SupMessage`, etc.).
To use different scopes of the datastore, refer to the set/get functions available in `SupUser`, `SupMessage` and `SupGlobal`.
### sup.get()
#### `(key: string)` `β any | null`
```js {2, 5}
sup.set("message", "Hello, world!");
const message = sup.get("message"); // "Hello, world!"
// Returns null if the key does not exist
const notFound = sup.get("notFound"); // null
```
Retrieves the value of a key from the patch's chat-level state.
If the key does not exist, `sup.get` returns `null`.
### sup.status()
#### `(status: string)` `β void`
```js
sup.status("Fetching data...");
```
Sets a status message for the patch while it is running.
Calling `sup.status` more than once in a patch run replaces the previous status message.
The status message is automatically cleared when the patch finishes running.
### sup.fetch()
#### `(url: string, options?: { method?: string; headers?: Headers; body?: string })` `β FetchResponse`
```js title="Examples"
// GET request + JSON response
const response = sup.fetch("https://dummyjson.com/test");
const json = response.json(); // { "status": "ok", method: "GET" }
// GET request + Text response
const response = sup.fetch("https://dummyjson.com/test");
const text = response.text(); // `{"status":"ok","method":"GET"}` (string)
// GET request + Blob response
const response = sup.fetch("https://dummyjson.com/image/150");
const blob = response.blob(); // Blob for the image
// POST request + JSON response
const response = sup.fetch("https://dummyjson.com/test", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ key: "value" }),
});
const json = response.json(); // { "status": "ok", method: "POST" }
```
Performs an HTTP request to the specified URL.
The `options` parameter can be used to customize the request method, headers, and body.
The response should be handled using the `json`, `text`, or `blob` methods on the returned `FetchResponse` object.
### sup.uuid()
#### `β string`
```js
const id = sup.uuid(); // "XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX" (v4 UUID)
```
Generates a random UUID (v4) string.
### sup.asset()
#### `(filename: string)` `β SupAudio | SupImage | SupVideo | SupFile | SupBundle`
**This is the correct way to get an asset by filename.**
```js
// β CORRECT - Look up assets by filename using sup.asset()
const image = sup.asset("image.jpg");
const audio = sup.asset("sound.mp3");
const video = sup.asset("clip.mp4");
const bundle = sup.asset("myApp"); // Uploaded zip bundle
// β WRONG - Do NOT use sup.assets[name] (that's an array, not an object)
const image = sup.assets["image.jpg"]; // WRONG! Will not work!
```
Returns the asset matching the given filename. The returned asset can be a SupImage, SupAudio, SupVideo, SupFile, or SupBundle depending on what was uploaded. Zip files uploaded as assets are automatically processed as bundles.
Both `SupBundle` and `SupFile` assets (e.g. a standalone `.html` file) can be passed directly to `sup.html()`:
```js
// Bundle (from uploaded zip)
sup.html(sup.asset("myApp"));
// Standalone HTML file
sup.html(sup.asset("index.html"));
```
### sup.secret()
#### `(key: string)` `β string`
```js
const apiKey = sup.secret("API_KEY");
```
Loads a secret value from the patch's secrets by key.
Secrets can be used to store sensitive information like API keys.
If the secret does not exist, `sup.secret` will throw an error.
### sup.emoji()
#### `(name: string)` `β SupEmoji | undefined`
```js
const emoji = sup.emoji("baby/a");
```
Loads an emoji from Sup emoji library by name.
### sup.scrape()
#### `(url: string, prompt?: string)` `β string`
```js
const scrape = sup.scrape(
"https://en.wikipedia.org/wiki/Wikipedia:On_this_day/Today",
"Summarize the events listed on this page."
);
```
Performs a web scrape on the specified URL with an optional prompt for AI processing.
If a prompt is provided, the response string will be the AI-generated response.
Otherwise, the response string will be a Markdown-formatted version of the web page's primary contents.
### sup.screenshot()
#### `(url: string, selector?: string)` `β SupImage`
```js
// Screenshot entire page
const screenshot = sup.screenshot("https://example.com");
// Screenshot specific element
const element = sup.screenshot("https://example.com", "#main-content");
```
Takes a screenshot of the specified URL and returns it as a `SupImage`. Optionally provide a CSS selector to capture only a specific element on the page.
### sup.makePublic()
#### `(...Function[])` `β void`
```js
function init() {
sup.makePublic([add, reset, getCount]);
}
function main() {
const count = sup.get("count") || 0;
return [count, sup.button("Add", add)];
}
function add() {
let count = sup.get("count") || 0;
count++;
sup.set("count", count);
}
function reset() {
sup.set("count", 0);
}
function getCount() {
return sup.get("count") || 0;
}
```
Makes the specified functions public, allowing them to be called from other patches. `makePublic` should be called in the `init` function. It can accept one or more functions as arguments.
### sup.patch()
#### `(patchIdOrFullname: string)` `β SupPatch`
```js
function main() {
// Using patch ID
const patch1 = sup.patch("cm37ij7tq000008l5h9asc2tb");
// Using full name format /username/patchname
const patch2 = sup.patch("/hiro/counter");
// Call public functions via .public
patch1.public.add();
const count = patch1.public.getCount();
// Access patch properties
const sourceCode = patch2.code;
const patchId = patch2.id;
// Directly run the patch's main() function
const result = patch2.run("Hello", "World");
return count;
}
```
Retrieves an instance of another patch by its ID or full name. You can specify either:
- A patch ID (e.g., `"cm37ij7tq000008l5h9asc2tb"`)
- A full name in the format `/username/patchname` (e.g., `"/hiro/counter"`)
The returned `SupPatch` object provides access to the patch's public functions, source code, and other properties.
#### SupPatch Properties
##### `patch.id`
**`string`** - The unique ID of the patch.
##### `patch.code`
**`string`** - The source code of the patch.
##### `patch.author`
**`User`** - The author (creator) of the patch. Returns a `SupUser` object with properties like `username`, `displayName`, `pfp`, `bio`, etc.
```js
const patch = sup.patch("/hiro/counter");
const authorName = patch.author.username;
const authorBio = patch.author.bio;
// Also available on sup.this
const myAuthor = sup.this.author;
```
##### `patch.public`
**`object`** - Contains all functions and data exposed via `sup.makePublic()`. Access public functions as `patch.public.functionName()` and public data as `patch.public.dataName`.
#### SupPatch Methods
##### `patch.run(...args)`
**`(...args: any[]) β any`** - Directly invokes the patch's `main()` function with the provided arguments, bypassing the normal message input system.
### sup.dump()
#### `(obj: any)` `β string`
```js
const obj = { foo: "bar", nested: { value: 123 } };
const str = sup.dump(obj); // Returns a debugging string representation
```
Returns a debugging string representation of the given object.
### sup.serialize()
#### `(obj: SupSerializable)` `β JSONObject`
```js
const data = { foo: "bar", image: sup.image("example.jpg") };
const serialized = sup.serialize(data);
```
Serializes a Sup-serializable object into a JSON-compatible format. Supports serializing Sup objects like SupImage, SupAudio, SupVideo, etc.
### sup.deserialize()
#### `(obj: JSONObject)` `β SupSerializable`
```js
const serialized = { type: "image", url: "example.jpg" };
const image = sup.deserialize(serialized);
```
Deserializes a JSON object serialized via `sup.serialize`.
### sup.metadata()
#### `(data: any)` `β SupMetadata`
```js
// Chess game: return visible board image with invisible game state
const boardImage = renderBoard(fen);
return [
boardImage,
sup.metadata({
fen: "rnbqkbnr/pppppppp/8/8/4P3/8/PPPP1PPP/RNBQKBNR b KQkq e3 0 1",
pgn: "1. e4"
})
];
```
Another patch can access this metadata when replying:
```js
// Access game state from the replied-to message
const gameState = sup.reply.metadata[0];
const currentFen = gameState.fen;
const moveHistory = gameState.pgn;
```
Creates a `SupMetadata` object containing data that won't be rendered in the chat but can be accessed by other patches and NPCs. This is useful for passing structured data between patches or providing context to AI-powered NPCs without cluttering the visible output.
Note: `metadata` returns an array since a message can contain multiple metadata objects.
### sup.sleep()
#### `(ms: number)` `β void`
```js
sup.sleep(1000); // Pause for 1 second
```
Pauses execution for the specified number of milliseconds.
### sup.svg()
#### `(content: string)` `β SupSVG`
```js
const svg = sup.svg(
''
);
```
Creates a new `SupSVG` object from the specified SVG content.
### sup.gif()
#### `(imagesOrVideo: SupImage[] | SupVideo, options?: { frameRate?: number })` `β SupImage`
```js
const gif = sup.gif([sup.image("image1.jpg"), sup.image("image2.jpg")]);
```
Creates a GIF (as a `SupImage` object) from the provided images or video.
### sup.segment()
#### `(users, matchingElements?, notMatchingElements?)` `β SupSegment`
**Important limitations:**
- **Client-side only**: Segments control what is *displayed*, but all segment data is sent to every user
- **No access restriction**: This is for UI personalization, not security. Do not include sensitive data in segmentsβcheck `sup.user` in your patch logic instead
```js
// Show different content to a specific user
const segment = sup.segment(
user,
"Content only this user sees",
"Content everyone else sees"
);
// Target multiple users
const segment = sup.segment(
[user1, user2],
["Content for these users"],
["Content for others"]
);
```
Creates a `SupSegment` that shows different content to different users.
## sup.global {#supglobal}
The `sup.global` package provides methods for storing and retrieving global state that persists across all patch runs on Sup. It is accessed through `sup.global`.
```js title="Example sup.global Usage"
// Store global data
sup.global.set("totalUsers", 1000);
sup.global.set("lastUpdate", new Date());
// Retrieve global data
const totalUsers = sup.global.get("totalUsers");
const lastUpdate = sup.global.get("lastUpdate");
```
---
## Methods
### sup.global.set()
#### `(key: string, value: any)` `β void`
```js
sup.global.set("totalGames", 100);
sup.global.set("highScores", [
{ name: "player1", score: 1000 },
{ name: "player2", score: 950 }
]);
```
Sets a key-value pair in the patch's global state.
The key can be any string, and the value can be any JSON-serializable object (including Sup classes like `User`, `Image`, `Message`, etc.).
Global state is shared across all patch runs on the Sup platform, regardless of chat or user.
### sup.global.get()
#### `(key: string)` `β any | null`
```js
const totalGames = sup.global.get("totalGames"); // 100
const scores = sup.global.get("highScores"); // [{ name: "player1", score: 1000 }, ...]
```
Retrieves the value of a key from the patch's global state.
If the key does not exist, `get` returns `null`.
Global state is useful for storing platform-wide data like:
- Total usage statistics
- Cross-chat leaderboards
- Global settings or configurations
- Shared resources or assets
### sup.global.keys()
#### `β string[]`
```js
const keys = sup.global.keys(); // ["totalGames", "highScores", ...]
```
Returns an array of all keys stored in the patch's global state.
Keys that have been `set()` during the current patch run are included in the results, even before they are persisted to the database.
## sup.ai {#supai}
The `sup.ai` package provides access to the runtime's basic AI capabilities, including text generation and analysis, and embeddings. For image operations, see the `sup.ai.image` package.
---
## sup.ai.prompt()
### Simple prompting
##### `sup.ai.prompt(prompt: string)` `β string`
Simple prompting takes text input and returns a text response.
```js
const response = sup.ai.prompt("Write a haiku about frogs");
```
### Multi-modal prompting
##### `sup.ai.prompt(...string or images...)` `β string`
Multi-modal prompting can take multiple inputs, including images. This can be done in any order or combination.
```js
// Multiple inputs
const response = sup.ai.prompt(
"Write a haiku about this image",
sup.input.image
);
// With options
const response = sup.ai.prompt("List three colors", { temperature: 0.9 });
```
### Advanced prompting
##### `sup.ai.prompt(...strings, images, or options...)` `β string | object`
The `sup.ai.prompt()` function also takes options that can adjust temperature or provide a response schema. Like mentioned before, you can mix multiple inputs, media types, and options or schemas in any order or configuration you like.
When providing an `options` object, the following properties are available:
```ts
{
temperature?: number; // Randomness of output (0-1)
schema?: StructuredResponseSchema; // Schema for structured output
reasoning?: boolean | { // Enable reasoning tokens (works with compatible models)
effort?: 'high' | 'medium' | 'low'; // Reasoning effort level
};
tools?: Record;
callback: (args: any) => any;
}>;
}
```
When using the `schema` property, you can define the structure of the AI's response using a simplified JSON Schema syntax. Instead of wrapping your schema in `{ type: "object", properties: {...} }`, you define properties directly at the root level:
```ts
type StructuredResponseSchema = {
[key: string]: {
type: "string" | "number" | "boolean" | "array" | "object";
description?: string;
// String-specific
enum?: string[];
minLength?: number;
maxLength?: number;
pattern?: string;
format?: string;
// Number-specific
minimum?: number;
maximum?: number;
exclusiveMinimum?: number;
exclusiveMaximum?: number;
multipleOf?: number;
// Array-specific
items?: StructuredResponseSchemaObject[];
minItems?: number;
maxItems?: number;
uniqueItems?: boolean;
// Object-specific
properties?: StructuredResponseSchemaObject;
};
};
```
:::note[Simplified Schema Format]
Unlike standard JSON Schema, you define properties directly without wrapping them:
```js
// β Correct - properties defined directly
schema: {
name: { type: "string" },
age: { type: "number" }
}
// β Incorrect - don't wrap in type/properties
schema: {
type: "object",
properties: {
name: { type: "string" },
age: { type: "number" }
}
}
```
You can use `type` as a property name (it's distinguished from the JSON Schema keyword by having a nested `type` field):
```js
schema: {
type: { type: "string", description: "The item type" }, // "type" as a property name
name: { type: "string" }
}
```
:::
The AI will format its response to match the provided schema, making it easy to work with the response programmatically. Structured responses can also be useful for getting multiple pieces of information from a single prompt.
```js
const response = sup.ai.prompt("What's the weather like?", {
schema: {
temperature: {
type: "number",
minimum: -100,
maximum: 150,
description: "Temperature in Fahrenheit",
},
conditions: {
type: "string",
enum: ["sunny", "cloudy", "rainy", "snowy"],
},
forecast: {
type: "string",
maxLength: 100,
description: "Brief weather forecast",
},
},
});
console.log(response);
// {
// temperature: 22,
// conditions: "sunny",
// forecast: "Clear skies with light breeze"
// }
```
### Function calling with tools
The `tools` property enables AI function calling, allowing the AI to execute custom functions during the conversation. This is useful for giving the AI access to external data, APIs, or custom logic.
```js
const result = sup.ai.prompt("What's the weather in Paris and calculate a 20% tip on $50?", {
tools: {
get_weather: {
description: "Get current weather for a city",
parameters: {
city: {
type: "string",
description: "City name"
},
units: {
type: "string",
description: "Temperature units (celsius or fahrenheit)",
required: false
}
},
callback: (args) => {
// This function gets called when AI decides to use this tool
const temp = args.city === "Paris" ? "22Β°C" : "20Β°C";
return `Weather in ${args.city}: ${temp}, sunny`;
}
},
calculate_tip: {
description: "Calculate tip amount on a bill",
parameters: {
amount: {
type: "number",
description: "Bill amount in dollars"
},
percentage: {
type: "number",
description: "Tip percentage (e.g. 20 for 20%)"
}
},
callback: (args) => {
const tip = args.amount * (args.percentage / 100);
return `${args.percentage}% tip on $${args.amount} = $${tip.toFixed(2)}`;
}
}
}
});
console.log(result);
// "The weather in Paris is 22Β°C and sunny. A 20% tip on $50 would be $10.00."
```
**Tool Definition:**
- `description`: Clear explanation of what the tool does (required)
- `parameters`: Object defining expected parameters (optional)
- `callback`: Function to execute when the AI calls this tool (required)
**Parameter Types:**
- `string`: Text values
- `number`: Numeric values
- `boolean`: True/false values
- `object`: Complex objects
- `array`: Lists of values
Each parameter can specify:
- `description`: What the parameter represents
- `required`: Whether it's mandatory (defaults to `true`)
The AI automatically decides when to call your tools and handles the conversation flow.
## sup.ai.promptFull()
#### `(...strings, images, or options...)` `β SupAICompletion`
```js
const completion = sup.ai.promptFull("Explain quantum physics", {
temperature: 0.8,
reasoning: true
});
console.log(completion.content); // The main response
console.log(completion.reasoning); // AI's step-by-step reasoning (if enabled)
console.log(completion.finish_reason); // "stop", "length", "tool_calls", etc.
console.log(completion.tool_calls); // Tool calls made during response (if any)
```
`sup.ai.promptFull()` works exactly like `sup.ai.prompt()` but returns a detailed completion object instead of just the content. This gives you access to additional metadata about the AI's response.
**Returns:** A `SupAICompletion` object with these properties:
- `content` (string | object): The main AI response (same as `sup.ai.prompt()`)
- `reasoning` (string | undefined): Step-by-step reasoning if reasoning mode is enabled
- `finish_reason` (string): Why the response ended ("stop", "length", "tool_calls", etc.)
- `tool_calls` (array | undefined): Details of any tool calls made during the response
**Reasoning Mode:**
```js
const completion = sup.ai.promptFull("Solve this math problem: 127 Γ 43", {
reasoning: true // or { effort: "high" }
});
console.log(completion.reasoning);
// "Let me break this down step by step:
// 127 Γ 43 = 127 Γ (40 + 3) = (127 Γ 40) + (127 Γ 3)..."
console.log(completion.content);
// "127 Γ 43 = 5,461"
```
Use `reasoning: true` or `reasoning: { effort: "high" | "medium" | "low" }` to get the AI's thought process.
## sup.ai.promptWithContext()
#### `(...strings, images, or options...)` `β string | object`
```js
// this code
function main() {
return sup.ai.promptWithContext("Write a haiku about the input");
}
// is roughly similar to this code
function main() {
return sup.ai.prompt(`Write a haiku about ${sup.input.text}`);
}
```
This is similar to `sup.ai.prompt()`, but automatically includes relevant context about the current message, chat, and user. This helps the AI understand the conversation context without needing to manually include things like `sup.input.text` in your prompt.
The context includes:
- Current date and time
- User information (name, username, bio)
- Chat information (name)
- Message information (input, attachments)
- Reply information (when replying to a message)
## sup.ai.tts()
#### `(...strings, audio, or options...)` `β SupAudio`
```js
// Basic text-to-speech
const speech = sup.ai.tts("sup world");
// With voice cloning using sample audio
const clonedSpeech = sup.ai.tts("sup world", sampleAudio);
// With generation options
const speech = sup.ai.tts("sup world", {
temperature: 0.8, // Voice variation (0-1)
exaggeration: 1.2, // Emphasis level
cfg: 3.0 // Classifier-free guidance
});
// Multiple text parts
const speech = sup.ai.tts("sup", "world", "sup sup");
```
Generates speech audio from text using AI text-to-speech. Can optionally clone a voice by providing sample audio.
**Parameters:**
- Text strings to convert to speech
- Optional `SupAudio` for voice cloning
- Optional configuration object with:
- `temperature`: Controls voice variation (0-1)
- `exaggeration`: Controls emphasis and expression level
- `cfg`: Classifier-free guidance scale
## sup.ai.embedding.embed()
#### `(input: string | SupImage | SupAudio | SupVideo, options?: SupAIEmbeddingOptions)` `β SupEmbedding`
```js
const embedding = sup.ai.embedding.embed("sup world");
// Returns SupEmbedding { value: [0.123, -0.456, ...], model: "gemini-embedding-2-preview" }
const embedding2 = sup.ai.embedding.embed("sup world", { model: "text-embedding-3-small" });
// Uses text-embedding-3-small
const imageEmbedding = sup.ai.embedding.embed(myImage);
// Embed an image (uses default gemini-embedding-2-preview)
embedding.value; // number[] β the raw vector
embedding.model; // the model used to create this embedding
```
Generates a vector embedding for the provided input. Returns a `SupEmbedding` object containing the vector and the model used.
**Options:**
- `model` β The embedding model to use. Defaults to `gemini-embedding-2-preview`. Other models like `text-embedding-3-small` and `text-embedding-3-large` are also supported.
- `dimensions` β Number of dimensions in the output vector. If not specified, uses the model's default
## sup.ai.embedding.distance()
#### `(a: SupEmbedding, b: SupEmbedding | string | SupImage | SupAudio | SupVideo)` `β number`
```js
const a = sup.ai.embedding.embed("hello");
const b = sup.ai.embedding.embed("hi there");
const dist = sup.ai.embedding.distance(a, b);
// Returns 0 (identical) to 1 (opposite)
// Auto-embeds strings using the same model as the first argument
const dist2 = sup.ai.embedding.distance(a, "goodbye");
```
Computes the cosine distance between two embeddings. If the second argument is not a `SupEmbedding`, it will be automatically embedded using the same model as the first argument.
Both embeddings must use the same model β an error is thrown if models don't match.
## sup.ai.embedding.nearest()
#### `(target: SupEmbedding, candidates: (SupEmbedding | string | SupImage | SupAudio | SupVideo)[])` `β { item: SupEmbedding, distance: number, index: number }`
```js
const query = sup.ai.embedding.embed("programming languages");
const result = sup.ai.embedding.nearest(query, ["Python", "JavaScript", "cooking recipes", "TypeScript"]);
result.item; // SupEmbedding of the closest match
result.distance; // cosine distance to the closest match
result.index; // index of the closest match in the candidates array
```
Finds the nearest candidate to a target embedding. Candidates that are not already `SupEmbedding` objects will be auto-embedded using the target's model.
## Intelligent Expressions
Intelligent expressions are a runtime addition that allow you to use "micro prompts" for conditionals, data reasoning, and more during the execution of your program.
### Type coercion
Use intelligent expressions to coerce inferred values into specific primitive types.
#### Standard syntax
```jsx
sup.ai.int("The number of Rs in strawberry");
// -> 3
sup.ai.float("A number between 0 and 1");
// -> 0.7
sup.ai.bool("Wednesday is after Tuesday");
// -> true
sup.ai.array("yellow red green", "the colors");
// -> ['yellow', 'red', 'green']
```
### Contextual coercion
Use intelligent expressions on existing variables to infer values based on their context.
#### Standard syntax
Standard syntax for contextual coercion.
```jsx
const foo = "strawberry";
foo.ai.int("The number of Rs");
// -> 3
const bar = "Wednesday";
bar.ai.bool("Comes after Tuesday");
// -> true
const baz = "yellow red green";
baz.ai.array("the colors");
// -> ['yellow', 'red', 'green']
const foo = "encyclopedia";
foo.ai.text("hyphen between each syllable");
// -> `en-cyc-lo-pe-di-a`
```
### Array operations
Perform intelligent operations like filtering and mapping on arrays.
#### `anArray.ai.filter()`
Standard syntax for filtering arrays:
```jsx
const words = ["yellow", "red", "sunday", "monday", "green"];
words.ai.filter("just the colors");
// -> ['yellow', 'red', 'green']
```
#### `anArray.ai.map()`
Standard syntax for mapping array elements.
```jsx
const words = ["foo", "bar", "baz"];
words.ai.map("capitalize each word");
// -> ['Foo', 'Bar', 'Baz']
```
#### Reducing arrays
This can often be accomplished using contextual coercion, for example:
```jsx
words.ai.bool('all words are a color')
// -> true
```
## onThreadReply {#onthreadreply}
Patches may optionally define an `onThreadReply` function that will be called when a user replies in the comments (or thread) of the patch message. `onThreadReply` should be defined at the top level of the patch code, outside of any other functions.
The function is passed an event object with the following properties:
```js
function onThreadReply(e: {
user: SupUser;
input: SupInput;
}) => SupOutput | undefined;
```
### Basic Usage
```js
function onThreadReply(e) {
console.log(`${e.user.username} replied: ${e.input.text}`);
}
```
### Responding to a user's reply
You can return a value from `onThreadReply` to send a response message. The return value can be text, images, audio, video, or any other supported patch output format.
```js
function onThreadReply(e) {
return `sup ${e.user.username}!`;
}
```
You can also return rich content like images, audio, or video:
```js
function onThreadReply(e) {
// Return an image
return sup.image("https://example.com/welcome.jpg");
// Or return audio
// return sup.audio("https://example.com/greeting.mp3");
// Or return video
// return sup.video("https://example.com/welcome.mp4");
}
```
### Return Value Types
`onThreadReply` can return any valid patch output, including:
- **Text**: Simple strings or formatted text
- **Images**: Using `sup.image(url)` or `sup.image(blob)`
- **Audio**: Using `sup.audio(url)` or `sup.audio(blob)`
- **Video**: Using `sup.video(url)` or `sup.video(blob)`
- **Complex content**: Arrays or objects containing multiple elements
- **undefined**: When no response is needed
```js
function onThreadReply(e) {
if (e.input.text.includes("photo")) {
return sup.image("https://example.com/response.jpg");
} else if (e.input.text.includes("music")) {
return sup.audio("https://example.com/song.mp3");
} else if (e.input.text.includes("video")) {
return sup.video("https://example.com/clip.mp4");
} else {
return `Thanks for your message: "${e.input.text}"`;
}
}
```
### Updating the initial message
The `main` function will be re-run every time `onThreadReply` completes, so you can use `onThreadReply` to update state or perform other side effects in response to user input. This will update the initial message with the new content.
```js
function main() {
const replyCount = sup.get("replyCount") || 0;
return `This message has been replied to ${replyCount} times`;
}
function onThreadReply(e) {
let count = sup.get("replyCount") || 0;
count++;
sup.set("replyCount", count);
}
```
### Accessing reply content and attachments
The `input` property contains the full reply content and any attachments:
```js
function onThreadReply(e) {
// Access the text content
const replyText = e.input.text;
// Access images in the reply
if (e.input.images.length > 0) {
return `Thanks for the image, ${e.user.username}!`;
}
// Access files in the reply
if (e.input.files.length > 0) {
return `Got your file: ${e.input.files[0].name}`;
}
// Access videos in the reply
if (e.input.videos.length > 0) {
return `Cool video, ${e.user.username}!`;
}
// Access audio in the reply
if (e.input.audios.length > 0) {
return `Nice audio message!`;
}
// Default response
return `Thanks for your reply: "${replyText}"`;
}
```
### Context-aware responses
You can create more intelligent responses by analyzing the reply content:
```js
function onThreadReply(e) {
const reply = e.input.text.toLowerCase();
if (reply.includes("help")) {
return "Here are the available commands: /status, /info, /reset";
} else if (reply.includes("thanks") || reply.includes("thank you")) {
return "You're welcome! π";
} else if (reply.includes("?")) {
return "That's a great question! Let me think about that...";
}
// Return undefined if no specific response is needed
return undefined;
}
```
## onReact {#onreact}
Patches may optionally define an `onReact` function that will be called when a user reacts with an emoji to the patch message. `onReact` should be defined at the top level of the patch code, outside of any other functions.
The function is passed an event object with the following properties:
```js
function onReact(e: {
user: SupUser;
reactionEmoji: {
id: string;
shortname: string;
imageUrl?: string;
audioUrl?: string;
};
}) => void;
```
### Basic Usage
```js
function onReact(e) {
console.log(`${e.user.username} reacted: ${e.reactionEmoji.shortname}`);
}
```
### Updating the initial message
The `main` function will be re-run every time `onReact` completes, so you can use `onReact` to update state or perform other side effects in response to user reactions. This will update the initial message with the new content.
```js
function main() {
const reactionCount = sup.get("reactionCount") || 0;
return `This message has received ${reactionCount} reactions`;
}
function onReact(e) {
let count = sup.get("reactionCount") || 0;
count++;
sup.set("reactionCount", count);
}
```
## main {#main}
Every patch must have a `main` function. This function is the entry point for the patch and is called every time the patch is run.
The function must return a single value or an array of values. The return value can be a string, number, boolean, or a Sup component, like `SupImage` or `SupAudio`.
### Usage
```js
function main() {
return "sup, world!";
}
```
### Valid return types
`string | number | boolean`
`SupImage`
`SupAudio`
`SupVideo`
`SupHTML`
`SupSVG`
`Array<...>`
### Interactions
Patches that are interactive (e.g. using buttons or onThreadReply) will automatically re-run the `main` function when the user interacts with the patch. If the `main` function returns a different value, the new value will be displayed in the chat for all users.
## launch {#launch}
Patches may optionally define a `launch` function for HTML patches that will be displayed in the HTML view/split view. When both `launch` and `main` are defined, `launch` takes precedence and will be called instead of `main`.
The `launch` function is specifically designed for patches that return HTML content. When a patch with `launch()` is executed via the launch API, it must return a `SupHTML` object. Other return types will result in an error.
### Usage
```js
// launch() must return HTML content
function launch() {
return sup.html("
Hello World
");
}
// If launch() is defined, main() will not be called
function main() {
return "This won't be executed";
}
```
### Examples
```js
// Basic HTML
function launch() {
return sup.html("
Hello World
");
}
// HTML with dimensions
function launch() {
return sup.html("
My App
", {
width: 800,
height: 600
});
}
// HTML with state for interactivity
function launch() {
return sup.html("
Click count: 0
", {
state: { count: 0 }
});
}
// HTML with Tailwind CSS
function launch() {
return sup.html(`
Styled with Tailwind
`, {
tailwind: true
});
}
// HTML as an image (screenshot)
function launch() {
return sup.html("
This will be rendered as an image
", {
type: 'image',
width: 1200,
height: 800
});
}
// HTML as a video
function launch() {
return sup.html("
Animated content
", {
type: 'video',
duration: 5000,
width: 1920,
height: 1080
});
}
// HTML with callbacks for server-side interaction
function handleQuery(event) {
return sup.ai.prompt(`Answer: ${event.value}`);
}
function launch() {
return sup.html(sup.asset("myApp"), {
callbacks: ["handleQuery"]
});
}
// Client-side JS can call: await window.sup.exec("handleQuery", "question")
```
### When to use launch()
Use `launch()` when you want to:
- Create HTML patches that will be displayed instantly in the HTML view/split view
- Build interactive web applications that run in Sup
- Create patches that are specifically designed to be "launched" rather than displayed inline in chat
**Important:** `launch()` is specifically for HTML content. Patches with `launch()` must return a `SupHTML` object. Returning other types (strings, images, etc.) will result in an error when the patch is launched.
### Return type
`launch()` **must** return a `SupHTML` object. This is enforced by the client when patches are launched via the launch API.
See the SupHTML documentation for all available options and features.
## init {#init}
Patches may optionally define an `init` function that will be called when a patch is saved or updated.
This function can be used to initialize state, or expose functions that can be called from other functions in the patch.
### Example
```js
function init() {
sup.set("count", 0);
}
function main() {
return [sup.get("count"), sup.button("add", add)];
}
function add() {
sup.set("count", sup.get("count") + 1);
}
```
In this example, the `init` function initializes a count to 0. The `main` function displays the current count and a button to increment the count. The `add` function increments the count when the button is clicked. If the patch creator updates the patch, the count will be reset to 0.
### Public Functions
`init` can be used to expose functions that can be called from other patches. For example:
```js {3}
function init() {
sup.set("count", 0);
sup.makePublic([add, getCount]);
}
function main() {
return [sup.get("count"), sup.button("add", add)];
}
function add() {
sup.set("count", sup.get("count") + 1);
}
function getCount() {
return sup.get("count");
}
```
In this example, the `add` and `getCount` functions are exposed to other functions in the patch by calling `sup.makePublic`. This allows other patches to call `add` and `getCount` directly.
```js title="Another patch"
function main() {
const countApp = sup.patch("cm37fypg7000108l2fwt82751"); // patch ID
countApp.add();
return countApp.getCount();
}
```
## sup.vc {#supvc}
The `sup.vc` package provides access to voice call functionality in Sup.
---
## Properties
### sup.vc.participants
#### [`SupUser[]`](https://sup.net/docs/reference/types/user)
Returns an array of users currently participating in the voice call.
---
## Methods
### sup.vc.queue.add()
#### `(mediaOrURL: string | SupVideo | SupImage | SupAudio, title?: string)` `β void`
```js
sup.vc.queue.add("https://example.com/audio.mp3", "My Audio");
sup.vc.queue.add(sup.image("example.jpg"), "My Image");
```
Adds media to the voice call queue. The media can be a URL string or a Sup media object (video, image, or audio).
### sup.vc.beginRecording()
#### `β string`
```js
const recordingId = sup.vc.beginRecording();
```
Starts recording the voice call. Returns a recording ID that can be used with `endRecording`.
### sup.vc.endRecording()
#### `(recordingId: string)` `β SupAudio`
```js
const recordingId = sup.vc.beginRecording();
// ... some time later ...
const audio = sup.vc.endRecording(recordingId);
```
Ends the recording of the voice call and returns the recorded audio as a `SupAudio` object.
## sup.ex {#supex}
The `sup.ex` package provides convenience functions for interacting with external APIs.
---
## Functions
### sup.ex.fal()
#### `(modelName: string, input: JSONObject, key: string)` `β JSONObject`
```js
// Run inference with a Fal AI model
const result = sup.ex.fal(
"fal-ai/fast-sdxl",
{
prompt: "pixel art of a frog in a rabbit costume",
num_inference_steps: 20,
},
sup.secret("FAL_API_KEY")
);
```
Calls the Fal AI API to run machine learning models.
### sup.ex.elevenlabs()
#### `(voiceId: string, text: string, key: string)` `β Audio`
```js
// Generate speech from text
const audio = sup.ex.elevenlabs(
"CJvvY4cO767O6IVtXUG1",
"sup sup",
sup.secret("ELEVENLABS_API_KEY")
);
// Display audio in chat
return audio;
```
Calls the ElevenLabs API to convert text to speech. Returns an Audio instance that can be played in chat.
### sup.ex.openrouter()
#### `(model: string, prompt: string)` `β string`
```js
// Generate text using OpenRouter models
const response = sup.ex.openrouter(
"anthropic/claude-3-haiku",
"Explain quantum computing in simple terms"
);
```
Calls the OpenRouter API for text generation using various AI models including GPT, Claude, and others.
### sup.ex.huggingface()
#### `(model: string, prompt: string, key: string)` `β string`
```js
// Generate text using Hugging Face models
const response = sup.ex.huggingface(
"microsoft/DialoGPT-medium",
"What is machine learning?",
sup.secret("HUGGINGFACE_API_KEY")
);
```
Calls the Hugging Face Inference API for text generation using thousands of available AI models.
### sup.ex.glif()
#### `(glifId: string, input: JSONObject, key: string)` `β JSONObject`
```js
// Run a Glif
const result = sup.ex.glif(
"clxa7m2f80004lkozza8ralld",
{ "image-input1": sup.input.image },
sup.secret("GLIF_API_KEY")
);
```
Calls the Glif API to access various AI and machine learning services.
---
## Notes
- Each method requires an API key for the respective service.
- The request and response formats will depend on the APIs and models being used. Refer to each service's documentation for details.
## sup.ai.image {#supaiimage}
The `sup.ai.image` package provides AI-powered image generation and manipulation capabilities. It is accessed through `sup.ai.image`.
```js title="Example sup.ai.image Usage"
// Generate a simple image
const image = sup.ai.image.create("a beautiful sunset over mountains");
// Generate with options
const hdImage = sup.ai.image.create("a futuristic cityscape", {
aspectRatio: "16:9",
model: "fast",
quality: "high"
});
// Generate using reference images for style/content
const styledImage = sup.ai.image.create("a cat in this artistic style", {
referenceImages: [sup.asset("style_reference.png")]
});
// Generate with web search for real-time data
const currentImage = sup.ai.image.create("today's news as a painting", {
model: "gemini-3.1-flash-image-preview",
useWebSearch: true
});
// Analyze an image
const description = sup.ai.image.interpret(
image,
"What objects are in this image?"
);
// Edit an image
const editedImage = sup.ai.image.edit(image, "make it more vibrant");
// Edit with style references
const styledEdit = sup.ai.image.edit(image, "apply this style", {
referenceImages: [sup.asset("style.png")]
});
```
---
## Methods
### sup.ai.image.create()
#### `(prompt: string, options?: SupAIImageCreateOptions)` `β SupImage`
```js
// Basic usage - just a prompt
const image = sup.ai.image.create("a beautiful sunset over mountains");
// With options
const hdImage = sup.ai.image.create("a futuristic cityscape", {
aspectRatio: "16:9",
model: "fast",
quality: "high"
});
// With reference images for style or content guidance
const styledCat = sup.ai.image.create("a cat in this artistic style", {
referenceImages: [sup.asset("style_reference.png")]
});
// With multiple reference images
const combined = sup.ai.image.create("combine these characters into one scene", {
referenceImages: [sup.asset("character1.png"), sup.asset("character2.png")]
});
// With web search for real-time information
const currentImage = sup.ai.image.create(
"today's weather in New York as a painting",
{ model: "gemini-3.1-flash-image-preview", useWebSearch: true }
);
```
Creates an image using AI based on a text prompt, optionally using reference images for style or content guidance.
**Parameters:**
- `prompt` (string): The text description of the image to generate
- `options` (optional object):
- `model` (`'best'` | `'fast'` | `'gemini-3-pro-image'` | `'gemini-3.1-flash-image-preview'`): The AI model to use
- `quality` (`'low'` | `'medium'` | `'high'`): The quality level for generation
- `aspectRatio` (string): The desired aspect ratio (e.g., `'16:9'`, `'1:1'`)
- `referenceImages` (SupImage[]): Reference images to guide the style/content of the generated image (max 14). Only supported by `'fast'`, `'gemini-3-pro-image'`, and `'gemini-3.1-flash-image-preview'` models.
- `useWebSearch` (boolean): Enable web search grounding for real-time data. Only supported by `'gemini-3-pro-image'` and `'gemini-3.1-flash-image-preview'` models.
**Returns:** A `SupImage` object containing the generated image
### sup.ai.image.interpret()
#### `(image: Image, prompt: string)` `β string`
```js
const description = sup.ai.image.interpret(
image,
"What objects are in this image?"
);
```
Converts an image to text using AI image recognition.
**Parameters:**
- `image` (SupImage): The image to analyze
- `prompt` (string): The specific question or instruction for image interpretation
**Returns:** A string containing the AI's interpretation of the image
### sup.ai.image.edit()
#### `(image: SupImage | SupImage[], prompt: string, options?: SupAIImageEditOptions)` `β SupImage`
```js
// Edit a single image
const editedImage = sup.ai.image.edit(image, "make it more vibrant");
// Edit multiple images with specific quality
const editedImages = sup.ai.image.edit(
[image1, image2],
"combine into a single black and white image",
{ model: "best", quality: "high" }
);
// Edit with style reference images
const styledEdit = sup.ai.image.edit(
image,
"apply this artistic style",
{ referenceImages: [sup.asset("style.png")] }
);
// Edit with web search for current context
const updatedImage = sup.ai.image.edit(
image,
"update the outfit to current fashion trends",
{ model: "gemini-3.1-flash-image-preview", useWebSearch: true }
);
```
Edits one or more images using AI based on a text prompt.
**Parameters:**
- `image` (SupImage | SupImage[]): A single image or array of images to edit
- `prompt` (string): The text description of how to edit the image(s)
- `options` (optional object):
- `model` (`'best'` | `'fast'` | `'gemini-3-pro-image'` | `'gemini-3.1-flash-image-preview'`): The AI model to use
- `quality` (`'low'` | `'medium'` | `'high'`): The quality level for editing
- `mask` (SupImage | string): A mask image to specify which areas to edit
- `referenceImages` (SupImage[]): Reference images to guide the style of the edited image (max 14). Only supported by `'fast'`, `'gemini-3-pro-image'`, and `'gemini-3.1-flash-image-preview'` models.
- `useWebSearch` (boolean): Enable web search grounding for real-time data. Only supported by `'gemini-3-pro-image'` and `'gemini-3.1-flash-image-preview'` models.
**Returns:** A `SupImage` object containing the edited image
---
## Notes
- Generated and edited images are returned as `SupImage` objects which can be displayed by returning them from `main()`.
- The AI models used may have specific limitations or requirements for prompts.
### Model Availability
| Feature | `'best'` | `'fast'` | `'gemini-3-pro-image'` | `'gemini-3.1-flash-image-preview'` |
|---------|----------|----------|------------------------|------------------------------------|
| `referenceImages` | No | Yes | Yes | Yes |
| `useWebSearch` | No | No | Yes | Yes |
- **`referenceImages`**: Allows up to 14 reference images to guide the style or content of generated/edited images. Not available with the `'best'` model.
- **`useWebSearch`**: Enables web search grounding for real-time information (e.g., current events, trends). Only available with `'gemini-3-pro-image'` and `'gemini-3.1-flash-image-preview'`.
## SupSVG {#supsvg}
The `SupSVG` object represents SVG content in Sup. You can return it from the `main()` function to display it in the chat.
```js title="Example SVG"
SupSVG: {
"content": ""
}
```
## Properties
### `content`
type: `string`
The SVG markup content.
## Usage
```js
// You can also use `sup.svg` as shorthand
const svg = sup.svg(`
`);
// SVGs can be edited
const cursor = svg.edit();
const editedSvg = cursor.render();
```
Note: SVGs can be processed using the edit() method which will convert them to images for editing.
## SupFile {#supfile}
The `SupFile` object represents assets uploaded in the patch editor that aren't images, audio, or video.
```js title="Example File"
{
"name": "document.pdf",
"url": "https://user-uploads.cdn.overworld.xyz/r2kxmp9wvqft8nu5hjc3bsg4.pdf",
"mimeType": "application/pdf"
}
```
## Properties
### `name`
type: `string | undefined`
The original filename, if available.
### `mimeType`
type: `string`
The MIME type of the file (e.g., "application/pdf", "text/plain").
### `url`
type: `string`
Gets the CDN URL for the file.
### `blob`
type: `Blob`
Gets the file data as a Blob. The file will be downloaded when this property is first accessed.
### `text`
type: `string`
Gets the file content as text. Use this to read the contents of a text file.
```js
function main() {
const file = sup.file('horosocopes.json'); // horoscopes.json is uploaded as an asset
const horoscopes = JSON.parse(file.text);
return sup.random.choice(horoscopes.virgo);
}
```
## Output
Returning a SupFile from a patch's `main()` function will display a file attachment in the chat.
```js
function main() {
return sup.file(
"https://example.com/document.pdf",
"application/pdf",
);
}
```
## SupEmoji {#supemoji}
The `SupEmoji` object represents a custom emoji in Sup, which can have both an image and an audio component.
```js title="Example Emoji"
{
"id": "clim8yjtr0007k10fx3pt6gip",
"name": "a",
"image": {
"url": "https://user-uploads.cdn.overworld.xyz/tobpzjemg76w36f94kd1jyh9.webp",
"width": 128,
"height": 128
},
"audio": {
"url": "https://user-uploads.cdn.overworld.xyz/k49lv3f90remnuqln1i8xt9p.mp3",
"name": "k49lv3f90remnuqln1i8xt9p.mp3"
},
"creator": {
"id": "user123",
"username": "exampleuser",
"displayName": "Example User"
}
}
```
## Properties
### `id`
type: `string`
The unique identifier for the emoji.
### `name`
type: `string`
The display name of the emoji. Note: This does not include the username of the creator.
### `image`
type: `SupImage``| undefined`
The image for the emoji.
### `audio`
type: `SupAudio` `| undefined`
The sound that plays when the emoji is used, if available.
### `creator`
type: `SupUser` `| undefined`
The user who created this emoji. System emojis will have `undefined` for this field.
## Usage
```js
function main() {
const matches =
(sup.input.text || "").match(/(:[a-z0-9]+:|:\w+\/\w+:)/g) || [];
const emojis = matches.map((match) => sup.emoji(match)).filter(Boolean);
return emojis.map((emoji) => emoji.image);
}
```
This example extracts emoji names from user input and displays the corresponding images.
## SupChat {#supchat}
The `SupChat` object represents a chat or conversation in Sup. It provides information about the chat and methods for storing chat-specific data.
```js title="Example Chat"
SupChat: {
"id": "cm37h4ky3000108mego5udiz4",
"title": "Ribbit Chat",
"type": "GROUP"
}
```
---
## Properties
### `id`
type: `string`
The unique identifier for the chat.
### `title`
type: `string`
The current title or name of the chat.
### `type`
type: `"DIRECT" | "GROUP" | "CHANNEL" | "SOLOCHAT" | "THREAD" | "PAGE"`
The type of chat:
- `DIRECT`: One-on-one conversation between two users
- `GROUP`: Group DM with multiple participants
- `CHANNEL`: Public channel
- `SOLOCHAT`: User's notebook
- `THREAD`: A message's thread or post comments
- `PAGE`: A page post
### `fs`
type: `SupFiles`
```js
sup.chat.fs.write(
'/exports/summary.txt',
sup.file('https://example.com/summary.txt', 'text/plain')
);
```
File storage for this chat. `sup.chat.fs` has the same methods as `sup.fs`, but it does not allow cross-chat access.
---
## Methods
### `set`
type: `function(key: string, value: any): void`
```js
sup.chat.set("pinnedMessage", "cm37h674y000408mec7b2ae5a");
```
Sets a key-value pair in the patch's state, scoped to this chat.
The key can be any string, and the value can be any JSON-serializable object (including Sup classes like `SupUser`, `SupImage`, `SupMessage`, etc.).
Chat-scoped values persist across patch executions and are only accessible when patch running in the same chat.
### `get`
type: `function(key: string): any | null`
```js
const pinnedMessage = sup.get("pinnedMessage"); // "cm37h674y000408mec7b2ae5a"
```
Retrieves the value of a key from the chat's state in this patch.
If the key does not exist, `get` returns `null`.
Chat-specific data can be useful for storing game states, chat settings, statistics, or other persistent information that should be available to all participants in the chat.
### `keys`
type: `function(): string[]`
```js
const keys = sup.chat.keys(); // ["pinnedMessage", "gameState", ...]
```
Returns an array of all keys stored in this chat's scoped state.
---
## File Storage
The `fs` property provides chat-scoped file storage helpers.
### `fs.write`
type: `function(path: string, file: string | SupImage | SupAudio | SupVideo | SupFile, options?: { mimeType?: string }): SupImage | SupAudio | SupVideo | SupFile`
```js
const image = sup.ai.image.create('retro game over screen');
sup.chat.fs.write('/images/game-over.png', image);
```
Writes a file into this chat's storage.
### `fs.read`
type: `function(path: string): SupImage | SupAudio | SupVideo | SupFile`
```js
const image = sup.chat.fs.read('/images/game-over.png');
return image;
```
Reads a previously stored file from this chat.
### `fs.exists`
type: `function(path: string): boolean`
```js
if (sup.chat.fs.exists('/images/game-over.png')) {
return 'already saved';
}
```
Checks whether a file or directory exists.
### `fs.list`
type: `function(path?: string): { filename?: string; isDirectory: boolean; mimeType?: string; path: string; size?: number; type: "file" | "directory"; url?: string }[]`
```js
const files = sup.chat.fs.list('/images');
```
Lists files and directories in chat storage.
### `fs.stat`
type: `function(path: string): { filename: string; isDirectory: false; mimeType: string; path: string; size: number; url: string } | null`
```js
const info = sup.chat.fs.stat('/images/game-over.png');
```
Returns metadata for a stored file, or `null` if it is missing.
### `fs.mkdir`
type: `function(path: string): void`
```js
sup.chat.fs.mkdir('/images');
```
Creates an empty directory in chat storage.
### `fs.move`
type: `function(oldPath: string, newPath: string): void`
```js
sup.chat.fs.move('/drafts/version-1.png', '/images/final.png');
```
Moves a file within chat storage.
---
## Users
The `users` property provides methods for querying participants in the chat.
### `users.list`
type: `function(options?: { limit?: number; offset?: number }): SupUser[]`
```js
const users = sup.chat.users.list(); // first 100 users
const page2 = sup.chat.users.list({ limit: 50, offset: 50 });
```
Returns an array of `SupUser` objects representing the active participants in this chat.
- `limit`: Maximum number of users to return (1-500, default: 100)
- `offset`: Number of users to skip for pagination (default: 0)
### `users.count`
type: `function(): number`
```js
const total = sup.chat.users.count();
```
Returns the total number of active participants in this chat.
## SupButton {#supbutton}
The `SupButton` object represents an interactive button in Sup that can trigger callbacks when clicked. It is created using `sup.button()`, and should be returned from the `main()` function.
```js title="Example Button"
SupButton: {
"label": "Click me!",
"userEventToken": "abc123"
}
```
## Constructor
### `sup.button(label, clickCallbackFn, value?)`
```js
const button = sup.button("Click me!", () => {
console.log("Button clicked!");
});
```
Parameters:
- `label`: The text to display on the button
- `clickCallbackFn`: A function to call when the button is clicked. Must be defined at the top level of your patch.
- `value`: Optional value to pass to the callback function
The click callback function receives an argument of type:
```ts
{
"user": SupUser, // The user who clicked the button
"value": any // The value passed when creating the button
}
```
## Properties
### `label`
type: `string`
The text displayed on the button.
### `userEventToken`
type: `string`
A unique token used internally to track button clicks.
## Notes
- The callback function must be defined at the top level of your patch (not inside other functions)
- The callback receives a single argument containing both the user who clicked and the value passed when creating the button
- For more examples of interactive elements, see the Adding Interactivity guide
## SupSegment {#supsegment}
The `SupSegment` object allows you to show different content to different users within the same patch output. It is created using `sup.segment()` and should be returned from the `main()` function.
## Limitations
**Client-side only**: Segments control which content is *displayed* to users on the client, but all segment data is sent to every user. This is a UI presentation feature, not a security mechanism.
**No access restriction**: The segment does not prevent users from accessing the "hidden" content. A determined user could inspect the underlying data to see content meant for other segments. If you need to protect sensitive information:
- Do not include secrets, private data, or security tokens in any segment
- Implement actual access control in your patch logic by checking `sup.user` before making any changes or revealing data that is privileged.
- Use segments only for personalization and UI customization, not for security
## Constructor
### `sup.segment(users, matchingElements?, notMatchingElements?)`
```js
// Show different content to a specific user
const segment = sup.segment(
user,
"Content only this user sees",
"Content everyone else sees"
);
// Target multiple users
const segment = sup.segment(
[user1, user2],
["Content for these users"],
["Content for others"]
);
```
Parameters:
- `users`: A single user (or user ID string), or an array of users/IDs to target
- `matchingElements`: Content shown to users in the list (optional)
- `notMatchingElements`: Content shown to users NOT in the list (optional)
Both element parameters accept single values or arrays of any valid return type (strings, numbers, SupImage, SupButton, etc.).
## Properties
### `userIds`
type: `string[]`
The list of user IDs that will see the matching content.
### `matching`
type: `unknown[]`
The content shown to users in the `userIds` list.
### `notMatching`
type: `unknown[]`
The content shown to users NOT in the `userIds` list.
## Example Usage
```js
function main() {
const currentUser = sup.user;
// Show a special message to the patch creator
const creatorSegment = sup.segment(
"creator-user-id",
"Welcome back, creator!",
"Hello, visitor!"
);
// Show admin controls only to specific users
const adminControls = sup.segment(
[admin1, admin2],
[sup.button("Delete", onDelete), sup.button("Edit", onEdit)],
[] // Others see nothing
);
return [creatorSegment, adminControls];
}
```
## SupCaller {#supcaller}
The `SupCaller` object provides information about what invoked the current patch β either a user running it directly or another patch calling it via `patch.run()` or `patch.public`.
Access it via `sup.caller`.
```js title="Example"
// Check who called this patch
if (sup.caller.type === 'user') {
return `Run by ${sup.caller.user.username}`;
} else {
return `Called by patch ${sup.caller.patch.name}`;
}
// Check if this is a preview invocation
if (sup.caller.isPreview) {
return "This is a preview β not yet shared with chat";
}
```
## Properties
### type
type: `'user' | 'patch'`
`'user'` when the patch was invoked directly by a user, `'patch'` when called by another patch (via `patch.run()` or `patch.public`).
### isPreview
type: `boolean`
Whether this invocation is a preview (unsent draft). When `true`, the message has not been shared with the chat yet β the user sees a "Share with chat" button.
### user
type: `SupUser` `optional`
The user who invoked this patch. Only set when `type === 'user'`, otherwise `undefined`.
```js
if (sup.caller.type === 'user') {
const username = sup.caller.user.username;
const pfp = sup.caller.user.pfp;
}
```
### patch
type: `SupPatch` `optional`
The patch that called this patch. Only set when `type === 'patch'`, otherwise `undefined`.
```js
if (sup.caller.type === 'patch') {
const callerName = sup.caller.patch.name;
const callerAuthor = sup.caller.patch.author.username;
}
```
## sup.vm {#supvm}
The `sup.vm()` function launches dedicated Linux VMs for executing shell commands, running build processes, serving web content, and processing media files. Each VM provides full Linux environment access with persistent storage and network capabilities.
---
## Creating a VM
### sup.vm()
#### `() β SupVM`
```js
// Create a new VM
const vm = sup.vm();
// Reuse VM across invocations by storing in global state
let vm = sup.global.get('vm');
if (!vm) {
vm = sup.vm();
sup.global.set('vm', vm);
}
```
Launches a fresh Linux VM. Each call provisions a new VM instance. To reuse the same VM across multiple patch invocations, persist the `SupVM` object in the datastore using `sup.global.set()`.
### Recommended lifecycle and reuse strategy
In most patches, you should **not** call `sup.vm()` on every run.
- Prefer one VM per patch (single shared VM)
- Or one VM per chat context (keyed by `sup.chat.id`)
- Recreate only when missing or unhealthy
Creating many short-lived VMs is usually slower than reusing an existing VM.
#### Pattern: one VM per patch
```js
function ensurePatchVm() {
let vm = sup.global.get('vm');
if (vm) {
const health = vm.execResult('echo ok');
if (health.exitCode === 0) return vm;
}
vm = sup.vm();
vm.exec('mkdir -p /workspace/app'); // Optional one-time init
sup.global.set('vm', vm);
return vm;
}
```
#### Pattern: one VM per chat (keyed by chat ID)
```js
function ensureChatVm() {
const key = `vm:${sup.chat.id}`;
let vm = sup.global.get(key);
if (vm) {
const health = vm.execResult('echo ok');
if (health.exitCode === 0) return vm;
}
vm = sup.vm();
vm.exec('mkdir -p /workspace/app'); // Optional one-time init
sup.global.set(key, vm);
return vm;
}
```
---
## VM Properties
### vm.id
#### `string`
```js
const vmId = vm.id;
// 'vm-1234abcd'
```
Unique identifier for the VM instance. This ID remains stable across invocations if the VM is persisted in the datastore.
### vm.url
#### `string`
```js
vm.execStreaming('python3 -m http.server 3000');
const webUrl = vm.url;
// 'https://abcd.supvm.com'
```
Public HTTPS URL that forwards traffic to port 3000 on the VM. Use this to expose web servers, APIs, or any HTTP-accessible service running inside the VM.
---
## Command Execution
### vm.exec()
#### `(command: string | string[], options?: { cwd?: string; env?: Record }) β string`
```js
// Simple command
const output = vm.exec('uname -a');
// 'Linux supvm-1234 ...'
// With working directory
const files = vm.exec('ls', { cwd: 'app' });
// 'README.md\nindex.mjs\n'
// With environment variables
const result = vm.exec('echo $MY_VAR', {
env: { MY_VAR: 'hello' }
});
// 'hello\n'
// Array syntax for safer argument handling
const output = vm.exec(['git', 'clone', repoUrl]);
```
Runs a command and returns merged stdout/stderr output. Throws a `SupVMExecError` if the command exits with a non-zero code.
**Command Format:**
- String: Executed via `sh -c` (supports shell features like pipes, redirects)
- Array: Direct execution (safer for dynamic arguments, no shell injection risk)
**Options:**
- `cwd`: Working directory for the command
- `env`: Environment variables to set (merged with default environment)
**Error Handling:**
Throws `SupVMExecError` with properties:
- `exitCode`: Command exit code
- `stdout`: Standard output
- `stderr`: Standard error
- `output`: Combined stdout/stderr
- `reason`: Optional error description
### vm.execResult()
#### `(command: string | string[], options?: { cwd?: string; env?: Record }) β { output: string; stdout: string; stderr: string; exitCode: number; error?: string }`
```js
const result = vm.execResult(['node', '--version']);
// { exitCode: 0, stdout: 'v18.17.0\n', stderr: '', output: 'v18.17.0\n' }
const result = vm.execResult('exit 1');
// { exitCode: 1, stdout: '', stderr: '', output: '', error: 'Command failed' }
if (result.exitCode !== 0) {
return `Build failed: ${result.output}`;
}
```
Runs a command and captures stdout/stderr/exitCode without throwing on failure. Useful when you need to handle errors programmatically or when non-zero exit codes are expected.
**Returns:**
- `exitCode`: Command exit code (0 = success)
- `stdout`: Standard output only
- `stderr`: Standard error only
- `output`: Combined stdout/stderr
- `error`: Optional error message
### vm.execStreaming()
#### `(command: string | string[], options?: { cwd?: string; env?: Record }) β SupVMStreamHandle`
```js
// Stream output for long-running commands
const cmd = vm.execStreaming('apt-get install -y neovim');
let collected = '';
for (const chunk of cmd.output()) {
collected += chunk;
sup.status(collected);
}
// Run in background and check later
const server = vm.execStreaming('python3 -m http.server 3000');
sup.sleep(1000);
console.log(server.status); // { state: 'running' }
// Stream stdout only
const build = vm.execStreaming('npm run build');
for (const chunk of build.stdout()) {
sup.status(`Building: ${chunk}`);
}
```
Executes a command and returns a handle for streaming output in real-time. Useful for long-running processes, build commands, or when you need to provide progress updates.
**Stream Handle Methods:** See SupVMStreamHandle section below.
---
## File Transfer
### vm.upload()
#### `(source: string | SupImage | SupAudio | SupVideo | SupFile, dest?: string) β string`
```js
// Upload with automatic filename
const path = vm.upload(sup.input.image);
// 'image.png'
// Upload with custom destination
const path = vm.upload(sup.input.image, 'thumbnail.png');
// 'thumbnail.png'
// Upload text content
const path = vm.upload('console.log("Hello")', 'script.js');
// 'script.js'
// Process uploaded file
const imagePath = vm.upload(sup.input.image, 'input.png');
vm.exec(`convert ${imagePath} -resize 512x512 output.png`);
```
Copies content into the VM filesystem and returns the destination path.
**Source Types:**
- `string`: URL or raw text content
- `SupImage`, `SupAudio`, `SupVideo`, `SupFile`: Sup media objects
**Destination:**
- If omitted, filename is inferred from content type using libmagic
- If provided, must include filename (e.g., `'images/pic.png'`)
### vm.uploadStreaming()
#### `(source: string | SupImage | SupAudio | SupVideo | SupFile, dest?: string) β SupFileUploadHandle`
```js
// Upload with progress updates
const handle = vm.uploadStreaming(sup.input.file, 'dataset.zip');
for (const percent of handle.progress()) {
sup.status(`Uploading dataset.zip (${percent}%)`);
}
const path = handle.wait();
// 'dataset.zip'
// Upload in background
const upload = vm.uploadStreaming(largeFile, 'data.bin');
// Do other work...
const finalPath = upload.wait();
```
Uploads content with streaming progress updates. Useful for large files where you want to show upload progress.
**Returns:** SupFileUploadHandle for monitoring upload progress.
### vm.download()
#### `(path: string) β SupImage | SupAudio | SupVideo | SupFile`
```js
// Download and return as Sup media
const image = vm.download('output.png');
return image;
// Download after processing
vm.exec('convert input.png -blur 0x8 blurred.png');
const result = vm.download('blurred.png');
return result;
```
Downloads a file from the VM and returns it as a Sup media object. The return type is automatically determined by the file's MIME type.
**Returns:**
- `SupImage` for image files (PNG, JPG, GIF, etc.)
- `SupAudio` for audio files (MP3, WAV, etc.)
- `SupVideo` for video files (MP4, WebM, etc.)
- `SupFile` for other file types
### vm.downloadStreaming()
#### `(path: string) β SupFileDownloadHandle`
```js
// Download with progress
const handle = vm.downloadStreaming('large-file.iso');
for (const percent of handle.progress()) {
sup.status(`Downloading (${percent}%)`);
}
const file = handle.wait();
return file;
// Download in background
const download = vm.downloadStreaming('output.mp4');
// Do other work...
const video = download.result;
```
Downloads a file with streaming progress updates. Useful for large files where you want to show download progress.
**Returns:** SupFileDownloadHandle for monitoring download progress.
### vm.cat()
#### `(path: string) β string`
```js
// Read text file contents
const config = vm.cat('/workspace/.env');
console.log(config);
// Read command output
vm.exec('echo "Hello, World!" > output.txt');
const content = vm.cat('output.txt');
// 'Hello, World!\n'
```
Reads a text file from the VM and returns its contents as a string. Convenience method for reading text files without needing to download and parse them.
---
## Session Management
### vm.session()
#### `() β SupVMSession`
```js
// Create session with persistent context
const session = vm.session();
session.cd('/app');
session.setEnv('NODE_ENV', 'production');
session.exec('npm install');
session.exec('npm run build');
// Persist session across invocations
sup.global.set('vm-session', session);
const session = sup.global.get('vm-session');
```
Creates a session object that maintains working directory and environment variables across multiple commands. This is cleaner than passing `cwd` and `env` options to every `exec()` call.
**Session Methods:** See SupVMSession section below.
### vm.delete()
#### `() β void`
```js
// Clean up VM when done
vm.delete();
sup.global.set('vm', undefined);
// Conditional cleanup
if (sup.input.text === 'cleanup') {
const vm = sup.global.get('vm');
if (vm) {
vm.delete();
sup.global.set('vm', undefined);
return 'VM deleted';
}
}
```
Immediately tears down the VM and releases all resources. Use this when you're done with a VM to avoid leaving it running.
**Important:** After calling `delete()`, the VM object becomes invalid. Remove it from the datastore to avoid attempting to use a deleted VM.
---
## SupVMSession
A helper class that maintains working directory and environment state across multiple VM commands.
### session.cd()
#### `(dir: string) β void`
```js
session.cd('/app');
session.exec('ls'); // Lists /app contents
```
Sets the working directory for all subsequent commands in this session.
### session.setEnv()
#### `(key: string, value: string) β void`
```js
session.setEnv('NODE_ENV', 'production');
session.setEnv('PORT', '3000');
session.exec('npm start');
```
Sets an environment variable for all subsequent commands in this session.
### session.cwd
#### `string | undefined`
```js
console.log(session.cwd);
// '/workspace/app/api'
```
Current working directory override for this session, if any.
### session.env
#### `Record`
```js
console.log(session.env);
// { NODE_ENV: 'production', PORT: '3000' }
```
Snapshot of environment variable overrides applied to this session.
### session.exec()
#### `(command: string | string[]) β string`
```js
const output = session.exec('npm test');
```
Runs a command with the session's `cwd` and `env` settings. Throws on non-zero exit code.
### session.execResult()
#### `(command: string | string[]) β { output: string; stdout: string; stderr: string; exitCode: number; error?: string }`
```js
const result = session.execResult('npm test');
if (result.exitCode !== 0) {
return `Tests failed: ${result.output}`;
}
```
Runs a command with the session's context and returns result without throwing.
### session.execStreaming()
#### `(command: string | string[]) β SupVMStreamHandle`
```js
const stream = session.execStreaming('npm run dev');
for (const chunk of stream.output()) {
sup.status(chunk);
}
```
Streams command output with the session's context applied.
---
## SupVMStreamHandle
Handle returned by `vm.execStreaming()` for controlling and reading from streaming command execution.
### handle.output()
#### `() β Generator`
```js
let collected = '';
for (const chunk of handle.output()) {
collected += chunk;
sup.status(collected);
}
return collected;
```
Iterates over merged stdout+stderr chunks as they arrive.
### handle.stdout()
#### `() β Generator`
```js
let output = '';
for (const chunk of handle.stdout()) {
output += chunk;
sup.status(`Building: ${output}`);
}
return output;
```
Iterates over stdout chunks only, ignoring stderr.
### handle.stderr()
#### `() β Generator`
```js
let errors = '';
for (const chunk of handle.stderr()) {
errors += chunk;
sup.status(`Errors: ${errors}`);
}
return errors;
```
Iterates over stderr chunks only, ignoring stdout.
### handle.sendStdin()
#### `(text: string) β void`
```js
const handle = vm.execStreaming('python3');
handle.sendStdin('print(1 + 1)\n');
handle.sendEOF();
const output = handle.block();
// '2\n'
```
Writes text to the running process's stdin. Useful for interactive commands that read from stdin.
### handle.sendEOF()
#### `() β void`
```js
const handle = vm.execStreaming('python3');
handle.sendStdin('1 + 1\n');
handle.sendEOF(); // Signal end of input
const result = handle.block();
```
Closes stdin (sends EOF signal). Required to signal the end of input for commands that read from stdin.
### handle.kill()
#### `() β void`
```js
const server = vm.execStreaming('python3 -m http.server');
sup.sleep(5000);
server.kill(); // Terminate after 5 seconds
```
Terminates the running process immediately. First attempts SIGTERM, then SIGKILL after a timeout if the process is still alive.
### handle.block()
#### `() β string`
```js
const handle = vm.execStreaming('echo done');
const output = handle.block();
// 'done\n'
```
Waits for the command to exit and returns merged output. Throws `SupVMExecError` if the exit code is non-zero.
### handle.blockResult()
#### `() β { output: string; stdout: string; stderr: string; exitCode: number; error?: string }`
```js
const { exitCode, output } = handle.blockResult();
if (exitCode !== 0) {
return `Build failed: ${output}`;
}
return `Success: ${output}`;
```
Waits for exit and captures stdout/stderr/exitCode without throwing. Useful when non-zero exit codes are expected.
### handle.status
#### `{ state: 'running' | 'exited' | 'killed' | 'error'; exitCode?: number }`
```js
if (handle.status.state === 'exited') {
return `Job completed with code ${handle.status.exitCode}`;
}
// Check periodically
while (handle.status.state === 'running') {
sup.status('Still running...');
sup.sleep(1000);
}
```
Lightweight status check without blocking. Check this property to determine if a command is still running.
### handle.tail()
#### `(n: number) β string[]`
```js
const recentOutput = handle.tail(20).join('\n');
sup.status(recentOutput);
```
Fetches the last N lines of merged output (best effort). Useful for checking recent output without reading the entire stream.
---
## SupFileUploadHandle
Handle returned by `vm.uploadStreaming()` for monitoring file upload progress.
### handle.progress()
#### `() β Generator`
```js
for (const percent of handle.progress()) {
sup.status(`Uploading: ${percent}%`);
}
```
Iterates over completion percentages from 0 to 100 as the upload progresses.
### handle.wait()
#### `() β string`
```js
const path = handle.wait();
// 'uploaded-file.zip'
```
Blocks until upload completes and returns the final file path in the VM.
### handle.path
#### `string`
```js
handle.wait();
const filePath = handle.path;
// 'uploaded-file.zip'
```
Final path once the upload finishes. Only valid after calling `wait()` or draining the progress iterator.
---
## SupFileDownloadHandle
Handle returned by `vm.downloadStreaming()` for monitoring file download progress.
### handle.progress()
#### `() β Generator`
```js
for (const percent of handle.progress()) {
sup.status(`Downloading: ${percent}%`);
}
```
Iterates over completion percentages from 0 to 100 as the download progresses.
### handle.wait()
#### `() β SupImage | SupAudio | SupVideo | SupFile`
```js
const file = handle.wait();
return file;
```
Blocks until download completes and returns the downloaded media object.
### handle.result
#### `SupImage | SupAudio | SupVideo | SupFile`
```js
handle.wait();
const media = handle.result;
return media;
```
Final downloaded media object. Only valid after calling `wait()` or draining the progress iterator.
---
## Error Handling
### SupVMExecError
Exception thrown when a VM command exits with a non-zero code.
```js
try {
vm.exec('exit 1');
} catch (error) {
console.log(error.exitCode); // 1
console.log(error.stdout); // ''
console.log(error.stderr); // ''
console.log(error.output); // Combined output
console.log(error.reason); // Optional error description
}
```
**Properties:**
- `exitCode`: Command exit code (non-zero)
- `stdout`: Standard output
- `stderr`: Standard error
- `output`: Combined stdout/stderr
- `reason`: Optional error description
- `message`: Error message string
---
## Common Patterns
### Ensure VM (recommended default)
```js
function ensureVm() {
const key = `vm:${sup.chat.id}`; // Use 'vm' for one VM per patch
let vm = sup.global.get(key);
if (vm) {
const health = vm.execResult('echo ok');
if (health.exitCode === 0) return vm;
}
vm = sup.vm();
vm.exec('mkdir -p /workspace/app');
vm.execResult('node --version'); // Optional warm-up check
sup.global.set(key, vm);
return vm;
}
function main() {
const vm = ensureVm();
return `Ready: ${vm.id}`;
}
```
### Web Server
```js
function main() {
const key = `vm:${sup.chat.id}`;
let vm = sup.global.get(key);
if (!vm || vm.execResult('echo ok').exitCode !== 0) {
vm = sup.vm();
sup.global.set(key, vm);
}
// Start server only if not already running
const check = vm.execResult('pgrep -f "python3 -m http.server 3000"');
if (check.exitCode !== 0) {
vm.execStreaming('python3 -m http.server 3000');
}
return [
'Server is running at:',
vm.url
];
}
```
### Image Processing
```js
function main() {
// Reuse per-chat VM by default
const key = `vm:${sup.chat.id}`;
let vm = sup.global.get(key);
if (!vm || vm.execResult('echo ok').exitCode !== 0) {
vm = sup.vm();
sup.global.set(key, vm);
}
// Upload image
const inputPath = vm.upload(sup.input.image, 'input.png');
// Process with ImageMagick
vm.exec(`convert ${inputPath} -resize 512x512 -blur 0x8 output.png`);
// Download result
const result = vm.download('output.png');
return result;
}
```
### Ephemeral VM job (exception)
Use this only for truly one-off tasks where cleanup is required immediately after execution.
```js
function main() {
const vm = sup.vm();
try {
vm.exec('echo "one-off task" > /workspace/job.txt');
return vm.cat('/workspace/job.txt');
} finally {
vm.delete();
}
}
```
## sup.search {#supsearch}
The `sup.search` package provides search capabilities for finding patches and emojis on the platform. It is accessed through `sup.search`.
```js title="Example sup.search Usage"
// Search for patches
const aiPatches = sup.search.patches("ai image generation", 5);
console.log(aiPatches); // Array of up to 5 matching patches
// Search for emojis
const dogEmojis = sup.search.emojis("dog", 3);
console.log(dogEmojis); // Array of up to 3 matching emojis
// Use found patches
const imagePatch = sup.search.patches("image editor", 1)[0];
if (imagePatch) {
const result = imagePatch.run(sup.input.image);
return result;
}
// Use found emojis
const emoji = sup.search.emojis("fire", 1)[0];
if (emoji) {
return [emoji, "Found the emoji!"];
}
```
---
## Methods
### sup.search.patches()
#### `(query: string, numResults?: number)` `β SupPatch[]`
```js
// Search for patches related to "music"
const patches = sup.search.patches("music player", 5);
// Get first result
const topPatch = sup.search.patches("weather", 1)[0];
// Use default number of results (if not specified)
const results = sup.search.patches("calculator");
// Execute a found patch
const imagePatches = sup.search.patches("blur image");
if (imagePatches.length > 0) {
const result = imagePatches[0].run(sup.input.image);
return result;
}
```
Searches for patches on the platform using semantic search. Returns patches that match the query based on their names, descriptions, and functionality.
**Parameters:**
- `query` (string): Search query describing what patches to find
- `numResults` (optional number): Maximum number of results to return (default varies)
**Returns:** Array of `SupPatch` objects matching the query
**Example use cases:**
```js
// Find utility patches
const converters = sup.search.patches("unit converter");
const timers = sup.search.patches("countdown timer");
// Find creative tools
const imageTools = sup.search.patches("photo editor effects");
const musicMakers = sup.search.patches("beat maker synthesizer");
// Call a found patch's public functions
const mathPatch = sup.search.patches("advanced calculator", 1)[0];
if (mathPatch.public.factorial) {
const result = mathPatch.public.factorial(5);
console.log(result); // 120
}
```
### sup.search.emojis()
#### `(query: string, numResults?: number)` `β SupEmoji[]`
```js
// Search for animal emojis
const animals = sup.search.emojis("cat dog", 10);
// Get a specific emoji
const fire = sup.search.emojis("fire", 1)[0];
// Use default number of results
const hearts = sup.search.emojis("heart");
// Display found emojis
const results = sup.search.emojis("happy face", 5);
return ["Found emojis:", ...results];
```
Searches for emojis on the platform. Returns custom and standard emojis that match the query.
**Parameters:**
- `query` (string): Search query describing what emojis to find
- `numResults` (optional number): Maximum number of results to return (default varies)
**Returns:** Array of `SupEmoji` objects matching the query
**Example use cases:**
```js
// Find reaction emojis
const reactions = sup.search.emojis("thumbs up like yes");
const celebrations = sup.search.emojis("party celebrate confetti");
// Check emoji properties
const emoji = sup.search.emojis("robot", 1)[0];
if (emoji) {
console.log(emoji.name); // 'robot'
console.log(emoji.id); // Unique ID
console.log(emoji.image); // SupImage if available
console.log(emoji.audio); // SupAudio if available
}
// Build an emoji response
const randomEmoji = sup.search.emojis(sup.input.text, 1)[0];
if (randomEmoji) {
return [randomEmoji, "Here's an emoji for you!"];
}
```
---
## Return Types
### SupPatch
Returned patches are `SupPatch` objects with these capabilities:
```js
const patch = sup.search.patches("calculator", 1)[0];
// Properties
patch.id; // Unique patch ID
patch.code; // Source code
patch.public; // Public functions and data
// Methods
patch.run(...args); // Execute main() function
patch.public.functionName(); // Call public functions
```
See the Patch type reference for more details.
### SupEmoji
Returned emojis are `SupEmoji` objects with these properties:
```js
const emoji = sup.search.emojis("star", 1)[0];
emoji.id; // string - Unique emoji ID
emoji.name; // string - Emoji name/shortname
emoji.image; // SupImage | undefined - Visual representation
emoji.audio; // SupAudio | undefined - Sound effect
```
See the Emoji type reference for more details.
---
## Examples
### Finding and Using Related Patches
```js
// Search for image processing patches
const imagePatches = sup.search.patches("image filter effects", 3);
if (imagePatches.length === 0) {
return "No image patches found!";
}
// Let user know what we found
const patchNames = imagePatches.map(p => p.id).join(", ");
sup.status(`Found patches: ${patchNames}`);
// Use the first one
const result = imagePatches[0].run(sup.input.image);
return result;
```
### Building an Emoji Picker
```js
const query = sup.input.text || "random emoji";
const emojis = sup.search.emojis(query, 10);
if (emojis.length === 0) {
return "No emojis found for: " + query;
}
// Return grid of emojis
return [
`Found ${emojis.length} emojis for "${query}":`,
...emojis
];
```
### Composing Multiple Patches
```js
// Find related patches
const blurPatch = sup.search.patches("blur image", 1)[0];
const sharpenPatch = sup.search.patches("sharpen enhance", 1)[0];
if (!blurPatch || !sharpenPatch) {
return "Required patches not found";
}
// Apply multiple effects
let image = sup.input.image;
image = blurPatch.run(image);
image = sharpenPatch.run(image);
return image;
```
---
## Notes
- Search uses semantic matching, not just keyword matching
- Results are ranked by relevance to the query
- Search includes only public patches
## sup.lookup {#suplookup}
The `sup.lookup` package resolves platform entities by ID or slug. It is useful when your patch already knows a concrete identifier and wants the corresponding Sup object.
Unlike `sup.search`, lookup does not perform semantic search. It attempts to fetch an exact entity and returns `undefined` when nothing matches.
```js title="Example sup.lookup Usage"
const patch = sup.lookup.patch('cm37fypg7000108l2fwt82751');
if (!patch) return 'Patch not found';
return patch.run('hello');
```
---
## Methods
### sup.lookup.chat()
#### `(id: string)` `β SupChat | undefined`
```js
const chat = sup.lookup.chat('cm37h4ky3000108mego5udiz4');
if (chat) {
return chat.title;
}
```
Looks up a chat by ID.
### sup.lookup.emoji()
#### `(idOrSlug: string)` `β SupEmoji | undefined`
```js
const emoji = sup.lookup.emoji('star');
if (emoji) {
return [emoji, `Found ${emoji.name}`];
}
```
Looks up an emoji by ID or slug.
### sup.lookup.patch()
#### `(idOrSlug: string)` `β SupPatch | undefined`
```js
const patch = sup.lookup.patch('/shahruz/weather');
if (!patch) return 'Patch not found';
return patch.run();
```
Looks up a patch by ID or slug.
### sup.lookup.user()
#### `(idOrUsername: string)` `β SupUser | undefined`
```js
const user = sup.lookup.user('shahruz');
if (user) {
return `Found @${user.username}`;
}
```
Looks up a user by ID or username.
---
## When To Use
- Use `sup.lookup` when you already know the exact ID or slug.
- Use `sup.search` when you want fuzzy or semantic discovery.
## sup.intent {#supintent}
The `sup.intent` package provides intent return types that open native Sup flows with prefilled values.
---
## Methods
### sup.intent.emoji()
#### `(opts: { image?: SupImage; name?: string; audio?: SupAudio })` `β SupEmojiIntent`
```js
function main() {
const image = sup.ai.image.create('emoji of a star');
return sup.intent.emoji({ name: 'star', image });
}
```
Returns an intent to open the emoji creation flow with prefilled values.
At least one of `image` or `audio` is required. The user can edit the prefilled values before saving.
```js
function main() {
const image = sup.ai.image.create(`emoji of ${sup.input.text || 'sparkles'}`);
const sound = sup.ai.tts('sparkle');
return sup.intent.emoji({
name: 'sparkles',
image,
audio: sound
});
}
```
Intent values should usually be returned directly from `main()` or another public function so Sup can open the corresponding native flow.
## sup.fs {#supfs}
The `sup.fs` package provides file storage access for the current chat, with support for cross-chat reads using absolute paths.
`sup.chat.fs` exposes the same methods but is limited to the current chat only.
Files written through this API always belong to the current chat. Reads respect access control.
```js title="Example sup.fs Usage"
// Save a generated image for reuse later in the chat
const avatar = sup.ai.image.create('pixel art frog avatar');
sup.fs.write('/avatars/frog.png', avatar);
// List saved files
const files = sup.fs.list('/avatars');
// Read one back
const image = sup.fs.read('/avatars/frog.png');
return image;
```
---
## Methods
### sup.fs.exists()
#### `(path: string)` `β boolean`
```js
if (sup.fs.exists('/notes/today.txt')) {
return 'notes file exists';
}
```
Checks whether a file or directory exists at the given path.
### sup.fs.list()
#### `(path?: string)` `β { filename?: string; isDirectory: boolean; mimeType?: string; path: string; size?: number; type: "file" | "directory"; url?: string }[]`
```js
const root = sup.fs.list();
const avatars = sup.fs.list('/avatars');
```
Lists files and directories at the given path. When `path` is omitted, it lists the current chat's root storage directory.
Each returned item includes:
- `path`: Full path within the chat file store
- `type`: `"file"` or `"directory"`
- `isDirectory`: Convenience boolean
- `filename`: File name when present
- `mimeType`: MIME type for files when known
- `size`: File size in bytes when known
- `url`: Resolved URL for downloadable files when available
### sup.fs.mkdir()
#### `(path: string)` `β void`
```js
sup.fs.mkdir('/exports');
sup.fs.mkdir('/avatars');
```
Creates an empty directory in the current chat's file store.
### sup.fs.move()
#### `(oldPath: string, newPath: string)` `β void`
```js
sup.fs.move('/drafts/cover.png', '/published/cover.png');
```
Moves a file to a new path within the current chat.
### sup.fs.read()
#### `(path: string)` `β SupImage | SupAudio | SupVideo | SupFile`
```js
const screenshot = sup.fs.read('/images/receipt.png');
return screenshot;
```
Reads a file and returns the appropriate Sup media object.
This throws if the file does not exist or the current patch does not have access to it.
### sup.fs.stat()
#### `(path: string)` `β null | { filename: string; isDirectory: false; mimeType: string; path: string; size: number; url: string }`
```js
const info = sup.fs.stat('/exports/report.pdf');
if (!info) return 'Missing report';
return `${info.filename} is ${info.size} bytes`;
```
Returns metadata for a file, or `null` if it does not exist or is not accessible.
### sup.fs.write()
#### `(path: string, file: string | SupImage | SupAudio | SupVideo | SupFile, options?: { mimeType?: string })` `β SupImage | SupAudio | SupVideo | SupFile`
```js
const report = sup.file('https://example.com/report.json', 'application/json');
sup.fs.write('/reports/latest.json', report);
sup.fs.write('/reports/backup.json', 'https://example.com/report.json', {
mimeType: 'application/json'
});
```
Writes a file into the current chat's file store and returns the saved Sup file object.
You can write:
- URL strings
- `SupImage`
- `SupAudio`
- `SupVideo`
- `SupFile`
When writing a URL string, provide `options.mimeType` if you need a specific content type.
When cross-chat access is enabled for the current patch context, you can read from another chat with an absolute path such as `/sup/#03fer8dv70jah614p86zpc3ix/images/avatar.png`.
---
## Notes
- `sup.fs` supports cross-chat reads with absolute `/sup/#[chatId]/path` paths when the current chat type allows it.
- `sup.chat.fs` is always limited to the current chat.
- Writes are only allowed for the current chat.
## sup.chat.npc {#supchatnpc}
The `sup.chat.npc` property provides access to the NPC functionality in the current chat. NPCs are AI-powered chat participants that can be nudged, whispered to, or prompted for responses.
```js title="Example NPC Usage"
// Check if NPC is enabled
if (sup.chat.npc.enabled) {
// Get NPC info
console.log(`NPC: ${sup.chat.npc.displayName} (@${sup.chat.npc.username})`);
console.log(`Persona: ${sup.chat.npc.persona}`);
// Send a nudge to get the NPC's attention
sup.chat.npc.nudge();
// Send a private whisper message
sup.chat.npc.whisper("The user seems confused about the topic");
// Get a direct response from the NPC
const response = sup.chat.npc.prompt("What do you think about this topic?");
// Post a message directly as the NPC (with optional custom username)
sup.chat.npc.say("Hello from the NPC!", { username: "mr_npc" });
return response;
}
```
---
## Properties
### sup.chat.npc.enabled
#### `boolean`
```js
if (sup.chat.npc.enabled) {
// NPC functionality is available
}
```
Returns `true` if an NPC is enabled and active in this chat, `false` otherwise.
### sup.chat.npc.username
#### `string`
```js
const npcUsername = sup.chat.npc.username; // "assistant"
```
The NPC's username. Defaults to `"npc"` if not configured.
### sup.chat.npc.displayName
#### `string`
```js
const npcName = sup.chat.npc.displayName; // "Chat Assistant"
```
The NPC's display name. Falls back to the username if no display name is set.
### sup.chat.npc.pfp
#### `SupImage` `| undefined`
```js
const npcAvatar = sup.chat.npc.pfp;
if (npcAvatar) {
return npcAvatar; // Display the NPC's profile picture
}
```
The NPC's profile picture as a `SupImage`, or `undefined` if no avatar is set.
### sup.chat.npc.persona
#### `string`
```js
const npcPersonality = sup.chat.npc.persona;
// "I am a helpful assistant focused on coding and technical discussions"
```
The NPC's personality/persona prompt that defines how it behaves and responds.
---
## Methods
### sup.chat.npc.nudge()
#### `() β void`
```js
function main() {
if (sup.user.username === "alice") {
// Nudge the NPC to join the conversation
sup.chat.npc.nudge();
return "Called the assistant to help!";
}
}
```
Sends a nudge to the NPC, prompting it to participate in the conversation. The NPC will typically respond with a contextually appropriate message.
**Important:**
- Executes after the patch has fully completed, allowing the NPC to comment on or respond to the patch output
- Either `nudge()` or `whisper()` can be called once per patch execution, but not both
### sup.chat.npc.whisper()
#### `(content: string) β void`
```js
function main() {
if (sup.input.text?.includes("help")) {
// Send private context to the NPC
sup.chat.npc.whisper("The user is asking for help with: " + sup.input.text);
return "I've alerted the assistant about your question.";
}
}
```
Sends a private whisper message to the NPC. This message is not visible to other chat participants but provides context for the NPC's next response.
**Parameters:**
- `content` (string): The whisper message to send to the NPC
**Important:**
- Executes after the patch has fully completed, allowing the NPC to comment on or respond to the patch output
- Either `nudge()` or `whisper()` can be called once per patch execution, but not both
### sup.chat.npc.prompt()
#### `(userPrompt: string, options?: SupAIPromptOptions) β string | object`
```js
function main() {
if (!sup.chat.npc.enabled) {
return "No NPC available in this chat";
}
// Get a direct response from the NPC
const analysis = sup.chat.npc.prompt(
"Analyze this user's coding question and provide suggestions",
{ temperature: 0.7 }
);
return analysis;
}
```
Directly prompts the NPC for a response using its configured persona. This is synchronous and returns the NPC's response immediately.
**Parameters:**
- `userPrompt` (string): The prompt/question to send to the NPC
- `options` (optional): AI prompt options like temperature, schema, etc.
**Returns:** The NPC's response as a string, or structured object if using schema
**Note:** The NPC will respond as its configured persona, and the system prompt includes both the NPC's display name and personality description.
### sup.chat.npc.say()
#### `(firstArg?: string | SupImage | SupAudio | SupVideo, ...args: Array | { pfp?: string | SupImage; username?: string }>) β void`
```js
function main() {
if (!sup.chat.npc.enabled) {
return "No NPC available in this chat";
}
// Post a simple text message as the NPC
sup.chat.npc.say("Hello! I'm here to help.");
// Post a message with a custom profile picture (URL string)
sup.chat.npc.say("Check out this update!", {
pfp: "https://example.com/custom-avatar.png"
});
// Post a message with a custom profile picture (SupImage)
const customAvatar = sup.image("https://example.com/custom-avatar.png");
sup.chat.npc.say("I'm using a custom avatar!", { pfp: customAvatar });
// Post a message with a custom username
sup.chat.npc.say("This message appears from a different username!", {
username: "custom_bot"
});
// Post a message with both custom pfp and username
sup.chat.npc.say("I'm a completely different character!", {
pfp: "https://example.com/other-avatar.png",
username: "other_character"
});
// Post a message with attachments
const image = sup.input.image;
if (image) {
sup.chat.npc.say("Here's the image you shared:", image);
}
// Post a message with attachments and custom identity
sup.chat.npc.say("Here's an image from me:", image, {
pfp: "https://example.com/custom-avatar.png",
username: "image_bot"
});
}
```
Posts a message directly as the NPC without AI processing. This allows you to send messages on behalf of the NPC with full control over the content.
**Parameters:**
- `firstArg` (optional): The message content as a string, or an attachment (`SupImage`, `SupAudio`, or `SupVideo`)
- `...args` (optional): Additional attachments or an options object with:
- `pfp` (optional): A custom profile picture for this message. Can be a URL string or a `SupImage` object. If not provided, uses the NPC's default avatar.
- `username` (optional): A custom username for this message. Allows the NPC to post as a different identity. If not provided, uses the NPC's default username.
**Important:**
- The message is posted immediately without AI processing
- The custom `pfp` and `username` only apply to this specific message; subsequent messages will use the NPC's defaults unless overridden again
- At least one argument (text or attachment) must be provided
- This method executes immediately, unlike `nudge()` and `whisper()` which execute after patch completion
---
## Usage Notes
- NPC functionality must be enabled in the chat for these methods to work
- `nudge()` and `whisper()` execute after patch completion, enabling them to respond to the patch output
- Either `nudge()` or `whisper()` can be called once per patch execution (not both)
- `prompt()` can be called multiple times but requires the NPC to be enabled
- `say()` executes immediately and allows posting messages directly as the NPC without AI processing
- The custom `pfp` and `username` parameters in `say()` only apply to that specific message
- The NPC uses its configured persona to maintain consistent personality across interactions
- NPCs are great for providing contextual assistance, moderation, or interactive chat experiences
## Error Handling
```js
function main() {
try {
if (sup.chat.npc.enabled) {
return sup.chat.npc.prompt("Hello!");
} else {
return "NPC is not enabled in this chat";
}
} catch (error) {
return "Error communicating with NPC: " + error.message;
}
}
```
Always check `sup.chat.npc.enabled` before calling NPC methods to avoid errors.
## sup.blockchain {#supblockchain}
The `sup.blockchain` package provides read-only access to EVM-compatible blockchain data. It supports Ethereum mainnet and major Layer 2 networks including Base, Optimism, Arbitrum, Polygon, and Zora.
```js title="Example sup.blockchain Usage"
// Get current block number
const blockNum = sup.blockchain.eth.blockNumber();
// Get ETH balance (vitalik.eth)
const balance = sup.blockchain.eth.balance('0xd8dA6BF26964aF9D7eEd9e03E53415D37aA96045');
const formatted = sup.blockchain.eth.format(balance);
// Get token price
const ethPrice = sup.blockchain.eth.price('ETH');
console.log('ETH price:', ethPrice.toString());
// Read from a smart contract
const usdc = sup.blockchain.eth.erc20('0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48');
const symbol = usdc.symbol(); // "USDC"
const decimals = usdc.decimals(); // 6
// Use other chains
const baseBalance = sup.blockchain.eth.chain('base').balance('0xd8dA6BF26964aF9D7eEd9e03E53415D37aA96045');
```
---
## Security Notice
All blockchain operations are **read-only**. There is no transaction signing, private key handling, or write operations. This API is designed for safely querying blockchain data without any risk of funds movement.
---
## Properties
### sup.blockchain.eth
#### `SupBlockchainChain`
The primary interface for Ethereum mainnet operations. All methods are available directly on this object.
---
## Chain Information
### sup.blockchain.eth.blockNumber()
#### `β string`
```js
const blockNum = sup.blockchain.eth.blockNumber();
console.log('Current block:', blockNum);
```
Returns the current block number for the chain as a string.
---
## Balances
### sup.blockchain.eth.balance()
#### `(address: string, token?: string)` `β string`
```js
// Get native ETH balance (vitalik.eth)
const ethBalance = sup.blockchain.eth.balance('0xd8dA6BF26964aF9D7eEd9e03E53415D37aA96045');
console.log('ETH:', sup.blockchain.eth.format(ethBalance));
// Get ERC20 token balance
const usdcBalance = sup.blockchain.eth.balance(
'0xd8dA6BF26964aF9D7eEd9e03E53415D37aA96045',
'0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48'
);
console.log('USDC:', sup.blockchain.eth.format(usdcBalance, 6));
```
Get wallet balance for native token (ETH) or any ERC20 token. Returns the balance as a string in the smallest unit (wei for ETH, or token's base unit).
**Parameters:**
- `address` (string): Wallet address to query
- `token` (string, optional): ERC20 token contract address. Omit for native token.
**Returns:** Balance as string in wei/smallest unit
---
## Token Prices
### sup.blockchain.eth.price()
#### `(contractOrTicker: string)` `β SupBigNumber`
```js
// Get price by ticker symbol
const ethPrice = sup.blockchain.eth.price('ETH');
console.log('ETH price:', ethPrice.toString());
// Get price by contract address
const usdcPrice = sup.blockchain.eth.price('0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48');
console.log('USDC price:', usdcPrice.toFixed(6));
// Perform calculations with precision
const doubled = ethPrice.times(2);
const ethAmount = new SupBigNumber('1.5');
const totalValue = ethAmount.times(ethPrice);
console.log('Value of 1.5 ETH:', totalValue.toFixed(2), 'USD');
```
Get current USD price for a token from DeFiLlama. Returns a `SupBigNumber` with 18 decimal precision for mathematical operations.
**Parameters:**
- `contractOrTicker` (string): Either a contract address (starting with `0x`) or a ticker symbol (like `ETH`, `USDC`, `BTC`)
**Returns:** `SupBigNumber` with the USD price
See the SupBigNumber documentation for available operations.
---
## Smart Contract Reads
### sup.blockchain.eth.contract()
#### `(address: string, abi: any[], functionName: string, args?: any[])` `β any`
```js
const balance = sup.blockchain.eth.contract(
'0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48',
[
{
"constant": true,
"inputs": [{"name": "_owner", "type": "address"}],
"name": "balanceOf",
"outputs": [{"name": "balance", "type": "uint256"}],
"type": "function"
}
],
'balanceOf',
['0xd8dA6BF26964aF9D7eEd9e03E53415D37aA96045']
);
```
Read data from any smart contract by providing the ABI and function details.
**Parameters:**
- `address` (string): Contract address
- `abi` (any[]): Array containing the function ABI
- `functionName` (string): Name of the function to call
- `args` (any[], optional): Array of arguments to pass to the function
**Returns:** The function's return value (type depends on the contract function)
---
## ERC20 Tokens
### sup.blockchain.eth.erc20()
#### `(address: string)` `β SupERC20`
```js
const usdc = sup.blockchain.eth.erc20('0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48');
// Get token info
const symbol = usdc.symbol(); // "USDC"
const name = usdc.name(); // "USD Coin"
const decimals = usdc.decimals(); // 6
const supply = usdc.totalSupply(); // Total supply in base units
// Check balances
const balance = usdc.balanceOf('0xd8dA6BF26964aF9D7eEd9e03E53415D37aA96045');
// Check allowances
const allowance = usdc.allowance(
'0xOwnerAddress...',
'0xSpenderAddress...'
);
```
Convenience wrapper for ERC20 token operations. Returns an object with methods that automatically use the ERC20 ABI.
**Returns:** Object with the following methods:
#### SupERC20 Methods
- `balanceOf(account: string)` β `string` - Get token balance for an address
- `totalSupply()` β `string` - Get total token supply
- `decimals()` β `number` - Get number of decimals
- `symbol()` β `string` - Get token symbol
- `name()` β `string` - Get token name
- `allowance(owner: string, spender: string)` β `string` - Get approved spending amount
---
### sup.blockchain.eth.token()
#### `(address: string)` `β TokenMetadata`
```js
const metadata = sup.blockchain.eth.token('0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48');
console.log(`${metadata.symbol}: ${metadata.name}`);
console.log(`Decimals: ${metadata.decimals}`);
console.log(`Total Supply: ${metadata.totalSupply}`);
```
Get comprehensive metadata for an ERC20 token in a single call.
**Returns:** Object with:
- `name` (string): Token name
- `symbol` (string): Token symbol
- `decimals` (number): Number of decimals
- `totalSupply` (string): Total supply in base units
---
## ERC721 NFTs
### sup.blockchain.eth.erc721()
#### `(address: string)` `β SupERC721`
```js
const blitmap = sup.blockchain.eth.erc721('0x8d04a8c79cEB0889Bdd12acdF3Fa9D207eD3Ff63');
// Get NFT info
const name = blitmap.name(); // "Blitmap"
const symbol = blitmap.symbol(); // "BLIT"
const supply = blitmap.totalSupply(); // "1700"
// Check ownership
const owner = blitmap.ownerOf(1); // Owner address of token #1
const balance = blitmap.balanceOf('0xd8dA6BF26964aF9D7eEd9e03E53415D37aA96045');
// Get metadata
const uri = blitmap.tokenURI(1); // Metadata URI for token #1
// Check approvals
const approved = blitmap.getApproved(1); // Address approved for token #1
const isApproved = blitmap.isApprovedForAll('0xOwner...', '0xOperator...');
```
Convenience wrapper for ERC721 NFT operations.
**Returns:** Object with the following methods:
#### SupERC721 Methods
- `balanceOf(owner: string)` β `string` - Get NFT count for an address
- `ownerOf(tokenId: number | string)` β `string` - Get owner of a specific token
- `tokenURI(tokenId: number | string)` β `string` - Get metadata URI
- `name()` β `string` - Get collection name
- `symbol()` β `string` - Get collection symbol
- `totalSupply()` β `string` - Get total number of tokens
- `getApproved(tokenId: number | string)` β `string` - Get approved address for token
- `isApprovedForAll(owner: string, operator: string)` β `boolean` - Check operator approval
---
## ERC1155 Multi-Tokens
### sup.blockchain.eth.erc1155()
#### `(address: string)` `β SupERC1155`
```js
const nft = sup.blockchain.eth.erc1155('0x495f947276749ce646f68ac8c248420045cb7b5e');
// Check balance for specific token ID
const balance = nft.balanceOf('0x742d35Cc6634C0532925a3b844Bc9e7595f0bEb', 1);
// Check multiple balances at once
const balances = nft.balanceOfBatch(
['0xAddress1...', '0xAddress2...'],
[1, 2]
);
// Get metadata URI
const uri = nft.uri(1);
// Check approvals
const isApproved = nft.isApprovedForAll('0xOwner...', '0xOperator...');
```
Convenience wrapper for ERC1155 multi-token operations.
**Returns:** Object with the following methods:
#### SupERC1155 Methods
- `balanceOf(account: string, id: number | string)` β `string` - Get balance for specific token ID
- `balanceOfBatch(accounts: string[], ids: (number | string)[])` β `string[]` - Get multiple balances
- `uri(id: number | string)` β `string` - Get metadata URI for token ID
- `isApprovedForAll(owner: string, operator: string)` β `boolean` - Check operator approval
---
## Transactions
### sup.blockchain.eth.tx()
#### `(hash: string)` `β Transaction`
```js
const tx = sup.blockchain.eth.tx('0x123...');
console.log('From:', tx.from);
console.log('To:', tx.to);
console.log('Value:', tx.value);
console.log('Block:', tx.blockNumber);
```
Get details about a transaction by its hash.
**Returns:** Transaction object with properties like `from`, `to`, `value`, `gas`, `blockNumber`, etc.
---
## Utility Functions
### sup.blockchain.eth.format()
#### `(value: string | number, decimals?: number)` `β number`
```js
const balance = sup.blockchain.eth.balance(address);
const formatted = sup.blockchain.eth.format(balance, 18);
console.log('Balance:', formatted, 'ETH');
// USDC has 6 decimals
const usdcBalance = sup.blockchain.eth.balance(address, usdcAddress);
const usdcFormatted = sup.blockchain.eth.format(usdcBalance, 6);
console.log('Balance:', usdcFormatted, 'USDC');
```
Convert from wei/smallest unit to human-readable value.
**Parameters:**
- `value` (string | number): Value in wei or smallest unit
- `decimals` (number, optional): Token decimals (default: 18 for ETH)
**Returns:** Formatted value as JavaScript `number`
**Note:** Since this returns a JavaScript `Number`, precision may be lost for very large values. For precise calculations, use `SupBigNumber` operations instead.
---
### sup.blockchain.eth.parse()
#### `(value: number, decimals?: number)` `β string`
```js
const amount = sup.blockchain.eth.parse(1.5, 18);
console.log('1.5 ETH in wei:', amount);
const usdcAmount = sup.blockchain.eth.parse(100, 6);
console.log('100 USDC in base units:', usdcAmount);
```
Convert from human-readable value to wei/smallest unit.
**Parameters:**
- `value` (number): Human-readable value
- `decimals` (number, optional): Token decimals (default: 18 for ETH)
**Returns:** Value in wei/smallest unit as string
---
### sup.blockchain.eth.isAddress()
#### `(address: string)` `β boolean`
```js
if (sup.blockchain.eth.isAddress(userInput)) {
const balance = sup.blockchain.eth.balance(userInput);
// ...
} else {
return "Invalid Ethereum address";
}
```
Validate Ethereum address format (checks for 0x prefix and 40 hex characters).
**Parameters:**
- `address` (string): Address to validate
**Returns:** `true` if valid Ethereum address format, `false` otherwise
---
## Multi-Chain Support
### sup.blockchain.eth.chain()
#### `(chainName: string)` `β SupBlockchainChain`
```js
// Access Base L2
const baseBalance = sup.blockchain.eth.chain('base').balance(address);
const baseBlock = sup.blockchain.eth.chain('base').blockNumber();
// Access Optimism
const opUsdc = sup.blockchain.eth.chain('optimism').erc20('0x0b2C639c533813f4Aa9D7837CAf62653d097Ff85');
const symbol = opUsdc.symbol();
// Access Arbitrum
const arbPrice = sup.blockchain.eth.chain('arbitrum').price('ETH');
// Access Polygon
const polyBalance = sup.blockchain.eth.chain('polygon').balance(address);
// Access Zora
const zoraBlock = sup.blockchain.eth.chain('zora').blockNumber();
```
Access Layer 2 and other EVM-compatible chains with the same API interface.
**Supported chains:**
- `base` - Base
- `optimism` - Optimism
- `arbitrum` - Arbitrum One
- `polygon` - Polygon PoS
- `zora` - Zora
**Returns:** Chain-specific blockchain operations with the same API as `sup.blockchain.eth`
All methods available on `sup.blockchain.eth` are also available on the returned chain object:
- `blockNumber()`
- `balance(address, token?)`
- `price(contractOrTicker)`
- `contract(address, abi, functionName, args?)`
- `erc20(address)`
- `erc721(address)`
- `erc1155(address)`
- `token(address)`
- `tx(hash)`
- `format(value, decimals?)`
- `parse(value, decimals?)`
- `isAddress(address)`
---
## SupBigNumber
The `SupBigNumber` class is returned by `price()` and provides precision-safe mathematical operations for financial calculations.
### Why SupBigNumber?
JavaScript's native `Number` type loses precision with large numbers or many decimal places. For financial data like token prices, this can lead to incorrect calculations. `SupBigNumber` uses JavaScript's native `bigint` for fixed-point arithmetic to maintain exact values within a specified decimal precision (default 18 decimals).
```js
// Problem with native numbers:
const price = 0.999663;
const amount = 1000000;
console.log(price * amount); // 999663.0000000001 β
// Solution with SupBigNumber:
const price = sup.blockchain.eth.price('USDC');
const amount = new SupBigNumber('1000000');
console.log(price.times(amount).toString()); // Exact result β
```
### Constructor
```js
// Create from various sources
const bn1 = new SupBigNumber('123.456');
const bn2 = new SupBigNumber(123.456);
const bn3 = new SupBigNumber('1000000000000000000', 18); // With decimals
// From price API (returns SupBigNumber)
const ethPrice = sup.blockchain.eth.price('ETH');
```
### Methods
#### Arithmetic Operations
```js
const a = new SupBigNumber('10.5');
const b = new SupBigNumber('2.25');
a.plus(b); // Addition: 12.75
a.minus(b); // Subtraction: 8.25
a.times(b); // Multiplication: 23.625
a.div(b); // Division: 4.666...
```
#### Conversion & Formatting
```js
const num = new SupBigNumber('123.456789');
num.toString(); // "123.456789"
num.toFixed(2); // "123.46"
num.toFixed(0); // "123"
num.toNumber(); // 123.456789 (JavaScript number)
```
#### Comparisons
```js
const a = new SupBigNumber('10');
const b = new SupBigNumber('20');
a.gt(b); // false (greater than)
a.gte(b); // false (greater than or equal)
a.lt(b); // true (less than)
a.lte(b); // true (less than or equal)
a.eq(b); // false (equal)
```
### Example: Calculate Portfolio Value
```js
function main() {
// Get current prices
const ethPrice = sup.blockchain.eth.price('ETH');
const btcPrice = sup.blockchain.eth.price('BTC');
// Define holdings
const ethHolding = new SupBigNumber('2.5');
const btcHolding = new SupBigNumber('0.1');
// Calculate values
const ethValue = ethPrice.times(ethHolding);
const btcValue = btcPrice.times(btcHolding);
const totalValue = ethValue.plus(btcValue);
return [
`ETH: ${ethHolding.toString()} Γ $${ethPrice.toFixed(2)} = $${ethValue.toFixed(2)}`,
`BTC: ${btcHolding.toString()} Γ $${btcPrice.toFixed(2)} = $${btcValue.toFixed(2)}`,
`Total: $${totalValue.toFixed(2)}`
];
}
```
---
## Complete Examples
### Check Multiple Token Balances
```js
function main() {
const address = sup.input.text;
if (!sup.blockchain.eth.isAddress(address)) {
return "Please provide a valid Ethereum address";
}
// Native ETH
const ethBalance = sup.blockchain.eth.balance(address);
const ethFormatted = sup.blockchain.eth.format(ethBalance);
// USDC
const usdcAddress = '0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48';
const usdcBalance = sup.blockchain.eth.balance(address, usdcAddress);
const usdcFormatted = sup.blockchain.eth.format(usdcBalance, 6);
// USDT
const usdtAddress = '0xdAC17F958D2ee523a2206206994597C13D831ec7';
const usdtBalance = sup.blockchain.eth.balance(address, usdtAddress);
const usdtFormatted = sup.blockchain.eth.format(usdtBalance, 6);
return [
`Address: ${address}`,
`ETH: ${ethFormatted.toFixed(4)}`,
`USDC: ${usdcFormatted.toFixed(2)}`,
`USDT: ${usdtFormatted.toFixed(2)}`
];
}
```
### Track Token Price
```js
function main() {
const ticker = sup.input.text || 'ETH';
try {
const price = sup.blockchain.eth.price(ticker);
const priceNum = price.toNumber();
// Store historical data
const history = sup.get('priceHistory') || [];
history.push({
ticker,
price: price.toString(),
timestamp: Date.now()
});
// Keep last 10 prices
if (history.length > 10) history.shift();
sup.set('priceHistory', history);
// Calculate change if we have previous data
let change = '';
if (history.length > 1) {
const prev = new SupBigNumber(history[history.length - 2].price);
const diff = price.minus(prev);
const pct = diff.div(prev).times(100);
change = ` (${pct.toFixed(2)}%)`;
}
return `${ticker}: $${price.toFixed(priceNum < 10 ? 4 : 2)}${change}`;
} catch (error) {
return `Could not find price for ${ticker}`;
}
}
```
### Multi-Chain Balance Checker
```js
function main() {
const address = sup.input.text;
if (!sup.blockchain.eth.isAddress(address)) {
return "Invalid address";
}
const chains = [
{ name: 'Ethereum', chain: null },
{ name: 'Base', chain: 'base' },
{ name: 'Optimism', chain: 'optimism' },
{ name: 'Arbitrum', chain: 'arbitrum' },
{ name: 'Polygon', chain: 'polygon' }
];
const results = chains.map(({ name, chain }) => {
try {
const balance = chain
? sup.blockchain.eth.chain(chain).balance(address)
: sup.blockchain.eth.balance(address);
const formatted = sup.blockchain.eth.format(balance);
return `${name}: ${formatted.toFixed(4)} ETH`;
} catch (error) {
return `${name}: Error`;
}
});
return ['Balance across chains:', ...results];
}
```
### NFT Ownership Checker
```js
function main() {
const [collection, tokenId] = (sup.input.text || '').split('/');
if (!collection || !tokenId) {
return "Usage: /";
}
if (!sup.blockchain.eth.isAddress(collection)) {
return "Invalid collection address";
}
try {
const nft = sup.blockchain.eth.erc721(collection);
const name = nft.name();
const symbol = nft.symbol();
const owner = nft.ownerOf(tokenId);
const uri = nft.tokenURI(tokenId);
return [
`Collection: ${name} (${symbol})`,
`Token ID: ${tokenId}`,
`Owner: ${owner}`,
`Metadata: ${uri}`
];
} catch (error) {
return `Error: ${error.message}`;
}
}
```
---
## Notes
- All operations are read-only - no transaction signing or private keys
- Prices are fetched from DeFiLlama's price API
- Price data is stored with 18 decimal precision
- Use `SupBigNumber` for precise financial calculations
- `format()` returns JavaScript `Number` which may lose precision for very large values
- Contract reads require valid ABI definitions
- Address validation only checks format, not if the address exists on-chain
## sup.ai.video {#supaivideo}
The `sup.ai.video` package provides AI-powered video generation capabilities using Google's Veo models. It is accessed through `sup.ai.video`.
```js title="Example sup.ai.video Usage"
// Generate a video from text prompt
const video = sup.ai.video.create("a cat playing with a ball of yarn");
// Generate with custom options
const hdVideo = sup.ai.video.create(
"sunset over ocean with waves",
{
duration: 8,
aspectRatio: "16:9",
resolution: "1080p",
model: "best"
}
);
// Generate from an image
const imageToVideo = sup.ai.video.create(
sup.input.image,
"animate this image with gentle movement"
);
```
---
## Methods
### sup.ai.video.create()
#### `(prompt: string | SupImage, options?: VideoCreateOptions)` `β SupVideo`
```js
// Simple text-to-video
const video = sup.ai.video.create("a bird flying through clouds");
// With custom duration and aspect ratio
const video = sup.ai.video.create("dancing robot", {
duration: 6,
aspectRatio: "9:16",
resolution: "720p"
});
// Image-to-video
const video = sup.ai.video.create(
sup.input.image,
"make this image come alive with subtle animation"
);
```
Creates a video using AI based on a text prompt, image, or existing video.
**Parameters:**
- `prompt` (string | SupImage | SupVideo):
- **Text prompt**: Description of the video to generate
- **SupImage**: Starting image to animate
- `description` (optional string): When using image or video input, provide a text description
- `options` (optional object):
- `duration` (number): Length of video in seconds (default varies by model)
- `aspectRatio` (string): Video dimensions:
- `"16:9"`: Landscape (default)
- `"9:16"`: Portrait
- `resolution` (string): Output quality:
- `"720p"`: HD quality (default)
- `"1080p"`: Full HD quality
- `model` (string): Which Veo model to use:
- `"fast"`: Fast generation with Veo 3.1
- `"best"`: Highest quality with Veo 3.1 (default)
- `"veo-3.1-fast-generate-001"`: Explicit fast model
- `"veo-3.1-generate-001"`: Explicit best model
- `"veo-3.1-fast-generate-preview"` / `"veo-3.0-fast-generate-preview"`: Legacy aliases mapped to fast GA endpoint
- `"veo-3.1-generate-preview"` / `"veo-3.0-generate-preview"` / `"veo-2.0-generate-preview"` / `"veo-2.0-generate-exp"`: Legacy aliases mapped to best GA endpoint
**Returns:** A `SupVideo` object containing the generated video
**Examples:**
```js
// Portrait video for social media
const socialVideo = sup.ai.video.create(
"person dancing to upbeat music",
{
aspectRatio: "9:16",
duration: 8,
resolution: "1080p"
}
);
// Quick preview with fast model
const quickPreview = sup.ai.video.create(
"car driving down highway",
{
model: "fast",
duration: 4
}
);
// Animate a still image
const animatedPainting = sup.ai.video.create(
artworkImage,
"bring this painting to life with gentle movements and lighting changes"
);
```
### sup.ai.video.interpret()
#### `(...args: (string | SupVideo)[])` `β string`
```js
const video = sup.input.video;
const description = sup.ai.video.interpret(video);
// With a custom prompt
const analysis = sup.ai.video.interpret(
video,
"What actions are happening in this video?"
);
```
Analyzes a video using AI and returns a text description. Uses Gemini 3 Flash for multimodal video understanding.
**Parameters:**
- `video` (SupVideo): The video to analyze
- `prompt` (optional string): Custom instructions for the AI analysis. If not provided, uses a default prompt.
**Returns:** A string containing the AI's interpretation of the video
**Examples:**
```js
// Basic video description
const video = sup.input.video;
const description = sup.ai.video.interpret(video);
// Analyze specific aspects
const actions = sup.ai.video.interpret(
video,
"Describe the main actions and movements in this video."
);
// Transcribe speech in video
const transcript = sup.ai.video.interpret(
video,
"Provide a transcript of all spoken words in this video."
);
```
---
## Notes
- Generated videos are returned as `SupVideo` objects which can be displayed by returning them from `main()`.
- The Veo models excel at understanding natural language descriptions and generating realistic motion.
- Video generation can take longer than image generation depending on duration and quality settings.
- For best results, provide detailed descriptions including camera movement, lighting, and action.
- The `"best"` model produces higher quality but takes longer to generate than `"fast"`.
- Image-to-video can be used to animate still images with motion.
## sup.ai.audio {#supaiaudio}
The `sup.ai.audio` package provides AI-powered audio understanding and interpretation capabilities. It is accessed through `sup.ai.audio`.
```js title="Example sup.ai.audio Usage"
// Load an audio file
const audio = sup.audio("audio.mp3");
// Basic audio interpretation
const description = sup.ai.audio.interpret(audio);
console.log(description); // "A person speaking clearly about technology with background music..."
// Custom prompt for specific analysis
const instruments = sup.ai.audio.interpret(
audio,
"What musical instruments can you hear in this audio?"
);
// Analyze speech content
const transcript = sup.ai.audio.interpret(
audio,
"Please provide a detailed transcript of the speech in this audio."
);
```
---
## Methods
### sup.ai.audio.interpret()
#### `(audio: SupAudio, prompt?: string)` `β string`
```js
const audio = sup.audio("audio.wav");
const description = sup.ai.audio.interpret(audio);
```
Converts audio to text using AI audio understanding. This method can transcribe speech, describe sound effects, identify musical instruments, and provide detailed audio analysis.
**Parameters:**
- `audio` (SupAudio): The audio file to analyze
- `prompt` (optional string): Custom instructions for the AI analysis. If not provided, uses a default prompt that provides both transcription and audio description.
**Returns:** A string containing the AI's interpretation of the audio
**Examples:**
```js
// Basic interpretation - transcribes speech and describes audio
const audio = sup.audio("recording.mp3");
const description = sup.ai.audio.interpret(audio);
// Custom analysis for music
const musicAnalysis = sup.ai.audio.interpret(
audio,
"Identify the musical genre, instruments, and mood of this audio."
);
// Focus on speech transcription
const transcript = sup.ai.audio.interpret(
audio,
"Please provide an accurate transcript of all spoken words in this audio."
);
// Identify sound effects
const soundEffects = sup.ai.audio.interpret(
audio,
"What sound effects or non-speech audio can you hear? Describe them in detail."
);
```
---
## Notes
- The AI model used for audio interpretation is Gemini 3 Flash, which supports both speech transcription and general audio understanding.
- Audio files can be loaded using `sup.audio()` with URLs or file paths.
- The default behavior (when no custom prompt is provided) includes both speech transcription and detailed audio description.
- Supported audio formats include MP3, WAV, M4A, and other common audio formats.
- For best results with speech transcription, use clear audio with minimal background noise.