What it does
Upload every extracted frame to the OpenAI Files API so Custom GPTs, Projects, Assistants file-search, and the Responses API can reference the frames by `file_id` without re-uploading each turn. Frames go up with `purpose=vision` (the bucket image inputs use); the manifest ships with `purpose=assistants` so it's discoverable by file-search tools. No SDK — native fetch + FormData. Transcripts ride along inside the manifest when transcription is enabled.
When to reach for it
- Wire peepshow into a Custom GPT / Assistant file-search flow without a per-turn upload step
- Stage frames for a Responses API call where `input_image` content parts reference `file_id`
- Team workflows where an `OPENAI_FILES_ORG` header pins uploads to a shared organisation
Install
npm i -g peepshowUse it
export OPENAI_API_KEY="sk-..."
peepshow sinks add openai-files
peepshow ./demo.mp4Make it automatic
Register the sink once — every run fires it afterward. Scope by --when so it only runs for matching videos.
peepshow sinks add openai-files
peepshow sinks add openai-files --when extension=mp4,mov
peepshow sinks add openai-files --when path=/Volumes/Work/Configuration
OPENAI_API_KEYStandard OpenAI key. Sent as `Authorization: Bearer <key>`. requiredOPENAI_FILES_PURPOSEPurpose for frame uploads. One of `vision`, `assistants`, `user_data`, `batch`. Default `vision`. Manifest is always `assistants`.OPENAI_FILES_ORGAdds an `OpenAI-Organization` header when set.OPENAI_FILES_API_URLOverride the `/v1/files` base URL. Useful for proxies and mocks.
Use with an LLM agent
Every peepshow sink reads its config from env vars and receives a single JSON payload on stdin. An LLM agent (Claude Code, Cursor, Windsurf, Gemini, Codex) can drive the OpenAI Files sink automatically when three things are true:
- the env vars below are exported in the agent's shell (or a project
.envit can load), - the
peepshowCLI is onPATH— install withnpm i -g peepshow, - a peepshow auto-sink is registered for the run (optional but recommended — makes invocation zero-argument).
1. Set the environment
# Add to ~/.zshrc, ~/.bashrc, or a project .env the agent can load
export OPENAI_API_KEY="..."2. Register as an auto-sink
peepshow sinks add openai-files
peepshow sinks add openai-files --when extension=mp4,mov3. Example LLM session
You → drop a
.movinto Claude Code.Claude → auto-invokes
/peepshow:slides ./clip.mov. peepshow extracts frames + audio, theOpenAI Filessink forwards the run to the configured OpenAI Files target. Claude replies with a summary and a link to the created record.
The transcript rides along in the payload whenever the audio pass transcribes successfully.
Write your own
A sink is any executable that reads the --emit json payload on stdin. Shell, Node, Python, Go — the spec's in docs/PLUGINS.md. Register persistent ones with peepshow sinks add-cmd 'your-command'.