Now in preview

Sandboxes for AI agents.
200ms cold start.

Firecracker microVMs on bare metal. Create a sandbox, run code, destroy it. Claude Code, Python, Rust, and Node pre-installed. Per-second billing.

Agent sandboxes are stateless.
Your agents shouldn't be.

Every sandbox provider gives your agent an isolated environment to run code. When the session ends, everything disappears.

For a 4-minute task, that's fine.

For a 14-hour research pipeline, a week-long code migration, or a multi-agent system that builds on previous results -- you need execution that remembers.

upstream is a turn executor. Each turn produces structured output. Context accumulates. The next turn starts where the last one ended.

# Turn 1: Research $ upstream run research.md Step 1: Search papers .............. done (14 results) Step 2: Summarize findings ......... done Context: 2 artifacts saved # Turn 2: Analysis (receives Turn 1 context) $ upstream run analysis.md Context loaded: 2 artifacts from previous turn Step 1: Cross-reference findings ... done Step 2: Generate report ............ done

Three API calls. No setup.

Create a sandbox. Run code. Get results. The SDK handles everything else.

01

Create a sandbox

One API call. A Firecracker VM boots from a pre-warmed snapshot in 200ms. Claude Code, git, and your toolchain are ready.

--- image: python:3.12 --- # Data Analysis ```bash pip install pandas python analyze.py ```
02

Run it

upstream pulls the image, executes each bash block in an isolated VM, and captures the output.

$ upstream run pipeline.md Pulling python:3.12 ... done Step 1: pip install ..... done (3.1s) Step 2: python analyze .. done (1.4s)
03

Read the results

Results are written back into the markdown. Exit codes, stdout, timing -- all inline. The document is now a complete execution report.

**Result:** exit 0 (1.4s) ``` Found 847 records Mean: 42.3, Median: 38.1 Report saved to output.csv ```

How upstream compares

Honest numbers. We win on cost, context, and self-hosting. Others win on SDK breadth and enterprise traction.

E2B Daytona upstream
Self-hosted Enterprise only No Free + open source
Context store No No Built-in
Interface Python/JS SDK Python/JS SDK Markdown
Cold start ~150ms ~90ms ~5ms (warm pool)
Cost per session ~$0.01 ~$0.01 ~$0.0002 (self-hosted)
Isolation Firecracker microVM Firecracker microVM Firecracker microVM
Max session 24 hours Unlimited Unlimited

Self-hosted is free. Forever.

Run on your own metal for $0. Or let us host it -- still 2-5x cheaper than alternatives.

Free
$0
Self-hosted, unlimited
  • Single Rust binary
  • All core features
  • Context accumulation
  • Unlimited pipelines
  • Unlimited agents
  • Community support
Install from source
Enterprise
Custom
Dedicated metal, volume discounts
  • Dedicated bare metal
  • All Pro features
  • Volume discounts
  • Custom SLA
  • SSO / SAML
  • Dedicated support
Contact us

Try it in 60 seconds.

$ cargo install upstream click to copy

Or download a prebuilt binary from GitHub Releases