diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
new file mode 100644
index 0000000..9e38480
--- /dev/null
+++ b/CONTRIBUTING.md
@@ -0,0 +1,135 @@
+# Contributing
+
+Thanks for contributing to `runtimeuse`.
+
+## Prerequisites
+
+- Git
+- Node.js 22 or newer
+- Python 3.11 or newer recommended
+- `npm`
+
+There is no root workspace setup in this repository today, so install dependencies separately in each package you want to work on.
+
+## Clone the Repository
+
+```bash
+git clone https://github.com/getlark/runtimeuse.git
+cd runtimeuse
+```
+
+## Repository Layout
+
+- `packages/runtimeuse` is the TypeScript runtime that runs inside the sandbox.
+- `packages/runtimeuse-client-python` is the Python client used outside the sandbox.
+- `docs` is the documentation app.
+
+## Environment Files
+
+Two local env files are useful for advanced development flows. They are not required for the basic local test path:
+
+- `packages/runtimeuse/.env` for the runtime package's `npm run dev-publish` flow. Start from `packages/runtimeuse/.env.example`.
+- `packages/runtimeuse-client-python/.env` for sandbox and LLM tests. Start from `packages/runtimeuse-client-python/.env.example`.
+
+## TypeScript Runtime Development
+
+Install dependencies:
+
+```bash
+cd packages/runtimeuse
+npm install
+```
+
+Useful commands:
+
+```bash
+npm run build
+npm run typecheck
+npm test
+npm run test:integration
+```
+
+Notes:
+
+- `npm test` runs the main unit test suite.
+- `npm run test:integration` builds first and then runs the integration tests.
+- If you want to use the Claude handler locally, install the CLI with `npm install -g @anthropic-ai/claude-code`.
+- `npm run dev-publish` runs `scripts/dev-publish.sh`: it builds the runtime, uploads a zip to S3, and prints a presigned download URL plus a ready-to-use `curl ... && node runtimeuse/dist/cli.js` command.
+- `npm run dev-publish` reads `packages/runtimeuse/.env` for `S3_BUCKET` and optionally `S3_PREFIX` and `PRESIGN_EXPIRY`.
+- `npm run dev-publish` assumes the AWS CLI is installed and already authenticated with permission to upload to the configured S3 bucket and generate presigned URLs.
+
+## Python Client Development
+
+Create and activate a virtual environment, then install the package in editable mode:
+
+```bash
+cd packages/runtimeuse-client-python
+python -m venv .venv
+source .venv/bin/activate
+pip install -e ".[dev]" 2>/dev/null || pip install -e .
+pip install pytest pytest-asyncio
+```
+
+Run the default test suite:
+
+```bash
+pytest test/ -m "not sandbox and not llm"
+```
+
+Sandbox-only test flow:
+
+```bash
+pytest test/ -m sandbox
+```
+
+LLM-only test flow:
+
+```bash
+pytest test/ -m llm
+```
+
+Notes:
+
+- The Python package declares `python >=3.10`, but CI currently tests on Python 3.11 through 3.13.
+- If your change touches the runtime protocol or end-to-end behavior, build `packages/runtimeuse` before running Python tests:
+
+```bash
+cd packages/runtimeuse
+npm install
+npm run build
+```
+
+- Copy `packages/runtimeuse-client-python/.env.example` to `.env` before running sandbox or LLM tests locally.
+- Sandbox tests create an E2B sandbox and require `E2B_API_KEY`.
+- LLM tests also create sandboxes by default and require `E2B_API_KEY` plus the relevant provider credentials such as `OPENAI_API_KEY` or `ANTHROPIC_API_KEY`.
+- If you already have a runtime server running, set `TEST_WS_URL` to reuse it instead of creating a fresh sandbox.
+- Some LLM tests also require `TEST_S3_BUCKET` for artifact upload verification.
+- If you want sandbox tests to run against a dev build instead of `npx -y runtimeuse`, set `RUNTIMEUSE_RUN_COMMAND`. A convenient way to get that command is `packages/runtimeuse` -> `npm run dev-publish`.
+
+## Docs Development
+
+Install and run the docs app:
+
+```bash
+cd docs
+npm install
+npm run dev
+```
+
+Useful commands:
+
+```bash
+npm run build
+npm run types:check
+npm run lint
+```
+
+## Before Opening a PR
+
+Run the checks relevant to the package you changed:
+
+- `packages/runtimeuse`: `npm run typecheck` and `npm test`
+- `packages/runtimeuse-client-python`: `pytest test/ -m "not sandbox and not llm"`
+- `docs`: `npm run types:check` and `npm run lint`
+
+If you changed behavior shared between the runtime and Python client, run both the TypeScript and Python checks.
diff --git a/README.md b/README.md
index 17f9023..27c2e07 100644
--- a/README.md
+++ b/README.md
@@ -2,7 +2,7 @@
[](https://twitter.com/getlark)
-Communicate with AI agents inside sandboxes over WebSocket.
+Run AI agents inside sandboxes and communicate with them over WebSocket.
| Package | Language | Role | Install |
| ---------------------------------------------------------- | ---------- | ------------------------------------------ | ------------------------------- |
@@ -14,10 +14,10 @@ Communicate with AI agents inside sandboxes over WebSocket.
### 1. Start the runtime (inside a sandbox)
```bash
-npx -y runtimeuse@latest
+npx -y runtimeuse
```
-This starts a WebSocket server on port 8080 using the OpenAI agent handler by default. Use `--agent claude` for Claude.
+This starts a WebSocket server on port 8080 using the OpenAI agent handler by default. Use `--agent claude` for Claude. The Claude handler also requires the `claude` CLI to be installed in the sandbox, for example with `npm install -g @anthropic-ai/claude-code`.
### 2. Connect from Python
@@ -41,33 +41,12 @@ async def main():
asyncio.run(main())
```
-### 3. Or use the runtime programmatically (TypeScript)
-
-```typescript
-import { RuntimeUseServer, openaiHandler } from "runtimeuse";
-
-const server = new RuntimeUseServer({ handler: openaiHandler, port: 8080 });
-await server.startListening();
-```
-
-## How It Works
-
-```
-Python Client ──> WebSocket ──> Runtime (in sandbox) ──> AgentHandler
- ├── openai (default)
- └── claude
-```
+See the [runtime README](./packages/runtimeuse/README.md) and [client README](./packages/runtimeuse-client-python/README.md) for full API docs.
-1. The client sends an `InvocationMessage` over WebSocket
-2. The runtime downloads files and runs pre-commands (if any)
-3. The `AgentHandler` executes the agent with the given prompts and model
-4. Intermediate `AssistantMessage`s stream back to the client
-5. Files in the artifacts directory are auto-detected and uploaded via presigned URL handshake
-6. A final `ResultMessage` with structured output is sent back
-7. The runtime runs post-commands (if any)
+## Contributing
-See the [runtime README](./packages/runtimeuse/README.md) and [client README](./packages/runtimeuse-client-python/README.md) for full API docs.
+See [`CONTRIBUTING.md`](./CONTRIBUTING.md) for local setup, package-specific development commands, and the recommended checks to run before opening a PR.
## License
-BSL-1.1
+[FSL-1.1-ALv2](./LICENSE.md)
diff --git a/docs/README.md b/docs/README.md
index 2b4c09a..b0ab080 100644
--- a/docs/README.md
+++ b/docs/README.md
@@ -1,14 +1,29 @@
# docs
-This is a React Router application generated with
-[Create Fumadocs](https://github.com/fuma-nama/fumadocs).
+Documentation app for `runtimeuse`, built with React Router and Fumadocs.
-Run development server:
+## Local Development
+
+Install dependencies and start the dev server from this directory:
```bash
+npm install
npm run dev
-# or
-pnpm dev
-# or
-yarn dev
```
+
+## Available Scripts
+
+```bash
+npm run dev # start the local docs app
+npm run build # create a production build
+npm run start # serve the built site
+npm run types:check # generate route/docs types and run TypeScript checks
+npm run lint # run Biome checks
+npm run format # format the docs app with Biome
+```
+
+## Project Layout
+
+- `content/docs` contains the MDX documentation content.
+- `app` contains the React Router application and search/UI code.
+- `source.config.ts` configures the Fumadocs content source.
diff --git a/docs/app/lib/layout.shared.tsx b/docs/app/lib/layout.shared.tsx
index 52e20ed..818435a 100644
--- a/docs/app/lib/layout.shared.tsx
+++ b/docs/app/lib/layout.shared.tsx
@@ -1,16 +1,15 @@
import type { BaseLayoutProps } from 'fumadocs-ui/layouts/shared';
-// fill this with your actual GitHub info, for example:
export const gitConfig = {
- user: 'fuma-nama',
- repo: 'fumadocs',
+ user: 'getlark',
+ repo: 'runtimeuse',
branch: 'main',
};
export function baseOptions(): BaseLayoutProps {
return {
nav: {
- title: 'React Router',
+ title: 'RuntimeUse',
},
githubUrl: `https://github.com/${gitConfig.user}/${gitConfig.repo}`,
};
diff --git a/docs/app/root.tsx b/docs/app/root.tsx
index d928614..4895c7e 100644
--- a/docs/app/root.tsx
+++ b/docs/app/root.tsx
@@ -13,6 +13,7 @@ import SearchDialog from '@/components/search';
import NotFound from './routes/not-found';
export const links: Route.LinksFunction = () => [
+ { rel: 'icon', href: '/favicon.svg', type: 'image/svg+xml' },
{ rel: 'preconnect', href: 'https://fonts.googleapis.com' },
{
rel: 'preconnect',
diff --git a/docs/content/docs/index.mdx b/docs/content/docs/index.mdx
index e17a342..a2b4a8f 100644
--- a/docs/content/docs/index.mdx
+++ b/docs/content/docs/index.mdx
@@ -1,16 +1,18 @@
---
title: Introduction
-description: Run AI agents in sandboxes and communicate over WebSocket.
+description: Run AI agents in sandboxes and communicate with them over WebSocket.
---
## What is RuntimeUse?
-RuntimeUse lets you run an AI agent inside any sandbox and communicate with it over WebSocket — handling artifact uploads, pre-commands, file downloads, cancellation, and structured results.
+RuntimeUse lets you run an AI agent inside any sandbox and communicate with it over WebSocket. It handles the runtime lifecycle for you: file downloads, pre-commands, artifact uploads, cancellation, and structured results.
It is made up of two parts:
-1. **NPM Runtime executable** — Run it in any sandbox with `npx runtimeuse`. It sets up your agent of choice (Claude, OpenAI, etc) and starts a WebSocket server to interact with it.
-2. **Client libraries** — Simple abstractions for interacting with the sandbox WebSocket server from your API server, worker nodes, etc.
+1. **`runtimeuse`**: the TypeScript runtime that runs inside the sandbox and exposes a WebSocket server.
+2. **`runtimeuse-client`**: the Python client that connects from outside the sandbox and sends invocations.
+
+Today, the recommended path is to run the runtime in the sandbox and use the Python client from your application code.
## Built-in Agent Handlers
@@ -21,6 +23,12 @@ The runtime ships with two built-in handlers:
Switch between them with `--agent openai` or `--agent claude`.
+The Claude handler also requires the `claude` CLI to be installed in the sandbox, for example:
+
+```bash
+npm install -g @anthropic-ai/claude-code
+```
+
## Key Features
- **Sandbox-agnostic** — works with any provider that can run `npx` and expose a port
@@ -30,4 +38,6 @@ Switch between them with `--agent openai` or `--agent claude`.
+
+
diff --git a/docs/content/docs/quickstart.mdx b/docs/content/docs/quickstart.mdx
index 9bde9a3..581dc86 100644
--- a/docs/content/docs/quickstart.mdx
+++ b/docs/content/docs/quickstart.mdx
@@ -11,49 +11,80 @@ Inside any sandbox that can run `npx` and expose a port:
npx -y runtimeuse
```
-This starts a WebSocket server on port 8080 using the OpenAI agent handler by default. To use Claude instead:
+This starts a WebSocket server on port 8080 using the OpenAI agent handler by default.
+
+To use Claude instead:
```bash
npx -y runtimeuse --agent claude
```
-## Step 2: Connect from Python
+The Claude handler requires the `claude` CLI to be installed in the sandbox, for example with:
-Install the Python client:
+```bash
+npm install -g @anthropic-ai/claude-code
+```
+
+## Step 2: Install the Python Client
```bash
-pip install runtimeuse
+pip install runtimeuse-client
+```
+
+## Step 3: Connect from Python
+
+For local development, connect directly to the runtime's WebSocket URL:
+
+```python
+import asyncio
+from runtimeuse_client import RuntimeUseClient, QueryOptions
+
+async def main():
+ client = RuntimeUseClient(ws_url="ws://localhost:8080")
+
+ result = await client.query(
+ prompt="What is 2 + 2?",
+ options=QueryOptions(
+ system_prompt="You are a helpful assistant.",
+ model="gpt-4.1",
+ ),
+ )
+
+ print(result.data.text)
+
+asyncio.run(main())
```
-Connect to the WebSocket server and invoke the agent:
+## Step 4: Use a Sandbox URL in Production-Like Flows
+
+In a real sandbox integration, your sandbox provider gives you the runtime URL after starting the process inside the sandbox. The Python client then connects from outside the sandbox:
```python
-from runtimeuse import RuntimeUseClient, RuntimeUseInvoker, InvocationMessage, ResultMessageInterface
-
-# Connect to the sandbox's WebSocket server
-sandbox = Sandbox.create()
-sandbox.run("npx -y runtimeuse")
-ws_url = sandbox.get_url(8080)
-
-client = RuntimeUseClient(ws_url=ws_url)
-invoker = RuntimeUseInvoker(client)
-
-# Build an invocation
-invocation = InvocationMessage(
- message_type="invocation_message",
- system_prompt="You are a helpful assistant.",
- user_prompt="Run the tests.",
- output_format_json_schema_str='{"type":"json_schema","schema":{"type":"object"}}'
-)
-
-# Invoke and handle the result
-async def on_result(result: ResultMessageInterface):
- print(f"Result: {result.structured_output}")
-
-await invoker.invoke(
- invocation=invocation,
- on_result_message=on_result,
- result_message_cls=ResultMessageInterface,
-)
+import asyncio
+from runtimeuse_client import RuntimeUseClient, QueryOptions
+
+async def main():
+ # Pseudocode: start the runtime inside your sandbox provider
+ sandbox = Sandbox.create()
+ sandbox.run("npx -y runtimeuse")
+ ws_url = sandbox.get_url(8080)
+
+ client = RuntimeUseClient(ws_url=ws_url)
+
+ result = await client.query(
+ prompt="Summarize the repository.",
+ options=QueryOptions(
+ system_prompt="You are a helpful assistant.",
+ model="gpt-4.1",
+ ),
+ )
+
+ print(result.data.text)
+
+asyncio.run(main())
```
+## Next Steps
+
+- See the [runtime package README](https://github.com/getlark/runtimeuse/tree/main/packages/runtimeuse) for runtime configuration and custom handlers.
+- See the [Python client README](https://github.com/getlark/runtimeuse/tree/main/packages/runtimeuse-client-python) for structured output, artifact uploads, and cancellation.
diff --git a/docs/public/favicon.ico b/docs/public/favicon.ico
deleted file mode 100644
index 5dbdfcd..0000000
Binary files a/docs/public/favicon.ico and /dev/null differ
diff --git a/docs/public/favicon.svg b/docs/public/favicon.svg
new file mode 100644
index 0000000..643776d
--- /dev/null
+++ b/docs/public/favicon.svg
@@ -0,0 +1,10 @@
+
diff --git a/examples/README.md b/examples/README.md
new file mode 100644
index 0000000..1dc4521
--- /dev/null
+++ b/examples/README.md
@@ -0,0 +1,13 @@
+# Examples
+
+Runnable examples showing how to use the [runtimeuse Python client](../packages/runtimeuse-client-python) with different sandbox providers.
+
+Each example is a single, self-contained `.py` file. Setup instructions (dependencies and environment variables) are listed in the comments at the top of each file.
+
+## Available examples
+
+| File | Provider | Description |
+|------|----------|-------------|
+| [e2b-quickstart.py](e2b-quickstart.py) | [E2B](https://e2b.dev) | Run Claude Code in an E2B cloud sandbox |
+
+More provider examples coming soon.
diff --git a/examples/e2b-quickstart.py b/examples/e2b-quickstart.py
new file mode 100644
index 0000000..937abc0
--- /dev/null
+++ b/examples/e2b-quickstart.py
@@ -0,0 +1,105 @@
+"""
+E2B Quickstart -- Run Claude Code in an E2B cloud sandbox using runtimeuse.
+
+Setup:
+ pip install runtimeuse-client e2b e2b-code-interpreter
+
+Environment variables:
+ E2B_API_KEY - your E2B API key (https://e2b.dev)
+ ANTHROPIC_API_KEY - your Anthropic API key
+
+Usage:
+ python e2b-quickstart.py
+"""
+
+from __future__ import annotations
+
+import asyncio
+import os
+
+from e2b import Template, wait_for_port, default_build_logger
+from e2b_code_interpreter import Sandbox
+
+from runtimeuse_client import (
+ RuntimeUseClient,
+ QueryOptions,
+ AssistantMessageInterface,
+ TextResult,
+)
+
+
+def _get_env_or_fail(name: str) -> str:
+ value = os.environ.get(name)
+ if not value:
+ raise RuntimeError(f"{name} environment variable is not set")
+ return value
+
+
+def create_sandbox() -> tuple[Sandbox, str]:
+ """Build an E2B template with runtimeuse + Claude Code and return (sandbox, ws_url)."""
+ e2b_api_key = _get_env_or_fail("E2B_API_KEY")
+ anthropic_api_key = _get_env_or_fail("ANTHROPIC_API_KEY")
+
+ alias = "runtimeuse-quickstart-claude"
+ start_cmd = "npx -y runtimeuse --agent claude"
+
+ print(
+ f"Building E2B template '{alias}' (this may take a few minutes the first time)..."
+ )
+
+ template = (
+ Template()
+ .from_node_image("lts")
+ .apt_install(["unzip"])
+ .npm_install(["@anthropic-ai/claude-code"], g=True)
+ .set_envs({"ANTHROPIC_API_KEY": anthropic_api_key})
+ .set_start_cmd(start_cmd, wait_for_port(8080))
+ )
+
+ Template.build(
+ template,
+ alias,
+ cpu_count=2,
+ memory_mb=2048,
+ on_build_logs=default_build_logger(),
+ )
+
+ sandbox = Sandbox.create(template=alias, api_key=e2b_api_key)
+ ws_url = f"wss://{sandbox.get_host(8080)}"
+ print(f"Sandbox ready at {ws_url}")
+
+ return sandbox, ws_url
+
+
+async def main() -> None:
+ sandbox, ws_url = create_sandbox()
+ try:
+ client = RuntimeUseClient(ws_url=ws_url)
+ print(f"Connected to {ws_url}")
+
+ async def on_message(msg: AssistantMessageInterface) -> None:
+ for block in msg.text_blocks:
+ print(f"[assistant] {block}")
+
+ prompt = "What files are in the current directory? List them."
+ print(f"Sending query: {prompt}")
+
+ result = await client.query(
+ prompt=prompt,
+ options=QueryOptions(
+ system_prompt="You are a helpful assistant.",
+ model="claude-sonnet-4-20250514",
+ on_assistant_message=on_message,
+ ),
+ )
+
+ print("\n--- Final Result ---")
+ assert isinstance(result.data, TextResult)
+ print(result.data.text)
+ finally:
+ sandbox.kill()
+ print("Sandbox terminated.")
+
+
+if __name__ == "__main__":
+ asyncio.run(main())
diff --git a/packages/runtimeuse-client-python/.env.example b/packages/runtimeuse-client-python/.env.example
new file mode 100644
index 0000000..5749c9c
--- /dev/null
+++ b/packages/runtimeuse-client-python/.env.example
@@ -0,0 +1,31 @@
+# Copy this file to .env for local sandbox and LLM testing.
+
+# Required for E2B-backed sandbox creation.
+E2B_API_KEY="your-e2b-api-key"
+
+# Required when starting an OpenAI-backed runtime in a sandbox.
+OPENAI_API_KEY="your-openai-api-key"
+
+# Required when starting a Claude-backed runtime in a sandbox.
+ANTHROPIC_API_KEY="your-anthropic-api-key"
+
+# Optional: override the runtime start command used by sandbox tests.
+# Defaults to: npx -y runtimeuse
+# Useful with `packages/runtimeuse/scripts/dev-publish.sh`, which prints a curl/unzip command.
+RUNTIMEUSE_RUN_COMMAND="npx -y runtimeuse"
+
+# Optional: reuse the E2B template between test runs.
+E2B_REUSE_TEMPLATE=false
+
+# Optional: point tests at an already-running runtime instead of creating a sandbox.
+# TEST_WS_URL="ws://localhost:8080"
+
+# Required for LLM artifact upload tests that write to S3.
+TEST_S3_BUCKET="your-test-bucket"
+
+# Optional: defaults to us-east-1 in the test fixtures.
+AWS_REGION="us-east-1"
+
+# Optional for scratch scripts using other sandbox providers.
+# SANDBOX_API_KEY="your-sandbox-provider-api-key"
+# DAYTONA_API_KEY="your-daytona-api-key"
diff --git a/packages/runtimeuse-client-python/README.md b/packages/runtimeuse-client-python/README.md
index 709e093..3eabb6f 100644
--- a/packages/runtimeuse-client-python/README.md
+++ b/packages/runtimeuse-client-python/README.md
@@ -103,6 +103,8 @@ async def on_artifact(request: ArtifactUploadRequestMessageInterface) -> Artifac
return ArtifactUploadResult(presigned_url=presigned_url, content_type=content_type)
```
+When using artifact uploads, set both `artifacts_dir` and `on_artifact_upload_request` in `QueryOptions`; the client validates that they are provided together.
+
### Cancellation
Call `client.abort()` from any coroutine to cancel a running query. The client sends a cancel message to the runtime and `query` raises `CancelledException`.
@@ -152,3 +154,8 @@ except CancelledException:
| -------------------- | ------------------------------------------- |
| `AgentRuntimeError` | Raised when the agent runtime returns an error (carries `.error` and `.metadata`) |
| `CancelledException` | Raised when `client.abort()` is called during a query |
+
+## Related Docs
+
+- [Repository overview](../../README.md)
+- [TypeScript runtime README](../runtimeuse/README.md)
diff --git a/packages/runtimeuse/.env.example b/packages/runtimeuse/.env.example
index 966566e..52a1120 100644
--- a/packages/runtimeuse/.env.example
+++ b/packages/runtimeuse/.env.example
@@ -1,4 +1,7 @@
-# If you need to run npm run dev-publish, then you need to set these variables
-# S3_BUCKET='runtimeuse-dev-builds'
-# S3_PREFIX='your-prefix' # optional, defaults to 'local-dev'
-# PRESIGN_EXPIRY=3600 # optional, defaults to 3600
+# Copy this file to .env if you want to run `npm run dev-publish`.
+# The script builds the runtime, uploads a zip to S3, and prints a presigned URL
+# that you can plug into sandbox test flows.
+
+S3_BUCKET="your-dev-build-bucket"
+S3_PREFIX="your-prefix"
+PRESIGN_EXPIRY=3600
diff --git a/packages/runtimeuse/README.md b/packages/runtimeuse/README.md
index 18b7626..cfc911a 100644
--- a/packages/runtimeuse/README.md
+++ b/packages/runtimeuse/README.md
@@ -2,6 +2,8 @@
TypeScript runtime package for [runtimeuse](https://github.com/getlark/runtimeuse). Runs inside the sandbox and handles the agent lifecycle: receives invocations over WebSocket, executes your agent handler, manages artifact uploads, runs pre-commands, downloads runtime files, and sends structured results back to the client.
+This package is used together with the Python client in [`runtimeuse-client`](../runtimeuse-client-python/README.md), which connects to the runtime from outside the sandbox.
+
## Installation
```bash
@@ -21,6 +23,8 @@ This starts a WebSocket server on port 8080 using the OpenAI agent handler (defa
- **`openai`** (default) -- uses `@openai/agents` SDK
- **`claude`** -- uses `@anthropic-ai/claude-agent-sdk` with Claude Code tools and `bypassPermissions` mode
+The Claude handler requires the `claude` CLI to be installed in the sandbox environment.
+
```bash
npx -y runtimeuse # OpenAI (default)
npx -y runtimeuse --agent claude # Claude
@@ -218,3 +222,8 @@ Command output (stdout/stderr) from pre-commands is automatically redacted using
| `AssistantMessage` | Runtime -> Client | Intermediate text from the agent |
| `ArtifactUploadRequestMessage` | Runtime -> Client | Request a presigned URL for an artifact |
| `ErrorMessage` | Runtime -> Client | Error during execution |
+
+## Related Docs
+
+- [Repository overview](../../README.md)
+- [Python client README](../runtimeuse-client-python/README.md)
diff --git a/packages/runtimeuse/scripts/dev-publish.sh b/packages/runtimeuse/scripts/dev-publish.sh
index a10c652..29f9464 100755
--- a/packages/runtimeuse/scripts/dev-publish.sh
+++ b/packages/runtimeuse/scripts/dev-publish.sh
@@ -37,9 +37,15 @@ aws s3 cp "$PROJECT_DIR/$ZIP_NAME" "s3://${S3_BUCKET}/${S3_KEY}"
URL=$(aws s3 presign "s3://${S3_BUCKET}/${S3_KEY}" --expires-in "$PRESIGN_EXPIRY")
+QUICK_CMD="curl -L \"$URL\" -o runtimeuse.zip && unzip -o runtimeuse.zip -d runtimeuse && node runtimeuse/dist/cli.js"
+
echo ""
echo "Download URL (expires in ${PRESIGN_EXPIRY}s):"
echo "$URL"
echo ""
+
echo "Quick start:"
-echo " curl -L \"$URL\" -o runtimeuse.zip && unzip -o runtimeuse.zip -d runtimeuse && node runtimeuse/dist/cli.js & echo \"RuntimeUse WS server started (PID \$!)\""
+echo " $QUICK_CMD"
+echo ""
+printf "%s" "$QUICK_CMD" | pbcopy
+echo "(copied to clipboard)"