create-snipara (NPX)
One-command setup and maintenance for Snipara MCP + snipara-companion, with optional RLM-Runtime for local execution.
npx create-sniparaWhat It Does
- Installs snipara-companion — Local companion for query, plan, upload, chunk, session bootstrap, event inspection, and task workflows
- Installs snipara-mcp — MCP server for context-optimized documentation queries
- Installs rlm-runtime — Safe code execution with Docker isolation
- Configures .mcp.json — Ready for Claude Code, Cursor, Claude Desktop
- Sets up hooks — Thin edge runtime bootstrap for session memory automation
- Updates environment files — Adds API key configuration
- Adds maintenance commands —
doctor,repair,upgrade, andprint-config
Install Profiles
| Profile | What it installs | Best for |
|---|---|---|
hosted-only | Hosted MCP config only | SaaS-first setups without local workflows |
hosted-companion | Hosted MCP + snipara-companion | Recommended default |
full-stack | Hosted MCP + snipara-companion + rlm-runtime | Hosted context plus local execution |
runtime-only | rlm-runtime only | Pure local execution without hosted API |
Hosted Core + Thin Edge Runtime
create-snipara installs a hosted-first setup. Snipara stays the source of truth for memory, review, policy, and orchestration. The local install only adds a thin edge layer for hook capture, context restore, and optional safe execution.
- Hosted core: 102-tool MCP surface, reviewed memory, orchestration, automation policies
- Thin edge runtime: local hooks, compatibility CLI flows, and optional
rlm-runtime - Companion workflows:
rlm-hook query,plan,multi-query,orchestrate,load-document,upload,chunk,events recent,session-bootstrap, andtask-commit - Generated companion pack:
.snipara/companionwith client-aware command presets, local usage guidance, and a doctor report - Design rule: local adapters capture and forward signals; they do not own durable memory policy
Interactive Setup
Run npx create-sniparain your project directory. You'll be prompted for:
| Prompt | Description |
|---|---|
| Project slug | Auto-detected from git remote or directory name |
| Project ID | Optional, for advanced use cases |
| API key type | Project key, Team key, Sign up, or Skip |
| API key | Your Snipara API key |
| AI client | Claude Code, Cursor, Claude Desktop, or other |
| Install profile | Hosted only, hosted + companion, full stack, or runtime only |
| Hooks | Whether local hooks should be generated when supported |
| LLM provider | OpenAI, Anthropic, or None (for rlm run/rlm agent CLI) |
| Run rlm init | Optional — configure execution environment (sandbox/docker/local) |
Maintenance Commands
doctor
Validates local wiring and writes .snipara/companion/doctor.json.
repair
Rebuilds local configuration, companion pack, and hooks.
upgrade
Upgrades installed local pieces and refreshes generated assets.
print-config
Shows the inferred local setup and install profile.
npx create-snipara doctornpx create-snipara repairExecution Environments
When you select RLM-Runtime during setup, you'll be asked if you want to run rlm init to configure the execution environment:
| Environment | Description | Use Case |
|---|---|---|
sandbox | RestrictedPython, safe stdlib only | Default, most secure |
docker | Full Python in isolated container | Recommended for full features |
local | Full access, no isolation | Development only |
Security Recommendation
Use docker mode for production and untrusted code. local mode is only recommended for development and AI-generated code.
You can also configure the environment later by running rlm init manually.
API Key Requirements
| Tool | Snipara API Key | LLM API Key (OpenAI/Anthropic) |
|---|---|---|
execute_python MCP | Not needed | Not needed (your AI client is the LLM) |
rlm_context_query MCP | Required | Not needed |
rlm_remember/rlm_recall | Required | Not needed |
rlm run / rlm agent CLI | Optional (for context) | Required |
Key Types
| Type | Description |
|---|---|
| Project API key | Access to a single project |
| Team API key | Access to all projects in your team |
Command Line Options
Basic usagenpx create-sniparaWith project API keynpx create-snipara --api-key rlm_your_project_keyWith team API key (access all team projects)npx create-snipara --team-key rlm_your_team_keySpecify project slugnpx create-snipara --slug my-projectRuntime only - no Snipara API key needednpx create-snipara --runtime-onlySkip local companion CLInpx create-snipara --skip-companionSkip specific installationsnpx create-snipara --skip-mcp # Skip snipara-mcpnpx create-snipara --skip-runtime # Skip rlm-runtimenpx create-snipara --skip-hooks # Skip Claude Code hooksnpx create-snipara --skip-test # Skip connection testAccept all defaults (non-interactive)npx create-snipara -y --api-key rlm_xxx --slug my-projectWhat Gets Created
.mcp.json
{ "mcpServers": { "snipara": { "type": "http", "url": "https://api.snipara.com/mcp/your-project", "headers": { "X-API-Key": "rlm_your_key" } }, "rlm-runtime": { "type": "http", "url": "http://localhost:8765/mcp", "headers": {} } }}Claude Code Hooks (if selected)
snipara-startup.sh- Restores session contextsnipara-session.sh— Auto-remembers commitssnipara-compact.sh— Saves context before compaction
Local Companion Pack
If you install the local companion, create-snipara also generates a small project-local starter pack under .snipara/companion.
README.md- client-aware usage guidance and starter commandscommands.json- machine-readable command presets for local workflows
Companion Workflows
The companion CLI is a thin local facade over hosted Snipara workflows. It is useful when you want repeatable local commands in addition to MCP access.
rlm-hook query --query "recent auth decisions"rlm-hook plan --query "implement webhook retry hardening"rlm-hook multi-query --queries "recent incidents" "open decisions"rlm-hook orchestrate --query "map auth architecture"rlm-hook load-document --path docs/architecture/auth.mdrlm-hook events recent --limit 20rlm-hook task-commit --summary "Shipped retry hardening"These commands print human-readable output by default. Add --json when you need the raw hosted response.
Environment Files
Updates .env.example and .env.local with:
Snipara ConfigurationSNIPARA_API_KEY=your_api_keySNIPARA_PROJECT_SLUG=your-projectRLM-Runtime LLM Provider (if configured)OPENAI_API_KEY=sk-...orANTHROPIC_API_KEY=sk-ant-...After Installation
For Claude Code / Cursor
- Restart your AI client
- MCP tools are automatically available
For Claude Desktop
- Restart Claude Desktop
- Config is at
~/Library/Application Support/Claude/claude_desktop_config.json
RLM-Runtime Usage
MCP Tools (no LLM API key needed):
Your AI client (Claude, GPT, etc.) provides the LLM — no additional API key required.
| Tool | Description |
|---|---|
execute_python | Run Python in sandbox |
get_repl_context | Get session variables |
set_repl_context | Set session variables |
clear_repl_context | Clear session |
CLI Commands (requires LLM API key):
For rlm run and rlm agent, you need an LLM provider API key:
Set your LLM providerexport OPENAI_API_KEY=sk-...orexport ANTHROPIC_API_KEY=sk-ant-...Run commandsrlm init # Initialize configurationrlm run --env docker # Run with Docker isolationrlm agent "task" # Autonomous agent moderlm visualize # Launch trajectory dashboardAvailable MCP Tools
If you enable hook-compatible local tooling, the install can also forward canonical lifecycle events into Snipara's automation API. That lets local adapters feel closer to a local-memory workflow while keeping review and persistence centralized.
After setup, you have access to the current 102-tool MCP contract across context, memory, automation, analytics, and orchestration:
| Category | Tools |
|---|---|
| Context | rlm_context_query, rlm_ask, rlm_search, rlm_sections |
| Planning | rlm_plan, rlm_decompose, rlm_multi_query |
| Memory | rlm_remember, rlm_recall, rlm_memories, rlm_forget, rlm_memory_attach_source, rlm_memory_verify, rlm_memory_invalidate, rlm_memory_supersede |
| Execution | execute_python, get_repl_context (via RLM-Runtime) |
| Swarms | rlm_swarm_create, rlm_claim, rlm_task_create |
Requirements
- Node.js 18+
- Python 3.10+ (for snipara-mcp and rlm-runtime)
- Docker (optional, for RLM-Runtime isolation)