Getting Started
The fastest path through Mesh Storage — get data in, view it, then query it from SQL or your AI tools
Mesh Storage is the data lake at the heart of Alloy. Drop in MCAP recordings — from your laptop, a robot, or a fleet of devices — then browse, replay, and query everything from one place.
This page walks the canonical workflow in three steps. Pick the path through each step that matches your setup.
Step 1 — Get data in
Mesh Storage accepts MCAP files via three paths. Pick the one that matches how you record:
Web upload
Already have MCAP files on your laptop? Drag and drop them in. Best for ad-hoc files.
Docker
No recording stack yet? Pull the prebuilt edge image — recorder, compression, diagnostics, and graph recording end-to-end. Best for production fleets.
Track a folder (binary)
Already have your own ROS2 image or recorder? Install the lightweight binary and point it at a directory of MCAP files.
Web uploads land in the uploads/ folder. Device uploads (Docker or binary) land in devices/<device-id>/. Both are queryable from the same place once processed.
Every file goes through the same lifecycle once it lands: Queued → Processing → Ready (or Failed). Replay and Inspect work as soon as a file lands — you don't have to wait for processing to finish.
Step 2 — View the data
Once a file is in Mesh Storage you can browse, replay, and inspect it without leaving the browser.
Step 3 — Use the data
Once a file is Ready, you can query it. Two ways depending on where you want to work:
From the browser — SQL Workbench
The in-app SQL editor runs DuckDB in your browser against the Iceberg tables. No setup, no credentials to manage. Best for ad-hoc analysis.
SELECT topic, count(*) AS messages
FROM "uploads"."my_recording__diagnostics"
GROUP BY topic
ORDER BY messages DESC
LIMIT 20;From your AI tool — MCP
Alloy ships an MCP server so Claude, Cursor, Codex, Windsurf, or any MCP-aware tool can query missions, browse files, run SQL, and pull mission context into your workflow.
claude mcp add alloy --transport http https://aus.usealloy.ai/mcpFrom notebooks / BI / external compute — Iceberg REST
Mesh Storage exposes the data lake as an Iceberg REST Catalog. Generate an API key from the Connect modal and point DuckDB, Spark, Trino, or PyIceberg at the endpoint — no copy step, no exports.