Alloy
Mesh Storage

Getting Started

The fastest path through Mesh Storage — get data in, view it, then query it from SQL or your AI tools

Mesh Storage is the data lake at the heart of Alloy. Drop in MCAP recordings — from your laptop, a robot, or a fleet of devices — then browse, replay, and query everything from one place.

This page walks the canonical workflow in three steps. Pick the path through each step that matches your setup.

Step 1 — Get data in

Mesh Storage accepts MCAP files via three paths. Pick the one that matches how you record:

Web uploads land in the uploads/ folder. Device uploads (Docker or binary) land in devices/<device-id>/. Both are queryable from the same place once processed.

Every file goes through the same lifecycle once it lands: Queued → Processing → Ready (or Failed). Replay and Inspect work as soon as a file lands — you don't have to wait for processing to finish.

Step 2 — View the data

Once a file is in Mesh Storage you can browse, replay, and inspect it without leaving the browser.

Open the file browser — the Mesh Storage page lists every file in your data lake with a status badge, size, and last-modified time.
Inspect any MCAP to see topics, schemas, message counts, and time range — useful for "what's actually in this recording?"
Replay opens the 3D replay viewer with sensor data, robot poses, and other visualisable topics rendered directly in the browser.
For device folders, you also get ROS2 diagnostics and the ROS graph view to debug your robot's runtime configuration.

Step 3 — Use the data

Once a file is Ready, you can query it. Two ways depending on where you want to work:

From the browser — SQL Workbench

The in-app SQL editor runs DuckDB in your browser against the Iceberg tables. No setup, no credentials to manage. Best for ad-hoc analysis.

SELECT topic, count(*) AS messages
FROM "uploads"."my_recording__diagnostics"
GROUP BY topic
ORDER BY messages DESC
LIMIT 20;

From your AI tool — MCP

Alloy ships an MCP server so Claude, Cursor, Codex, Windsurf, or any MCP-aware tool can query missions, browse files, run SQL, and pull mission context into your workflow.

claude mcp add alloy --transport http https://aus.usealloy.ai/mcp

From notebooks / BI / external compute — Iceberg REST

Mesh Storage exposes the data lake as an Iceberg REST Catalog. Generate an API key from the Connect modal and point DuckDB, Spark, Trino, or PyIceberg at the endpoint — no copy step, no exports.

What's next?

On this page