[ Web UI for your robot · Closed Beta ]

Drive your robot
from a browser tab.

MakerMods is a UI that doesn't get in the way. Plug in your arm, calibrate in a few clicks, teleop in real time, record episodes, and train policies on a remote GPU. Inference runs in the cloud and streams back to the arm on your desk.

app.makermods.io
DEMO · 02:00
setup · calibration · teleop + record · train + inference
XLEROBOT · SO-101 · REBOT
supported arms
browser native
no install · USB direct
cloud GPU
training + remote inference
open
built for makers
[ Features ]

Everything you'd script. Without writing scripts.

Teleop, calibrate, record, and train, all in your browser. No install, no Python on your machine, no SSH. Just plug your arm in and go.

[ TELEOP ]

Live teleoperation

Move a leader arm, the follower mirrors. WebSerial straight to USB. No daemons, no Python env, no SSH.

[ CALIB ]

Guided calibration

Auto or manual. We drive each servo to its limits, write the calibration file, and pair leader/follower by base ID.

[ RECORD ]

Multi-camera recording

Detect every cam your machine sees. Toggle, name (front_cam, wrist_cam), and record episodes straight to a dataset.

[ TRAIN ]

Cloud GPU training

Pick a policy, point at your dataset, hit train. We spin up a GPU, stream loss curves back, drop the checkpoint in your library.

[ INFER ]

Remote inference

Run policies on a remote GPU; actions stream back to the arm at 300ms. Your laptop just holds the USB cable.

[ SDK · SOON ]

Python SDK (coming later)

A standalone Python package you'll install locally to drive your robot from code. Separate from the web app, for makers who'd rather live in a script.

[ How it works ]

From box to policy in an afternoon.

The whole flow is a single guided wizard. You'll spend more time on your dataset than on getting the rig to work.

[ FLOW · 01 → 04 ]
STEP_01USB
01

Plug in

Connect leader and follower arms. We auto-detect Feetech motor controllers and assign roles. Click Wiggle to confirm which is which.

STEP_02AUTO
02

Calibrate

Run auto-calibration: each servo drives to its physical limits and we save the file. Or do it manually if you prefer.

STEP_03REC
03

Teleop & record

Mirror the leader to the follower. Add cameras (wrist, front, top). Record episodes into a HuggingFace-format dataset.

STEP_04GPU
04

Train & deploy

Pick a policy. Train on cloud GPU. Deploy with one click. Inference runs remote, actions stream to your arm.

[ Closed beta ]

Get in early.

We're rolling out invites in waves to people actually building. Tell us what arm you have and what you're trying to do. It helps us prioritize who to onboard first.

  • Free during beta, including cloud GPU credits for training
  • Direct line to the team on Discord; your bug reports ship same week
  • Early access to remote inference + the dataset hub
  • No spam. One email when your invite is ready.
[ JOIN_WAITLIST ]