MakerMods is a UI that doesn't get in the way. Plug in your arm, calibrate in a few clicks, teleop in real time, record episodes, and train policies on a remote GPU. Inference runs in the cloud and streams back to the arm on your desk.
Teleop, calibrate, record, and train, all in your browser. No install, no Python on your machine, no SSH. Just plug your arm in and go.
Move a leader arm, the follower mirrors. WebSerial straight to USB. No daemons, no Python env, no SSH.
Auto or manual. We drive each servo to its limits, write the calibration file, and pair leader/follower by base ID.
Detect every cam your machine sees. Toggle, name (front_cam, wrist_cam), and record episodes straight to a dataset.
Pick a policy, point at your dataset, hit train. We spin up a GPU, stream loss curves back, drop the checkpoint in your library.
Run policies on a remote GPU; actions stream back to the arm at 300ms. Your laptop just holds the USB cable.
A standalone Python package you'll install locally to drive your robot from code. Separate from the web app, for makers who'd rather live in a script.
The whole flow is a single guided wizard. You'll spend more time on your dataset than on getting the rig to work.
Connect leader and follower arms. We auto-detect Feetech motor controllers and assign roles. Click Wiggle to confirm which is which.
Run auto-calibration: each servo drives to its physical limits and we save the file. Or do it manually if you prefer.
Mirror the leader to the follower. Add cameras (wrist, front, top). Record episodes into a HuggingFace-format dataset.
Pick a policy. Train on cloud GPU. Deploy with one click. Inference runs remote, actions stream to your arm.
We're rolling out invites in waves to people actually building. Tell us what arm you have and what you're trying to do. It helps us prioritize who to onboard first.