Welcome aboard, future roboticist. This is your manual for Maqueen Lab — the live cockpit for your DFRobot Maqueen Lite v4 robot. No prior experience needed. Just curiosity and a 🤖.
Four mini-demos embedded in the page. Click them. They're real, just not connected to a robot.
From "what's a robot?" to driving it across the kitchen floor — fastest path.
ON. The micro:bit screen wakes up. green light = healthy.
Before we drive, let's tour the chassis. The Maqueen Lite v4 has more sensors than a smartphone has cameras.
micro:bit v2
5×5 LED screen, two buttons (A & B), accelerometer, magnetometer, microphone, speaker, and Bluetooth Low Energy radio.
TB6612FNG driver via I²C 0x10
Two DC motors, each independently controllable from -255 (full reverse) to +255 (full forward). PWM speed control.
SR04 ultrasonic on P1·P2
Sends a 40 kHz chirp and times the echo. Range 2 cm to ~4 m. Sees through glass, fabric, even fog. Cannot see clear plastic head-on.
Line sensors on P13 & P14
Infrared reflectance — black tape absorbs, white paper reflects. Each sensor returns 0 or 1. Two sensors = a sense of which side of a line you're on.
When the app loads you land on Maqueen → Drive. Big keypad in the middle, gauges below, sensors on the right. Here's the layout.
Logo, language flags 🇬🇧 🇫🇷 🇩🇿, theme picker (six themes!), the 📱 QR-share button, the 🤝 robot-pair button, and the 🟢 connection pill that's always visible.
Maqueen = the cockpit (default). Playground = a generic micro:bit tinker space. Inside Maqueen, 8 sub-tabs split by component.
Floats on the right at all times: robot identity card, CONNECT controls, live sensor strip, and message log. Drag its left edge to resize. Click the chevron to collapse.
The bridge between your screen and the robot is Web Bluetooth. No app install. No driver. Just a permission prompt and away you go.
Right inside the CONNECT card you'll see this 6-stage diagram light up as bytes travel:
M:200,200 for "drive both motors at speed 200". The robot replies ECHO:42 M:200,200 ("got it, I'm executing"). Round trip is around 30 ms — about as fast as you can blink three times.
Inside the CONNECT card, next to the Firmware button, there's a streams: OFF / ON toggle.
Turn it ON when you want pretty graphs. Turn it OFF for max responsiveness when driving.
The headline feature. Three ways to control the wheels, plus a bunch of automation.
W forward · S back · A spin left · D spin right · Space stop. Tap the on-screen buttons if you don't have a keyboard.
Drag the orange dot in the circle. Distance = speed. Direction = motion vector. Release = stop. Best on touchscreens.
On phones, click the 📱 tilt button in the macro bar and tilt the phone like a steering wheel. More details below.
The big horizontal slider sets the maximum speed (50 = crawl, 255 = sprint). It scales every drive command — so if you're set to 100 and tap forward, the robot uses speed 100, not 200.
Hold-to-drive is safer for kids — the robot stops the instant you let go.
Four chip buttons above the speed row that bundle drive feel + sweep speed + obstacle margin:
| Character | Speed | Obstacle margin | Sweep | Vibe |
|---|---|---|---|---|
| 🏃 Speedy | 240 | 15 cm | full, fast | Brave and fast |
| 🐢 Cautious | 110 | 45 cm | narrow front cone, slow | Stops a long way before any wall |
| 🔍 Curious | 180 | 25 cm | full, medium | Balanced explorer |
| 😴 Lazy | 90 | 55 cm | narrow, very slow | Chill mode — for tight spaces |
One tap = full character swap. Choice persists between sessions.
Click the green button — the robot drives forward on its own, watches the sonar, and turns away from anything closer than the obstacle ≤ slider. Touch any drive control to take back over.
if (something close) then turn. Yet it's astonishing how alive it feels. Welcome to roboticist's secret #1: simple rules → emergent behavior.
Tap 🔴 Record, drive a sequence, tap again to stop. Then ▶ Replay plays it back at original timing — useful for choreography demos. Saved between reloads.
Both pulse when active so the mic / motion sensor state is always visible. Voice details · Tilt details.
Three quick launchers below the macro bar. All games explained →
A pop-quiz card: random speed × time, predict the cm. Inside the Games section.
Below the controls sits an automotive-style cluster. Four gauges + an LCD trip computer + warning lights.
Cyan needle. Reads cm/s, computed from the wheel velocities you commanded × a calibration constant. Red tick = your peak speed this session.
Orange needle, 0–100 %. Average of |left| + |right| motor commands. Spinning in place still uses power even though SPEED is 0.
North/South/East/West compass. Integrated from wheel velocities (no real magnetometer reading by default). Resets on path-reset.
Inverted scale — 0 cm pegs the needle right (alarm!), 100+ cm pegs left (chill). Color shifts with proximity.
| ODO | Total distance ever driven this session, in meters |
| TRIP | Distance since last ↺ reset |
| PEAK | Top cm/s reached this session |
| AVG | Average cm/s while moving (excludes idle) |
| ⏱ | Total drive time, mm:ss |
The robot draws its own breadcrumb trail in the world — and projects sonar pings into a live obstacle map. Real Simultaneous Localization And Mapping, kid-sized.
Pick a target shape from the dropdown — Square 40 cm, Circle r=30 cm, Figure-8. The app draws a dashed outline; your job is to drive along it. SCORE updates live as your trail covers the target. Best score per shape persists.
A 60-second timed run. Score = unique 10×10 cm cells your trail visits. The room you've "mapped" appears in the cells display. More on this game →
The Maqueen has two servo ports (S1 & S2). Plug in any standard hobby servo — robot arms, ultrasonic gimbals, claw grippers.
Toggle the SWEEP button — the servo bounces back and forth automatically. Two sliders control the show:
Five preset chips: Full (0–180), Front (60–120), Left (90–180), Right (0–90), Wide (30–150). Sweep is essential for the radar — it's how the sonar "looks around".
A target servo angle pops up. Drag the purple slider to match it. Hit Lock answer — score = 100 − |target − guess|. If the robot's connected, your guess physically moves S1 so you feel the answer. Best score persists.
Two yellow LEDs, one on each side of the robot's "face". Mostly used as turn indicators or status flashers.
Use case: pair with auto-wander to flash a "yes I see you, turning now" signal when the robot detects an obstacle. Or set them as eyes that pulse to the buzzer's beat.
The BLE verb is LED:i,s. i = 0 (left) or 1 (right), s = 0 (off) or 1 (on). Try sending LED:0,1 from the message log!
Four addressable RGB pearls along the front of the robot. Each can be any of 16.7 million colors. Independently.
Click any of the four pearls in the diagram, then pick a color from the wheel. The change pushes to the robot in < 50 ms.
4 colored pads (red/green/blue/yellow), each with a unique tone. App plays a growing sequence; you tap them back in order. One wrong = game over, score = sequence length. Inside Games →
Important: Standard neopixel conflicts with bluetooth on the micro:bit because they both want the same hardware timer. Our default firmware shows your intent in the screen pearls but doesn't fire the real on-bot LEDs unless you flash a BLE-safe NeoPixel fork. The screen mockup is faithful — what you'd see in real life if the timing worked.
A piezo buzzer that can play any single note in the audible range (about 20 Hz to 20 kHz). One note at a time.
Eight buttons mapped to the C-major scale: C D E F G A B C. Each button shows the frequency. Tap them as a piano.
An 8-key piano strip (C5 → C6). Tap to play notes on both screen and robot. Hit 🎼 Twinkle demo to hear Twinkle Twinkle Little Star and discover the concept "a tune is a sequence of (pitch, duration) pairs". Inside Games →
The robot transmits a hidden 3–6 letter word as Morse code. You type what you heard. Score = exact matches. Inside Games →
The ultrasonic sensor is the robot's best friend. This panel has FIVE different ways to visualize what it sees.
The headline number on top. Updates a few times per second. Click poll DIST in the rail to start auto-polling.
| Style | Best for |
|---|---|
| 🦇 Bat | Default. Big single-distance number, like an echolocator's perception. |
| 🚢 Sonar | Submarine vibe — slow circular sweep with persistent contacts. |
| 📐 LiDAR | Engineering view — precise lines and grids. |
| 🔥 Heat | Heat-map gradient. Looks like an infrared camera. |
| 🌀 Sweep | Classic radar. Rotating beam, fading dots. Five marker shapes: |
Each radar style has its own audio cue when something is detected. 🔊 audio turns it on:
Two infrared eyes on the robot's underside detect dark vs. light. Black tape on white paper = perfect.
Two big dots — one for the left sensor, one for the right. Light up when they see white, dark when they see black tape.
Toggles a basic line follower. Logic: L sees line + R sees line → forward. L only → spin right. R only → spin left. Both lost → stop. Speed is the Drive slider.
Fine-tune with the tick slider — how often the follower re-evaluates (100 ms = jittery responsive, 1000 ms = smooth slow).
Stopwatch. Print/draw a black-line track. Click ▶ Start when the robot crosses the start line, 🏁 Lap when it finishes. Multi-lap one-handed; best lap persists. Inside Games →
Plug in an IR receiver module on the appropriate pin (NEC protocol). Press buttons on a TV remote — the codes appear here.
Map IR codes to drive commands — turn any TV remote into a robot remote. Add an event listener in the message log: when IR:nnn arrives, fire the matching M: verb.
Hands-free driving. Click the 🎙 voice button in the macro bar and shout commands at your phone.
| Phrase (any language) | What happens |
|---|---|
| "forward" / "go" / "avance" / "تقدم" | Drive ↑ |
| "back" / "reverse" / "recule" / "إرجع" | Drive ↓ |
| "left" / "spin left" / "gauche" / "يسار" | Spin ↺ |
| "right" / "spin right" / "droite" / "يمين" | Spin ↻ |
| "stop" / "arrête" / "قف" | STOP |
| "faster" / "plus vite" / "أسرع" | +30 to speed slider |
| "slower" / "moins vite" / "أبطأ" | −30 to speed slider |
Browser support: Chrome and Edge only. Firefox and Safari don't support the Web Speech API. The button greys out where unsupported.
Privacy: audio is streamed to the browser's speech service (Google for Chrome). Don't use voice control where you'd care about that being uploaded.
Steering wheel mode. Hold your phone like a Wii steering wheel. Tilt forward → robot goes forward. Tilt right → robot turns right. Wii era, reborn.
Defaults are tuned for a kid holding a phone naturally. The dead-zone is 4° around the calibrated center, so micro-shake doesn't make the robot twitch.
Finger-paint a route on the SLAM map. The robot drives it.
Drift accumulates! Long curvy paths might end up several centimeters off because we don't correct using sensor data. That's the same problem real robots have — and it's why GPS / lidar / vision matter.
Two buttons next to the Message Log "Export":
CSV opens directly in Google Sheets / Excel. Plot column B (x_cm) vs C (y_cm) and you've got your robot's path in 30 seconds. School-assignment gold.
Click 🍩 recap on the path card. A modal pops up with a Spotify-Wrapped-style narrative — total distance, top speed, obstacles spotted, final pose, suggested next tab.
Optional ✨ AI polish button rewrites the recap via the Claude API for extra flair. Needs a Claude API key in localStorage.maqueen.claudeKey (we don't ship one — get yours at console.anthropic.com).
Click 🎬 on the path card. A 6-second WebM video of your trail unrolling and SLAM hits popping in is generated and downloaded.
Drive a recognizable shape (your name? a heart?) and the time-lapse becomes a shareable moment. Etsy listing material that sells itself.
Best on a phone. Click 📷 AR on the path card. Phone asks for camera permission. Allow. Suddenly your phone screen shows the room with sensor data overlaid.
Geometry caveat: we don't know the camera's field-of-view in cm, so the ring is schematic — a feel, not a measurement. The HUD numbers are truthful — they come straight from BLE.
Two Maqueens, two browser tabs, one mirrors the other. Pure peer-to-peer over WebRTC — no server.
Once linked, every drive command on Tab A mirrors to Tab B's robot in real time. "Look mom, they're dancing together!"
Each game lives in the matching sub-tab. They're not gimmicks — every game teaches a real CS / engineering concept.
Path tab
Drive around for 60 s. Score = unique 10×10 cm cells visited.
Drive · Mini-Games
Hidden virtual treasure. Ping rate accelerates as you close in.
Drive · Mini-Games
Bundles slow-speed + tight-margin auto-wander. Build cardboard mazes!
Buzzer tab
8-key piano strip. Tap → buzzer plays. Twinkle Twinkle demo included.
NeoPixels tab
4 colored pads, growing color sequence. One miss = game over.
Drive tab
Predict d = v × t. Hit "Run it" to verify on real hardware.
Drive · Mini-Games
Two-robot scoreboard. Plus the Push kit makes a real bulldozer.
Line tab
Stopwatch with multi-lap. Print a track, time the laps.
Servos tab
"Hit 67° EXACTLY." Slider drag, score by accuracy.
Buzzer tab
Buzzer transmits a hidden word in Morse. You type what you heard.
The whole app is touch-friendly and reflows automatically below 1100 px viewport.
Click the 📱 button in the header. A QR code appears with the live URL. Scan with your phone camera.
Warm maker-bench amber.
Calm classroom blue.
Soft sage learning.
Treehouse green (darker).
Galaxy purple. The first-run vibe.
Deep navy for low light.
Click any flag-icon in the header to switch. Choice persists.
🇬🇧 English · 🇫🇷 Français · 🇩🇿 العربية (with full RTL layout flip)
You're in Safari or Firefox. Switch to Chrome or Edge. Apple disabled Web Bluetooth on Safari/iOS — there's no workaround.
bluez ≥ 5.50.Battery is low. The micro:bit's BLE radio is power-hungry. Swap in fresh batteries or plug in USB.
Check the streams toggle didn't go OFF unexpectedly. Reload the page. If still dead, re-flash firmware (the 🧪 Firmware button in CONNECT card opens MakeCode).
Browser doesn't support Web Speech. Use Chrome / Edge.
iOS gates motion sensors behind explicit permission. The button click triggers the prompt — make sure you tap "Allow" when it appears.
All the words. Skip what you know. Bookmark what you don't.
| BLE | Bluetooth Low Energy. The radio protocol your robot uses. Same family as your fitness tracker. |
| GATT | Generic Attribute Profile. The "language" BLE devices speak. Defines services and characteristics. |
| UART | Universal Asynchronous Receiver/Transmitter. The oldest serial protocol. Over BLE, we use the Nordic UART Service to send text lines. |
| I²C | Inter-Integrated Circuit. A 2-wire bus. The Maqueen's motor driver lives at I²C address 0x10. |
| PWM | Pulse-Width Modulation. Faking analog output by toggling a digital pin really fast. |
| SLAM | Simultaneous Localization And Mapping. Building a map of where you are while figuring out where you are. Hard problem. |
| Odometry | Tracking position by integrating wheel motion. Cheap, but drifts over time. |
| Dead reckoning | Same idea, classic name (sailors used it before GPS). |
| Differential drive | Two wheels independently controlled. The Maqueen kinematic model. |
| WebRTC | Web Real-Time Communication. Peer-to-peer browser-to-browser. We use it for 2-robot pairing. |
| SDP | Session Description Protocol. The blob you copy-paste during pairing. |
| NeoPixel | WS2812B addressable LED. Each one is a tiny computer that holds its own color. |
| SR04 | The classic ultrasonic distance sensor. 40 kHz chirp, time-of-flight measurement. |
Maqueen Lab covers a real CS / engineering curriculum disguised as play: