A smart coding car for micro:bit that works both as a standalone toy and a powerful classroom teaching aid. Featuring preset line-tracking and obstacle-avoidance modes, RGB LED headlights, servo expansion ports, and full LEGO brick compatibility — students move from play to programming in minutes.
Product Overview
The TPBot Car Kit bridges the gap between play and programming. Out of the box it drives, tracks lines, and avoids obstacles on its own — no micro:bit needed. Connect a micro:bit and 14 structured projects unlock the full potential of block-based and Python coding.
Why It Works
Every design decision — from the servo expansion ports to the LEGO-compatible shell — removes friction so students spend more time learning and less time troubleshooting.
Product Gallery
From the compact chassis to classroom deployments — the TPBot is built to inspire from the moment it's switched on.
Key Hardware
A fully integrated hardware platform — from autonomous sensors to LEGO-compatible chassis — gives students a complete smart car to explore and extend.
Complete Package
Everything needed to start line-tracking, obstacle-avoiding, and programming right away. micro:bit sold separately.
Technical Specifications
Detailed specifications for the TPBot Car Kit (SKU: EF08230).
| SKU | EF08230 |
| Controller | micro:bit (not included) |
| Drive System | Dual DC Motors (differential drive) |
| Line Sensor | Onboard line-tracking sensor |
| Distance Sensor | Onboard ultrasonic sensor |
| Lighting | RGB LED headlights (full-color) |
| Expansion | 2× servo connectors |
| Chassis | LEGO brick compatible |
| Battery | 4× AA (included) |
| Standalone Mode | Line-tracking + obstacle-avoidance |
| Standby Indicator | Rainbow breathing LEDs |
| Obstacle Alert | Red headlights activate |
| Programming | Microsoft MakeCode / MicroPython |
| Curriculum Cases | 14 structured projects |
| Target Grades | Grades 3–8 |
Curriculum Cases
Each case introduces a new concept, builds on previous skills, and ends with a working program students run on their own TPBot.
Students learn to control the movement of the TPBot via programming. Starting from setting wheel speeds for continuous forward motion, they progress to timed movements and button-triggered control — building the foundational skills all future cases depend on.
Students programme to control the colour of the LED headlights using micro:bit buttons. They build timed light sequences, explore conditional logic for hardware control, and learn how vehicles use lighting systems for signaling and safety.
Students programme the TPBot to drive along a black line on the included map. They learn how line-tracking sensors detect contrast, implement conditional steering corrections, and explore how autonomous vehicles and warehouse robots use line-following navigation.
Students programme the TPBot to detect and navigate around obstacles automatically. The car monitors ultrasonic distance readings, stops with visual feedback when an obstacle is within 15 cm, then reverses and turns — mirroring a core capability of real self-driving vehicles.
Students programme the TPBot to turn on its headlights automatically in darkness. They develop programs that respond to environmental light conditions using conditional logic — exploring how real vehicles use ambient light sensors for intelligent, automated lighting systems.
Students programme the TPBot to drive at random by assigning randomised speed values to each wheel motor. The car exhibits unpredictable movement patterns — introducing the concept of randomisation in programming and how it can create emergent, lifelike behavior in robots.
Students programme the TPBot to simulate a police car — driving forward while alternating headlights between red and blue with timed delays. The project connects programming to real-world context and builds skills in coordinating motion, lighting, and timing logic simultaneously.
Students programme the TPBot to detect a designated stopping location and halt automatically using its line-tracking sensors. The case connects to real-world automated parking systems and reinforces the use of sensor feedback to trigger precise, conditional robot actions.
Students programme the TPBot to head toward a light source — detecting light intensity and adjusting motor behavior accordingly. The car drives forward toward the light, or moves in a search pattern when no light is found, introducing sensor-guided navigation concepts.
Students place black tape along table edges and programme the TPBot to detect it, reverse for one second, then turn and continue forward — preventing falls. The case teaches a practical safety application of sensor-driven conditional logic in robotics design.
Students programme the TPBot to follow another car while maintaining a set distance. The car continuously monitors distance and applies three-state decision logic — stop, move forward, or reverse — to maintain the target gap, mirroring real adaptive cruise control technology.
Students create a TPBot that adjusts its driving speed according to ambient sound levels using micro:bit's built-in microphone. Louder environments produce faster movement; quieter environments slow the car — connecting real-time environmental sensing to dynamic motor control.
Students combine two previously learned behaviors — line-following and obstacle detection — into a single integrated program. The TPBot drives along a black line and stops automatically when an obstacle is detected, demonstrating multi-sensor autonomous decision-making.
Students create a TPBot alarm system that triggers audio and visual alerts when the car is picked up. Using micro:bit's built-in accelerometer to detect being lifted, the system flashes lights and sounds a buzzer — then deactivates and displays an icon when set back down.
Attach the Smart AI Lens to transform the TPBot into a vision-powered autonomous robot. Students programme card recognition, color detection, ball-tracking, and face-following — exploring the fundamentals of artificial intelligence and machine vision through hands-on coding.
AI Lens Cases
Students use the AI Lens to guide the TPBot according to road indicator cards. The car reads visual traffic signals and executes directional commands — forward, left, right, or stop — introducing real-world computer vision and autonomous navigation concepts.
Students use the Smart AI Lens to detect colors and display the matched color on the TPBot's RGB headlights. By recognising blue, red, green, and yellow, the car becomes a visual feedback system that bridges color science with robotics programming.
Students build a ball-tracking car by programming the TPBot to identify a ball's position using the AI Lens and steer toward it. The car uses X/Y coordinate data from the lens to make real-time directional decisions — demonstrating object-tracking AI in action.
Students build a face-tracking TPBot using the Smart AI Lens for facial recognition. The car continuously monitors detected face coordinates and adjusts motor speeds to keep the face centered — exploring real-time AI-driven autonomy used in cameras, drones, and robots.
Extend the TPBot with three progressive remote control methods — from button-based radio commands to motion-sensing accelerometer control and precision joystick steering. Students explore wireless communication, sensor input, and real-time motor control across three hands-on projects.
Remote Control Cases
Students programme a two-micro:bit wireless control system — one as a remote transmitter, one as the TPBot receiver. Button presses send radio signals that trigger motor commands, building a complete wireless robotics controller from first principles.
Students programme the TPBot to be steered by tilting a micro:bit — the built-in accelerometer measures tilt angles and transmits them wirelessly to the car. Tilting the controller forward, back, and sideways translates directly into differential wheel speed commands.
Students use a dedicated Joystick:bit controller to pilot the TPBot wirelessly. The joystick's X/Y values map to directional commands — forward, reverse, left, right, and stop — giving students a precise, intuitive control interface and introducing game-controller-style input processing.
Expand the TPBot with a sensor-rich accessories pack for four engaging interactive projects. Students explore potentiometer speed control, LED light programming, gesture-based navigation, and color-driven behavior — combining multiple sensors and outputs in progressively complex programs.
Interactive Coding Cases
Students connect a potentiometer to the TPBot and programme it to dynamically adjust driving speed based on the knob position. Sensor readings (0–1023) are mapped to motor speed values (0–100), introducing analog input processing and real-time variable control in robotics.
Students connect a rainbow LED strip to port 1 and programme the TPBot to simulate police car lights — alternating red and blue patterns (500 ms intervals) triggered by button A while the car drives forward, with button B switching lights off.
Students connect a gesture sensor and programme the TPBot to respond to hand movements — driving forward, backward, and turning left or right based on detected gestures. The case introduces touchless human-machine interaction with physical robot control.
Students use a color sensor to drive TPBot behavior — the rainbow LED changes color to match detected cards, each triggering a defined function: move forward, randomize headlight colors, avoid obstacles, or follow a line. Multiple sensors are unified into one color-driven decision system.
Learning Outcomes
The TPBot Car Kit curriculum delivers measurable competencies across programming, engineering, physics, and mathematics — aligned to STEAM learning frameworks for Grades 3–8.
Bring the TPBot Car Kit into your classroom — built for K–8 STEAM programs, robotics clubs, and curriculum-aligned coding courses.