What’s provided
Fully assembled model vehicle
Teams selected for the competition will receive a 1:10 model vehicle at the kick-off (or around that time).
The provided vehicle comes fully assembled and equipped with essential components necessary for the development of core functionalities, including:
Chassis for structural support
Brushless DC motor with Electronic Speed Control (ESC) for vehicle speed control
Servo for steering control
Inertial Measurement Unit (IMU) for orientation and movement feedback
Control unit board (uC - stm32) for low-level interaction with the car
Brain unit board (SBC - Raspberry) for controlling states
Wide-angle (Raspberry) camera for scene understanding
Power distribution board to manage system power and provide usage feedback
Battery for power supply (For logistics reasons, it will only be provided for shippment within the in the European Union)
Car-case for concealing components
Demo code for the embedded platform
The code for the control unit board is written in C++ and runs on a real-time operating system. It is open source, allowing teams to either use it as provided or modify it to suit their needs. The code offers the following features:
Drivers for motion control (speed and turning angle).
Monitoring of instant current consumption and battery voltage, with ssafety mechanisms (such as low battery warning and automatic shutdown).
Power states manager (KL0 - system off, KL15 - system on for sensor readings, KL30 - system on for motor control).
IMU data acquisition.
Audible alerts via a buzzer.
Communication with the Brain unit board via UART interface
Demo code for the Brain platform
The Brain Unit Board Python code uses several libraries and is open source. Teams may use the code as-is, build upon it to develop autonomous driving functionalities or rewrite it completly.The code provides the following capabilities:
Implements multi-threading inside a multi-processing architecture, where you can set frequency of runing for each.
Featuring a subscription-based way of communication between the threads and processes.
Interprets and sends movement commands to the Control Unit Board via the UART interface.
Interfaces with the camera for live streaming, recording, continuous calibration, and more.
Provides an API for communication with the Traffic Communication Server (available both on-site and online).
Provides an API for receiving streamed data from semaphores.
State machine where start&stop of processes, threads is start and stopped.
- Includes the dashboard component, a web-app made of backend and front-end. Here are some details about it
- Multiple driving modes:
Manual control.
Automatic control for technical run.
Automatic functions for legacy run.
GPS positioning, speedometer, and steering wheel feedback.
Live camera feed.
Indicators (stop sign seen, pedestrian detected, etc.)
Battery level monitoring.
System status info (CPU usage, Ram, etc.).
Flags reading and live configuration (similar to CANalyzer table)
- Comes alongside a series of services that:
Starts it’s own hotspot in case the raspberry doesn’t automatically connect to a network. Comes with a script that allows you to easly insert new networks
Automatically builds the front-end application (dashboard), so to fastern up the runing of the app when needed (main.py).
Computer apps and scripts
semaphore streamer simulation
This application simulates the transmission of messages from semaphores. It helps teams validate data reception from this service and adapt their API accordingly, if needed.
traffic communication server simulator
This application simulates the communication server that will be present at the competition venue. Each team’s vehicle must send monitoring data to this server, including:
Vehicle speed
Vehicle position
Obstacle details (in the event of an encounter)
Additionally, the simulator manages the connection between the team’s vehicle and a simulated tracking device (which, in reality, will be a physical device mounted on the vehicle). This tracking device transmits the vehicle’s position on the track.
A more detailed description of this system can be found in the “The track” section.
Demo code for the simulator
The Gazebo simulator replicates the competition environment and is intended for integration and functional testing of team code. It is recommended primarily for integration testing rather than fine-tuning, due to its lower level of environmental noise compared to the real-world setup.
The simulator and its demo code are open source and open to further development. While official support for the project is no longer available, some improved variants developed by former teams exist. These variants are accessible but have not been tested or officially validated.
Some other code examples
Additional code examples for the Brain Unit Board, written in C++ or structured for ROS-based development, are provided for reference. These examples are no longer officially supported, but may serve as useful starting points or inspiration for team-specific implementations.
Courses and general documentation
The Bosch organisers will host a series of code review sessions, during which each aspect of the provided codebase will be explained. Participants are strongly encouraged to review the code beforehand and attend these sessions to gain a deeper understanding of the system.
Housing and meals
During the Challenge, all semi-finalist teams will be invited to the competition venue in Cluj-Napoca, Romania. During this period, accommodation and meals will be provided by the organisers for all team members, with the exception of the mentor.
Teams will be responsible for arranging and covering their own travel, as well as any additional accommodation before or after the event, if necessary. External sponsorship may be sought, subject to the specific competition restrictions outlined in the Additional Aspects section.