📚 Documentation and Media
Doxygen Documents: >html/index.html
Full Paper: 📘 Full Paper in Sensors Journal
Overview Video: 📽️✨ Introducing LARS
Benchmarking Video: 📽️📊 Performance Tests
Adaptability Video: 📽️🧪 Platform Adaptability Test

LARS (Light Augmented Reality System) is a cross-platform, open-source framework for experimentation, education, and outreach in collective robotics.
It leverages Extended Reality (XR) to seamlessly merge the physical and virtual worlds, projecting dynamic visual objects, such as gradients, fields, trails, and even robot states, directly into the environment where real robots operate.
LARS enables indirect robot-robot communication (stigmergy), while preserving all real-world constraints. It turns “invisible” collective dynamics into tangible, interactive experiences for researchers, students, and the public.
LARS features a robust, real-time tracking module based on the ARK (Automatic Robot Kinematics) algorithm, but goes far beyond:

Beyond tracking:
LARS projects virtual visual objects (gradients, cues, signals) in real time—directly onto the arena and the robots themselves.
This enables:
LARS is built on the classic Model-View-Controller (MVC) pattern:
(top, left:) GUI snapshot of 42 Kilobots synchronizing on a grid with their internal binary state being detected by the color of their LED in blue or red,
(top, mid-left:) user view of 63 Kilobots making a collective decision on a tiled environment with projected dynamic noise
(top, mid-right:) GUI Snapshot of 109 Kilobots with the trace of their random movement decaying over time
(top, right:) GUI snapshot of two active balls randomly moving in the bounded arena, being tracked by LARS without the need for any markers
(bottom, left:) GUI snapshot of two Thymios with different colors locating the center of the light distribution (projected by LARS). The trace of each robot shows the consistency of the color detection of each robot over time, even after a collision
(bottom, right:) User view of Thymios moving randomly, with their centroid, the projection of their trajectory (light blue trails), their Voronoi tesselation (black lines) and the corresponding network (green lines).
LARS runs as a Qt application (Qt 5.6+ recommended). Ubuntu is preferred.
See install_dep.md for full dependency details (Qt, CUDA/OpenCV3, etc.).
git clone https://github.com/mohsen-raoufi/LARS.git
cd LARS
In order to operate the Kilobot’s OHC, the user needs to be part of the dialout group. Therefore, add the user to the group dialout with command
sudo usermod -a -G dialout <user-name>
If you use or adapt LARS in your research or publications, please cite:
Raoufi, M., Romanczuk, P., & Hamann, H. (2025). LARS: A Light-Augmented Reality System for Collective Robotic Interaction. Sensors, 25(17), 5412. https://doi.org/10.3390/s25175412
Raoufi, M., Romanczuk, P., & Hamann, H. (2024). LARS: Light Augmented Reality System for Swarm. In Swarm Intelligence: 14th International Conference, ANTS 2024, Konstanz, Germany, October 9–11, 2024, Proceedings (Vol. 14987, p. 246). Springer Nature.
also include ARK:
LARS is supported by the Science of Intelligence Cluster of Excellence, Berlin.
Developed and maintained by Mohsen Raoufi.
Open-source under the GNU GPL v3.0.
doxygen/html/index.html🙌 Contributions welcome! LARS is for scientists, educators, and all who are curious about collective intelligence in robotics.