The goal of the project was to promote the independence of people with mobility impairments through a semi-autonomous electric wheelchair system. An innovative tech stack combining sensors, actuators, and middleware was intended to demonstrate new possibilities—from safe navigation in everyday life to intelligent interaction with the surrounding environment. At the same time, the project served as an exciting testbed for future mobility services in the fields of care, health, and smart assistance.
The greatest technical challenge was achieving safe and precise real-time localization, particularly at low speeds and in unstructured, confined environments. The central task therefore lay not only in selecting precise sensors, but also in building a robust, economically viable hardware setup that functions reliably over the long term and is suitable for series production.
In collaboration with Alber, we developed a modular simulation environment based on ROS 2 to efficiently and safely evaluate different sensor and communication architectures. A combination of two LiDAR sensors was used to enable robust, real-time environmental perception. With the help of SLAM algorithms, the environment was dynamically mapped and made navigable.
For the implementation of the Simultaneous Localization and Mapping algorithms, the ROS 2 SlamToolbox was used. Based on the two LiDAR sensors, an environment map was created from which a costmap was generated: a two-dimensional representation that distinguishes drivable from non-drivable areas. This costmap forms the basis for the system’s automatic route planning—the algorithm calculates the most efficient and safest route to a previously defined target point.
To control the system and ensure practical usability, we developed a mobile app using Flutter.



