Sensor Fusion Development for Automated Driving Functions
BASELABS Create Embedded is a software solution for the fast and efficient development of sensor data fusion systems for automated driving functions in embedded systems. Its embedded library contains fusion algorithms that combine data from radar, camera and LiDAR sensors. The resulting object fusion provides a uniform object list of the vehicle environment. BASELABS Create Embedded makes it possible for the first time to take over the developed sensor data fusion directly for production ECUs.
Building blocks for the development of sensor fusion systems
Consistent solution for all development stages
Ready for series production
Developed according to Automotive SPICE
Complies with ISO 26262 (ASIL B), confirmed by exida
Fully documented development process: consistent, traceable and verified
Complete test coverage and code verification
Optimized workflow with the middleware Robot Operating System (ROS)
Fully compatible with AUTOSAR Classic and Adaptive
Supports all relevant automotive sensors like radar, camera and LiDAR
Scalable from radar-camera front fusion up to 360° object fusion using multiple radars, cameras and LiDAR sensors
MISRA C:2012 compliant source code available for all embedded hardware platforms
Easy adaption of data fusion applications to different sensor setups or types
Video: How to combine different cameras.
BASELABS Create Embedded: Data fusion development for automated vehicles – from prototyping to series production.
Data fusion designer: With BASELABS Create Embedded, sensors are added and configured independently from other sensors to allow an incremental system design.
BASELABS Create Embedded: Development of sensor fusion along the entire development chain.
The sensor fusion combines detections and objects from all configured sensors to provide a unified object list of the vehicle’s surroundings. For each object, quantities like position, velocity and classification are determined.
The data fusion eliminates individual sensor weaknesses like limited lateral or longitudinal accuracy, limited detectability or false positives.
Data fusion developed with BASELABS software for object fusion and grid fusion supports all relevant automotive sensors and object/track interfaces form various vendors.
BASELABS Create Embedded provides data fusion algorithms that combine data from radar, camera and lidar sensors. The resulting object fusion provides a unified object list of the vehicle's environment and serves as an input to path planning and decision-making algorithms.
Object fusion from BASELABS Create Embedded runs on all relevant embedded platforms. For example, a front fusion consisting of 3 radars and a camera (3R1C) safely runs on Aurix TC397 (300 MHz) based systems.
Webinar Recording: Development of Data Fusion Systems for ADAS Functions for Embedded Systems
This demo recorded at the Vector Virtual Week shows how to generate an individual data fusion algorithm, which is safety-compliant and is running directly on embedded systems and ECUs with Vector's AUTOSAR Classic software MICROSAR in the target vehicle.
Sensor Fusion Know-How
How To Combine a Smart Camera With a Detection-Only Radar Sensor?
Watch the video to learn more about sensor fusion development.
Smart cameras are widely considered a standard solution when developing a perception system for ADAS. Watch the video to learn how to improve the overall perception performance by combining the camera with an additional radar sensor that provides independent detections.
BASELABS Create Embedded makes implementing sensor data fusion systems much faster and more efficient. The resulting C-source code can be used along the entire development chain - from pre-development through embedded prototyping to the ECU for series production. The powerful software enables the safety-compliant development of sensor fusions, including documentation and testing of safety-related use cases. This drastically reduces the development effort.
BASELABS Create Embedded: Development of data fusion along the entire development chain.
Data Fusion Designer and Generator
With the data fusion designer, radar, camera and LiDAR sensors of a vehicle setup are configured, customized and parameterized. A specific object fusion system is generated from this configuration.
Sensor Fusion Reference Architecture
The integrated reference architecture for object fusion allows for building data fusion applications ranging from two sensor systems to large 360° setups with many sensors. The architecture can be customized and extended.
Full Middleware Compatibility
The resulting sensor fusion systems integrate seamlessly in many platforms and runtime environments such as AUTOSAR Classic/Adaptive, ROS or any C/C++ environment.
Sensor Fusion Library for Embedded Systems
The integrated sensor fusion library contains algorithms to build custom object fusion systems such as:
Numerically stable Kalman filters
Data association methods
Existence probability handling
Track management algorithms
The C source code of the library is fully accessible and ready for embedded platforms:
Compatible with common embedded platforms such as Aurix 2G and ARM Cortex-A72