Challenging Tests for Camera Sensors With DYNA4
With release 6, DYNA4, the simulation environment for virtual test driving, offers numerous improvements for the development and testing of modern driver assistance functions. One of the main priorities is the real-time, close-to-reality modeling of the environment. A technological leap in graphic computing for 3D visualization and camera image generation facilitates the consistent use of physical lighting parameters in driving scenarios. This enables developers to test camera-based control systems in challenging lighting situations such as backlighting or strongly varying light intensities. To achieve this, camera images are generated with a high dynamic range (HDR) and can then be arranged as Bayer matrix. DYNA4 additionally provides ASAM OSI ground truth information. The object information detected through image processing can be directly validated based on this information. The same also applies to other sensor technologies, such as lidar, ultrasound or radar.
ASAM OSI for efficient exchange of sensor data
DYNA4 is based on simulation standards and includes a variety of interfaces to maximize the possibilities when using the virtual test drives in an existing tool infrastructure. Based on the use of the ASAM OSI standard in DYNA4, object-based sensors can transmit information as OSI messages. Other applications receive these data to process them. E.g. CANoe, where the data is used for remaining bus simulation in HIL mode and displayed in the Scene Window. ASAM OSI thus considerably reduces the effort required to setup and maintain the interfaces between the simulation and the sensors or ECU functions.
Optimization of models and applications
Thanks to the optimization of the model structure, the new version 6 is more user-friendly in the consistent use from MIL, SIL to HIL. It is now even easier to integrate controller components into the virtual test vehicle. Different variants and development versions of a controller component can be exchanged with each other while the model interface remains constant, without the need for any further model changes. Furthermore, flexible access to signal and control variables in the model has been simplified allowing to manipulate them dynamically during a test run.