Sensor Simulation: Environment Perception for ADAS and AD
The detection of the environment builds the basis for assisted and automated driving (ADAS/AD). For the development and testing of driving functions, the vehicle and environment simulation DYNA4 offers models of lidar, camera, radar and ultrasonic sensors. Depending on the boundaries of your system under test, DYNA4 provides virtual sensor input on different levels, from physics-based raw sensor data to fused object lists. This enables, for example, the following use-cases:
- Testing object detection with virtually generated camera pictures or lidar point clouds
- Testing sensor fusion algorithms with object lists from multiple sensors such as radar and camera
- Testing ADAS/AD functions with a fused object list as an input

Advantages
- Real-time capable simulation for development and testing of sensor-based ADAS/AD functions
- From early development phases to determine the sensor configuration up to virtual validation
- Full sensor set-up with ultrasonic, lidar, camera and radar sensors
- Output as raw data, target lists or object lists
- Variety of scenarios covering complex static and dynamic environments from parking to autonomous driving in surrounding traffic
- Closed-loop system tests from environment perception over actuation to realistic vehicle reaction
- Vehicle dynamics for realistic sensor movements
- Efficient test coverage through test automation with numerous variants
Object Lists

- Sensor-specific object lists or fused object list
- Idealized ground-truth information or consideration of occlusion
- Highly efficient computation based on bounding-boxes or semantic image segmentation with consideration of exact geometries
- Output: relative velocity, distance, object class
Lidar

- Reflection intensity based on angle between laser beam and object surface and its material properties
- Availability of rotating and non-rotating lidar sensors
- Opening angle and signal resolution adjustable
- 3D point cloud output as ROS Topic via DDS or via UDP in Velodyne format
- Video: Vehicle simulation with Velodyne sensor and city traffic in DYNA4
Cameras

- Configurable cameras with opening angles up to 360°
- Distortion parameterization with OpenCV or Scaramuzza parameters
- Support of dirt on the lens
- Display of RGB image streams on separate screens for image injection
- Usage from MiL (algorithm development) to HiL (image injection on ECU)
- Video: Configurable cameras for virtual ADAS testing with DYNA4
Ultrasound

- Consideration of propagation and atmospheric attenuation
- Absorption and reflection based on object geometry and material properties
- Adjustable opening angle and signal resolution
- Output of an intensity depth histogram
Radar

- Scattered radar waves based on object geometry and material properties
- Consideration of different antenna characteristics (Short-, Mid-, Long-Range)
- Adjustable opening angle and signal resolution
- Output of raw data such as relative velocity and distance to the object as well as the intensity of the electric field or GPU-based Fourier transform to generate range Doppler plots
Product Information
- Fact Sheet: Short overview of DYNA4 product facts (PDF)
Do you have technical questions and are you looking for suitable answers? Our knowledge base provides the most important FAQs for you.