top of page
Fokussierung bei der Arbeit

Sensor Models
with Ray Tracing

smdl logo

Sensor Model Development Library

The solution to develop your own high-fidelity sensor models

Develop and validate your sensor model once, then use it in combination with different simulation tools. The Persival SMDL is a C++/CUDA library that can easily be integrated into a functional mock-up unit (FMU) and be used for co-simulation.

Co-Simulation with Standardized Interfaces

The SMDL is targeted to be used in a sensor model that is employed in a co-simulation framework. That means, that the simulation of the perception sensors is decoupled from other simulation modules, such as scenario/traffic simulation or vehicle dynamics simulation. In this setup, the sensor model has only two inputs. An ASAM OSI SensorView message with references to 3D asset files conforming to the ASAM OpenMATERIAL 3D standard. Learn about the standards on our Persival Knowledge page.

How the SMDL works

The SMDL has an API that provides functionalities to:

  • load a static 3D environment based on ASAM OSI input,

  • load 3D assets for all moving objects based on ASAM OSI input and update their poses, velocities, etc. every simulation timestep,

  • set a sensor position,

  • perform a custom ray tracing utilizing physical material properties defined in ASAM OpenMATERIAL 3D,

  • interface with a proprietary signal processing that might remain a black box for IP protection,

  • and finally output the sensor output, i.e. a point cloud, in ASAM OSI SensorData format.

Embed SMDL into FMU

We provide templates to embed the SMDL into a functional mock-up unit (FMU). The templates are implementation examples on how to use the SMDL API and connect a signal processing to it. The image also shows how the SMDL connects to our other software and services, for example Simspector for visualization, analysis and validation, or our ASAM OpenMATERIAL 3D compliant 3D assets.

Advanced Effects

The ray tracing within the SMDL is capable of simulating advanced effects. One example is the simulation of velocities of every individual hit points during the ray tracing. This enables radar doppler simulation as well as the simulation of FMCW lidar. Another advanced effect is timing. This video shows timing effects of a MEMS lidar. It is slowed down frame by frame to visualize the effect. Each ray is shot at a specific time. All moving elements of the scenario including the virtual sensor keep moving during the simulated scan of the lidar.

bottom of page