On Demand Webinars
Related Articles
Brochures
Solutions
Sensor modelling in Autonomous Driving

VTD has a complete set of sensors to replicate the physical sensors used in an autonomous vehicle: cameras (included infrared), LiDAR, RADAR and ultrasonic sensor.

Each sensor can be represented with different levels of fidelity, from reproducing the intricacies of a laser beam reflecting over rough surfaces to simply capturing the basic sensor characteristics (in order to achieve the maximum simulation speed).

With PBR (Physically based rendering) -based LiDAR models, users can identify more details from LiDAR visualization like a glass window on a building, or a puddle of water on a road surface.

To further enhance the fidelity and the variety of the sensor models, team VTD is also working with the world-leading sensor manufacturers like Leica and NovAtel (all part of Hexagon).

Comparing Basic LiDAR representation of a shop front (left) with PBR LiDAR representation (right)

Target List Radar

The limitation of the missing simulation of radar sensors based on OpenDRIVE and bounding box information is fulfilled by the Target List Radar sensor that is configurable as a module manager plugin. This sensor can run faster than real-time without the GPU hardware and provides a list of target points within the sensor cone. It is performing the calculations based on the information in OpenDRIVE and current traffic simulation.

Millimeter-Wave Radar

The limitation of the missing simulation of radar sensors based on the geometry and materials is fulfilled by the Millimeter Wave Radar sensor that is configurable as an Optix plugin. It can run in real-time and can provide a simulated radar response that is calculated on GPU. The plugin is able to simulate artifacts such as mirrored targets and aliasing and provides the ground truth data for machine learning and algorithm corrections.