Velodyne Lidar, Inc. announced that its surround view Lidar solutions for capturing extensive perceptual data are available for testing and validation on the Nvidia DRIVE stand-alone autonomous vehicle platform a full 360-degree reception in real time, allowing for high-precision localisation and path planning.
The features of the Velodyne sensors are also available on Nvidia DRIVE Constellation, an open, scalable simulation platform that enables large-scale, bit-accurate hardware-in-the-loop testing of AVs. The solution’s DRIVE Sim software simulates Lidar and other sensors, simulating the inputs of a self-driving car in the virtual world with great accuracy.
“Velodyne and Nvidia are leaders in providing high-resolution sensing and high performance autonomous driving computers,” said Mike Jellen, president and chief commercial officer of Velodyne Lidar. “As an Nvidia DRIVE ecosystem partner, our intelligent lidar sensors are fundamental to improving the autonomy, safety and driver assistance systems of leading global manufacturers.”
Velodyne offers the industry’s broadest portfolio of lidar solutions, encompassing the full range of products required for the advanced driver assistance and autonomy of car manufacturers, truck OEMs, suppliers and Tier 1 suppliers.
The Velodyne sensors, which have proven themselves by learning from millions of miles of roads, help to find the safest way to navigate and control a self-driving vehicle. The addition of Velodyne sensors enhances the advanced driver assistance systems (ADAS) level 2+ features, including automatic emergency braking (AEB), automatic cruise control (Adaptive Cruise Control).
“Velodyne’s lidar sensors help deliver intelligence to enable automated driving systems and road safety by detecting more objects and presenting vehicles with deeper insights into their environment,” says Glenn Schuster, senior director, Sensor Systems. Ecosystem development at Nvidia.