The California-based developer of adaptive lidar solutions AEye is working with Nvidia to bring its sensing solutions to the Nvidia Drive autonomous vehicle platform.
The Nvidia platform is an open, end-to-end solution for Level 2+ automated driving to Level 5 fully autonomous driving. With AEye’s sensors supported on the platform, Nvidia says autonomous vehicle developers will have access to next-generation tools to increase the saliency and quality of data collected as they build and deploy ADAS and AV applications. Specifically, AEye’s SDK and Visualizer will enable developers to configure the sensor and view point clouds on the platform.
“We are pleased to offer our customers the full functionality of our sensor on the Nvidia Drive platform,” said Blair LaCorte, CEO of AEye. “ADAS and AV developers wanting a best-in-class, full-stack autonomous solution now have the unique ability to use a single adaptive lidar platform from Level 2+ through Level 5 as they configure solutions for different levels of autonomy. We believe that intelligence will be the key to delivering new levels of safety and performance.”
“AI-driven sensing and perception are critical to solving the most challenging corner cases in automated and autonomous driving,” added Glenn Schuster, senior director of sensor ecosystems at Nvidia.
“As an Nvidia ecosystem partner, AEye’s adaptive, intelligent-sensing capabilities complement our Drive platform, which enables safe AV development and deployment.”