LiDAR is a critical sensor for enabling autonomous vehicles. Faststream Technologies is developing and commercializing the next generation of LiDAR systems for automotive application using Velodyne’s scalable auto-grade LiDAR sensor, core 3D software technology and proprietary LiDAR ASIC engine. Both companies will contribute key components, technologies, know-how and other intellectual property needed to optimize the next generation of affordable, high performance LiDARs for the automotive market.
Detection and imaging in autonomous cars
Manufacturers are outfitting modern cars with a wide array of advanced control and sensing functions. Collision warning and avoidance systems, blind-spot monitors, lane-keeping assistance, lane departure warning, and adaptive cruise control are examples of established features that assist drivers and automate certain driving tasks, making driving a safer and easier experience.
Faststream Technologies LIDAR and radar share a broad array of common and complementary features that can map surroundings as well as measure object velocity.
Our LIDAR systems can detect objects at distances ranging from a few meters to more than 200m. LIDAR has difficulty detecting objects at close distances. Its range depends on the type of system,
Because of its ability to collimate laser light and its short 905m-1,550m wavelength, infrared (IR) light spatial resolution on the order of 0.1° is possible with LIDAR. This allows for extremely high-resolution 3D characterization of objects in a scene without significant backend processing. On the other hand, radar’s wavelength (4mm for 77GHz) struggles to resolve small features, especially as distances increase.
Field of view (FOV)
Solid-state LIDAR and radar both have excellent horizontal FOV (azimuth), while mechanical LIDAR systems, with their 360-degree rotation, possess the widest FOV of all advanced driver assistance systems (ADAS) technologies.Our LIDAR has better vertical FOV (elevation) than radar. LIDAR also has an edge over radar in angular resolution (for both azimuth and elevation), which is one key feature necessary for better object classification.
LIDAR and cameras are both susceptible to ambient light conditions. At night, however, LIDAR systems can have very high performance. Radar and modulated LIDAR techniques are robust against interference from other sensors.