Introduction
On August 29, 2025, Main Street Autonomy published a comprehensive technical guide to automotive lidar technology, providing an in-depth exploration of how lidar sensors work, the various technologies used in their implementation, and the practical challenges engineers face when deploying these systems in autonomous vehicles.
The guide represents one of the most detailed public resources on automotive lidar technology, covering everything from fundamental ranging principles to advanced calibration challenges. The article includes detailed explanations of technologies from various manufacturers including Velodyne, Ouster, Luminar, Livox, and others, providing real-world context for the technical concepts. As lidar becomes increasingly important for autonomous vehicles and robotics applications, understanding these technologies is crucial for engineers, researchers, and anyone interested in how self-driving cars perceive their environment.
Lidar technology has been a cornerstone of autonomous vehicle development, with many autonomous vehicle companies relying on lidar sensors to create detailed 3D maps of their surroundings. The guide includes examples of vehicles like the Waymo Jaguar I-Pace and Chrysler Pacifica Hybrid equipped with multiple lidar units, demonstrating real-world deployment. This comprehensive guide helps demystify the complex technology behind these sensors and explains why lidar remains essential for safe autonomous navigation.
What Lidar Does
Core Measurements
Lidar sensors operate by bouncing light off surrounding surfaces and measuring various properties of the returned light. The primary measurements include:
- Distance (ranging): Measured by calculating how long it takes for light to bounce back, using the constant speed of light
- Bearing: Determined by the direction the light is emitted or where the detector is pointed
- Reflectivity: Measured by how much light bounces back, useful for detecting road markings
- Speed: Calculated through Doppler shift measurements in the reflected light
- Ambient light: Measured to understand environmental lighting conditions
The combination of distance and bearing measurements allows lidar to create detailed 3D point clouds—collections of discrete 3D points in space that represent the environment. These point clouds enable autonomous vehicles to perceive obstacles, understand road geometry, and navigate safely through complex environments.
Point Cloud Generation
Each measurement from a lidar sensor corresponds to a 3D point in Cartesian coordinates. Given range r and bearing angles θ and φ, the 3D point can be calculated as:
x = r sin(θ) cos(φ)
y = r sin(θ) sin(φ)
z = r cos(θ)
This mathematical transformation allows lidar sensors to convert their raw measurements into the 3D point clouds that autonomous vehicle systems use for perception and planning.
Measuring Distance: Ranging Methods
Direct Time of Flight Pulsed Lidar
The most common method for measuring distance is direct time of flight (ToF), where a laser pulse is fired and the time until the reflection returns is measured. The distance is calculated using:
d = ct/2
where c is the speed of light (3×10⁸ m/s) and the division by 2 accounts for the round-trip distance.
Modern lidar systems can measure time with nanosecond precision. Since electronics typically run at 1 GHz, the time series is discretized on the order of 1 ns, which corresponds to a range resolution of 15 cm. To further improve ranging accuracy, interpolation filters are standard techniques in signal processing, typically enabling centimeter-level ranging accuracy. The time series data is processed to find peaks corresponding to object surfaces, often using cross-correlation techniques to match the outgoing pulse shape with the return signal. This approach, known as matched filtering, provides greater resistance against noise, interference, and crosstalk.
Photodetector Technologies
Two main types of photodetectors are used in pulsed lidar systems:
- Linear-mode avalanche photodiodes (APD): Operate with reverse bias slightly below breakdown voltage, providing high sensitivity with linear response
- Geiger-mode avalanche photodiodes (SPAD): Operate above breakdown voltage, where even a single photon can trigger a large current spike, enabling single-photon detection
SPADs are particularly useful for long-range detection and low-light conditions, while linear-mode APDs provide better dynamic range and intensity measurement capabilities.
Alternative Ranging Methods
Beyond direct time of flight, lidar systems also use:
- Amplitude modulated lidar: Modulates the amplitude of the outgoing light and measures phase differences
- Frequency modulated lidar: Uses frequency modulation to measure distance through phase relationships
- Parallax lidar: Uses triangulation similar to stereo vision, measuring distance through geometric relationships
Each method has advantages for different applications, with direct ToF being most common for automotive applications due to its accuracy and range capabilities.
Determining Bearing: Beam Steering and Arrays
Array-Based Approaches
Lidar systems determine bearing (direction) through two main approaches:
- Discrete arrays: Multiple separate laser-detector pairs arranged in specific patterns
- Solid-state arrays: Integrated arrays where multiple beams are generated and detected simultaneously
The Velodyne HDL-64, for example, uses 64 discrete lasers arranged vertically, while newer solid-state lidars integrate hundreds or thousands of beams in a single chip.
Scanning and Beam Steering Methods
Various mechanical and optical methods are used to steer lidar beams:
- Spinning: Rotating the entire lidar unit (classic Velodyne approach)
- Spinning mirrors: Using rotating mirrors to redirect beams
- Oscillating mirrors/galvos: Fast-moving mirrors that scan back and forth
- MEMS mirrors: Micro-electromechanical mirrors that can be precisely controlled
- Optical phased arrays: Using interference patterns to steer beams without moving parts
- Risley prisms: Rotating prism pairs that create complex scanning patterns
- Baraja SpectrumScan: Using wavelength diversity for beam steering
Each method offers different trade-offs in terms of reliability, scanning speed, field of view, and cost. Solid-state approaches (like optical phased arrays) eliminate moving parts but may have limitations in field of view or scanning patterns.
Calibration Challenges
Beam Angle Offsets
One of the most common calibration issues involves incorrect beam angles. Each laser in an array may have slight angular offsets due to manufacturing tolerances, thermal expansion, or physical damage. These offsets can cause systematic errors in point cloud geometry, making straight corridors appear curved or causing ground plane estimation errors.
The well-known KITTI dataset, for example, required manual calibration of beam angles (0.22 degrees) to achieve accurate results, highlighting the importance of proper calibration in lidar systems.
Range Offsets
Different lasers in a lidar array may have different range offsets, particularly in discrete array systems where each laser-detector pair is separately calibrated. This can cause points from certain beams to appear offset by several centimeters, creating artifacts in the point cloud.
Pixel Crosstalk and Blooming
Similar to camera lens flare, strong lidar returns can cause "blooming" where light spills over to neighboring detectors. This crosstalk creates spurious returns around bright or reflective objects, potentially causing false obstacle detections. The guide notes that early prototypes of lidars like the Argo lidar and early Ouster OS1 prototypes were susceptible to blooming, though later firmware upgrades have mitigated these issues. This effect typically can't be easily calibrated away and is usually handled in lidar firmware.
Intensity-Dependent Range Bias
SPAD-based lidars (like early Ouster lidars and the now-defunct Argo lidar) can suffer from range bias when returns are very strong. When all SPADs saturate at the beginning of a pulse, the detected peak can be shifted, causing range measurements to be systematically biased. This is particularly problematic for highly reflective surfaces like painted road markings or retroreflectors. The guide notes that very advanced signal processing techniques are needed to compensate for this effect, as even slight saturation can cause significant range bias.
Encoder Issues
Mechanical scanning lidars that use encoders can experience:
- Hysteresis: Different readings depending on scan direction (clockwise vs. counter-clockwise). The guide notes that the Luminar Iris, which uses encoders for oscillating beam scanning, has a mode where part of the point cloud is an "up-scan" and the other part is a "down-scan," and the two often don't align well even when the vehicle is stationary, suggesting encoder hysteresis.
- Physical offset: Encoder rings that are misaligned (often just glued in place by humans), causing sinusoidal errors in angle measurements. This can cause a straight corridor to appear consistently curved to one side.
These issues can cause double-layer point clouds or systematic geometric distortions that are difficult to correct in post-processing.
Practical Implications for Autonomous Vehicles
Integration with Other Sensors
Lidar is typically used alongside cameras, radar, and other sensors in autonomous vehicle systems. The guide notes that in contrast to lidar, which measures both distance and bearing, a camera only measures bearing and ambient light intensity. Each pixel of a photo is a measurement of how much light there is in that particular direction, but cameras generally have much higher bearing resolution than lidar.
The combination of lidar's 3D point clouds with camera imagery enables more robust perception, allowing autonomous systems to understand both the geometry and appearance of the environment.
Real-World Deployment Challenges
The calibration challenges discussed in the guide highlight why lidar integration in autonomous vehicles requires extensive testing and validation. The guide notes that some lidar systems comprise multiple separate lidars in a single box (like the Livox Mid 100 with three Mid-40s, or the Luminar Hydra with two separate lidars), which can become misaligned and require separate calibration. Companies deploying autonomous vehicles must account for:
- Manufacturing variations between lidar units
- Environmental effects (temperature, vibration, physical damage)
- Multiple lidar units working together
- Long-term calibration drift
These factors explain why lidar systems require careful integration into vehicle platforms and ongoing calibration maintenance.
Terminology: LiDAR vs. lidar
The guide addresses an important terminology question: should it be "LiDAR" or "lidar"? The author argues for lowercase "lidar," similar to how "radar" and "laser" became lowercase as they transitioned from exotic military/academic technologies to common consumer products.
As lidar becomes standard in phones and vehicles, the lowercase form reflects its status as a mature, widely-used technology rather than an exotic acronym.
Solid State vs. Mechanical Scanning
The guide clarifies a common misconception: not all rectangular or non-spinning lidars are solid state. Solid state refers to the absence of macroscopic moving parts, not the physical shape.
For example:
- Livox lidars: Use Risley prisms or mirrors for mechanical scanning, despite their rectangular form factor
- Luminar lidars: Use galvos or polygonal mirrors, not solid-state beam steering
True solid-state lidars use technologies like optical phased arrays, which steer beams electronically without moving parts. This distinction is important because solid-state systems are often perceived as more reliable and durable, though mechanical scanning systems can offer advantages in field of view and scanning patterns.
Conclusion
Main Street Autonomy's comprehensive guide to automotive lidar provides valuable technical insights into one of the most important sensor technologies for autonomous vehicles. The detailed coverage of ranging methods, beam steering techniques, and calibration challenges offers engineers and researchers a thorough understanding of how lidar systems work and the practical considerations for deploying them.
Key Takeaways:
- Fundamental Technology: Lidar measures distance and bearing to create 3D point clouds essential for autonomous navigation
- Multiple Approaches: Various ranging methods (ToF, modulated, parallax) and beam steering techniques (mechanical, solid-state) offer different trade-offs
- Calibration Critical: Proper calibration is essential, with beam angles, range offsets, and crosstalk being common challenges
- Real-World Complexity: Deploying lidar in autonomous vehicles requires addressing manufacturing variations, environmental effects, and long-term stability
- Terminology Evolution: As lidar becomes commonplace, lowercase "lidar" is appropriate, similar to "radar" and "laser"
As autonomous vehicle technology continues to evolve, understanding lidar technology becomes increasingly important. The comprehensive nature of this guide makes it a valuable resource for anyone working with or interested in autonomous systems, sensor fusion, and 3D perception.
For those interested in learning more about autonomous vehicle technology, explore our coverage of autonomous systems, robotics, and the latest developments in self-driving technology.
Sources
- All About Automotive Lidar - Main Street Autonomy, August 29, 2025
- Main Street Autonomy - Main Street Autonomy
Interested in learning more about autonomous vehicles and sensor technology? Check out our AI Fundamentals course, explore our glossary of AI terms, or discover other AI tools transforming transportation and mobility.