Can Lidars Zap Camera Chips?

Lidar wavelengths deemed safe for human eyeballs might imperil digital camera sensors

4 min read

cartoon of a car with lidar and other sensors pulsing signals
Illustration: iStockphoto

Laser safety rules designed to protect human eyes have shaped the design of automotive lidars. Safety rules limit the current generation of lidars—those emitting invisible infrared light near 900 nanometers—to pulse powers so low they limit their measurement range to 60 or 100 meters, too short a distance for cars moving at highway speed to stop in time to avoid a collision.

Moving to a longer wavelength that does not penetrate the human eye allows new lidars to fire more powerful pulses and stretch their range beyond 200 meters, far enough for stopping faster cars. Now a claim of lidar damage to the charge-coupled-device (CCD) sensor on a photographer's electronic camera has raised concern that new eye-safe long-wavelength lidars might endanger electronic eyes.

“I had never thought of that,” says Dennis Killinger, a veteran lidar developer and professor emeritus at the University of South Florida who consults for lidar companies. He says that he had not heard anyone in the industry talk about the possibility until an incident at the 2019 Consumer Electronics Show was reported. The biggest concern is not photographic cameras but rather the video cameras mounted on autonomous cars to gather crucial information the cars need to drive themselves.

Self-driving cars use artificial intelligence to generate a real-time three-dimensional map of the local environment from digitized maps of fixed objects and data collected by video cameras, lidars, microwave radars, and other sensors. Video cameras record images that can be used to locate and recognize objects, but they don't directly measure distance. That is done by lidars, which scan the local environment, firing about a million pulses per second to create a three-dimensional point cloud of distances. Radar measures velocity and distance, but at low resolution. Artificial intelligence combines these sensor inputs into a three-dimensional map it uses to steer the car.

Most current car lidars are built around semiconductor diode lasers that emit a series of nanosecond pulses near 900 nanometers, which is not visible but does penetrate to the light-sensing retina at the back of the eye. What makes laser light particularly dangerous to the eye is that the light rays are parallel, so the eye focuses all the incoming laser pulses onto a tiny spot where intensity can be very high. Safety rules from the U.S. Food and Drug Administration limit the peak power of the laser pulse, which in turn limits lidar operating range. 

To extend lidar range beyond the 60- to 100-meter limit for 900-nanometer lidars, some companies are turning to longer wavelengths, which the eye absorbs before they can reach the retina. So far the favorite eye-safe wavelength is 1,550 nanometers, where lasers originally developed for fiber-optic communication systems are readily available, and looser safety rules allow pulse powers a hundred times as bright. Luminar has used semiconductor lasers emitting at 1,550 nm to reach ranges of 300 meters, which give adequate warning for cars at highway speed to stop safely. Their lidars can measure the posture of a pedestrian up to 250 meters away, giving the car an advance warning of someone poised to walk across the street. 

Producers of laser light shows are well aware that laser beams can damage electronic eyes. “Camera sensors are, in general, more susceptible to damage than the human eye,” warns the International Laser Display Association on a Web page showing examples of laser damage and alerting photographers to avoid laser beams. The extent of damage can vary widely, depending on the distance from the source, beam direction, and power.

However, questions have emerged about the CES incident. The photographer blamed the damage on a 1,550-nanometer lidar with an unusually long range of 1,000 meters from AEye, a Pleasanton, Calif., startup. After more than a week, according to the photographer, the company asked to examine his $2,000 camera to assess what damage it had suffered and what might have caused it. But photographer says he had already thrown the camera away during a recent move. Although published photos show flaws that may have been caused by damage to pixels, without the camera it remains unknown what the nature of the damage was, when it occurred, and what caused it. 

Digital cameras intended for visual photography normally include filters that block infrared light from reaching CMOS or CCD sensors that would otherwise respond to those wavelengths. Sensor vulnerability to infrared damage would depend on the design of the infrared filters. Cameras used in autonomous cars use similar sensors, and presumably use filters built right onto the chips to block the widely used 900-nanometer lidar wavelength, but their vulnerability to longer-wavelength lidars is unknown. 

Regardless of what actually happened in the CES incident, the claim points out the need to assess potential lidar damage to cameras, especially those used along with lidars in autonomous cars.

“Now is the time for people to look at this in depth,” says Killinger. Most lidar companies have not released detailed power specifications. Killinger says pulse duration also is important. Most current automotive lidars use nanosecond (10-9 second) pulses. However, lasers also can generate picosecond (10-12 second) or femtosecond (10-15 second) pulses to measure distances more precisely, and concentrating pulse energy into such short time spans might raise their peak power to levels that could ablate material from the sensor surface. Designers have focused on eye safety, but Killinger says that “car cameras may need electronic eye protection.”

Editor's note: This story originally stated that the photographer said he could not provide his damaged camera to AEye because he could not find it. The story was updated on 28 January 2019 after the photographer contacted IEEE Spectrum to say he threw the camera away during a move soon after CES. 

The Conversation (0)