An Argument for LIDAR in Autonomous Vehicles

So it has happened. The first traffic death caused by a commercially available autonomous vehicle has occurred. Well, actually a semi-autonomous vehicle, but you get the picture. Joshua Brown died May 7 when the Tesla Model S in “Autopilot” mode failed to correctly identify a tractor trailer that had crossed into his path.

Jul 1st, 2016
Content Dam Su En Articles 2016 07 An Argument For Lidar In Autonomous Vehicles Leftcolumn Article Thumbnailimage File

So it has happened. The first traffic death caused by a commercially available autonomous vehicle has occurred. Well, actually a semi-autonomous vehicle, but you get the picture. Joshua Brown died May 7 when the Tesla Model S in “Autopilot” mode failed to correctly identify a tractor trailer that had crossed into his path.

Tesla has turned their new Autopilot feature off by default, and for a driver to turn it on, they must acknowledge that the feature is still in beta, and that they will keep their hands on the steering wheel and stay alert at all times. There is some current proof that Mr. Brown was not staying alert of the driving situation before him, but for this discussion that doesn’t matter, what matters is that an autonomous vehicle broke two of the three Isaac Asimov laws of robotics: The robot failed to protect its human occupant and the robot failed to protect its own existence.

So what went wrong? To understand that you need to understand howTesla’s Autopilot mode works. The Tesla uses a front-facing camera and technology from an Israeli company called MobileEye. The MobileEye software and hardware including the MobileEye EyeQ chip is designed to recognize a scene much like a human would and react to it. Tesla isn’t alone using MobileEye’s autonomous solutions. BMW, GM, Volvo, Hyundai, Renault Trucks and others are all using MobileEye solutions in their vehicles being tested.

Supplementing its front-facing camera, a Tesla Model S also incorporates long-range front facing radar that can reach over 500 feet, short-range ultrasonic sensors which can detect objects up to 16 feet, and GPS. What the Tesla does not contain is LIDAR.

For anyone that has seen a Google self-driving vehicle, the top-mounted LIDAR unit always stands out. Google has used LIDAR on the top of its vehicles from the start, and the LIDAR which Google uses is not cheap. The original LIDAR units use 64 lasers and the unit costs around $80,000. Today the prices have dropped somewhat (see this LFW article) with some volume but the units are still pretty expensive.

Tesla decided not to use LIDAR for two reasons; 1) cost, and 2) the fact that a LIDAR on top of a vehicle looks pretty ugly. Prices for much simpler LIDAR units have dropped to under $1,000, and units can now be mounted on multiple sides of a vehicle, with each covering 180 degrees or less. In any case LIDAR is still an expensive proposition for autonomous vehicle manufactures.

The Tesla’s camera is believed to have misrecognized the truck crossing in front of it because the low Tesla could recognize sky under the high body of the truck. Just today MobileEye released a statement saying that their system was never designed for such a scenario such as this but that they may add it in the future. This seems like a logical explanation because how can software really EVER recognize every scenario a possible driver may come across? I as a human can even say every now-and-then I falsely recognize a scene way down the road, at least until I get closer. If a human mind can’t recognize every situation all the time, it’s unlikely that a system viewing a scene through a camera can always recognize every scene either.

So what is the solution here? While the full cause of this accident won’t be known for some time, I believe that automakers need to put LIDAR back in the equation, at least front-facing LIDAR. LIDAR can produce so much more precision than radar, and not including it in every autonomous vehicle is a big mistake. Certainly LIDAR doesn’t work under all conditions, and fog, heavy rain or snow can cause problems for it, but when human lives are at stake, having more sensors to choose from is certainly beneficial. A camera lens can also get dirty or covered with snow or not see behind fogged-up glass.

Hopefully this accident will be a wake-up call to vehicle manufacturers to consider including LIDAR as one of their primary sensors. In this case it appears it was human error that ultimately led to Mr. Brown’s death, but once automotive manufacturers start shipping their fully autonomous vehicles and another death occurs, the only humans in the equation to blame will be the vehicle manufacturers themselves.

More in Blogs