Tesla has begun rolling out its enhanced Autopilot driver-assistance software, dubbed Full Self-Driving (FSD). Just in beta for now, the update is supposed to expand on the technology by enabling Tesla's cars to turn in intersections and navigate around cars. Previously, Autopilot could only drive straight ahead and stop at traffic lights or stop signs.
The strategy Tesla has taken to develop self-driving technology using cameras, rather than advanced LIDAR sensors, has been roundly criticized by competitors. They say that skimping on sensors in a safety-critical environment like a car is dangerous and could set back the industry should any fatalities occur. LIDAR uses lasers to capture the depth of objects in view so the software can say, distinguish between a real person and a 2D picture of one.
It's not autonomous — Critics believe that Tesla is playing fast and loose with vocabulary. A "full self-driving" vehicle should be able to transport a passenger from point A to point B without any human intervention or monitoring necessary. Even with FSD enabled, Tesla still warns customers that they need to keep their eyes on the road and be ready to take over the car at any time.
In the past, fatalities involving Autopilot have occurred when drivers were doing other things like playing games on their phones. The problem with an advanced driver assistance program is that the driver may become complacent, or abuse the system, and lose crucial seconds needed to take over control of the vehicle and avoid an incident. Tesla owners frequently document instances when Autopilot fails to recognize an obstacle ahead, such as a lane being closed by traffic cones.
In the release notes for FSD, Tesla writes: “Full Self-Driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent.”
Technically speaking it's legal for Tesla to roll out this software in the U.S., but the National Highway Traffic Safety Administration said it "will not hesitate to take action to protect the public against unreasonable risks to safety." Google's Waymo has been much more slow to roll out self-driving tech, just last month introducing limited public service in Phoenix. Tesla wants to bring FSD to thousands of cars by the end of the year and use drivers as guinea pigs for testing the software.
Philosophical choices — Tesla is the only company in the emerging self-driving space to shun LIDAR. The sensors are expensive and if Tesla is able to create a system that's just as accurate with standard cameras, it could wipe out everyone else in the industry including players like Waymo, which has invested billions creating self-driving cars with custom LIDAR gear.
The new FSD is supposed to rely on neural networks, learning from other drivers as they identify and correct mistakes made by Autopilot. Tesla also tests cars in "ghost mode," meaning it will take footage from one of its vehicles driving in manual mode and review what actions Autopilot mode would have taken along the way.
Since no self-driving technology has yet become a standard or proven itself to be 100 percent comparable to a human driver, it will take more time to reach a conclusion about what solution comes out on top. Waymo may at least get faster acceptance by regulators thanks to its cautious deployment and robust technology.