Trying to figure out the level of autonomy for Tesla Autopilot can be a challenge to someone coming in cold. This is to say nothing of understanding the Society of Automotive Engineers’ autonomous levels (dive in below). It’s enough to make anyone a little nuts. So we’ve brought together the most common terms used to describe autonomous technology offered definitions.
Level of Autonomy
SAE level 0: The car is just a car, and the driver is responsible for everything. No cruise control even, so basically this is the beater you drove around right out of school and were lucky if it had automatic windows. Fancy versions might be able to warn you if there are things in your blind spot – but the key here is that the car is never going to do anything without you.
SAE level 1: Hot damn, it has cruise control. But that’s basically it. This is the bare minimum of driver assistance and it refers to the basic model of most inexpensive cars on the market today. Probably has sensors and warning systems, but the car can’t react without you, it can only maintain settings you have given it (ie. cruise control, automatic parking), which is why these are also sold as called driver assistance cars.
SAE level 2: Alright, if you have one of these you feel like you made it. These cars are the ones that can stay in lanes and hit the brakes for you. Also called partially autonomous, the car is able to react to warning systems, and can steer and change how fast it’s going, but the driver still has to be doing the driving and paying attention to the road. The current version of Tesla Autopilot is an SAE level 2, no matter how autonomous the system looks right now.
SAE level 3: Also called “conditionally autonomous,” these bad boys aren’t commercially available yet, but these are the cars that will actually be able to decide to pass the car that’s braking in front of it, instead of just braking in response. There are still going to be times when the human driver is going to have to take up the wheel bad weather, poorly mapped areas the car can ask you to drive when it isn’t safe. There are a few level 3 car prototypes in the works, including at Tesla.
SAE level 4: Highly autonomous cars can drive by themselves, but not at all speeds or conditions. It’s not going to ask the driver to take over, but maybe it can’t leave the city it was given the data for, or there it doesn’t work in severe weather.
SAE level 5: “The full-time performance by an automated driving system of all aspects of the dynamic driving task under all roadway and environmental conditions that can be managed by a human driver,” is the SAE International definition of a fully autonomous car. The car has got it covered.
Black box: Like in an airplane, autonomous cars are probably going to have devices that record safety information in case of accidents. In Germany, there’s a proposal for the black box to only record when the autonomous system is active and when the human driver is in control, basically as an insurance measure.
Vehicle-to-vehicle communication, V2V: One of the big things in autonomous cars is that they will know where all the other cars are because they’re transmitting information about where they are and how fast they’re going. We’re going to see V2V systems as early as 2023, and even in driver assistance cars without any real autonomous function, they’re likely to significantly improve safety.
IMU: Inertial measurement unit: This is the device that knows how fast you’re going and if you’re hitting the brakes.
DATMO: A research paper from a team led by Anna Petrovskaya, an engineer at Stanford University explains DATMO as:
Autonomous driving in populated areas requires great situational awareness. The autonomous vehicle must perceive not only the stationary environment but also dynamic objects such as vehicles and pedestrians. For each moving target, the autonomous vehicle needs to identify location and velocity, so that it can predict the target’s position a few seconds later for planning purposes. Awareness of moving objects includes both the detection of new targets and tracking of existing targets over time. For this reason, it is often referred to as detection and tracking of moving objects, or DATMO for short.
SLAM: This is shorthand for an algorithm that allows a computer to make a map of the world around it and identify where it is on that map. SLAM is short for simultaneous localization and mapping and there is a whole field of research devoted to figuring out how to do this. Most models use a combination of GPS and either LIDAR or cameras and RADAR to figure out where it is in real time.
LiDAR: “Light Detection And Ranging” basically lasers mounted on the car are constantly sending out pulses and using the reflected light to figure out what’s around it by measuring how far the pulses went and how long it took to bounce back. Google’s self-driving cars use LiDAR, and Elon Musk thinks it’s a bad decision. Which is fair, because you can’t use LiDAR in foggy or dusty conditions. He uses RADAR and cameras instead, although he’s working to remove the cameras.
RADAR: Instead of using laser pulses, RADAR (“radio detection and ranging”) uses regular pulses of radio waves, and maps the world based on how the radio waves bounce back. It can be used in bad weather conditions, but can’t give information about color or other visual information like LiDAR.
Machine vision/Stereo vision: Cameras with 360 degrees of vision and an algorithm to identify obstacles.
SONAR: Like bats and submarines, Sonar (“sound navigation and ranging”) sound waves to create a picture of the environment based on the bounce-back of the sound. Because they’re sound waves, they can be used in bad weather. They’re cheap to make, but they have really short ranges.
Photos via SAE International , Getty Images / Justin Sullivan