The Best Self-driving AI Technologies- Considering Safety and Cost
Introduction
Driving gets stressful, especially when it’s a long journey and you must drive for hours. Even traffic jams can make driving a horrible experience. So sitting for hours, with your eyes focused on the road and your muscles aching, you think of one thing: “what if these cars could drive themselves?” Well, artificial intelligence has a reply for you.
With the advent of AI, we now have cars that can take complete control and drive themselves. Furthermore, with the assistance of AI and machine learning, cars can navigate traffic, avoid obstacles, and make decisions without human intervention. The state of AI in 2023 has made all these technological advancements possible.
The most pivotal feature of self-driving technology is the car’s awareness of its surroundings. This feature makes self-driving safer and less likely to cause accidents. Self-driving cars always stay aware of their surroundings, seeing what even a human driver cannot see. In addition, self-driving technology cannot be distracted like humans. This ability, however, works by using some sensing technologies:
Self-driving AI Technologies
Lidar
Lidar sensors can detect vehicle environment and object range using laser technology. They emit laser beams around the car at objects, using the beam’s reflections to detect their range. Lidar technologies also create a 3D representation of the environment on the car’s display screen.
Sonar
Sonar sensors (also called ultrasonic sensors) use sound to detect the vehicle environment. The sensors emit sound and listen for the echo to compute the distance between the vehicle and other objects nearby. It also listens for sounds produced by other cars to calculate their distance.
Radar
Radar sensors use radio waves to detect the area around a vehicle. It uses a receiving wire that sends radio waves that skip off the surface of objects and back to the receiving wire. These wave reflections assist the sensors in identifying objects and landscapes around the vehicle.
Cameras
These cars use high-resolution cameras to see the surroundings in real time. Cameras read road signs and road markings to aid autonomous driving. Self-driving vehicles use cameras to recognize their surroundings quickly and reliably.
Neural Networks
Just like every AI-based technology, self-driving AI uses neural network processing. Neural networks help the AI react on the road like humans. AI driving looks natural to other drivers, and this is because the AI is trained to think like a human.
Self-driving technology also works with maps. Well, if your car is going to be driving by itself, it has to know where it is going. So self-driving AI uses maps to detect the fastest route to a place and the best entry point to the destination. These sensors help self-driving technology be the best and safest in sensing distance and type of objects in the surroundings.
Self-driving technology has been underway ever since 1939. However, it was mainly experimental then, with no guarantee of safety for it to become mainstream. The first commercial autonomous car was the Navia Shuttle, released in early 2014. Since then, many companies have explored self-driving technology, and there have been huge advances.
Self-driving Technology Companies
Tesla
You cannot discuss self-driving vehicles without mentioning Tesla. Tesla cars are equipped with autopilot features based on Tesla’s self-driving AI. Since 2014, Tesla has cemented its place as a leading self-driving technology company. Tesla’s autopilot AI works alongside the hardware built into the car to enable effective self-driving technology. Tesla’s self-driving AI uses computer vision; eight cameras installed in strategic positions to study traffic and the car’s surroundings. The AI uses wide, regular, and narrow-angle cameras to see all around the car, as far as 250 meters. These cameras are used to read signs and watch out for other road users. Tesla’s computer vision is linked to a deep learning neural network computer to process all the visual information in real time.
Tesla’s autopilot allows cars to switch lanes, steer, and park. However, the technology does not support fully autonomous travel yet. Instead, there are cameras in the car, watching to ensure that the driver remains focused and ready to take the wheel at any time. Tesla’s self-driving technology currently uses just cameras and ultrasonic sensors with its technology. These sensors can spot an incoming collision on all sides of the car and can even sense when headlights are needed or not. Early Teslas employed lidar and radar sensors, but they soon became irrelevant and unnecessary to Tesla. However, Tesla promises that their cars will support fully driverless travel later in 2023. This means a driver can fall asleep while the Tesla drives itself. Also, following the removal of lidar and radar sensors, they hope to take out the ultrasonic sensors too. Ultimately, Tesla’s self-driving technology is going to be solely dependent on Tesla Vision.
Waymo
Waymo is a subsidiary of Alphabet Inc (Google’s parent company). With involvement in Deepmind and other AI companies, Alphabet has been making an indelible impression on AI technology. And Waymo is another one of those. While other self-driving technology solutions require human supervision, Waymo offers a driverless experience. For example, with Waymo Driver, you can lean back and let the car take you to your destination. Unlike Tesla, Waymo does not build cars, just the sensors, and computing needed for self-driving. This is why they have partnered with companies like Jaguar, Volvo, Geely, and Daimler trucks. As a result, the Waymo Driver can drive almost any road vehicle, ranging from sedans to SUVs and light-duty and heavy-duty trucks.
For over a decade, Waymo has constantly been building its self-driving technology. Now, they have, quite possibly, the most advanced autonomous driving framework, which even operates on public roads. Waymo Driver makes a custom map of every environment it travels and stores this offline. While this helps navigation, it also uses necessary sensors to move around. The cameras read signs and spot objects that allow the Driver to comprehend the area (construction zones and zebra crossings). The number of camera sensors used by the Waymo Driver varies from vehicle to vehicle. For example, a Chrysler Pacifica uses 19, while a Jaguar I-Pace uses 29. These lidar sensors give the Waymo driver a bird’s eye view of the environment, regardless of time or lighting conditions. At the same time, the sensors inform the driver of the speed of various objects around. The Waymo Driver launched in 2009 as Google’s self-driving car project before Alphabet Inc took it on in 2016. Now, it is publicized as the world’s most capable driver. The Waymo Driver has traveled billions of miles in various vehicles; it has driven in experimental, public, and testing conditions. The Waymo Driver is the leading autonomous ride-hailing convenience accessible in a couple of states in the US. They hope to make it available all over the US and in many other countries overseas.
Cruise Automation
Cruise Automation, or Cruise AV, is a subsidiary of General Motors. And just like Waymo, they make fully driverless vehicles available for ride-hailing. GM acquired Cruise AV for over $3 billion, and Cruise has been building self-driving frameworks for them ever since. Cruise AV vehicles use cameras, laser, and radar sensors to figure out the area. With these sensors, the vehicle stays aware of objects that are hundreds of meters away. And with a fast-processing computer, the vehicle can make decisions within a split second. With machine learning and miles of experimenting, the vehicle navigates traffic naturally, even with unexpected changes in traffic conditions. After years of building, Cruise operates legally in San Francisco, Austin, and Phoenix as an autonomous ride-hailing service. They promise expansion into other states and countries of the world. Also, presently, Cruise does not allow riders to travel with pets and kids.
Zoox
Zoox is a subsidiary of Amazon, developing fully self-driving technology. After self-checkout superstores and self-delivering drones, Amazon’s self-driving technology is no surprise. Zoox’s self-driving AI is based on sensors, software, and machine learning. Zoox doesn’t aim to update regular cars; they are building their own vehicle suitable for sensor equipment and autonomous driving. Their design is a bidirectional, fully electric vehicle that solves most issues in the robotaxi market. The hardware includes 14 cameras, eight lidar sensors, radar, and ultrasonic sensors. All these sensors give the car a 360° knowledge of its environment as far as 150 meters. The information is processed by a machine learning system built to operate even in the case of sensor or software failure. Machine learning using Zoox Artificial intelligence (AI) ensures the cars can adjust to sudden changes in the weather, traffic, and road conditions. Zoox also focuses on the self-driving mapping process by mapping every city in which it operates. Zoox ride-hailing is available in the US, in states like San Francisco and Seattle, and there are talks of expanding into other states.
Nvidia
Nvidia Drive is Nvidia’s solution for automated road transportation. Many might know Nvidia as a company that makes computer software and processing hardware. Well, they have taken this software excellence into the self-driving car sector. They are building software-based self-driving AI using decades-long experience in high-performance computing, imaging, and AI. The equipped supercomputer processes the data from the camera, lidar, and radar sensors to understand the car’s environment. Along with autonomous driving technology, it also uses driver monitoring technology. It is not fully autonomous and requires the driver to pay attention to the road and remain ready to take over at any time. The hardware includes:
- 12 exterior and three interior cameras.
- 12 ultrasonic sensors.
- Eight radars.
- One front-facing lidar.
- One ground-facing lidar. Nvidia emphasizes the importance of software in-car artificial intelligence. And they believe the Nvidia Drive OS is the safest operating system for in-vehicle AI and neural networking. So far, they have produced functional driver assistant AI for cars and trucks, partnering with companies like Jaguar, Land Rover, Mercedes-Benz, Hyundai, Volvo, etc. But so far, their fully autonomous car technology is only experimental and not publicly available.
Momenta
Momenta is a Chinese company developing self-driving technology for cars. Since its founding in 2016, the company has partnered with automakers, including General Motors. Momenta’s Self-driving solutions focus on commercial and private transport markets. Their AI is designed to perform in every driving situation, from the highway to urban driving and parking, even in tight spaces.
The MPilot software uses information from the lidar, radar, and 11 camera sensors to get real-time HD information about the environment. The AI uses prediction and perception technology to make quick and safe decisions. Other drivers and pedestrians can be unpredictable sometimes, and Momenta’s AI is designed to react within a split second. However, this AI is designed to work with various vehicles. Sensor number and placement can be adjusted based on the vehicle. The main aim is to be aware of the car’s surroundings up to 200 meters away. Most of their works are experimental and have yet to be active on the road. But, General Motors hopes that soon, MPilot will come equipped in their cars.
Baidu
Baidu is another Chinese technology company; it is so big that some people call it the Chinese Google. And like Google, Baidu is doing research in self-driving AI. And they hope to become the world’s largest driverless ride-hailing service in 2023. In 2021, Baidu paired with Geely to produce their first driverless ride-hailing service: Apollo Go. With the AI expertise of Baidu, and the automotive expertise of Geely, Apollo Go was a success. Currently, they are working on an upgrade from Apollo Go to the Apollo RT6. This is still experimental, but they promise it will be available to the public in the second half of 2023. According to Baidu, Apollo RT6 navigates the road like a driver with 20 years of experience. This is made possible by Baidu’s Artificial Intelligence which they have spent decades building. And 38 sensors feed the computer with necessary information; 12 ultrasonic sensors, 12 cameras, eight lidar sensors, and six radar sensors. Since its launch, Apollo Go has made over 1 million successful rides and has become a popular ride-hailing service for people in China. Unfortunately, as of January 2023, Baidu’s Robotaxi is only available in 10 major cities in China, like Beijing and Wuhan. However, unlike their internet service, we hope that their Robotaxis are available in other countries worldwide.
Mobileye
For over two decades now, Mobileye has been making massive steps in self-driving technology. Mobileye is a leading driver assistant AI technology company, and now they want to fill the roads with fully autonomous vehicles. Mobileye driver-assist technology uses a 360° camera view and EyeQ processing chips. This driver-assist model is hands-free and only requires the driver’s eyes on the road. The Mobileye AI is also equipped with REM, allowing it to make safe decisions, even in random situations, quickly.
The hardware includes 11 cameras and lidar sensors capable of seeing around the vehicle as far as 200 meters. According to Mobileye, they hope to lead the evolution of AI from driver-assisted driving to fully driverless travel. In addition, they have developed more advanced EyeQ chips and SuperVision technology to provide hands-off eyes-off driving. The future of both private and commercial driving is autonomous, and Mobileye is working to be a part of it.
Comma
Comma ai’s Openpilot AI is a driver assistant system. Like Tesla, it requires that the driver remains focused on the road while the self-drive works. Openpilot is completely different as it does not come in any vehicle at all. Openpilot is strictly third-party and requires a whole installation process. For the Openpilot software to work, the car must have some sensors. The only thing you get when purchasing Openpilot is a comma device and cameras, which can be optional. The sensors on a car will determine if it is compatible with Openpilot’s technology. And because many cars now come with basic sensors for simple driver assist, the latest Comma 3 works with over 200 car models. When plugged in, Openpilot uses the sensors on the car to sense the environment while the AI takes the wheels. It also has a driver monitoring system that keeps an eye on drivers, so they keep an eye on the road. Comma ai’s Openpilot is probably the best third-party hands-off eyes-on self-driving technology. Even though 200 sounds like a large number, Openpilot’s compatibility is still limited. We hope that very soon, we will see a more compatible update to Comma ai. Mabe the Comma 4?
Conclusion
As fancy and relaxing as self-driving is, it is helpful for more than that. Over 80% of road accidents are caused by human error: drunk driving, distracted driving, overspeeding, etc. Even since the use of AI in driving, most accidents involving self-driving cars were caused by other human drivers. Leading companies in self-driving technology have perfected V2I (vehicle to Infrastructure), which allows vehicles to understand objects in their environment. Companies are now working towards seamless V2V (Vehicle to Vehicle) communication, allowing vehicles to communicate.
Over the next decade, most cars on the road will be autonomous, even public transport vehicles. This ultimately means the roads will be safer, and even physically challenged people can travel safely. Also, imagine what you could do instead of staring at the road for hours.
Autonomous driving will not just improve transportation; AI will completely revolutionize transportation. Perhaps, we can see self-driving motorcycles soon. Subscribe to the best AI newsletter to get updated on innovations in self-driving technology and everything concerning AI.