Try to find a single ablation study of a sensor suite. Waymo is in a good position to do such a study and the corporation would have benefited from showing that vision-only systems aren't viable (by demonstrating the corporation's good will to maintain public safety and by making it harder for vision-only competitors), but no such study from them.
I guess they understand that computer vision is a fast-moving target and their paper might become obsolete the next day.
Read Electrek articles with a mouthful of salt. Fred Lambert’s “robotaxi is 10x worse than a human” estimate is based on his personal statistical reasoning, which somehow arrived at 200,000 miles per accident for humans. Minor accidents that Tesla reports for robotaxis (such as low-speed collisions with stationary objects) do not make it into publicly available statistics, so his estimate might be significantly off.
Waymo routinely uses safety drivers, sorry, "autonomous specialists" when expanding to new cities[1][2]. Waymo cars occasionally contact the remote support. If support is not available, the cars just stay where they stopped[3].
Tesla has rolled out a small number of cars with no safety driver[4].
In short, you are either grossly misinformed or intentionally lying. Is it a political echochamber you are stuck in?
[1] https://waymo.com/faq/ "Our vehicles are primarily driving autonomously, but you’ll sometimes notice that our cars have autonomous specialists riding in the driver’s seat. These specialists are there to monitor our autonomous driving technology and share important feedback to help us improve the Waymo experience."
[2] https://waymo.com/waymo-in-uk/ "Our autonomous specialists who are present in the vehicle during testing are highly trained professionals."
You are grossly misinformed. Waymo self driving never disengages the way Tesla FSD does. It is active at all times. In novel situations humans will provide instructions on what path to take but this is relatively infrequent. Tesla Robotaxis are so bad they need a safety driver in every single car at all times ready to take control when the car does something stupid. The small number of robotaxis without safety drives are limited to a tiny area and not open to the public.
> Waymo self driving never disengages the way Tesla FSD does. It is active at all times
Consumer version of FSD can park a car if a driver doesn't take contol[1]. Waymo seems to require a remote command to initiate parking instead of just standing there with hazard lights on[2].
> Tesla Robotaxis are so bad they need a safety driver in every single car at all times ready to take control
Every single robotaxi in Austin doesn't have a driver behind the wheel. So a driver can't be ready to take control. Stop lying. I no longer believe that you are misinformed.
Not for a very long time. Just think about how big of an advantage lidar and radar are at night or radar is in snow and rain?
If Tesla had been smart they would have used regular cameras and event based cameras where the pixels send a signal whenever their brightness changes enough. These can have microsecond latency. And multi spectral cameras. Combined this data would provide very rich data for neural networks.