Advertisement
HomeSpecial ReportsSuper Bowl Ad Slams Tesla’s Full-Self Driving Mode Used Around School Buses

Super Bowl Ad Slams Tesla’s Full-Self Driving Mode Used Around School Buses

Lost amid Sunday’s overtime thriller, a controversial Super Bowl ad criticized Tesla’s Full-Self Driving (FSD) Beta Mode for being unsafe and unreliable, including around school bus stops.

The ad was created by the Dawn Project, which has the goal of “making computers safe for humanity,” according to its website. The organization also ran a Super Bowl ad last year.

This year’s ad, which aired regionally to select audiences in California, Florida, Georgia, Texas, and Washington, D.C., cites the case of a 17-year-old North Carolina boy, who was struck by a Tesla after exiting his school bus last March. The Tesla driver had activated FSD, which is supposed to detect and avoid obstacles and other vehicles.

Dan O’Dowd, founder of the Dawn Project and CEO of Green Hills Software, told School Transportation News this week that he and his teams have run tests of Teslas operating near school buses and that FSD is unable to decipher the big yellow vehicle with flashing lights from a similar-sized truck with its hazards on. Additionally, O’Dowd said tests show that a Tesla using FSD often failed to stop for and hit mannequins placed in the roadway to simulate the presence of children.

Advertisement

O’Dowd is a software engineer who created secure operating systems for the Boeing 787 and B1-B Intercontinental Nuclear Bomber, Lockheed-Martin F-35 fighter jets and NASA’s Orion Crew Exploration Vehicle. O’Dowd claims he has developed the only unhackable software in the world, which earned him the Caltech Distinguished Alumni Award in June 2021, and said that Tesla’s Full Self-Driving (FSD) Beta mode is fundamentally flawed and unsafe.

“You can see it on the [Tesla] display,” explained O’Dowd. “What do you do when you see a truck with its hazard lights on? Well, you stop, you peak around to see if there’s anybody in the other lane, and you go around it, right? That’s what it does. It does slow down sometimes but it’s going to go past the school bus because it thinks it’s a disabled vehicle and not a school bus. It doesn’t recognize a school bus. They need to teach it what a school bus is … They just haven’t done it.”

Still image from a video showing a test by the Dawn Project of a Telsa in Full Self-Driving mode but passing a stopped school bus.
Still image from a video showing a test by the Dawn Project of a Telsa in Full Self-Driving mode but passing a stopped school bus.

He accused Tesla of being either incompetent or simply not caring.

“They can’t figure out how to recognize a school bus? Then, they shouldn’t’ have their software on the road at all,” O’Dowd continued. “Instead, they are putting their efforts into parking … You’re not fixing a critical vulnerability while you put resources into basically marketing gimmicks, that’s just ridiculous.”

O’Dowd, who unsuccessfully ran for a U.S. Senate seat to represent California on the sole platform of fighting Tesla FSD and Elon Musk, added that Dawn Project tests of similar technology in Ford and GM vehicles show they perform correctly.

An email sent to Tesla seeking comment was not answered at this report.

But Tesla sent O’Dowd a cease-and-desist letter last summer following another Dawn Project ad. Dinna Eskin, deputy general counsel for Tesla, argues that the Dawn Project manipulates the video and performed some of the tests that FSD is disengaged. O’Dowd published his own retort online, and a spokesman for the Dawn Project said Tesla has not threatened any further legal action since then.

Additionally, a Youtuber posted a video last year debating the validity of the Dawn Project’s tests. User Dr. Know-it-all notes in his own video-recorded tests that Tesla FSD will not function properly when the driver overrides the system with their foot on the accelerator. Behind-the-wheel footage he posted shows a Tesla slowing down for and going around actual people as well as cardboard cutouts representing children. But the vehicle never stops completely. He also does not test the car at a school bus stop.


Related: Texas Autonomous Vehicle Task Force Will Work with School Bus Companies
Related: NHTSA Orders Autonomous School Shuttle to Stop Operating
Related: STN EXPO: Autonomous School Bus Future Must Include Human Drivers
Related: Webinar Tackles School Buses in the Autonomous Future


Still, the National Highway Traffic Safety Administration (NHTSA) has at least 41 open investigations into Tesla for its Autopilot system, which is a precursor to the FSD mode and still requires human supervision. Tesla recalled over 2 million vehicles in the U.S. in December to install new safeguards in FSD mode, and Transport Canada said another 193,000 vehicles will be recalled there.

Then, on Monday, Consumer Reports published an article based on its own car safety experts’ findings that Tesla’s fix does not go far enough to fix the real problem. It reported that Tesla’s autopilot feature still does not have effective driver monitoring and does not allow for a seamless collaboration between lane-centering assistance and the driver’s own steering inputs.

NHTSA told STN that it “generally does not comment on open investigations.”

But NHTSA’s Office of Defects Investigations records show a Preliminary Evaluation of Tesla’s SAE Level 2 driving automation system opened in August 2021 after identifying 14 crashes of Tesla Model 3, Model S, Model X and Model Y electric cars into stationary emergency vehicles. These crashes resulted in 15 injuries and one fatality.

“The investigation opening was motivated by an accumulation of crashes in which Tesla vehicles, operating with Autopilot engaged, struck stationary in-road or roadside first responder vehicles tending to pre-existing collision scenes,” NHTSA ODI investigators wrote in a June 2022 announcement that it was expanding its investigation. “Upon opening the investigation, NHTSA indicated that the [preliminary evaluation] would also evaluate additional similar circumstance crashes of Tesla vehicles operating with Autopilot engaged, as well as assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation.”

Regarding ongoing NHTSA reviews of illegal school bus passing laws and technologies to help mitigate violations, as required by the Investing in Infrastructure and Jobs Act, the agency said updates may be available “in a few months.”

Meanwhile, the National Transportation Safety Board in 2020 issued nine recommendations and reiterated several others following a fatal Tesla Model X crash two years earlier. Investigators determined system limitations of what it termed Tesla’s “Autopilot,” the driver’s overreliance on the system, and the driver’s distraction.

“This tragic crash clearly demonstrates the limitations of advanced driver assistance systems available to consumers today,” then NTSB Chairman Robert Sumwalt said at the time. “There is not a vehicle currently available to U.S. consumers that is self-driving. Period. Every vehicle sold to U.S. consumers still requires the driver to be actively engaged in the driving task, even when advanced driver assistance systems are activated.

“If you are selling a car with an advanced driver assistance system, you’re not selling a self-driving car,” Sumwalt continued. “If you are driving a car with an advanced driver assistance system, you don’t own a self-driving car.”

Also this week, NTSB sent its own cease-and-desist letter to the Dawn Project because of unauthorized use of the agency’s log in the Super Bowl ad.

Advertisement

November 2024

Meet the 2024 Transportation Director of the Year, Craig Beaver, director of transportation at Beaverton School District in Oregon....
Advertisement

Buyer’s Guide 2024

Find the latest vehicle production data and budget reports, industry trends, and contact information for state, national and federal...
Advertisement

Poll

Does your state require school bus evacuation training for students with disabilities and special needs?
67 votes
VoteResults
Advertisement