YouTube removed a video in which Tesla drivers conduct their own safety tests to determine whether the EV’s (electric vehicle) Full Self-Driving (FSD) capabilities will make it stop automatically for children crossing or standing on the road. as first reported by CNBC.
The video, titled “Did Tesla Full-Self Driving Beta Really Run Kids Over?” was originally posted on The YouTube channel of the entire Mars catalog and Tesla owner and investor, Tad Park, engages Tesla’s FSD feature with his own children. During the video, Park drives a Tesla Model 3 to one of his children standing in the road, then tries again with his other child crossing the street. The vehicle stops before reaching the children both times.
As outlined on the support page, YouTube has specific rules against content that “endangers the emotional and physical well-being of minors”. YouTube spokesperson Elena Hernandez told CNBC that the video is against his policies harmful and dangerous content and that the platform “does not allow content that shows a minor participating in dangerous activities or encouraging minors to engage in dangerous activities.” YouTube didn’t respond right away The edge‘s request for comment.
“I’ve tried FSD beta before and I would entrust my kids’ lives to them,” Park says during the now-deleted video. “So I’m confident it will detect my kids, and I’m also in control of the steering wheel so I can brake at any time,” Park told CNBC the car was never going more than eight miles an hour. and “made the car recognize the child.”
As of August 18, the video had over 60,000 views on YouTube. The video was also posted to Twitter and is still available to watch. The edge contacted Twitter to see if it has plans to take it down, but didn’t hear back immediately.
The crazy idea of testing FSD with real — living and breathing — kids originated after a video and advertising campaign posted to Twitter showed Tesla vehicles seemingly unable to detect and collided with kid-sized dummies placed in front of the vehicle. Tesla fans didn’t buy it, sparking a debate on the feature’s limitations on Twitter. Whole Mars Catalog, an EV-powered Twitter and YouTube channel later run by Tesla investor Omar Qazi hinted at making a video involving real children in an attempt to prove that the original results are wrong.
In response to the video, the National Highway Traffic Safety Administration (NHTSA) issued a statement warning against using children to test automated driving technology. “No one should risk their life or that of anyone else to test the performance of vehicle technology,” the office told Bloomberg. “Consumers should never attempt to create their own test scenarios or use real people, especially children, to test the performance of vehicle technology.”
Tesla’s FSD software does not make a vehicle fully autonomous. It is available to Tesla drivers for an additional $12,000 (or $199/month subscription). Once Tesla determines that a driver meets a certain safety score, it unlocks access to the FSD beta, which allows drivers to enter a destination and drive the vehicle there using Autopilot, the vehicle’s Advanced Driver Assistance System (ADAS). . Drivers must still keep their hands on the wheel and be ready to take control at any time.
Earlier this month, California’s DMV accused Tesla of making false claims about Autopilot and FSD. The agency claims that the names of both features, as well as Tesla’s description of them, falsely imply that vehicles can operate autonomously.
In June, the NHTSA first released data on driver-assisted collisions and found that Tesla vehicles using Autopilot vehicles were involved in 273 accidents from July 20, 2021 to May 21, 2022. The NHTSA is currently investigating a number of incidents. . where Tesla vehicles using driver assistance technology collided with parked emergency vehicles, in addition to more than two dozen Tesla accidents, some of which were fatal.