YouTuber and former NASA engineer Mark Rober has kicked the hornet’s nest together with his newest video.
Within the piece — titled “Can You Fool a Self Driving Car?” — Rober discovered {that a} Tesla automobile on Autopilot was fooled by a Wile E. Coyote-style wall painted to appear to be the street forward of it, with the electrical automobile plowing proper by it as a substitute of stopping.
The footage was damning sufficient, with slow-motion clips displaying the automobile not solely crashing by the styrofoam wall but additionally a model of a kid. The Tesla was additionally fooled by simulated rain and fog.
A separate take a look at automobile, a Luminar tech-equipped Lexus SUV, aced the exams.
The stunt was meant to display the shortcomings of relying totally on cameras — reasonably than the LIDAR and radar methods utilized by manufacturers and autonomous automobile makers aside from Tesla.
“I can definitively say for the primary time within the historical past of the world, Tesla’s optical digicam system would completely smash by a faux wall with out even a slight faucet on the brakes,” Rober mentioned within the video.
However Tesla’s fanboys have since cried foul, arguing that the EV maker might even sue Rober for “false promoting/deceptive an viewers,” in response to YouTuber Kevin “Meet Kevin” Paffrath.
In a response video posted to Tesla CEO Elon Musk’s X-formerly-Twitter, Paffrath argued that Rober had disengaged Autopilot proper earlier than crashing into the faux wall.
Paffrath went so far as to allege that Rober was being paid by Luminar, the LIDAR tech firm that outfitted the SUV that went head-to-head with the Tesla.
Different customers on X argued that Rober ought to’ve used Tesla’s notorious Full Self-Driving (FSD) function, which prices a whopping $8,000 on high of the price of the automobile.
In a separate post seemingly responding to the allegations, Rober shared the “uncooked footage of my Tesla going by the wall.”
“Undecided why it disengages 17 frames earlier than hitting the wall however my toes weren’t touching the brake or gasoline,” he added.
In different phrases, is that this actually a smoking gun — or did the Autopilot disengage by itself, sending the Tesla plowing proper by the wall whereas underneath “human” management?
As Electrek points out, Autopilot has a well-documented tendency to disengage proper earlier than a crash. Regulators have beforehand discovered that the superior driver help software program shuts off a fraction of a second earlier than making impression.
It is a extremely questionable strategy that has raised considerations over Tesla attempting to evade guilt by robotically turning off any probably incriminating driver help options earlier than a crash.
Put merely, as a substitute of taking down Rober’s purportedly anti-Musk hit piece, Paffrath inadvertently highlighted Tesla’s shady practices.
Tesla has already been extraordinarily reluctant about handing over crash information, particularly with regards to its notorious “Full Self-Driving” function, duking it out with the California Department of Motor Vehicles in 2023.
Tesla has additionally gone after crash information collected by the Nationwide Freeway Site visitors Security Administration (NHTSA). The EV maker reportedly asked the regulator to redact details about whether or not its Autopilot or FSD software program was used throughout each single documented crash since 2021.
Briefly, Rober’s newest video nonetheless factors out a evident shortcoming with regards to Tesla’s security options — irrespective of how sad it makes Tesla fanboys.
Extra on the video: Man Tests If Tesla Autopilot Will Crash Into Wall Painted to Look Like Road
Source link