• A Tesla fan tested the child-detection abilities of the carmaker's Full Self-Driving software with two kids.
  • He did it after another test showed the self-driving tech crashing into a child-sized mannequin.  
  • The test shows FSD stopping for children at slow speeds, but is not subject to US testing standards.

A Tesla fan tested out the carmaker's Full Self-Driving (FSD) software using a real child after a video went viral of the electric car crashing into a toddler-sized mannequin.

The marketing campaign from Tesla critic Dan O'Dowd that showed Tesla FSD crashing into a child-sized mannequin quickly sparked outrage in the Tesla community — drawing the ire of one Tesla fan in particular, @WholeMarsBlog.

Last week, Omar Qazi, the person running Tesla fan account @WholeMarsBlog, put a call out on Twitter for a San Francisco parent willing to use their child for a test of the software. Qazi said he was eager to disprove the results of O'Dowd's test and show that he was "lying about life-saving technology." 

 

The Twitter post quickly generated a response online and the transportation editor from The Verge even published an open letter calling on the Tesla fan to cancel the test.

On Sunday, Qazi, who has been known to interact with Elon Musk on Twitter and has over 130,000 followers, posted a YouTube video in which he tested FSD using real life children. 

The Tesla fan did not respond to a request for comment from Insider ahead of publication, but said on Twitter the test would be safe as a human would take over if the Tesla failed to brake. 

 

Tad Park, CEO of Volte Equity, brought two of his kids along for the test with Qazi at a residential street.

"I'm confident that I can trust FSD with my kids and I'm also in control of the wheel so I can brake at any time," Park said in the video.

In one test, Park's daughter stood in the middle of the road and Tesla FSD appeared to recognize the child from a stand-still about three car-lengths away. In the video, the car moved forward at a five miles per hour pace and stopped, refusing to go forward until the child was moved out of its trajectory.

Park performed a similar test with his five-year-old son. The boy walked across the street as the Tesla came toward him from the same distance at less than 10 miles per hour. The Tesla appeared to slow down until after the child had crossed the street.

"It was a little nerve-wracking in the beginning, but I knew that it was going to detect and stop," Park said in the video. "I think it's very important for this to be out there. I think it will save a lot of kids' lives."

The video features multiple tests that show the software recognizing a dummy as well as an adult male in the middle of the road. Though, the test is never performed at a speed greater than about 20 miles per hour. For comparison, the test Qazi criticizes was performed at a speed of 40 miles per hour from a distance of 120 yards.

Qazi is not the first Tesla fan to try to replicate the test. After the video was released, several FSD drivers took to the streets to see whether the system could recognize a child-sized dummy with varying results.

It's important to note that none of the tests have been conducted with the oversight of a US regulator, but rather independently — which means they were not subject to the same testing standards.

Though FSD claims to be fully self-driving, in reality it operates as an optional add-on that enables Teslas to automatically change lanes, enter and exit highways, recognize stop signs and traffic lights, and park. Tesla has told drivers that the system does not replace a licensed driver and instructs them to keep their hands on the wheel and be prepared to take over when the system is running.

Read the original article on Business Insider