DETROIT — Three times in the last four months, William Stein, a technology analyst with Truist Securities, has accepted Elon Musk’s invitation to test the latest versions of Tesla’s lauded “Full Self-Driving” system.
According to the business, a Tesla equipped with the technology may go from one location to another with minimum human assistance. However, when Stein drove one of the cars, he claimed that the vehicle engaged in risky or illegal manoeuvres. Stein stated that his 16-year-old son, who accompanied him on the test drive earlier this month, was “terrified.”
Federal inspectors are investigating Stein’s experiences, as well as a Tesla crash in the Seattle region employing Full Self-Driving that killed a biker in April. They had previously been looking at Tesla’s automated driving technologies for almost two years, following scores of collisions that prompted safety concerns.
Questions About The Safety Of Tesla’s ‘Full Self-Driving’ System Are Growing
The issues have made many who monitor autonomous vehicles more skeptical that Tesla’s automatic system would ever be able to operate securely on a large scale. Stein believes Tesla is still waiting to launch a fleet of self-driving cars by next year, as Musk has predicted.
The newest mishaps occur at a critical juncture for Tesla. Musk has informed investors that Full Self-Driving may be able to operate more safely than human drivers by the end of this year, if not next.
In less than two months, the business plans to showcase a car designed specifically to be a robotaxi. Musk has stated that in order to deploy robot axis on the road, Tesla will have to demonstrate to regulators that the system can drive more safely than people. Under federal regulations, Teslas must meet national vehicle safety standards.
Musk has disclosed data on miles driven per crash, but only for Tesla’s less advanced Autopilot technology. According to safety experts, the data must be corrected since it only records catastrophic collisions involving airbag deployment and does not reveal how frequently human drivers had to take control to escape a collision.
Approximately 500,000 Tesla owners employ Full Self-Driving on public roads, accounting for slightly more than one in every five Teslas now in service. Most of them paid at least $8,000 for the optional system.
The business has stressed that automobiles outfitted with the system cannot drive autonomously and that drivers must always be prepared to intervene if necessary. Tesla also claims to watch each driver’s behavior and will suspend their ability to use Full Self-Driving if they do not properly supervise the system. Recently, the business began referring to the technology as “Full Self-Driving” (Supervised).
Musk, who has admitted that his previous estimates for the deployment of autonomous driving were overly optimistic, predicted a fleet of autonomous vehicles by the end of 2020. Five years later, many who follow the technology doubt it will function across the United States as promised.
Michael Brooks, executive director of the Centre for Auto Safety, stated, “It’s not even close, and it won’t be next year.”
Stein drove a Tesla Model 3, purchased at a Tesla dealership in Westchester County, north of New York City. Tesla’s lowest-priced vehicle was outfitted with the most recent Full Self-Driving software. According to Musk, the program now incorporates artificial intelligence to assist with steering and pedal control.
During his journey, Stein noted that the Tesla seemed smoother and more human-like than previous versions. However, in less than ten miles, he claimed the automobile made a left turn from a through lane while running a red light.
“That was stunning,” Stein said.
He explained that he did not take control of the automobile because there was little traffic, and the maneuver did not appear unsafe then. Later, the car drove along the middle of a parkway, straddling two lanes of traffic moving in the same direction. This time, Stein explained, he intervened.
Stein wrote to investors that the latest version of Full Self-Driving does not “solve autonomy” as Musk had expected. It does not “appear to approach robotaxi capabilities.” Stein reported that Tesla vehicles surprised him with dangerous maneuvers after two previous test drives in April and July.
Tesla has not replied to requests for comment.
Stein stated that, while he believes Tesla will eventually profit from its driving technology, he does not anticipate a robotaxi with no driver and a passenger in the back seat shortly. He projected that it would be considerably delayed or have restricted travel options.
According to Stein, there is frequently a considerable difference between what Musk says and what is likely to happen.
To be true, many Tesla devotees have shared videos on social media of their cars driving autonomously without human intervention. Videos, of course, do not demonstrate how the system functions over time. Others have shared recordings that depict harmful behavior.
Alain Kornhauser, head of autonomous vehicle studies at Princeton University, said he drove a Tesla borrowed from a friend for two weeks and discovered that it constantly recognized pedestrians and detected other drivers.
While it often performs well, Kornhauser said he had to take control when the Tesla made actions that terrified him. He advises that Full Self-Driving has yet availableto be alone in all locations.
“This thing,” he told me, “is not at a point where it can go anywhere.”
Kornhauser believes the system may operate independently in smaller portions of a city with precise maps to direct the vehicles. He asks why Musk doesn’t start by providing rides on a limited scale.
“People could really use the mobility that this could provide,” according to him.
For years, experts have cautioned that Tesla’s camera and computer system cannot always detect and identify objects. Cameras cannot always see in bad weather or darkness. Most other self-driving car firms, including Alphabet Inc.’s Waymo and General Motors’ Cruise, use a combination of cameras, radar, and laser sensors.
“If you can’t see the world correctly, you can’t plan, move, or react to it correctly,” said Missy Cummings, a professor of engineering and computing at George Mason University. “Cars can’t do it with vision alone,” she explained.
Cummings claims that even those with laser and radar cannot always drive successfully, raising safety concerns about Waymo and Cruise. (Waymo and Cruise declined to comment.)
Phil Koopman, a Carnegie Mellon University professor who researches autonomous vehicle safety, believes it will be many years before autonomous vehicles based only on artificial intelligence can handle all real-world scenarios.
“Machine learning has no common sense and learns narrowly from a huge number of examples,” according to Koopman. “If the computer driver gets into a situation it has not been taught about, it is prone to crashing.”
A motorcyclist was hit and killed by a Tesla employing Full Self-Driving in Snohomish County, Washington, near Seattle, in April, according to officials. The Tesla driver, who has yet to be charged, informed police that he was utilising Full Self-Driving while looking at his phone when the car rear-ended the motorcycle. Authorities said that the motorcyclist was pronounced deceased at the spot.
Questions About The Safety Of Tesla’s ‘Full Self-Driving’ System Are Growing
The agency stated that it is reviewing information obtained from Tesla and law enforcement officials on the fatal crash. It also acknowledges Stein’s experience with full self-driving.
The NHTSA also stated that it is investigating whether a Tesla recall earlier this year, which was intended to improve its automated vehicle driver monitoring system, was successful. It also urged Tesla to recall Full Self-Driving in 2023 because, in “certain rare circumstances,” the agency stated, it may violate some traffic laws, increasing the risk of a crash. (The agency declined to say whether it had completed its evaluation of whether the recall accomplished its mission.)
As Tesla electric vehicle sales have declined in recent months despite price cuts, Musk has urged investors to view the company as a robotics and artificial intelligence business rather than a car company. However, Tesla has been working on full self-driving vehicles since at least 2015.
“I recommend that anyone who doesn’t believe Tesla will solve vehicle autonomy should not hold Tesla stock,” he stated during an earnings conference call last month.
Stein advised investors, however, to evaluate whether Full Self-Driving, Tesla’s artificial intelligence initiative “with the most history, that’s generating current revenue, and is already being used in the real world,” works.
SOURCE | AP