Tesla forces us to question whether cars should drive like humans or robots

Many drivers misunderstand the limits of technology already on the road today. The public is getting confused about what “self-driving” means, for example, as driver assistance systems become more common and more sophisticated. In a survey conducted by analyst firm JD Power last year, only 37 percent of respondents chose the correct definition of self-driving cars.

Neither Tesla nor any other company sells a self-driving or self-driving carAnd A vehicle capable of driving itself in a wide range of locations and conditions without a human being willing to take over.

However, Tesla markets its driver assistance systems in the US with names that regulators and safety experts say are misleading Like the standard package autopilot, fully self-driving for a premium package.

At the same time, Tesla warns drivers in its owner’s manuals that it is their responsibility to use the features safely and they must be prepared to take on the task of driving at any moment with their eyes on the road and hands on the wheel.

The difficulty of navigating in an unpredictable environment is one reason why truly self-driving cars haven’t happened yet.

Said William S.

“I wish we’d been there yet,” he said, “but we haven’t gotten there, except for straight highways with typical entrances and exits drawn on the maps.”

“Stuck in a cookie jar’

The Tesla roll-stop feature has been around for months before it caught much attention. Chris, who chronicles the good and bad of Tesla’s latest YouTube feature under the name DirtyTesla, said his Tesla car stopped automatically spinning for more than a year before Tesla disabled the feature. He agreed to be interviewed on the condition that only his first name be used due to privacy concerns.

I picked up the audit this year. Regulators at the National Highway Traffic Safety Administration asked Tesla about the feature, and in January, the automaker launched a “remote” software update to disable it. NHTSA has classified the software update as an official safety recall.

Critics were surprised not only by choosing to design software in this way but also by Tesla’s decision to test features using customers, not professional test drivers.

Safety advocates said they did not know of any US jurisdiction where rolling stops are legal, and could not identify any safety justification for allowing them to do so.

said William Weeden, a law professor at the University of Miami who has written about autonomy to regulate the car.

“I’ll be honest about it, rather than getting their hands in a cookie pan,” Weeden said.

Safety advocates also questioned two entertainment features unrelated to self-driving that they said had bypassed safety laws. One, called Passenger Play, allows drivers to play video games on the go. Another type, called a Boombox, allows drivers to play music or other sound from their vehicles while in motion, which presents a potential danger to pedestrians, including blind people.

Tesla recently pushed software updates to restrict both of these features, and NHTSA opened an investigation into Passenger Play.

Tesla, the maker of best-selling electric cars, has not described the features as a bug or acknowledged that they may have caused safety risks. Instead, Musk denied the stops were unsafe and called federal car safety officials “fun police” for their objection to Boombox.

Separately, NHTSA is investigating Tesla for potential safety flaws in Autopilot, its standard driver assistance system, after a series of collisions in which Tesla cars, with the systems turned on, collided with stationary first responder vehicles. Tesla has faced lawsuits and accusations that the autopilot is unsafe because it cannot always detect vehicles or other obstacles in the way. Tesla has generally denied allegations made in lawsuits, including one in Florida where it said in court papers that the driver was at fault in the death of a pedestrian.

NHTSA declined an interview request.

It is not clear what state or local regulators might do to adapt to the reality that Tesla is trying to create.

“All vehicles operating on California’s public roads are expected to comply with the California Motor Vehicle Code and local traffic laws,” the California Department of Motor Vehicles said in a statement.

The agency added that automated vehicle technology must be deployed in a way that “encourages innovation” and “addresses public safety” – two goals that may conflict if innovation means intentionally violating traffic laws. Officials there declined to be interviewed.

Musk, like most proponents of self-driving technology, has focused on the number of deaths caused by current human-operated vehicles. He has said his priority is to achieve a self-driving future as quickly as possible in a theoretical attempt to reduce 1.35 million annual traffic deaths worldwide. However, there is no way to truly gauge how safe a self-driving car is, and even comparing Teslas to other vehicles is difficult due to factors such as different vehicle ages.

Industry Commitments

At least one other company has faced an allegation of willfully violating traffic laws, but with a different result than Tesla.

Last year, San Francisco city officials expressed concern that Cruz, which is majority-owned by General Motors, had programmed its vehicles to stop in travel lanes in violation of California’s motor vehicle law. Cruise’s advanced driverless cars are used in an automated taxi service that picks up and drops off passengers without a driver behind the wheel.

Cruz responded with something Tesla has yet to offer: a pledge to comply with the law.

“Our vehicles are programmed to follow all traffic laws and regulations,” Cruise spokesman Aaron McClear said in a statement.

Julianne McGoldrick, a Waymo spokeswoman, said another company seeking self-driving technology, Waymo, has programmed its cars to break traffic laws only when they clash with each other, such as crossing a double yellow line to give a cyclist more space.

“We prioritize safety and compliance with traffic laws over knowing the behavior of other drivers. For example, we do not program a car to exceed the speed limit because that is familiar to other drivers.

A third company, Mercedes, said it was ready to take responsibility for accidents that occur in situations where it has promised its driver assistance system, Drive Pilot, to be safe and comply with traffic laws.

Mercedes did not respond to a request for information about its approach to motorized vehicles and whether it should circumvent traffic laws.

Safety experts aren’t ready to give Tesla or anyone else permission to break the law.

“At a time when pedestrian deaths are at their highest level in 40 years, we shouldn’t be relaxing the rules,” said Leah Shahum, director of the Vision Zero Network, an organization trying to eliminate traffic deaths in the US.

“We need to think of higher goals – not to have a system that is no worse than it is today. It has to be dramatically better,” Chahoum said.