Thursday, April 25, 2024
More
    HomeBusinessTesla Faces Upgraded U.S. Probe Into Autopilot in Emergency-Scene Crashes

    Tesla Faces Upgraded U.S. Probe Into Autopilot in Emergency-Scene Crashes

    U.S. auto-safety regulators have escalated their investigation into emergency-scene crashes involving

    Tesla Inc.’s


    TSLA 1.20%

    Autopilot, a critical step for determining whether to order a safety recall.

    The National Highway Traffic Safety Administration said in a notice published Thursday that it was expanding a probe begun last August into a series of crashes in which Tesla vehicles using Autopilot struck first-responder vehicles stopped for roadway emergencies.

    The agency said it was upgrading its earlier investigation to an engineering analysis after identifying new crashes involving Autopilot and emergency-response vehicles.

    NHTSA also said it has expanded its examination of Autopilot to include a wider range of crashes, not only those at emergency scenes. The agency said it would further assess how drivers interact with Autopilot and the degree to which it might reduce motorists’ attentiveness.

    Forensic data available for 11 of the crashes showed that drivers failed to take evasive action in the two to five seconds before the collision, the agency said.

    The investigation covers an estimated 830,000 Tesla vehicles made from 2014 to 2021, including the Model 3, Model S, Model X and Model Y.

    NHTSA said in its filing that it has identified 15 injuries and one fatality related to the crashes.

    Tesla didn’t immediately respond to a request for comment. The electric-car maker’s stock was up 2.5% in midday trading Thursday, following news of a strong bounceback in production at its plant in China.

    Autopilot, Tesla’s name for the advanced driver-assistance technology used in its vehicles, is designed to help drivers with tasks such as steering and keeping a safe distance from other vehicles. Tesla instructs drivers using the system to pay attention to the road and keep their hands on the wheel.

    The electric-car maker has long maintained that driving with Autopilot engaged is safer than doing so without it. Tesla points to internal data showing that crashes were less common when drivers were using Autopilot. Some researchers have criticized Tesla’s methodology.

    In opening its initial probe last year, NHTSA said that it had identified 11 crashes since early 2018 in which a Tesla vehicle using Autopilot struck one or more vehicles involved in an emergency-response situation. In its latest filing, the agency said it discovered six additional crashes involving Teslas and first-responder vehicles where Autopilot was in use.

    U.S. safety regulators are probing crashes involving Teslas, suspecting the company’s Autopilot system might be involved. WSJ’s Robert Wall reports on how some motorists may mistakenly think Autopilot is a self-driving feature that doesn’t require their attention. (Video from 3/18/21)

    The expanded probe of Autopilot is the latest sign that U.S. auto-safety regulators are getting more aggressive in scrutinizing advanced vehicle technologies that automate some or all of the driving tasks.

    NHTSA is getting ready to release new crash data this month that will give the public its first detailed look at the frequency and severity of incidents involving what are known as automated driving or advanced driver-assistance features, The Wall Street Journal has reported.

    More than 100 companies are subject to an agency order requiring them to report crashes in which such systems were in use. Among those included are operators of autonomous-car fleets, like

    Alphabet Inc.’s

    Waymo and

    General Motors Co.

    ’s Cruise LLC.

    The technology under scrutiny includes lane-keeping assistance and cruise-control systems that keep a fixed distance behind a leading car, as well as higher-tech systems such as features that can guide a car along highways with minimal driver input.

    Autopilot has become a particular focus for U.S. regulators in recent years, prompted by incidents in which drivers have misused the technology, overriding safety functions to operate a vehicle without their hands on the wheel, for example. Some critics also said the term Autopilot risks giving drivers an inflated sense of the system’s capabilities.

    NHTSA said in its latest filing that driver use or misuse of Autopilot doesn’t necessarily preclude the agency from determining whether the technology is defective.

    “This is particularly the case if the driver behavior in question is foreseeable in light of the system’s design or operation,” NHTSA said. Auto makers are legally required to initiate a recall if a safety defect is discovered in their vehicles.

    Separately, NHTSA has opened a broader investigation into several dozen crashes where advanced driver-assistance features are suspected to have played a role. While the probe covers vehicles made by any car company, incidents involving Teslas represent most of the cases under examination, including several with fatalities.

    Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

    RELATED ARTICLES

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here

    - Advertisment -
    Google search engine

    Most Popular

    Recent Comments