Opinion: Tesla's response to auto recall raises concerns

Opinion: Tesla's response to auto recall raises concerns

Tesla's response to a massive Autopilot recall raises concerns about consumer safety, highlighting flaws in our auto safety system

William Wallace, the associate director of safety policy for Consumer Reports (CR), leads CRs public policy and advocacy work related to motor vehicle safety, vehicle automation, and household product safety. The views expressed here are his own. Read more opinion on CNN.

Now, picture this - you switch on the news and hear the heartbreaking news of a man who was killed while using a lawnmower. And shockingly, it's the same model that you own and use.

Opinion: Tesla's response to auto recall raises concerns

William Wallace

Consumer Reports

In this hypothetical situation, experts thoroughly investigate the incident and determine that while the man bears some responsibility, the design flaws in the lawnmower also played a role in his untimely death. As a result, they propose practical design modifications to avoid similar tragedies in the future.

Would you feel safe knowing that the manufacturer refuses to make necessary changes and continues to stand by the flawed design, leading to more incidents and fatalities?

Multiple scenarios like this have unfolded in recent years, demonstrating the sense of impunity that some companies seem to have when it comes to safety laws. The ongoing situation involving Tesla's Autopilot suite is just the most recent example. This highlights the need for federal safety regulators to reaffirm their authority and demonstrate their ability to hold companies accountable in a timely manner. Congress should provide the necessary support to these agencies, including funding, staffing, and streamlined authority, to ensure they can act swiftly and effectively.

Earlier in the month, Tesla recalled over 2 million US vehicles after facing scrutiny from safety officials, advocates, and lawmakers for years. The National Highway Traffic Safety Administration (NHTSA) determined that drivers were misusing Autopilot too easily in situations where they were not in control of the vehicle or where the system was not intended to be used. These findings align with the investigations conducted by the National Transportation Safety Board (NTSB) into Tesla crashes since 2016.

It is important to note that Autopilot does not make a car self-driving. It can maintain a set distance from vehicles ahead and provide steering support to keep the vehicle centered in the lane. However, Tesla advises drivers to keep their hands on the steering wheel, remain aware of their surroundings, and be ready to take immediate action.

Opinion: Tesla's response to auto recall raises concerns

Amazon associates work to ship out same day orders during Cyber Monday at the Same-Day Delivery Facility Fulfillment Center on November 27, 2023 in Tampa, Florida.

Octavio Jones/Getty Images

Economy, tech and business: 6 experts offer their predictions

According to The Washington Post, there have been at least 17 fatalities and five serious injuries linked to Autopilot, as well as 736 crashes in the US since 2019. Some of the fatalities involved individuals outside of the Tesla, such as motorcyclists, and at least 16 crashes involved Teslas colliding with stationary first responder or maintenance vehicles.

Although Tesla did not agree with the NHTSA's analysis, the company has agreed to voluntarily recall the vehicles in order to resolve the two-year investigation. The company's proposed solution is to provide a free over-the-air software update that it claims will enhance controls and alerts to keep drivers engaged.

Consumer Reports is currently assessing the effectiveness of Autopilot following the software update on Tesla vehicles in our possession. Regrettably, our initial assessment by experts indicates that the update is inadequate, as the software does not sufficiently prevent misuse or driver distraction. For instance, CRs testers were able to use Autopilot even with the in-car camera covered, and drivers can still use the feature while looking away from the road.

This recall is a crucial moment for Tesla drivers and all others who share the road with them. It is imperative for Tesla and the NHTSA to address the serious safety concerns by ensuring that Autopilot can only be used in specific situations for which it was designed, such as on limited-access highways, and only when the system has confirmed that the driver is focused on the road.

Its alarming thatbased on CRs preliminary evaluation and the assessments of other safety expertsthe recall might not work effectively in its current form.

Get Our Free Weekly Newsletter

Sign up for CNN Opinions newsletter

Join us on Twitter and Facebook

This is especially concerning because Autopilot is not the only driving assistance system available. According to the most recent data from CR, over half of 2023 model-year vehicles have active driving assistance systems, many of which lack necessary safeguards. It is possible that safety regulators may begin to notice a pattern of incidents in non-Tesla vehicles as well.

The large-scale Autopilot recall by Tesla emphasizes that our auto safety system is not meeting consumer expectations. How can people trust that their cars are designed to be safe and free of defects if a company under scrutiny takes years to carry out a recommended recall by safety experts and then offers a solution that may not fully address the issue?

Consumers need to demand more from companies, particularly when it comes to safety. It is essential to acknowledge and support businesses that prioritize safety and to urge others to do the same. Congress must also take action by empowering the NHTSA with the necessary resources and legal authority to hold companies accountable in a timely and thorough manner. Safety recalls should be carried out swiftly and effectively, within months, not years. This is a reasonable expectation that consumers deserve.

Recent