Home » Newly Released Video Of Thanksgiving Day Tesla Full Self-Driving Crash Demonstrates The Fundamental Problem Of Semi-Automated Driving Systems

Newly Released Video Of Thanksgiving Day Tesla Full Self-Driving Crash Demonstrates The Fundamental Problem Of Semi-Automated Driving Systems

L2prob Top
ADVERTISEMENT

I’m not sure how much you keep up with bridge-related holiday car crashes, but there was a huge one this past Thanksgiving on the San Francisco Bay Bridge. This was a genuine pileup, with eight vehicles involved and nine people injured. That’s already big news, but what makes this bigger big news is that the pileup seems to have been caused by a Tesla that was operating under the misleadingly-named Full Self-Driving beta software, according to the driver. As you likely already know, the inclusion of the nouns “Tesla” and the string of words “full self-driving” is internet click-catnip, but that’s not really what I want to focus on here. What this crash really demonstrates are the inherent conceptual – not technological – problems of all Level 2 semi-automated driving systems. Looking at what happened in this crash, it’s hard not to see it as an expensive, inconvenient demonstration of something called “the vigilance problem.” I’ll explain.

First, let’s go over just what happened. Thanks to a California Public Records Act request from the website The Intercept, video and photographs of the crash are available, as is the full police report of the incident. The crash happened on I-80 eastbound, in the lower level of the Bay Bridge. There’s five lanes of traffic there, and cars were moving steadily at around 55 mph; there appeared to be no obstructions and good visibility. Nothing unusual at all.

Vidframe Min Top
Vidframe Min Bottom

A Tesla was driving in the second lane from the left, and had its left turn signal on. The Tesla began to slow, despite no traffic anywhere ahead of it, then pulled into the leftmost lane and came to a complete stop — on the lower level of a bridge, with traffic all around it going between 50 and 60 mph or so. The results were grimly predictable, with cars stopping suddenly behind the now-immobile Tesla, leading to the eight-car crash.

Here’s what it looked like from the surveillance cameras:

Timeline1

ADVERTISEMENT

…and here’s the diagram from the police report:

Diagram1

According to the police report, here’s what the Tesla (referred to in the report as V-1) said of what happened

“He was driving V-1 on I-80 eastbound traveling at 50 miles per hour in the #1 lane. V-1 was in Full Auto mode when V-1 slowed to 20 miles per hour when he felt a rear impact… He was driving V-1 on I-80 eastbound in Full Self Driving Mode Beta Version traveling at approximately 55 miles per hour…When V-1 was in the tunnel, V-1 moved from the #2 lane into the #1 lane and started slowing down unaccountably.”

So, the driver’s testimony was that the car was in Full Self-Driving (FSD) mode, and it would be easy to simply blame all of this on the demonstrated technological deficiencies of FSD Beta. This could be an example of “phantom braking,” where the system becomes confused and attempts to stop the car even when there are no obstacles in its path. It could have been the system disengaged for some reason and attempted to get the driver to take over, or it could be caused by any number of technological issues, but that’s not really what the underlying problem is.

This is the sort of wreck that, it appears, would be extremely unlikely to happen to a normal, unimpaired driver (unless, say, the car depleted its battery, though the police report states that the Tesla was driven away, so it wasn’t that) because there was really no reason for it to happen at all. It’s about the simplest driving situation possible: full visibility, moderate speed, straight line, light traffic. And, of course, if the driver was using this Level 2 system as intended – remember, even though the system is called Full Self-Driving, it is still only a semi-automated system that requires a driver’s full, nonstop attention and a readiness to take over at any moment, which is something the “driver” of this Tesla clearly did not do.

ADVERTISEMENT

Of course, Tesla knows this and we all technically know this and the police even included a screengrab from Tesla’s site that states this in its report:

Report Ap

 

We all know this basic fact about L2 systems, that they must be watched nonstop, but what we keep seeing is that people are just not good at doing this. This is a drum I’ve been banging for years and years,  and sometimes I think to myself: “Enough already, people get it,” but then I’ll see a crash like this, where a car just does something patently idiotic and absurd and entirely, easily preventable if the dingus behind the wheel would just pay the slightest flapjacking bit of attention to the world outside, and I realize that, no, people still don’t get it.

So I’m going to say it again. While, yes, Tesla’s system was the particular one that appears to have failed here, and yes, the system is deceptively named in a way that encourages this idiotic behavior, this is not a problem unique to Tesla. It’s not a technical problem. You can’t program your way out of the problem with Level 2; in fact, the better the Level 2 system seems to be, the worse the problem gets. That problem is that human beings are simply no good at monitoring systems that do most of the work of a task and remaining ready to take over that task with minimal to no warning.

ADVERTISEMENT

This isn’t news to people who pay attention. It’s been proven since 1948, when N.H. Mackworth published his study The Breakdown of Vigilance During Prolonged Visual Search which defined what has come to be known as the “vigilance problem.” Essentially, the problem is that people are just not great at paying close attention to monitoring tasks, and if a semi-automated driving system is doing most of the steering, speed control, and other aspects of the driving task, the human in the driver’s seat’s job changes from one of active control to one of monitoring for when the system may make an error. The results of the human not performing this task well are evidenced by the crash we’re talking about.

I think it’s not unreasonable to think of Level 2 driving as potentially impaired driving, because the mental focus of the driver when engaging with the driving task from a monitoring approach is impaired when compared to an active driver.

I know lots of people claim that systems like these make driving safer – and they certainly can, in a large number of contexts. But they also introduce significant and new points of failure that simply do not need to be introduced. The same safety benefits can be had if the Level 2 paradigm was flipped, where the driver was always in control, but the semi-automated driving system was doing the monitoring, and was ready to take over if it detected dangerous choices by the human driver.  This would help in situations of a tired or distracted or impaired driver, but would be less sexy in that the act of driving wouldn’t feel any different than normal human driving.

If we take anything away from this wreck, it shouldn’t be that Tesla’s FSD Beta is the real problem here. It’s technically impressive in many ways though certainly by no means perfect; it’s also not the root of what’s wrong, which is Level 2 itself. We need to stop pretending this is a good approach, and start being realistic about the problems it introduces. Cars aren’t toys, and as much fun as it is to show off your car pretending to drive itself to your buddies, the truth is it can’t, and when you’re behind the wheel, you’re in charge — no question, no playing around.

If you want to read about this even more, for some reason, I might know of a book you could get. Just saying.

ADVERTISEMENT

 

Relatedbar

New IIHS Study Confirms What We Suspected About Tesla’s Autopilot And Other Level 2 Driver Assist Systems: People Are Dangerously Confused

Level 3 Autonomy Is Confusing Garbage

Support our mission of championing car culture by becoming an Official Autopian Member.

ADVERTISEMENT

Got a hot tip? Send it to us here. Or check out the stories on our homepage.

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
86 Comments
Inline Feedbacks
View all comments
Clear_prop
Clear_prop
1 year ago

Doesn’t ‘FSD’ have settings on how aggressive to drive?

Seems like this guy had it turned up all the way to ‘brake checking a-hole’.

Todd Beauchamp
Todd Beauchamp
1 year ago

Until we have Vehicle to Vehicle communication none of these systems are going to be any good.
Even after that nothing is ever perfect.

Black Peter
Black Peter
1 year ago
Reply to  Todd Beauchamp

Who pays to retrofit older cars with V2V communication? And what if I don’t want it? This is probably the correct solution to knocking a lot of kinks out of car vs. car interactions, but without sounding like a luddite, why should I change a single thing about my life/car so someone else can self drive? Then there is the car vs. bike, car vs. pedestrian issues that have been seen, how do we fix that? I’m not trying to dunk on you, just fleshing out some further issues with your (as I say, correct) solution.

Ivan256
Ivan256
1 year ago
Reply to  Black Peter

If the solution to your problem requires everybody else to change, expect disappointment.

Jblues
Jblues
1 year ago

You have to wonder if this incident was exactly what Level 2 is supposed to do when the driver is not keeping their hands on the wheel and ignores warnings, i.e., slow down and pull over.

Todd Beauchamp
Todd Beauchamp
1 year ago

Pulls On Fire Suit – ” When on the highway and the front of your vehicle hits the back of another – You are at fault” Nothing drives me more crazy than hearing people say that guy came out of nowhere – You didn’t see the guy but he didn’t materialize out of nothingness.

Wuffles Cookie
Wuffles Cookie
1 year ago
Reply to  Todd Beauchamp

Yup. The Tesla may have gone bonkers, but being prepared for traffic ahead of you coming to a sudden stop is Driver’s Ed 101. Liability is on the cars following, demonstrating nicely that no one involved in this crash was actually paying attention.

Paul B
Paul B
1 year ago

Saw this on Reddit last night.

Many comments were along the lines of “yeah phantom breaking happens to me too”. I have the feeling people have been lucking out.

I didn’t have my flame suit handy to handle the fanboys on Reddit, but since the discourse here stays much more level headed…

There was one comment where someone said they had a FSD phantom breaking event while doing 140km/h.

I was going to comment there on why the system was auto driving at that speed. Very few roads where 140 is legal.

If autopilot can be used to break the rules of the road, there is a fundamental problem with the system.

In the aircraft world where autopilot has been proven, there is a very good reason there are all sorts of alarms and verbal warning when part or all of your automation is lost.

Chartreuse Bison
Chartreuse Bison
1 year ago

“This is a drum I’ve been banging for years and years, ”
Have you tried making your point via Tik-tok or some other type of moron media? I don’t think the people who need to hear this point like reading much.

Black Peter
Black Peter
1 year ago

You say that but there’s more than a few comments here that seem to be stanning for FSD…

PL71 Enthusiast
PL71 Enthusiast
1 year ago

I hate having to be extra mindful of this crap on the highway. If a car just decides to emergency stop for no reason it should 100% be the manufacturer’s fault. As these age and the sensors deteriorate it’s just going to get worse! Anything that can cause sudden accelerations should have a bit of redundancy built in.

If a company can not be allowed to cover their ass with a disclaimer that there may be unpredictable behavior.

Ivan256
Ivan256
1 year ago

These things need to be banned. Including adaptive cruise control.

I had tried these systems in the past, but never lived with one. Now I’ve had Blue Cruise in my Lightning for 2 months and used it for 600+ miles and I’m convinced. There is no way to make something below level 5 that should be allowed on our roads.

Even if it’s the best in class of its type of system. Using it conditions you NOT to react to situations where you normally should react. You MUST be engaged because you might have to take over. But there are many situations that the software can handle without your assistance. It messes with your learned automatic reactions to driving situations. I want to see data, but I am convinced that people who use these systems later go on to have more accidents per mile with the system off.

RustyBritmobile
RustyBritmobile
1 year ago

Wait – isn’t one of the first rules of driving not to hit the car in front, no matter what it does? Maintaining an ‘assured clear distance is’ what it used to be called. So, while the Tesla not ought to have stopped (although we don’t know that for sure), the drivers behind bear the responsibility of not hitting it when it does – so, fault distributed.

Ivan256
Ivan256
1 year ago

Legal responsibility, sure.

But those rules are to make sure there is a nice neat resolution of liability.

If an accident occurred that otherwise wouldn’t have because of the Tesla driver, the Tesla driver was the cause of the accident.

In reality most accidents require mistakes by multiple parties. If _any_ of them do the right thing, the whole situation is averted.

Black Peter
Black Peter
1 year ago
Reply to  Ivan256

“the Tesla driver, ” exactly! Which in this case was some sloppy code..

Mr.Asa
Mr.Asa
1 year ago

Had a thought that weirded me out a bit.
We know that Tesla, and a bunch of other manufacturerererers have OTA capability for updates.

What’s stopping Tesla from going OTA and changing the memory of this particular car in such a way that it shows that it wasn’t in Level 2 at the time?

Musk is shady enough to do that. Is the rest of the company?

SquareTaillight2002
SquareTaillight2002
1 year ago
Reply to  Mr.Asa

Don’t need OTA for that. Just erase the memory when a crash is detected. Of course that is illegal based on the 2013 Black Box legislation.

Clear_prop
Clear_prop
1 year ago
Reply to  Mr.Asa

People have complained about ‘FSD’ turning off a second before an impact so Tesla can claim ‘FSD’ was enabled at the time of the crash.

FUCK YOU
FUCK YOU
1 year ago

I would love to see the results of a study that compared accidents-per-mile between cars both with and without Level 2 semi-autonomy. I imagine the dataset must be large enough to control for things like age of car, age of driver, geographic location, etc. Assuming a clean dataset, it would be a pretty straightforward statistical analysis.

While I agree that Level 2 driver assistance systems introduce new failure modes and new causes for accidents, I don’t think we know what their overall safety impact is. A study like I suggested would help answer that question.

Rob Knapp
Rob Knapp
1 year ago
Reply to  FUCK YOU

I’m wondering if there is a a way to further measure crash severity. I’m sure there’s a finger on the scale in the way we’re seeing the most extreme crashes… maybe injurious crashes per million miles?

Ivan256
Ivan256
1 year ago
Reply to  FUCK YOU

If you can find a dataset that hasn’t been selectively pruned, or otherwise sourced from somebody with a conflict of interest I’d love to see it too.

I suspect that the overall safety impact is negative. I think that the systems increase the number of incidents both when they’re enabled AND when they’re disabled. I have theories as to why that might be, but I also think that if the data were good, the companies involved would release it.

Andy the Swede
Andy the Swede
1 year ago

Working with human factors research within the area of driver assistance and self driving for the last 15 years, I can just say that OEMs have not listened to our concerns. If you look at research, the only systems that actually have a positive safety effect are the ones that warn the driver of imminent collision and/or act automatically to avoid an accident or at least mitigate the effects of it. Examples of these are ABS, auto braking etc.

Systems that are more of comfort type have a negative effect on safety. As an example, driver reaction times on unanticipated events are significantly longer when driving using adaptive cruise control. It’s pretty simple. The attentive the driver becomes, the more need of systems to save their assess when something happens.

Ben
Ben
1 year ago
Reply to  Andy the Swede

I suspect we all intuitively assumed this was true, but it’s good to hear that studies are backing it up. Well, it’s not good because they keep selling these terrible systems, but at least we all get to say “I told you so” when something like this happens. :-/

Brau Beaton
Brau Beaton
1 year ago

Actually Jason, I agree with what you say about “Self Driving” but I think you really missed the boat here. Most notably, the Tesla stopped safely and the cars behind crashed because they were following too closely. Cars must be able to stop when they are in trouble.

Recently the same thing happened on the highway near me. As an angry man got out to accost the idiot for stopping for no reason, he saw the driver was in the throes of a heart attack.

A lady stopped while exiting a tunnel causing a 15 car pile up. There was a kitten sitting in her lane and she could not run it over.

I was travelling down the highway when cars started piling into the left lane for the upcoming exit. Stopping distances were being ignored so I backed right off. For some reason the car in front of this caravan decided the thing to do was to *not* go left but stop, u-turn and go back. I watched as six cars ahead of me all rode up on each others back windows, but I wasn’t one of them for one simple reason; I follow the 2 second rule. Always.

Life is too short to die or kill another person over 2 seconds

Do You Have a Moment To Talk About Renaults?
Do You Have a Moment To Talk About Renaults?
1 year ago
Reply to  Brau Beaton

The Tesla didn’t exactly stop safely, it stopped suddenly and erroneously cutting into the leftmost lane where traffic is normally faster. Sure, not driving at a safe distance was another important factor, and all cars that crashed share some of the blame. But if that was a human driver who decided to veer into the fastest lane and slam on the brakes, coming to a full stop with no reason, could we really say the driver had “stopped safely”? The Tesla driver is still fault here more than anyone else, because that seems more than enough time to takeover and get back on the throttle to override the phantom braking, and they seem to just let it all happen. There’s no reason to even let the car change lanes, since there’s no significant traffic ahead.

In the end, most of these occurrences will have multiple factors and responsibility can’t fall squarely on a single person/entity. Assholes who wreck their Teslas alone because of FSD are still being deceived into beta-testing an incomplete system with a dangerously misleading name, so Tesla is actively enabling assholes and that’s part of the problem. It’s frankly mind-boggling that there’s no regulations in place for the beta-testing part of the problem, and no judicial institution takes broader action on the fraudulent advertising side (which is a problem since the Auto-pilot days).

JerryLH3
JerryLH3
1 year ago
Reply to  Brau Beaton

I would argue you cannot “stop safely” when the conditions are such as they were: no impediments in the lane of traffic and no medical emergency suffered by the driver.

Is safe following distance a problem in this crash and in every day driving? Absolutely. But the precipitating event for this crash was an erroneous stop by “FSD” and a driver who was not paying enough attention to realize it was an erroneous stop and let it happen.

Beasy Mist
Beasy Mist
1 year ago
Reply to  Brau Beaton

You can’t stop safely in the left lane of a tunnel. This wasn’t a heart attack or a kitten, it was an irresponsible person using a system that shouldn’t be beta-tested on public roads.

Ivan256
Ivan256
1 year ago
Reply to  Brau Beaton

“Most notably, the Tesla stopped safely and the cars behind crashed because they were following too closely. Cars must be able to stop when they are in trouble.”

A good driver stops in time AND considers whether they’re going to cause the inattentive or less capable driver behind them from crashing into them.

That doesn’t mean it’s their fault when they get rear-ended. But who caused or who prevented an accident isn’t a binary state.

You may not be liable if you stop fast and you get rear ended. But you absolutely contributed to causing the outcome. And making sure the right person gets blamed is one of the least consequential things about accident situations in the long run.

Crank Shaft
Crank Shaft
1 year ago

Frankly, it pisses me off that I/we have to share the roads with these fundamentally flawed vehicles. Musk sees mega $$$ in hyping his overpriced software, but we should not have to also be at risk because some asshat with $15k to blow spaces out while getting a literal bang for his buck behind the wheel. This shit needs to stop.

Furthermore, it damages the brand to no end. I will never buy a Tesla because of this blatant BS. I hope someone successfully sues them hard over this issue.

Steve Alpers
Steve Alpers
1 year ago

How does the driver override or take control of automatic braking? Seems like a tough problem to solve for level 2 cars.

SCJeff
SCJeff
1 year ago
Reply to  Steve Alpers

If that happens and the driver hits the accelerator that will override the phantom braking.

Marteau
Marteau
1 year ago
Reply to  SCJeff

If that happens and the driver hits the accelerator *it’s supposed to* override the phantom braking
FIFY

BigThingsComin
BigThingsComin
1 year ago
Reply to  SCJeff

Why didn’t Mr. Dummy here do that? Could it be because he didn’t know that fact?

HeyCharger
HeyCharger
1 year ago

Much like the old joke that gun control ‘requires both hands’, likewise full-self-driving should mean YOU driving the car YOURSELF!

I use radar cruise frequently when I’m driving for work, it really helps on long drives and keep a good distance when you’re stuck on a highway with no overtaking opportunities which frequently is the case out here.

Anything else in driving assistance isn’t really assisting me as Torch said is taking vigilance to monitor which is going to wane on a multiple-hour drive if I’m not actively involved in the process.

Eric Busch
Eric Busch
1 year ago

As much as like I technology, I don’t want it in my car. My phone is cool and all but I have it set with location services OFF. Unless I actually need a map, no thanks.

I get the cell location part of it, but google doesn’t need to know my whereabouts.

I’m inclined to spend as much money as it takes to keep my 12 year old car on the road for as long as possible, even if it exceeds the value of the car.

My car is a sanctuary.

Bill Garcia
Bill Garcia
1 year ago

I think this (in)ability to continously monitor may vary by driver, which makes it difficult to regulate in one same way for everyone.

In my case, I get distracted beyond my comfort level with my XC60s Pilot Assist, so I barely use it. I feel similarly about my lane assist. Yet the ACC hits just the spot for me – I can drive fully alert and comfortably for my usual ~250 highway miles before stopping to rest. I.e. I actually prefer to drive my Wrangler 4xe or the Volvo on ACC only vs with Pilot Assist.

I’m unsure whether that makes Volvo’s Pilot Assist a bad idea and thus should be abandoned together with all Level 2 approaches. I agree with Torch they seem quite impractical to implement should my own view of “variability by driver” be true for the general driving population!

Double Wide Harvey Park
Double Wide Harvey Park
1 year ago
Reply to  Bill Garcia

I’ve never liked and don’t use basic 1990s era cruise control because of the inattention problem. If my foot isn’t on the accelerator, will it be fast enough to switch to the brake pedal? Will I stay awake if I’m not controlling the throttle?
My newest car is from 2014 I think, right before all the modern driver assist tech became truly mainstream, and I fear having to use those features.
More cameras and emergency braking do sound good though.

Timothy Arnold
Timothy Arnold
1 year ago

I wonder if requiring any level system short of genuinely fully autonomous to regularly, and randomly, alert the driver to take control and drive for 5-10 minutes every 15-30 minutes is a way to teach people to be more vigilant and break them out of their complacency? Because there’s no realistic way to change the way our brains work – the “vigilance problem” is built in.

ProudLuddite
ProudLuddite
1 year ago

I agree with Jason 100%, people who like this stuff, don’t like to drive, or don’t have confidence in their driving, or probably most common denominator, really want to be doing something else, often involving their phone. I also think the way these driver aids are named and marketed is nearly criminal.

FULL SELF DRIVING**

driver should pay attention and have hands on the controls ready to take over at any times because FULL SELF DRIVING cant really be relied on to self drive.

Gilbert Wham
Gilbert Wham
1 year ago
Reply to  ProudLuddite

Yup. Buses and taxis exist so people can go places without having to bother about how it happens.

Everyone else gets to goddamn well pay attention while they’re driving.

Not Sure
Not Sure
1 year ago

This particular pile up seems less like a Tesla problem and more like a safe following distance problem.
This could happen to any vehicle for a number of reasons (tire blow out, stalled engine, wasp in the car etc).
An eight car pile up equals eight idiots following too close for the speed and conditions. Simple math.

Not Sure
Not Sure
1 year ago
Reply to  Not Sure

*An eight car pile up equals seven idiots following too close for the speed and conditions.
I’m bad at even simple math.

ProudLuddite
ProudLuddite
1 year ago
Reply to  Not Sure

If the car wouldn’t have pulled over and slowed drastically, the accident wouldn’t have happened. If the other cars had been paying more attention and not following so close, etc. Both at fault.

Ivan256
Ivan256
1 year ago
Reply to  Not Sure

Tire blow outs, and stalled engines, and other mechanical failures don’t stop a vehicle as quickly as hard braking.

If you’re travelling at highway speeds and brake hard with no obvious external influence the resulting pileup should be considered your fault.

I’m generally against capital punishment, but I’m open minded if we want to discuss applying it to brake-checkers.

Not Sure
Not Sure
1 year ago
Reply to  Ivan256

If a brake check affects you you’re too close.

Ivan256
Ivan256
1 year ago
Reply to  Not Sure

I’ve never hit a brake checker, thanks.

But that doesn’t justify the attempted murder.

If you’re a brake checker you should be in jail.

Sarah Bell
Sarah Bell
1 year ago

I wonder if the brake lights even came on.
If there were any integrity whatsoever left in our government, there would be restrictions on using public roads to beta test computer-controlled automobiles. At a minimum, a streamlined process for victims to receive compensation from the manufacturers; ideally, a system that prevents the computer from taking over unless all vehicles in the vicinity have opted in to the beta test.

SCJeff
SCJeff
1 year ago
Reply to  Sarah Bell

The brake lights do come on.

Marteau
Marteau
1 year ago
Reply to  Sarah Bell

I feel that’s a topic that wasn’t discussed enough on the Tesla crazy car in China, tesla declared the guy didn’t brake, they’ve said the same on the Paris’ case, but on the chinese footage you can clearly see all the brakes lights on.
Something shaddy here and i’d like to get Torch’s take on it.

Mrbrown89
Mrbrown89
1 year ago
Reply to  Sarah Bell

This is funny, I always encounter Teslas that their brake lights always turn on at the last minute when they are almost completely stop. Its like their regen doesn’t activate the brake lights on time or the regen is not strong enough to active the brake lights (According to the car) but you can tell they are braking. Also their brake lights are so small (Model 3, Model Y), they look like some aftermarket LED from Autozone.

PL71 Enthusiast
PL71 Enthusiast
1 year ago
Reply to  Mrbrown89

Not sure what the threshold is on a tesla but I know in other EVs the rear lights don’t come on until 8-10kw of Regen. This makes sense because it keeps the lights from turning on all the time and because a manual transmission car can do exactly the same thing with a bit of effort.

Citrus
Citrus
1 year ago
Reply to  Mrbrown89

The Tesla, in no way, stopped safely, and there was no reason for it to stop. Pulling into the left lane of a tunnel and stopping is the opposite of safe – the left lane typically has faster traffic and tunnels have less space to avoid it. Just pulling to the left instead of the right in order to stop is unsafe – the default on North American roadways is to have the right lane as the one dedicated to slower traffic, or traffic that may be stopped due to an emergency.

Should the follow distance have been larger? Sure, but stopping safely involves finding a place that doesn’t put you in the way of traffic, not just hitting the brakes. The Tesla stopped, but in the least safe way it possibly could.

StalePhish
StalePhish
1 year ago
Reply to  Sarah Bell

The video on The Intercept does show that the Tesla’s brake lights came on before they even initiated the lane change. Even though there was a full 3 seconds between when the Tesla’s brake lights came on and when the first vehicle hit, I can’t blame the second vehicle because it was an unexpected maneuver.

However, for the actual “pileup”, I put a lot of the blame on the black SUV which was majorly tailgating the white pickup truck as it entered the tunnel. The white pickup wasn’t paying much attention either so they slowed down pretty last minute and got tapped by the tailgating black SUV (causing the truck to swerve and miss the pileup). But because the black SUV reacted so late, the white car behind them didn’t have much time (they were also too close), then the black car behind them was too close. But then the very last car in the lineup, a black sedan, went full steam ahead, no brake lights at all, right into the pile, despite it being about 7 seconds after the initial incident, which is what caused the white car to lift up for the dramatic picture we’ve seen.

Stacks
Stacks
1 year ago

I think that people, even when paying perfect attention, tend to trust the computer’s guidance too much. The driver could have been completely aware of what the car was doing, but if it wasn’t screaming “TAKE OVER NOW” at him he might have wondered if the car “saw” something he didn’t, some problem he hadn’t noticed yet. Maybe he hesitated just long enough to cause a wreck, which can be just fractions of a second at highway speeds. Self-driving vehicles aren’t just going to have to deal with human vigilance, they’re going to have to deal with whatever goes on in our fleshy human brains that has people occasionally following their phone navigation straight into a ditch.

Balloondoggle
Balloondoggle
1 year ago
Reply to  Stacks

This hesitation is exactly what happens every time my wife gasps and tenses up in the passenger seat. Sometimes it’s something she sees on the road, but most times it’s a leg cramp or muscle spasm due to her bad back. This leads me to largely ignore her reaction and someday I’ll end up hitting a jaywalking pedestrian because I misjudged my wife’s action when she sees something I missed.

Flyingstitch
Flyingstitch
1 year ago
Reply to  Balloondoggle

This reminds me a little of when I was teaching my kids to drive, and I would see something that required immediate action. That little lag as I formed the thought, spoke the words, and the kid processed the information–just enough for some exciting moments.

SquareTaillight2002
SquareTaillight2002
1 year ago
Reply to  Balloondoggle

My wife tenses up every time I approach a turn at faster than normal but perfectly reasonable speed. I have learned to ignore it but occasionally it is because there is a dog in the road. Maybe I should just let her drive.

SlowCarFast
SlowCarFast
1 year ago

So the Tesla was the only one not smashed? The world is not fair.

NotSpanky
NotSpanky
1 year ago
Reply to  SlowCarFast

Police report shows a gap, but driver testimony indicates they felt an impact from the rear. So I’m guessing the car behind the Tesla stopped in time, but then the cascade of impacts behind eventually shunted them into the Tesla, albeit minor enough that the Tesla could still drive off.
That’s a guess though.

MaximillianMeen
MaximillianMeen
1 year ago

While I agree that this could have been avoided if the Tesla driver pulled his head out of his ass and kept driving out of the tunnel, the irony here is that if all 8 cars had level 2 driving aids, the other cars would likely have kept proper following distance for speed and line-of-sight and, as a result, stopped safely without hitting the car in front. Humans are also bad at paying attention when they are actively driving.

Dave Horchak
Dave Horchak
1 year ago

Maybe or if one was a stopped emergency vehicle they all might of rammed it and caught on fire. Because thats what they do.

Phil Layshio
Phil Layshio
1 year ago
Reply to  Dave Horchak

Yeah, they all might have done that. They just might have.

Hillbilly Ocean
Hillbilly Ocean
1 year ago

If I can’t count on the Level 2 to *not* stop in the wrong place, I’m going to be reluctant to believe it will stop in the right place.

Erik Hancock
Erik Hancock
1 year ago

Actually, I shudder to think of how much worse this could have been if just one more of these cars was operating on a L2 system, let alone all seven. Remember, Tesla’s own guidance is that the human driver must maintain constant vigilance at all times and be prepared to take over from FSD – because the active safety measures “cannot respond to every situation” (Tesla’s own language). In other words, every driver -including the Tesla owner-was legally responsible to be in full control of their vehicle at all times, regardless of any driver-assist mechanisms in operation. You’re saying that seven Teslas running L2 driver assists that require drivers to take over at any moment would have performed better than one Tesla and six drivers who went into the tunnel already under full control of their cars. In other words, your claim is that Tesla’s FSD software would have been vigilant of the developing emergency situation and avoided the unintended, irrational behavior of the first FSD-assisted Tesla ahead – without the intervention of the six drivers who you say weren’t paying attention in the first place? That is saying FSD could avoid a crash without the assistance of the dum-dums behind the wheel – which is exactly what Tesla and every definition of L2 says can’t be done. This is the whole logical fallacy of L2 – manufacturers and their defenders claim that you will be safer because of driver assists but then say that you cannot rely on driver assists and must be constantly prepared to take over in order to keep yourself safe. Imagine if other safety equipment were like this:

“Full Seat-Belt (FSB) has many advanced safety functions, but you cannot rely on Full Seat-Belt to keep you restrained – it may unexpectedly disengage at any time, so you should always keep your hands ready to brace yourself or be thrown from the vehicle in the event of an accident. Full Seat-Belt is a driver safety-assistance system. It does not provide seat belt functionality. Also, the manufacturer bears no responsibility if Full Seat-Belt should suddenly disengage during regular driving or in an emergency event or if it performs unexpected maneuvers – the driver should be constantly monitoring how FSB is behaving and be ready to immediately disengage and take over Full Seat-Belt’s safety functions. “

86
0
Would love your thoughts, please comment.x
()
x