Home » We Ask An Actual AV Engineer Why Two Recent Tesla FSD Videos Show Such Dumb, Dangerous Mistakes

We Ask An Actual AV Engineer Why Two Recent Tesla FSD Videos Show Such Dumb, Dangerous Mistakes

Thisisbadtesla
ADVERTISEMENT

When David and I started The Autopian not that long ago, one of the things we decided was that we weren’t going to fall into the trap of showing every single new Tesla FSD fail video, because there always seems to be a lot of them of very varying degrees of quality and interest, and after a point these just get to be gratuitous traffic-grab posts without any real new insight provided. That doesn’t mean I haven’t been paying attention to what’s been going on, which is why I noticed a pair of recent videos showing current versions of Tesla’s FSD Beta system in action, and failing in ways that I thought were worth exploring, not because of their complexity, but because of how basic the situations were. To help me understand what may be happening, I contacted an engineer who works on similar automated driving assist systems (and who needs to remain anonymous for work reasons) for a major OEM. Let’s take a look.

The first video is this one, posted by Seattle-area Tesla-focused YouTuber Gali on his channel HyperChange, shows his Model 3 (using FSD version 10.11.2) getting quite confused at an intersection and making the decision to turn the wrong way down a one-way street:

Vidframe Min Top
Vidframe Min Bottom

By the way, the author of the video has received blowback from other Tesla enthusiasts:

ADVERTISEMENT

Ugh, these people. Gali, you didn’t do anything wrong. Well, other than letting your car turn the wrong way down a one way, I suppose. If you really want all of this to actually, you know, work, the problems have to be seen. And this is definitely a problem. Luckily, the full video is still up:

Okay, so, back to the video. What got me interested in this one is that while the driving environment here is a fairly complex urban environment, there’s really not much that’s all that unusual. The particular mistake the FSD system made here, turning the wrong way down a one-way street, is an interesting error because it’s one that potentially has extremely severe consequences – head-on collisions – and should be one of the most basic things that can be avoided.

My first thought was that even though Tesla famously does not use HD maps as a major component of their automated driving system, like, say GM’s SuperCruise does, the information that a street is one way-only must be recorded and available somewhere, right? I asked our AV engineer expert about this:

“Yes, Tesla does not rely on HD maps, but it does have access to what we call SD maps, basically just the regular GPS maps, and those know what streets are one way or not. Even something like Google Maps on your phone has this information.”

What I didn’t understand is why the Tesla didn’t seem to be aware that street was one way, or, if it did, why would it ignore that information? Our expert source didn’t have an answer for that, but did point out something else interesting. He noted that even if, somehow, Tesla’s FSD Beta doesn’t use one way street information from even the normal GPS-level SD maps, there were visual cues it should have been aware of:

ADVERTISEMENT

Signage

As our expert reminded me, Tesla’s system is capable of reading traffic lights and traffic signs, and uses this information to get speed limits for roads, see stop signs, and so on. Here, in the field of view of the car at this intersection, we can see a one-way sign, a no left turn sign, a straight-only sign, and a green up arrow in the traffic light, indicating that going straight ahead is the only permitted choice here.

That’s four separate visual reminders that you just can make no turns at this intersection, which should make things real damn easy for the AI piloting the car here, because there’s just one choice: go straight. So why the hell did it turn the wrong way down a one-way street?

Our expert did his best to figure out what could be going on:

“This looks like a good old-fashioned software bug,” he told me. “The car got confused, and as for why the car didn’t simply kick control back to the driver like you’d think it should, it may have – and this is just my opinion and guessing here – it may have chosen to keep trying because Tesla favors statistics that show lower reports of their system disengaging and giving control to the driver.”

ADVERTISEMENT

What really bothers me about this situation is that this seems like the sort of issue that should have been solved on day one, before anything was released to anyone, beta or otherwise. Not driving into oncoming traffic on a one way street is not an edge case or an unusual circumstance, it’s the kind of thing that should just be hard-coded and checked for at the last moment before an action is taken.

The software decides the path it wants to take, but before that gets executed, that path should be compared to whatever map data is available to see – just a quick check, why not – if that path will take it the wrong way down a one-way street. If so, then don’t flapjacking do it. Easy. I get that the whole business of automated driving is wildly complicated, but making sure you’re not driving the wrong way down a one way street should very much not be.

A similar situation can be seen in this other video from a driver in Denver:

In this case, the full video is no longer available.

ADVERTISEMENT

What happens here is also remarkably simple – again, the simplicity is precisely why I’m writing about this. There’s an approaching light-rail tram, and FSD Beta attempts to turn right in front of it. If you look at the Tesla’s visualizations, you can see that the car did see there was a large vehicle approaching:

Tram

So, the Tesla saw the tram approaching, you can see it there in the video on the car’s dashboard screen, and still decided to turn in front of it. Why? Of all the complex and confusing situations an automated vehicle can encounter, this seems to be one of the most basic: big thing coming, clearly visible, so don’t put the car in its path of motion. Again, day one shit. I asked our expert if there’s anything I’m missing here that might explain this baffling and dangerous decision:

“The car sees the tram coming, clearly, perhaps it’s interpreting it as a bus, but that hardly matters. It sees it, it knows the velocity. I can’t tell why it decided to do this. Maybe it just bugged out, or there was a glitch. Things don’t always work out how you’d like with beta software.  It’s situations like these that makes me suggest that nobody should be using beta software in a moving vehicle. When I was at [major carmaker] and we were testing systems like these, I had to go through a full safety driving course. I’m always horrified Tesla is allowing normal people to test drive FSD software. I’m a software developer, I know how the sausage is made, so it’s terrifying to know people with no safety driving experience are having cars drive them around.”

In the past when I’ve written about these sorts of FSD videos I’ve usually made mention about how I know these are incredibly complex and advanced systems, and what they do manage to do safely is dazzlingly impressive, because it is, even if it isn’t perfect. But these two particular situations are not like that. They’re not examples of the car in a very complex situation trying its best and making an error that was a best attempt in a challenging circumstance.

This basic shit. This is not driving in front of huge moving things that are clearly visible, or violating basic traffic laws of which there is no excuse to be ignorant. This isn’t pushing up against the limits of what the AI can accomplish: this is, genuinely, stupid shit that has no place happening in a system as far along as FSD is, even if it still is in beta.

ADVERTISEMENT

I wish I had better, more concrete answers for you. Tesla doesn’t talk to media about this stuff, and there’s only so much an expert can tell from these videos. But, there is something to be gained by making sure these videos are seen and commented upon, and that’s a reminder that these are Level 2 driver assist systems. Despite the name, they are not full self-driving, by any stretch. You always need to be in control, ready to take over, because there really is no type of mistake, no matter how basic or dumb it may seem, that is not out of the realm of possibility when using these systems.

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Subscribe
Notify of
73 Comments
Inline Feedbacks
View all comments
Mr.Asa
Mr.Asa
2 years ago

Not yet sure what happened here, but potentially another bad result?
https://www.wfla.com/news/florida/tesla-slams-into-rear-of-parked-semi-truck-at-i-75-rest-stop/

StalePhish
StalePhish
2 years ago
Reply to  Mr.Asa

Autopilot/FSD automatically adjusts to the speed limit +/- a scaler up to +5 mph. Looks like they were going FAST when they hit that trailer, much above what is probably marked as a 30 mph zone, which makes me think they were probably not on Autopilot/FSD. Also going by the Google Maps views of rest areas on I-75 in Alachua County FL, it looks like they would’ve had to impact the very last truck parking space in the parking area. My guess is that they exited I-75, accidentally took the truck lane (left) instead of the car lane (right), and the driver was distracted looking for how to get over to the car lane without getting back into the highway, and just drove straight in cutting the slight left corner too sharply.

Dest
Dest
2 years ago

“Things don’t always work out how you’d like with beta software. ”
Uh huh. Cause production software is flawless.

Jeff Gillio
Jeff Gillio
2 years ago

“I’m always horrified Tesla is allowing normal people to test drive FSD software. I’m a software developer, I know how the sausage is made, so it’s terrifying to know people with no safety driving experience are having cars drive them around.” This a thousand times over. Companies should not be permitted to turn us all into unknowing and unwilling test subjects in their experiments.

D.B. Platypus
D.B. Platypus
2 years ago

I kind of think these systems are a bit like squirrels, in that they are frequently stupid and crazy, but can get away with it by having agility and fast reflexes. It makes me wonder how often in these videos the car would have suddenly swerved out of danger at the last moment if the driver hadn’t done anything. Not every time, I’m sure … just like squirrels don’t always make it.

I actually just want to see about 50 robot cars set loose in a closed course full of weird obstacles. We could place bets on which ones make it.

Thatguyinphilly
Thatguyinphilly
2 years ago

One thing that piques my curiosity about the long term future of autonomous cars is not how they will evolve to adapt to the roads, but how roads and even cities will evolve to adapt to autonomous cars. Considering they do well in flat suburban environments and struggle with urban centers, if autonomous cars become the norm we may see cities evolve into less dynamic spaces where one-way streets don’t exist, or all streets are one-way. When cars superseded trains, cities became far more two dimensional and sprawling suburbia came into favor to accommodate cars. Technological advancements in general have a knack for dulling the senses down to the status quo in the name of convenience. Looking at new cities, especially on the west coast and in Asia, we’re already moving towards extremely integrated and planned, repetitious spaces, the kind of sterile environs Elon Musk and other tech oligarchs seem to prefer. Autonomous cars are pretty fanciful, but there comes a tipping point in all convenience technology when the tech itself stops evolving to bend to society and society decides to compromise to accommodate it. Les Corbusier’s Radiant City might have simply been predicted a century early. 

Ian Case
Ian Case
2 years ago

This may be heresy on an automobile website, but we need LESS cars. I live in the suburbs/exurbs of Philly. When we moved in to this house 15 years ago, it took me 24 minutes to get from where my office at the time was to home. When I left that job 4 years ago, it took 43 minutes. In the early 00’s, I used to be able to be in center city philly in < 45 minutes from where I grew up, which is 20 miles further from where I am now. Today? Even if I go at like 10:30am on a Wednesday, its over an hour.

I hate to sound crotchety, but 'driving' is too accessible to people who shouldn't be driving. Cars have too much power, are too big, too heavy, and too insulated. People detach to the point that they don't even realize they're driving or that there's actual consequences to the things they do. The automated everything, smoothness of transmissions and power now, in my opinion, makes people forget that they're driving a 4000lb missile. Also the fact that everyone is in an ever increasing size of pickup or SUV vs a reasonably sized sedan or hatchback means that they also DON'T CARE that they can do damage to other people.

SLIDTossedPissedinto BleuCHSaladwCroutons
SLIDTossedPissedinto BleuCHSaladwCroutons
2 years ago
Reply to  Ian Case

Thats another additive to my list of emotional disorders:

Driving around with my windows down, music loud and no device in a moving vehicle…

Thatguyinphilly
Thatguyinphilly
2 years ago
Reply to  Ian Case

I couldn’t agree more, and to add to your heresy, I haven’t owned a car in seven years because I live in Center City. I still love cars, but driving in the Philly area has gotten exponentially worse since I moved here in 2003, and my father grew up here and confirms that the Schuylkill Expressway has never not been an absolute nightmare. 

What I’ve noticed has added to the headache of traversing Center City streets in particular is Uber, Lyft, Amazon, and gig delivery drivers, essentially untrained chauffeurs and couriers, many down from New York for the day, idly waiting for fares or blocking lanes for deliveries. 
We need expanded public transportation in all major North American cities and more urban and suburban people on mass transit than in EVs. Unfortunately many politicos have been sold on EVs as a solution to climate change and car shares as a solution to congestion, namely because they require little civic involvement. 

SLIDTossedPissedinto BleuCHSaladwCroutons
SLIDTossedPissedinto BleuCHSaladwCroutons
2 years ago

When I first started driving… I could count on the SureKill to be empty by about 8-10p. I used to do high speed runs up from N.E Phila, down 95 S, over to 676 and up 76 / SureKill to K.O.P.

I still know my way around despite being only out of state boundaries. I go back frequently and I have noticed a major increase in traffic, lights and general stupdity.

SHIT, I used to be able to get off at the Academy Exit.. go over to Grant. Id then shoot up past Nifty Fifties in my 4th gen Accord in mid 2nd-3rd gear. Id jump across the Blvd in mid 3rd. NOW, if you go over mid 2nd and you pass Nifty Fifties.. you are screwed. Then again, ya cant even get up to that speed anymore.

Lived in Phila for 30+yrs. I know every pothole, from N.E Phila to City Line Ave in my sleep. Its part of a Map I gut burned in my brain.

But.. the gig shit has turned everything up-side-the-fuck-down. — That and Kenny is a fucking asshole. Hes got the Spine of a Turd.

Smartascii
Smartascii
2 years ago
Reply to  Ian Case

While I agree with you in theory, people who say this are usually volunteering *other* people to give up their cars. I have yet to see someone declare the need for fewer cars on the road, followed by, “So that’s why we sold ours and now only use public transit.”

Otter
Otter
2 years ago
Reply to  Smartascii

I’m also in Philly and enjoy a drive out of town, but would rather walk, or use bus or train within the city. Bike lanes are constantly blocked by rideshare and delivery drivers, so biking for me is not worth it. Garages and driveways are rare, and street parking in most neighborhoods at night is difficult, so riding a bus to dinner or a movie or a friend’s house is the best way to go.
Improving and electrifying public transit in dense areas would decrease greenhouse emissions AND make driving a car easier for those who need to.

Hangover Grenade
Hangover Grenade
2 years ago

I think congress should go ahead and just make a law limiting autonomous cars to highways. I’d say Tesla and others have half a chance of getting that right. I’d say there is about a 5% chance of level 4 or 5 being fully operational in my lifetime.

Ron888
Ron888
2 years ago

The tram video reminds me of a moment from an earlier FSD video posted on the net (sorry i cant remember which).
At one point the driver is making a u turn on a multi-lane road. *Several times* the car moved like it was going to pull into the oncoming traffic.
Those brief few seconds are pure nightmare fuel to anyone with a brain.If the car had decided to go there’s ZERO chance the driver or the oncoming car’s drivers could have reacted fast enough to avoid a big one

csimme01
csimme01
2 years ago

Cut to the chase here.
This s**t is hard to do.
Even if the average FSD beta drive is safer than a human drive there are situations where they can really muck it up.
Who the F is still letting them do this on public roads. Do we actually need to wait for a Tesla to kill someone before they put a stop to this?
The really sad part is that Tesla will gladly throw whoever is driving, and I use the term loosely here, fully under the bus when it happens.
A warning to all the Teslastans out there.
You are doing all of Tesla’s FSD development work for free while being 100% liable for anything that happens.

Scott Baysinger
Scott Baysinger
2 years ago
Reply to  csimme01

We have to wait for somebody *important* to die. You know: a “celebrity” or politician.

Jason Snooks
Jason Snooks
2 years ago
Reply to  csimme01

There’s no way it’s safer than a human driver. It’s safer than a human driver with another human driver in the passenger seat waiting to take over all driving duties should the first driver make a mistake.

I bet if they include every driver takeover (whether called for by the system or manually executed by the driver) as a potential accident we’d find the system is horribly unsafe and unready for public use.

Ian Case
Ian Case
2 years ago
Reply to  Jason Snooks

I think, on the whole, it IS safer than a human driver. Have you seen any of the ‘driving fail’ videos on Youtube? People do some DUMB SHIT that an autonomous car would NEVER do. There’s 17,000 auto accidents in the US alone DAILY. Generally, that means SOMEONE did something stupid. Its EXTREMELY rare for there to be an accident that couldn’t have been avoided if everyone had been paying attention and driving properly. I’m sure there’s the occasional ‘both lights turned green and people hit each other’ or ‘the tie rod broke and the driver couldn’t steer the car anymore’, or similar, but people aren’t good drivers, on the whole.

csimme01
csimme01
2 years ago
Reply to  Ian Case

I could be wrong on this but I believe the deaths involving Tesla FSD/Autopilot are limited to the driver of the vehicle. I do not include them in my count because they are willing participants.

JTilla
JTilla
2 years ago
Reply to  csimme01

People have already died.

Fix It Again Tony
Fix It Again Tony
2 years ago

Maybe FSD thinks that it can beat the train in the second video.

R G
R G
2 years ago

If so it’s not trying very hard: https://twitter.com/i/status/1542971644355936256

Iain Delaney
Iain Delaney
2 years ago

The reason Tesla is pushing out buggy software that refuses to disengage to hype their statistics is because they know full-well what’s going on and they’re betting the farm on FSD. Their cars are poorly built and expensive, service on them is even worse, and the major auto manufacturers are rapidly catching up with more attractive BEVs (Ford Mustang, Ford F150, Polestar, Cadillac, etc). Soon, Tesla will find itself awash in competition and no way to respond. The Cybertruck, if it ever appears, will not be the answer.
Elon and his advisors have convinced themselves the only way to save the company is FSD. So they’re using the ‘agile approach’ which works fine for most software, but is a really dumb approach for something as potentially life-threatening as faulty FSD.

Do You Have a Moment To Talk About Renaults?
Do You Have a Moment To Talk About Renaults?
2 years ago

I don’t get it. I seriously don’t understand how Tesla gets away with it. Their buggy glorified cruise control mixed with human stupidity is pretty dangerous, and it’s hard for me to understand how unregulated these things are that they just get away with it – making everyone in the vicinity of an FSD-enabled Tesla a beta tester, and calling it FSD/Autopilot. Not saying drivers don’t have liability too, but at this point it’s just clear that they don’t care much for people’s safety. It’s mind-boggling that a company like Tesla just gets a pass.

Do You Have a Moment To Talk About Renaults?
Do You Have a Moment To Talk About Renaults?
2 years ago

They even get away with actively making their cars less safe, like with the choice of ditching LIDAR altogether, a cost-cutting measure that they try to pass off as a purely technological option. I just don’t get it.

Daniel Jones
Daniel Jones
2 years ago

What will really confuse you is when you watch the full video (for the one way street incident) and witness how excited the guy is at how “good” his car is doing as it drives drunk around the neighborhood.

Do You Have a Moment To Talk About Renaults?
Do You Have a Moment To Talk About Renaults?
2 years ago
Reply to  Daniel Jones

I did watch the whole video when it was posted a while back, and it just struck me as human stupidity in action, so it wasn’t as confusing as it was infuriating. It is confusing that Tesla doesn’t have a ToS clause that allows them to de-activate FSD for drivers who film themselves letting FSD screw up without taking over. Seems like it would be a great deterrent and would save Tesla some embarrassment.

Scott Baysinger
Scott Baysinger
2 years ago

You ‘n’ me both.

Cayde-6
Cayde-6
2 years ago

I’d be willing to bet that what screwed up NSD (Not Self Driving) in the first video was that you essentially had three different signs with differing information.

My car (not a tesla) has the camera that is able to read speed signs. But what it ISN’T able to do is parse both numbers AND text. So you know those signs that say “25 MPH when children present” and similar outside of schools? Yeah, it just says “25 MPH” on my dash, no matter what time or day it is. So when you have one sign that says “No Left Turn,” what I’m wondering is if it is able to parse the rest of the signs that say “Straight only” and the other says “One way only [left arrow]”. Or, a poorly-nested If-Then structure could have meant that it only read the first sign and not the others.

Another possibility the fact that the “One Way Only [left arrow]” conflicted with the “No Left Turn sign” and caused a failure in the logic, with it essentially discarding one of the two conflicting values.

Daniel Jones
Daniel Jones
2 years ago
Reply to  Cayde-6

Sometimes the cars just plain read the numbers wrong, too. On two separate occasions, I’ve had my Odyssey suggest that the speed limit at my kids’ school is 100 mph. (It is not.) There’s not really anything unusual about the sign, either, it’s just an error on the system’s part. Probably good that a human is controlling the vehicle in those cases…

Drew
Drew
2 years ago
Reply to  Cayde-6

“I’d be willing to bet that what screwed up NSD (Not Self Driving) in the first video was that you essentially had three different signs with differing information.”
There’s a very good chance that it couldn’t figure out what to make of the signs. But any decent program should be able to parse that:
1. Cannot turn left
2. Must go straight
3. Cross street traffic goes from right to left
=
A. Continue straight.

It’s a failing to have a program that cannot parse multiple signs in a world where multiple signs often exist. It is a further failing to have it parse them SO INCORRECTLY that it sends the car the wrong way down a one-way street. And, of course, providing that system to unprepared people on public roadways is yet another failing of the system.

Zak Billmeier
Zak Billmeier
2 years ago

How is it legal to test this on public roads, with someone behind the wheel who is likely NOT much of a driver to begin with?

Actually driving my EV manually brings great joy to my face every day, because I consider myself a driver and have had a lifetime of cars with anemic engines.

Jason Hinton
Jason Hinton
2 years ago
Reply to  Zak Billmeier

It is legal because Tesla’s aren’t self-driving. Legally they are a Level 2 autonomous system where the driver is legally responsible for the car at all times. So if FSD glitches and crashes it is the driver’s fault – not Tesla’s.

How is Tesla’s marketing legal? I have no idea.

Ricardo Mercio
Ricardo Mercio
2 years ago

That last bit is honestly my biggest gripe with Tesla. They pretend their software is ready and claim so the the furthest extent that they’re legally allowed to.

They always play it off as if full self-driving is already achieved and they’re only legally required to call it level 2 because they haven’t finished dotting their i’s and crossing their t’s. They encourage their buyers and drivers to believe that their software is ready and trust in it.

They don’t sound like “Don’t sleep in your moving car because we believe the software isn’t safe on its own and you will die”, they say “don’t sleep in your car because OTHER PEOPLE don’t believe the software is safe on its own and you’ll lose your license”.

Robert Kirchner
Robert Kirchner
2 years ago

FSD= Full Student Driving.

R G
R G
2 years ago

or Full Shitty Driving

SLIDTossedPissedinto BleuCHSaladwCroutons
SLIDTossedPissedinto BleuCHSaladwCroutons
2 years ago

Recently, I have boiled down my driving to… lunacy caged.

Sure, Ive gotten speeding tickets (but thats mostly attributed to not having a decent detector and not recognizing basic police 101 cues.)

But as of past 5-10yrs, Ive realized that its not the car thats doing the safety – ness… its my driving. Im the one who might miss a cue of a vehicle thats not paying attention. Im the one who is getting major cues about police up ahead and semis with reefer trailers and cars wayyyy tooo close behind them. Im the one doing the work.. the car is my tool, my abilities rendered in the Physical Form.

With that said…
I wouldnt want a driverless laptop bullshit car… why?
Think about this, everyone (except me because I have enough emotional disturbances to rule out being distracted by some touch enabled D E V I C E) has these devices and everyone would rather be on them.. doing mindless things.. than driving.

SO…
Instead of driving.. we would rather allow a device to do the work for us… as a pacifist comment enabling the vehicle to do work you choose not to… to stare into your device some more? Sounds like a copout to me.

Id rather be driving.

Ian Case
Ian Case
2 years ago

I think the question is, YOU are aware that the problem is other people. To me, humans are WAY more unpredictable than a computer is. A computer can mess up, sure, but that’s because it’s not programmed properly. Usually if a person does something irrational, we have no way to figure out why, click and type somethings and make sure that that never happens again. And all the Teslas have the same software, meaning once it’s pushed OTA, that mistake won’t happen on ANY Tesla. You can train someone not to do one thing, but that doesn’t do shit for the rest of the people. There’s billions of people on the planet and every one has unique software running on their brain. I’d LOVE to be able to turn self driving on sometimes. Not all the time, I love driving, but recently driving has been significantly LESS fun because there’s too many people doing too many unpredictable things in automobiles way too big and powerful for them to control, and they’re so disconnected from the road and the outside world by those automobiles that it’s inherently LESS safe than it would have been had they been driving a shitbox from 1994. Because in that case, you HAVE to control the car and pay attention to ensure it stays on the road.

SLIDTossedPissedinto BleuCHSaladwCroutons
SLIDTossedPissedinto BleuCHSaladwCroutons
2 years ago
Reply to  Ian Case

But with Humans… you have psychology and science to back it up along with a factor of X for being distracted.

Getting into a car.. with a computer as the driver.

Id rather walk.

SquareTaillight2002
SquareTaillight2002
2 years ago

I want self-driving but I have seen all these videos. I have had “smart” cruise control try to drive me off a highway. I can’t trust them. There is no way I would cede control of my car to any of them.

Most people have a strong drive for self-preservation. How are all these other people reaching such a different conclusion?

SLIDTossedPissedinto BleuCHSaladwCroutons
SLIDTossedPissedinto BleuCHSaladwCroutons
2 years ago

Simple, dont want self driving = computer driving for you.

Ian Case
Ian Case
2 years ago

Do you ever get in a cab? or an Uber? Because you’re ceding control at that point, too. I’m not arguing that self driving is ‘ready’, but to write it off because it’s not ready is the same as the people saying the internet is a fad, computers are a fad, cars are a fad. It’s here, and it’s going to get better. Remember how bad voice assistants were when they debuted? They’re still not perfect, but it’s amazing how far they’ve come in 10 years.

Defenestrator
Defenestrator
2 years ago
Reply to  Ian Case

Voice assistants are a great example. They’ve improved, yes, but the biggest difference is that they mostly try to distinguish among a large but discrete set of possible actions. That, they can do fairly well. Full speech to text, however, has been asymptotically approaching good for decades now. It gets most of it right, but every now and then it just absolutely murders a word or sentence. You absolutely have to double-check everything it does. Similarly, stuff like smart cruise control that carefully constrains what it’s willing to handle works well, but trying to handle every arbitrary situation mostly works but sometimes fails spectacularly.

Steven Moor
Steven Moor
2 years ago
Reply to  Ian Case

Ceding control of a vehicle to someone that spends their life driving on the roads of a specific city is far safer than ceding control to an AI that is still learning. I don’t know what car manufacturers are going to have to do to build up enough confidence for people to be able to trust self driving cars with their lives, but I know that we are nowhere near that level right now, and Tesla certainly isn’t helping the public’s perception by allowing people to beta test FSD, because you get to see the system do dumb, life threatening shit like this.

Richard O
Richard O
2 years ago

In the Denver case, the street is one way, but the light rail comes the opposite direction. It’s possible the system did not think the train was moving in “reverse.”

Drew
Drew
2 years ago
Reply to  Richard O

I suspect that is exactly what happened. Which is a major failing. Even without the rail there, there are plenty of reasons something might be going the wrong way, including someone else’s mistake, pedestrians, emergency vehicles, and bicycles. The car should be able to track movement and maneuver accordingly.

Joe The Drummer
Joe The Drummer
2 years ago
Reply to  Richard O

So, this video depicts the system failing at both reading road signs and the map, AND detecting a frickin’ train crossing its path?

Call me when self-driving cars can pass a goddamn driver’s license exam in any of the 50 states of the union of your choosing.

ExAutoJourno
ExAutoJourno
2 years ago

Beyond this lies another concern, at least for me: a new generation of drivers soon won’t have the instincts or experience to actually take control of an automobile on their own. They will expect FSD to do what the name implies, and just let them sit back and text, watch cat videos on their phone, or just chill and think about something totally unrelated to, you know, driving.

I still maintain that every “self-driving” vehicle should have a mandatory bright, rotating beacon on its top so those of us still driving the old-fashioned way can avoid them. If said beacons can be seen a couple of blocks away, there will be plenty of time for us mossbacks to make a turn at the next corner and continue safely….

To me, this AV stuff is a parlor trick, one that could potentially cause a lot of damage and injure people. Human drivers are far from perfect, but it’s a “better the devil you know” situation in my book.

Drew
Drew
2 years ago
Reply to  ExAutoJourno

I’d just prefer they hire trained safety drivers.

Zack Oliver
Zack Oliver
2 years ago
Reply to  ExAutoJourno

Maybe they could limit self-driving cars to people in their early 30s or people with at least 10 years of driving experience.

Just so that they have real experience and muscle memory for taking over the car when needed.

Drew
Drew
2 years ago
Reply to  Zack Oliver

Driving experience is different from supervising a car trying to do the driving for you. Train and pay someone to monitor, control, and report.

Drew
Drew
2 years ago

Let’s imagine, for a moment, a situation in which the map was wrong about a one-way street. Your “self-driving” car reads the map data, the traffic light, and the signs, and decides to trust the map data over all the other data. That’s a bad choice in the programming to favor the map over everything else. And I wonder if that is what happened.

As to the train, I suspect that the car recognized it was on a one-way street, recognized a large vehicle, but did not recognize that the large vehicle was on tracks that went the opposite direction from the street OR track the motion of the large vehicle (perhaps it assumed it was catching up faster than it was driving, somehow?). Again, poor programming if you do not track motion in the decision-making process. Even if we ignore tracks, there are many other situations in which a vehicle, pedestrian, or debris could be going the opposite direction from the markings. Including if a self-driving car decides to go the wrong way.

Michael Beranek
Michael Beranek
2 years ago
Reply to  Drew

1. Trust the signs
2. Lidar

Drew
Drew
2 years ago

Absolutely. Poor programming and refusal to use better object/speed detection tech combine poorly with putting “self-driving” vehicles on public roadways.
Signs should certainly have priority, as any temporary changes are likely to be shown via signs, as well as the reality that maps may not be updated in a timely manner. The train definitely demonstrates that lidar has plenty of use, no matter how willing some are to trust camera-based sensors alone.
And paying safety drivers to go out and test the system would be better than a beta test. Especially since the beta testers are often big fans of the company and the feature, so they want it to look good, even though they should really want it to work better.

Cayde-6
Cayde-6
2 years ago
Reply to  Drew

Right, that’s my thing. I can understand prioritizing visual signage over map directions because there will invariably be situations where the map data doesn’t align with the real-world (who knows, for example, how long it takes for a change in a street to be updated in Google Maps et al…)

Michael Beranek
Michael Beranek
2 years ago

Relying on Beta testers to work the bugs out of automated driving software is like relying on a plumber to treat you for cancer. Or relying on an attorney to do your car’s alignment.

Oafer Foxache
Oafer Foxache
2 years ago

Yeah but Tesla (probably) says:
1. “Woohoo! Free testing!”
2. “Even if the system actively hunted down and ran over small children and puppies, most of our fans would still find a way to praise us!”

10001010
10001010
2 years ago

If they put up a “no left turn” sign, why not throw a couple “no right turns” around that intersection as well?

Icouldntfindaclevername
Icouldntfindaclevername
2 years ago
Reply to  10001010

I was thinking the same thing. Plus, the amount of glare on the one way sign probably made it hard to read.

MaximillianMeen
MaximillianMeen
2 years ago

I wouldn’t be surprised if the one way sign combined with the no left turn confused FSD into thinking that a right turn must be OK. Still shouldn’t have ignored the straight only sign, even if the arrow on the one way sign was obscured by glare.

Cayde-6
Cayde-6
2 years ago
Reply to  10001010

The “No Left Turn” sign must be temporary (construction perhaps?) because the other sign says “One Way Only” with a left-pointing arrow.

Drunken Master Paul
Drunken Master Paul
2 years ago
Reply to  Cayde-6

No, that “No Left Turn” sign is permanent. This is 5th ave in Seattle and the old monorail support posts separate the two lanes going south (5th ave is one way southbound). It’s not legal to take a left turn from the right lane as it would cross the left lane traffic that could turn left, or continue forward.

With that said, why on earth are people even using that system? Make Tesla actually pay to beta test it themselves instead of getting owners to do it for free? Or at least pay Tesla owners to use the systems and put a lot of legal stuff in writing along.

Jason Snooks
Jason Snooks
2 years ago

What’s even worse is the owners are actually paying Tesla something like $12k (I don’t know for sure, they keep jacking up the prices) to beta test their software for them. Quite an early access scam Tesla is running.

Drew
Drew
2 years ago
Reply to  10001010

Sure, they could, but most people see the one-way and recognize they should not turn right. The left turn one is because some people don’t remember there is another lane going the same way on the other side of the barrier, so they think they are clear to turn onto the one-way.
In a perfect world, people would recognize a green straight arrow means they need to go straight. But, yeah, a sign warning people not to turn right probably wouldn’t hurt.

Vetatur Fumare
Vetatur Fumare
2 years ago
Reply to  Drew

I wonder if FSD would work better somewhere like Germany where there is an actual system for how to design roads and their signage. Although I guess that any country smart enough to be able to design roads is not stupid enough to allow shitty bot drivers on said roads.

Sid Bridge
Sid Bridge
2 years ago

“I don’t see anything wrong with the system”
-Elon Musk, as he walks right into a hot dog cart then turns left into a construction site, drops into an open manhole and knocks over a canister of toxic waste.

Interrobang‽
Interrobang‽
2 years ago
Reply to  Sid Bridge

We should be so lucky.

Icouldntfindaclevername
Icouldntfindaclevername
2 years ago
Reply to  Sid Bridge

You forgot he was posting to that bird thingy at the same time

SLIDTossedPissedinto BleuCHSaladwCroutons
SLIDTossedPissedinto BleuCHSaladwCroutons
2 years ago

I came here to comment…

I have done.. what I WISH others would do for “social media”:
I log on to twitter… with a stack of “people” that I follow (Peterbilt, Mack, various Class 8 and or 9 OEMs, various retailers, various police Depts in various areas doing Truck Stops at weigh stations.)
And just doiwnloading a ton of Truck, Semi, Trailer, Tow Truck / Wrecker pics in the process.
No commenting = I dont say a WORD. I upvote or “heart” a picture and thats it.

Twitter is absolutely fantastic for — exactly what I want = “streaming” outlet for just what I want (Truck / Semi / Wrecker pics of all kinds) and nothing absolutely N O T H I N G else.

Id never post… anything to that (or any other) like type site.

Data
Data
2 years ago

Car turn wrong way down one way street (Operator fails to correct)
Operator frets and worries about how bad this is (Got to build up the excitement and drama)
TikTok pay day

My guidance counselor never told me I could be an influencer or play video games for a living, so I went to college. I may have chosen poorly.

Vanillasludge
Vanillasludge
2 years ago

The value of all these bugs is that there is NO WAY you could be distracted from what the car is doing. It’s like living with a loose python. Sure it’s cool, but fall asleep and you’re dead.

Jason Snooks
Jason Snooks
2 years ago
Reply to  Vanillasludge

Right. The system is so terrible it keeps you on edge the entire time. Definitely not a relaxing driver assist system.

Ben
Ben
2 years ago
Reply to  Vanillasludge

That’s my thing about these videos. Even when the system is “working”, it drives like the worst nervous new student driver, jerking the steering wheel around and constantly accelerating and decelerating randomly. If I were in the car with a human driver like that it would be the last time I let them drive anywhere, never mind paying them $12000 for the “privilege”.

73
0
Would love your thoughts, please comment.x
()
x