Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> By recording driver behavior, Musk could argue that driver error led to the accidents rather than Tesla technology

We really need the US government to iron out the liability questions around self-driving. It's kind of a joke at this point.

Tesla is offering people the ability to NOT DRIVE. In the case of an accident, they want to prove that the driver was being negligent by NOT DRIVING.

Or you can take GM's approach and literally have cameras pointed at your eyes to make sure they are looking at the road and a steering wheel that can tell when you let go. And disable your ability to NOT DRIVE, whenever they detect you NOT DRIVING.

In comparison, Tesla is at least being an "honest liar". They are giving you what you want, which is borderline illegal, and will rat you out for using it the first chance they get.



> will rat you out for using it the first chance they get

In one fatal collision, Tesla held a press conference to throw blame at the driver and away from their autonomous systems.

They pulled telemetry and stood up at the conference and said that Tesla was not at fault because "the driver had been inattentive and the vehicle had warned him of this".

What later came out was that the vehicle had warned him to keep his hands on the wheel...

Once. Eighteen. Minutes prior to the collision.


But the way human bias works it doesn't matter much whether the statement was correct for tribal disagreements, the fanboys will believe it and the haters wont, at least short term.

Musk was smart to spend that much time cultivating a fanbase who will believe stuff like that.


Serious question for fellow Tesla drivers: Does anyone currently use FSD? What's the value prop? I just got a Model Y and tried it out (it came with a 3 month free trial) and found it awful. To the point of this comment -- it's called "Autopilot" but it yells at you (and punishes you!) if you look at your phone or stop paying attention or take your hands off the wheel.

I just can't imagine this being something I'd pay an extra $199/month (or $15,000) for after the trial's over.

I much prefer the Waymo / Cruise vision of a car I can sit in as a passenger, not this weird hybrid "still need to pay full attention but have none of the control of being an actual driver" version Tesla is offering.

But maybe I'm missing something...?

EDIT: I'm a terrible nerd and confused FSD and Autopilot. What I'm really asking about is FSD (Full Self Driving). Sorry about that!


> it's called "Autopilot" but it yells at you (and punishes you!) if you look at your phone or stop paying attention or take your hands off the wheel.

There are many things to break down here:

1. FSD sucks mostly sucks (as commenters have alluded to). Autopilot is quite good, although it does “fuss at you” for things you mentioned.

2. Autopilot is a misnomer. What the feature as documented does versus what the name suggests are not the same. Imho, this is actually a marketing failure by Tesla — calling it “Advanced Cruise Control” or some made up name would have been just fine.

3. You should keep your eyes on the road when using advanced cruise control.

4. Related, you should not be looking at your phone while you are driving (with the exception of maybe a quick look at a message when you’re stopped).

5. You actually can take your hands off the wheel, often for quite a bit of time, while in autopilot. Also, just touching it is not enough — there needs to be some tension on the wheel. This is not hard to do, and is a reasonable ask imho.

6. To stop the car from fussing at you about a few things, I recommend getting a little slide cover to cover the interior camera that you can get on Amazon for $5-10. Works like a charm.


> little slide cover

The car will scream at you if you cover the camera while using FSD. Very Black Mirror.


This might be specific to FSD.

No problems with AP.


100% agreed - yes and you helped me realize that I was confusing FSD and Autopilot. Everything you've said about Autopilot is correct and accurate and I agree. What I really meant to ask about is FSD -- thanks for the clarification!


"Autopilot" is kind of a funny term because pilots definitely still need to pay attention to other aspects of the plane while flying


> But maybe I'm missing something...?

I think so, because FSD != Autopilot, and if its nagging you constantly about paying attention you're not in FSD.

I'm not arguing that FSD is any good or with it, I'm just suggesting you didn't actually use Tesla "FSD".


Waymo / Cruise is limited to San Francisco, and isn't generally available right now. FSD is available today and can access freeways for long distance driving. SF to LA or NYC to Boston, say. FSD > Autopilot. Even if you have to pay attention, it still makes the drive easier - at the end of the drive you're less exhausted.


Yeah I hate the idea of semi-autonomous vehicles (a vehicle that can handle any kind of situation all by itself [terms and conditions may apply]). You can't make a 95% solution in this space.


I suspect the "out of the loop" problem [1] is going to make the last 10% very dangerous.

> The out of the loop performance problem arises when operators suffer from complacency and vigilance decrement

[1] https://www.frontiersin.org/articles/10.3389/fnins.2017.0054...



> https://www.tandfonline.com/doi/abs/10.1080/19439962.2023.21...

> Although [Tesla] Level 2 vehicles were claimed to have a 43% lower crash rate than Level 1 vehicles, their improvement was only 10% after controlling for different rates of freeway driving.

but even ignoring the statistical issues, Tesla Safety Reports are a measure of the safety of Autopilot AND a human driver.

Community sourced critical disengagement data (because Tesla refuses to release industry standard metrics for AV performance) shows that FSD Beta alone is far far worse than the average driver: https://sites.google.com/view/fsdbetacommunitytracker/home?p...


A self-reported, unaudited number published by the manufacturer claiming the product they are selling is safe is not data.

Tesla’s “vehicle safety report” consists of publishing three ratios of miles/accident every three months. That is it. They do not even publish the number of miles or the number of accidents used in their calculation. You would fail your elementary school science fair project if you made a report like that.

This is what a real safety report looks like: https://storage.googleapis.com/waymo-uploads/files/documents...

The differences are stark. The level of Tesla’s report is so far below the minimum standards of acceptable conduct that either everybody at Tesla safety is a utter moron or they have deliberately published a report that even a science-minded high schooler would be ashamed to submit in an attempt to deceive customers.

Oh, and there is a trillion dollar conflict of interest, the report is totally self-serving, they have no independent confirmation, they reject working with independent auditors, they hide information from third parties, and they demand the government redact information from public reports.

tl;dr That report is bogus self-serving lies.


I am open to studies as rigorous as you demand indicating Autopilot/FSD is more dangerous than human drivers. But until those are produced, total miles and total accidents seem to be pretty objective, and easily tracked, numbers. There's really no magic there. And on that count, Tesla makes very safe machines.


You are open to rigorous studies proving it is dangerous? This is a safety critical system; the default assumption is that it is unsafe and rigorous independent studies must be made to prove it is safe. You are guilty until proven innocent when making dangerous devices, that is how we threaten companies into not killing people for their own greed.

The utter contempt for rigor and acceptable standards on safety reporting Tesla has demonstrated is appalling. Your average elementary school science fair project is more transparent and robust. I am not joking or exaggerating on that assessment. They deliberately choose to not even publish the number of accidents or the number of miles used in their division operation. It is hard to overstate how ridiculous this is. Even Philip Morris had to try harder when claiming cigarettes were safe.

Tesla’s numbers are total and utter garbage.

But, since you want rigor. Tesla has reported ~350M miles on FSD. They have over 700 crashes with no known survivors attributable to FSD officially reported and acknowledged by Tesla to NHTSA. This results in a worst case estimate of 1 fatality per ~500,000 miles, 10x more fatalities than their purported accident numbers and 120x worse than the average driver who has 1 fatality per ~60,000,000 miles.

I have now provided a more rigorous proof of danger than the Tesla safety report. I have provided the numerator and denominator as drawn from official Tesla and government sources. My report is more transparent, robust, auditable, and conflict-of-interest free than Tesla’s safety report.

If you or Tesla would like to dispute the number of fatal crashes on FSD, then for each such crash provide a report indicating if FSD was not on or all occupants survived. That is the bare minimum level of proof required, which is still less than what Waymo already reports voluntarily without being forced to. Until then, Tesla’s auditable safety record of 120x more dangerous than the average driver speaks for itself.


LOL - 700 crashes with no known survivors attributable to FSD? You're claiming 700 people have died while operating FSD? Friend, that's not a worst-case assessment. That's just a lie.

But to be sure, there will be people who die. That is true of nearly every human activity. We have to accommodate a non-zero death rate. Were to not, then we'd immediately outlaw driving altogether, which would impoverish millions. Your angry "prove it's safe" argument is an ideological position not grounded in any useful reality. Their numbers evidence the technology is very safe. It is not perfectly safe because nothing is.


So not a single one of the 700 crashes with no known survivors that Tesla reported to NHTSA, ~95% of all crashes they have reported, had any fatalities? That sounds really easy to prove. Tesla can just get some lawyers to query the police reports. The police reports will record any injuries and fatalities and then Tesla can update their NHTSA incident reports to fill in the police report boxes that they chose not to fill in during their initial and follow-up reports.

Or are you arguing that Tesla, who knows exactly where every crash occurred and who is the nearly trillion dollar company, has no obligation to do any investigation? Instead it is the public's job to reverse engineer Tesla's crash information from the redacted reports and then donate time and money to do it for Tesla.

Furthermore, you have completely deflected on providing any support for the numbers in the Tesla safety report you keep quoting. Tesla's safety report claims 4.8 million miles per accident. You and Tesla have exactly zero evidence supporting that number except for Tesla's word that Tesla's cars are super safe which is why you should give Tesla money. Tesla will not even answer a simple question like how many accidents occurred in each quarter, let alone proper reporting like Waymo who indicates exactly which NHTSA reported crashes they corresponded to and the nature of the crash. You complain about publicly auditable numbers reported by Tesla to NHTSA while pushing completely unsupported numbers self-reported by the entity with the largest possible conflict of interest as if they are facts.

Also your last point railing about proving things safe is a strawman. Proving something safe does not mean "proving it perfectly safe", it means demonstrating the desired level of safety through rigorous investigations producing well-documented, auditable evidence confirmed through unbiased, third party access to the data and data generation methodology. Which is already the long-standing standard procedure in safety-critical systems in industries such as aviation, medical, automotive, transportation, and industrial control. It does not mean publishing a unaudited number while doing no investigations and preventing third party or regulatory access to the raw data like what Tesla does. That is called safety reporting malpractice.


I think it's quite likely that, if people had died, the press would have covered it (as they have so salaciously covered the accidents). I have no reason to believe Tesla is lying. But, before we barrel down that path, why don't you tell all of us what level of death would be acceptable to you. Would you be willing to endorse FSD if it were, on average, as safe, or more safe, than human drivers?


If any autonomous driving system could demonstrate safety at or above human drivers supported by a robust dataset generated from a rigorous investigative process and made available to credible unbiased third party watchdogs or regulatory agencies that confirm the findings then I would support the further testing and deployment of such a system.

Now your turn, how many people has Tesla ADAS software killed and what is the acceptable rate?

As to the data reporting I demand, which is longstanding standard in safety-critical system deployments, Tesla is literally the furthest from that. They deny all third party access, voluntarily publish no raw data, demand the maximum redaction in all mandatory government reporting, and hide data except when selectively releasing private data out of context in press conferences only when it portrays them positively and burying the rest. They choose to do no investigations so they will not be required to report negative outcomes, they sue reporters, threaten news organizations and employers with lawsuits to make them silence their employees, and direct abuse against regulators to force them to recuse themselves from investigations. Tesla uses every trick in the book to prevent a critical look at their numbers.

There is exactly zero reason to believe even a completely transparent company about the safety of their own products. A company as dishonest and cagey as Tesla providing numbers with zero supporting evidence, who has a history of cherry picking misleading good-sounding data, should have their numbers resoundingly ignored.


Nah, I think they've produced sufficient evidence to shift the burden. Where's your evidence of all these deaths? This is safety critical in the way that all cars are safety critical. This is not an airplane, and this is not the FAA. If we're going to drag Tesla through the mud, as you seem hell-bent on doing, I think it should be on you to evidence this epidemic of Tesla deaths. You cannot because there is no such epidemic.


You and Tesla have not provided any evidence. All you have done is point to self-reported numbers with absolutely no supporting data.

You can not even tell me the number of accidents reported or the number of miles driven on Autopilot in Q4 22 used to calculate Tesla’s number. That is the numerator and the denominator, really basic stuff that even a elementary school science fair project would report.

Given your total refusal to supply any data or evidence at all to support your points, do you even know anything beyond the Tesla talking points? Please reply with the number of confirmed fatalities (or fatal crashes) in the US while Tesla ADAS systems were in use (hint it is non-zero).


If it works for 99,999 seconds for every second it fails, then I should expect it to fail before 30 hours of use. That's abysmal.


True. But at a hypothetical 40 mph average, one accident per 6.57 million miles is one accident every 164,250 hours. There are 8,760 hours in a year. These are exceptionally safe machines.


compare:

“If it works for 99,999 microseconds for every microsecond it fails, then I should expect it to fail before 0.1 seconds of use.”

with:

“If it works for 99,999 hours for every hour it fails, then I should expect it to fail before 11 years of use.”

This comparison can be twisted to make any conclusion you want.


A car accident lasts for something on the order of 1 second, maybe a bit more or a bit less but no car accident lasts for a solid hour. I'm not concerned about one complete hour of failure, I'm concerned about one second of failure.


Who could possibly imagine something called Autopilot wouldn't require you to be in full control of car at all times?


An interesting bit of history is that autopilot was one of the first common terms for cruise control, and was in use until at least the mid 60s.

That is far enough back to not have any relevance to Tesla's current customer demographic though.


Also airplanes use autopilot and require full control at all times. It's almost as though there are no situations where the term autopilot refers to a fully automated system. Who knows where the general public today go that impression. Perhaps science fiction stories.


Airplanes use a variety of modes of autopilot systems and some modes absolutely do not require the pilot to touch the controls. They do require the pilot to set the mode and its parameters in the first place, and it would usually be considered a serious safety violation if both pilots were to get up and leave the cockpit while autopilot was engaged, but the technology is definitely good enough to handle this situation.

This is not true for Tesla's FSD. Not by a longshot.


> some modes absolutely do not require the pilot to touch the controls.

Autopilot can disengage and return control to the pilot at any moment. Pilots have to be on alert at all time. This has resulted in sone catastrophic outcomes, such as AF447.


Article is referring to Tesla's Full Self Driving feature not Autopilot.


As a frequent user of autopilot, I can assure you it is nothing close to NOT DRIVING. It is advanced cruise control and you need to be attentive. Even full self driving does many many things imperfectly, requiring you to take over. I'm not even talking about dangerous situations, just regular driving. It often won't move over when you want, it's too rough taking off ramps, and so on.

If you want to go down the road of regulation, I would like to see ALL driving assists subjected to the same laws. So no cruise control for you without cameras watching your eyes and all the rest. Though personally, I think it's fine as it is, making it very clear that you are still driving. Tesla doesn't do well at the latter in their marketing.


> As a frequent user of autopilot, I can assure you it is nothing close to NOT DRIVING.

And yet they're allowed to market it as "auto pilot" which is clearly an "I'm not required to be attentive" message. I agree, though, it's advanced cruise but not true driverless driving. It's not even worth it, to me. If I have not only monitor the road but "someone else's" driving, that's more work than just driving.


> If I have not only monitor the road but "someone else's" driving, that's more work than just driving.

This is how I feel about FSD.

Autopilot, on the other hand, is really good for on the highway/interstate.

Speed regulation, maintaining spacing, and lane assist are all smoothly implemented. It is certainly less work than driving, typically by a lot.

I say this with about 20k miles of autopilot usage in my model y.


The semantic difference between "cruise control" and "auto pilot" seems minimal to me. Imagine you have never heard of the term "cruise control" and a car company used the term.


Nope, Tesla is not offering people the ability to "NOT DRIVE". They might do so eventually, but all that they actually offer today is an advanced driver assistance system.

There is no need for the US government to make any changes to liability laws. Almost all such cases are handled at the state level.

Mercedes-Benz has gone quite a bit further and is now offering a true SAE level 3 autonomous driving system. It is significantly more advanced and reliable than the Tesla system. And they stand behind it by assuming liability for collisions.

https://media.mbusa.com/releases/release-1d2a8750850333f086a...


> Nope, Tesla is not offering people the ability to "NOT DRIVE"

This is technically correct but they, and especially Musk’s public comments, are pretty fast and loose around that. He’s been claiming L5 will be available in 1-2 years since 2016:

https://arstechnica.com/cars/2021/05/tesla-autopilot-directo...

Remember when they claimed that the driver in their promotional video was only there for legal reasons because the car was driving itself? Someone who paid extra for FSD back then has probably already sold the vehicle without ever getting it:

https://www.bloomberg.com/news/articles/2023-01-19/elon-musk...


https://twitter.com/elonmusk/status/1677531425795391489

https://twitter.com/elonmusk/status/1683348253763330048

Elon Musk openly replying to FSD users stating the car drives itself just a few months ago.

Also the promotional video was faked and staged. Ashok Elluswamy, the current head of Autopilot software, stated in a deposition under oath that he was personally involved in faking and staging the demo. During at least one of the test runs it ran off the road into a fence. They still wrote: “The driver is only there for legal reasons.” despite that.


Yeah, I can only imagine their lawyers scrupulously maintain a record of what they told Elon before he did the opposite so they can’t personally be held accountable.


> Nope, Tesla is not offering people the ability to "NOT DRIVE". They might do so eventually, but all that they actually offer today is an advanced driver assistance system.

That's not really what they are selling though, and that's one of the problems.


>Mercedes-Benz has gone quite a bit further and is now offering a true SAE level 3 autonomous driving system. It is significantly more advanced and reliable than the Tesla system.

Is it? From your link...

>On suitable freeway sections and where there is high traffic density, DRIVE PILOT can offer to take over the dynamic driving task, up to speeds of 40 mph...DRIVE PILOT available in the U.S. for model year 2024 S‑Class and EQS Sedan models, with the first cars delivered to customers in late 2023

So very limited driving scenarios and extremely limited product availability.


It is restricted to 40 mph (70 km/h) and it is restricted to just a few limited access roads (just a few sections of the autobahn).

This means that it is useful only dense, slow moving, traffic. That is scarcely better in practice than my 2015 Model S with first generation Autosteer.


Its is also level 3 which means you can watch a movie on your phone or send emails. Something you can't do with Tesla even if you purchased the $15k FSD package.


And when the car asks you to take over how long do you have to respond? Can someone do their knitting while the car drives and be able to put that down and take over before the car decides that it has to come to a safe stop?


Ten seconds per regulations.


If I'm immersed in doing something else, I wonder if that is really long enough.

I'm sure it's some kind of progress but it seems to me that being restricted to 40 mph rather reduces its utility. Traffic on motorways is usually travelling faster than that. This means that it will be used quite rarely which makes me think that this is essentially a PR exercise on the part of Mercedes. Also the press releases say nothing about how the car and driver handle the transition from Level 3 below 40 mph to level 2 above and the reverse.


FSD does use the internal camera to enforce eyes on the road hands on the wheel. It’s actually worse for road-trips on the highway than plain autopilot for this reason, aside from the fact that FSD will automatically pass and change lanes.


Autopilot will nag you every 30 seconds if it doesn't detect hands on steering wheel.


Hilarious typo. I’m imagining the vehicle saying things like “You suck, you’re so lucky I’m the only car that will ever love you.” “


Acktshually, negging would be a tad bit more subtle than that. It is a form of backhanded compliment.

I imagine a negging Tesla car would say something like "it's so cool to see a person who has so little to lose they don't care whether they live or die" when you take your eyes off road and/or hands off wheel.


I find the time varies, and that the amount of attentiveness required is much lower than that for FSD. FSD will ding you very quickly if you have a device in your hand, especially compared to plain autopilot.

My car came with an internal camera that didn’t have IR lights, which made FSD completely unusable at night time (they put in a new one under warranty).

I only subscribe FSD every few months for long trips, but I have a 1400 mile road trip planned this weekend and I will be using plain autopilot.


There are many well known workarounds for that though since the detection is just an accelerometer.


I recommend you read the biography and not some blog taking an entire chapter and turning it into a few sentence out of context, there was a lot more nuance in the book


[flagged]


> The alternative, as we've seen with other manufacturers, is that they just won't have FSD. They'll call it that,

1) no one else calls it FSD or anything like that. Every other car manufacturer with driver assistance tech calls it what it is.

2) other manufacturers define safe constraints on their tech and then enforce those constraints

> , but they'll make you keep your hand on the wheel and your eyes on the road -- which isn't FSD at all.

Super Cruise and BlueCruise are both hands free. Mercedes Drive-Pilot is licensed in California as Level 3 and won't require the driver to pay attention.


> FSD

The F here stands for 'full' here but...

> I need to remain mindful of the system at all times

It sounds like it should really be called supervised self driving (SSD) to me.

I think the technology is interesting but I wholeheartedly object with the name and the promises that it implies.

> If you pin the failure of a driver to oversee FSD on Tesla

I think it's reasonable to pin the failure on the system if you call it FSD.


Tesla's system requires less supervision than other manufacturers. And that's where the rub is. They're saying -- hey, we'll make this tool available to you, and it really will function autonomously, but you have to know Tesla's not going to take the hit if there's an accident. In America's litigous society, I think that's the only way we'll ever get these tools. Otherwise, the plaintiffs' lawyers will destroy Tesla. I want the technology, and I'm comfortable assuming the risk if there's a fuck-up. If Musk needs a way to prove I wasn't watching and that the liability to transfer to me -- that's fine. I want the option. And Ford/GM/BMW et al won't give it to me.


> I'm comfortable assuming the risk if there's a fuck-up.

Well, here's the rub. The risk is not just ours. It also involves others.

If said "fuck-up" is spilling some tomato sauce on the carpet, then sure, we can say "my bad," and take out our checkbook. It's fairly certain that the other person the risk exploded on, will accept our amends.

However, if it is running over a child, I don't think the checkbook thing will work.


> I want the technology, and I'm comfortable assuming the risk if there's a fuck-up. If Musk needs a way to prove I wasn't watching and that the liability to transfer to me -- that's fine. I want the option. And Ford/GM/BMW et al won't give it to me.

It isn't just about you, its about everyone else on the road as well. You don't automatically deserve to operate a less safe system on public roads just because you are willing to accept liability. Other manufacturers, at least with regard to autonomy, recognize that fact and design products that mitigate risks with a proper safety lifecycle and design domain.


But that's where you're wrong. All data indicate it's SAFER than human drivers. My decision is, on average, making people more safe, not less. https://www.tesmanian.com/blogs/tesmanian-blog/tesla-autopil...


There are serious statistical issues with Teslas claimed rates, but even so autopilot != FSD and Tesla FSD is, imo, currently benefiting from the left hand side of this chart:

http://safeautonomy.blogspot.com/2019/01/how-road-testing-se...

Their disengagement rate is so high that as it stands it keeps drivers vigilant, but as the system improves driver vigilance WILL fade and without robust mitigations FSD will become less safe than a human for a considerable amount of it's development.


> Tesla's system requires less supervision than other manufacturers.

> I need to remain mindful of the system at all times

> and it really will function autonomously

To me this is a contraction. It's said to be both autonomously and to also require constant supervision. And that's why I think it's marketed incorrectly despite it being an interesting piece of technology.


If Tesla really believes it "really will function autonomously" then they shouldn't have a problem assuming liability. Further, if it needs to be supervised, then it's not autonomous, is it?

Congratulations on being so accepting of being sold a bill of goods, though.


There are easier, faster, and cheaper ways to reduce traffic accidents rather than relying on hopium AI to solve all our driving problems, starting with reduction in driving and more transits.


[flagged]


> I'll start listening to the mass transit hopium

It's odd that we are talking about 'hopium' when mass transit is already a thing in many countries.


And in rural America it decidedly is not. Tesla Autopilot/FSD, on the other hand, decidedly is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: