Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Musk Wanted to Spy on Drivers to Defend Tesla from Lawsuits (cleanenergyrevolution.co)
182 points by unsolved73 on Sept 19, 2023 | hide | past | favorite | 203 comments


> By recording driver behavior, Musk could argue that driver error led to the accidents rather than Tesla technology

We really need the US government to iron out the liability questions around self-driving. It's kind of a joke at this point.

Tesla is offering people the ability to NOT DRIVE. In the case of an accident, they want to prove that the driver was being negligent by NOT DRIVING.

Or you can take GM's approach and literally have cameras pointed at your eyes to make sure they are looking at the road and a steering wheel that can tell when you let go. And disable your ability to NOT DRIVE, whenever they detect you NOT DRIVING.

In comparison, Tesla is at least being an "honest liar". They are giving you what you want, which is borderline illegal, and will rat you out for using it the first chance they get.


> will rat you out for using it the first chance they get

In one fatal collision, Tesla held a press conference to throw blame at the driver and away from their autonomous systems.

They pulled telemetry and stood up at the conference and said that Tesla was not at fault because "the driver had been inattentive and the vehicle had warned him of this".

What later came out was that the vehicle had warned him to keep his hands on the wheel...

Once. Eighteen. Minutes prior to the collision.


But the way human bias works it doesn't matter much whether the statement was correct for tribal disagreements, the fanboys will believe it and the haters wont, at least short term.

Musk was smart to spend that much time cultivating a fanbase who will believe stuff like that.


Serious question for fellow Tesla drivers: Does anyone currently use FSD? What's the value prop? I just got a Model Y and tried it out (it came with a 3 month free trial) and found it awful. To the point of this comment -- it's called "Autopilot" but it yells at you (and punishes you!) if you look at your phone or stop paying attention or take your hands off the wheel.

I just can't imagine this being something I'd pay an extra $199/month (or $15,000) for after the trial's over.

I much prefer the Waymo / Cruise vision of a car I can sit in as a passenger, not this weird hybrid "still need to pay full attention but have none of the control of being an actual driver" version Tesla is offering.

But maybe I'm missing something...?

EDIT: I'm a terrible nerd and confused FSD and Autopilot. What I'm really asking about is FSD (Full Self Driving). Sorry about that!


> it's called "Autopilot" but it yells at you (and punishes you!) if you look at your phone or stop paying attention or take your hands off the wheel.

There are many things to break down here:

1. FSD sucks mostly sucks (as commenters have alluded to). Autopilot is quite good, although it does “fuss at you” for things you mentioned.

2. Autopilot is a misnomer. What the feature as documented does versus what the name suggests are not the same. Imho, this is actually a marketing failure by Tesla — calling it “Advanced Cruise Control” or some made up name would have been just fine.

3. You should keep your eyes on the road when using advanced cruise control.

4. Related, you should not be looking at your phone while you are driving (with the exception of maybe a quick look at a message when you’re stopped).

5. You actually can take your hands off the wheel, often for quite a bit of time, while in autopilot. Also, just touching it is not enough — there needs to be some tension on the wheel. This is not hard to do, and is a reasonable ask imho.

6. To stop the car from fussing at you about a few things, I recommend getting a little slide cover to cover the interior camera that you can get on Amazon for $5-10. Works like a charm.


> little slide cover

The car will scream at you if you cover the camera while using FSD. Very Black Mirror.


This might be specific to FSD.

No problems with AP.


100% agreed - yes and you helped me realize that I was confusing FSD and Autopilot. Everything you've said about Autopilot is correct and accurate and I agree. What I really meant to ask about is FSD -- thanks for the clarification!


"Autopilot" is kind of a funny term because pilots definitely still need to pay attention to other aspects of the plane while flying


> But maybe I'm missing something...?

I think so, because FSD != Autopilot, and if its nagging you constantly about paying attention you're not in FSD.

I'm not arguing that FSD is any good or with it, I'm just suggesting you didn't actually use Tesla "FSD".


Waymo / Cruise is limited to San Francisco, and isn't generally available right now. FSD is available today and can access freeways for long distance driving. SF to LA or NYC to Boston, say. FSD > Autopilot. Even if you have to pay attention, it still makes the drive easier - at the end of the drive you're less exhausted.


Yeah I hate the idea of semi-autonomous vehicles (a vehicle that can handle any kind of situation all by itself [terms and conditions may apply]). You can't make a 95% solution in this space.


I suspect the "out of the loop" problem [1] is going to make the last 10% very dangerous.

> The out of the loop performance problem arises when operators suffer from complacency and vigilance decrement

[1] https://www.frontiersin.org/articles/10.3389/fnins.2017.0054...



> https://www.tandfonline.com/doi/abs/10.1080/19439962.2023.21...

> Although [Tesla] Level 2 vehicles were claimed to have a 43% lower crash rate than Level 1 vehicles, their improvement was only 10% after controlling for different rates of freeway driving.

but even ignoring the statistical issues, Tesla Safety Reports are a measure of the safety of Autopilot AND a human driver.

Community sourced critical disengagement data (because Tesla refuses to release industry standard metrics for AV performance) shows that FSD Beta alone is far far worse than the average driver: https://sites.google.com/view/fsdbetacommunitytracker/home?p...


A self-reported, unaudited number published by the manufacturer claiming the product they are selling is safe is not data.

Tesla’s “vehicle safety report” consists of publishing three ratios of miles/accident every three months. That is it. They do not even publish the number of miles or the number of accidents used in their calculation. You would fail your elementary school science fair project if you made a report like that.

This is what a real safety report looks like: https://storage.googleapis.com/waymo-uploads/files/documents...

The differences are stark. The level of Tesla’s report is so far below the minimum standards of acceptable conduct that either everybody at Tesla safety is a utter moron or they have deliberately published a report that even a science-minded high schooler would be ashamed to submit in an attempt to deceive customers.

Oh, and there is a trillion dollar conflict of interest, the report is totally self-serving, they have no independent confirmation, they reject working with independent auditors, they hide information from third parties, and they demand the government redact information from public reports.

tl;dr That report is bogus self-serving lies.


I am open to studies as rigorous as you demand indicating Autopilot/FSD is more dangerous than human drivers. But until those are produced, total miles and total accidents seem to be pretty objective, and easily tracked, numbers. There's really no magic there. And on that count, Tesla makes very safe machines.


You are open to rigorous studies proving it is dangerous? This is a safety critical system; the default assumption is that it is unsafe and rigorous independent studies must be made to prove it is safe. You are guilty until proven innocent when making dangerous devices, that is how we threaten companies into not killing people for their own greed.

The utter contempt for rigor and acceptable standards on safety reporting Tesla has demonstrated is appalling. Your average elementary school science fair project is more transparent and robust. I am not joking or exaggerating on that assessment. They deliberately choose to not even publish the number of accidents or the number of miles used in their division operation. It is hard to overstate how ridiculous this is. Even Philip Morris had to try harder when claiming cigarettes were safe.

Tesla’s numbers are total and utter garbage.

But, since you want rigor. Tesla has reported ~350M miles on FSD. They have over 700 crashes with no known survivors attributable to FSD officially reported and acknowledged by Tesla to NHTSA. This results in a worst case estimate of 1 fatality per ~500,000 miles, 10x more fatalities than their purported accident numbers and 120x worse than the average driver who has 1 fatality per ~60,000,000 miles.

I have now provided a more rigorous proof of danger than the Tesla safety report. I have provided the numerator and denominator as drawn from official Tesla and government sources. My report is more transparent, robust, auditable, and conflict-of-interest free than Tesla’s safety report.

If you or Tesla would like to dispute the number of fatal crashes on FSD, then for each such crash provide a report indicating if FSD was not on or all occupants survived. That is the bare minimum level of proof required, which is still less than what Waymo already reports voluntarily without being forced to. Until then, Tesla’s auditable safety record of 120x more dangerous than the average driver speaks for itself.


LOL - 700 crashes with no known survivors attributable to FSD? You're claiming 700 people have died while operating FSD? Friend, that's not a worst-case assessment. That's just a lie.

But to be sure, there will be people who die. That is true of nearly every human activity. We have to accommodate a non-zero death rate. Were to not, then we'd immediately outlaw driving altogether, which would impoverish millions. Your angry "prove it's safe" argument is an ideological position not grounded in any useful reality. Their numbers evidence the technology is very safe. It is not perfectly safe because nothing is.


So not a single one of the 700 crashes with no known survivors that Tesla reported to NHTSA, ~95% of all crashes they have reported, had any fatalities? That sounds really easy to prove. Tesla can just get some lawyers to query the police reports. The police reports will record any injuries and fatalities and then Tesla can update their NHTSA incident reports to fill in the police report boxes that they chose not to fill in during their initial and follow-up reports.

Or are you arguing that Tesla, who knows exactly where every crash occurred and who is the nearly trillion dollar company, has no obligation to do any investigation? Instead it is the public's job to reverse engineer Tesla's crash information from the redacted reports and then donate time and money to do it for Tesla.

Furthermore, you have completely deflected on providing any support for the numbers in the Tesla safety report you keep quoting. Tesla's safety report claims 4.8 million miles per accident. You and Tesla have exactly zero evidence supporting that number except for Tesla's word that Tesla's cars are super safe which is why you should give Tesla money. Tesla will not even answer a simple question like how many accidents occurred in each quarter, let alone proper reporting like Waymo who indicates exactly which NHTSA reported crashes they corresponded to and the nature of the crash. You complain about publicly auditable numbers reported by Tesla to NHTSA while pushing completely unsupported numbers self-reported by the entity with the largest possible conflict of interest as if they are facts.

Also your last point railing about proving things safe is a strawman. Proving something safe does not mean "proving it perfectly safe", it means demonstrating the desired level of safety through rigorous investigations producing well-documented, auditable evidence confirmed through unbiased, third party access to the data and data generation methodology. Which is already the long-standing standard procedure in safety-critical systems in industries such as aviation, medical, automotive, transportation, and industrial control. It does not mean publishing a unaudited number while doing no investigations and preventing third party or regulatory access to the raw data like what Tesla does. That is called safety reporting malpractice.


I think it's quite likely that, if people had died, the press would have covered it (as they have so salaciously covered the accidents). I have no reason to believe Tesla is lying. But, before we barrel down that path, why don't you tell all of us what level of death would be acceptable to you. Would you be willing to endorse FSD if it were, on average, as safe, or more safe, than human drivers?


If any autonomous driving system could demonstrate safety at or above human drivers supported by a robust dataset generated from a rigorous investigative process and made available to credible unbiased third party watchdogs or regulatory agencies that confirm the findings then I would support the further testing and deployment of such a system.

Now your turn, how many people has Tesla ADAS software killed and what is the acceptable rate?

As to the data reporting I demand, which is longstanding standard in safety-critical system deployments, Tesla is literally the furthest from that. They deny all third party access, voluntarily publish no raw data, demand the maximum redaction in all mandatory government reporting, and hide data except when selectively releasing private data out of context in press conferences only when it portrays them positively and burying the rest. They choose to do no investigations so they will not be required to report negative outcomes, they sue reporters, threaten news organizations and employers with lawsuits to make them silence their employees, and direct abuse against regulators to force them to recuse themselves from investigations. Tesla uses every trick in the book to prevent a critical look at their numbers.

There is exactly zero reason to believe even a completely transparent company about the safety of their own products. A company as dishonest and cagey as Tesla providing numbers with zero supporting evidence, who has a history of cherry picking misleading good-sounding data, should have their numbers resoundingly ignored.


Nah, I think they've produced sufficient evidence to shift the burden. Where's your evidence of all these deaths? This is safety critical in the way that all cars are safety critical. This is not an airplane, and this is not the FAA. If we're going to drag Tesla through the mud, as you seem hell-bent on doing, I think it should be on you to evidence this epidemic of Tesla deaths. You cannot because there is no such epidemic.


You and Tesla have not provided any evidence. All you have done is point to self-reported numbers with absolutely no supporting data.

You can not even tell me the number of accidents reported or the number of miles driven on Autopilot in Q4 22 used to calculate Tesla’s number. That is the numerator and the denominator, really basic stuff that even a elementary school science fair project would report.

Given your total refusal to supply any data or evidence at all to support your points, do you even know anything beyond the Tesla talking points? Please reply with the number of confirmed fatalities (or fatal crashes) in the US while Tesla ADAS systems were in use (hint it is non-zero).


If it works for 99,999 seconds for every second it fails, then I should expect it to fail before 30 hours of use. That's abysmal.


True. But at a hypothetical 40 mph average, one accident per 6.57 million miles is one accident every 164,250 hours. There are 8,760 hours in a year. These are exceptionally safe machines.


compare:

“If it works for 99,999 microseconds for every microsecond it fails, then I should expect it to fail before 0.1 seconds of use.”

with:

“If it works for 99,999 hours for every hour it fails, then I should expect it to fail before 11 years of use.”

This comparison can be twisted to make any conclusion you want.


A car accident lasts for something on the order of 1 second, maybe a bit more or a bit less but no car accident lasts for a solid hour. I'm not concerned about one complete hour of failure, I'm concerned about one second of failure.


Who could possibly imagine something called Autopilot wouldn't require you to be in full control of car at all times?


An interesting bit of history is that autopilot was one of the first common terms for cruise control, and was in use until at least the mid 60s.

That is far enough back to not have any relevance to Tesla's current customer demographic though.


Also airplanes use autopilot and require full control at all times. It's almost as though there are no situations where the term autopilot refers to a fully automated system. Who knows where the general public today go that impression. Perhaps science fiction stories.


Airplanes use a variety of modes of autopilot systems and some modes absolutely do not require the pilot to touch the controls. They do require the pilot to set the mode and its parameters in the first place, and it would usually be considered a serious safety violation if both pilots were to get up and leave the cockpit while autopilot was engaged, but the technology is definitely good enough to handle this situation.

This is not true for Tesla's FSD. Not by a longshot.


> some modes absolutely do not require the pilot to touch the controls.

Autopilot can disengage and return control to the pilot at any moment. Pilots have to be on alert at all time. This has resulted in sone catastrophic outcomes, such as AF447.


Article is referring to Tesla's Full Self Driving feature not Autopilot.


As a frequent user of autopilot, I can assure you it is nothing close to NOT DRIVING. It is advanced cruise control and you need to be attentive. Even full self driving does many many things imperfectly, requiring you to take over. I'm not even talking about dangerous situations, just regular driving. It often won't move over when you want, it's too rough taking off ramps, and so on.

If you want to go down the road of regulation, I would like to see ALL driving assists subjected to the same laws. So no cruise control for you without cameras watching your eyes and all the rest. Though personally, I think it's fine as it is, making it very clear that you are still driving. Tesla doesn't do well at the latter in their marketing.


> As a frequent user of autopilot, I can assure you it is nothing close to NOT DRIVING.

And yet they're allowed to market it as "auto pilot" which is clearly an "I'm not required to be attentive" message. I agree, though, it's advanced cruise but not true driverless driving. It's not even worth it, to me. If I have not only monitor the road but "someone else's" driving, that's more work than just driving.


> If I have not only monitor the road but "someone else's" driving, that's more work than just driving.

This is how I feel about FSD.

Autopilot, on the other hand, is really good for on the highway/interstate.

Speed regulation, maintaining spacing, and lane assist are all smoothly implemented. It is certainly less work than driving, typically by a lot.

I say this with about 20k miles of autopilot usage in my model y.


The semantic difference between "cruise control" and "auto pilot" seems minimal to me. Imagine you have never heard of the term "cruise control" and a car company used the term.


Nope, Tesla is not offering people the ability to "NOT DRIVE". They might do so eventually, but all that they actually offer today is an advanced driver assistance system.

There is no need for the US government to make any changes to liability laws. Almost all such cases are handled at the state level.

Mercedes-Benz has gone quite a bit further and is now offering a true SAE level 3 autonomous driving system. It is significantly more advanced and reliable than the Tesla system. And they stand behind it by assuming liability for collisions.

https://media.mbusa.com/releases/release-1d2a8750850333f086a...


> Nope, Tesla is not offering people the ability to "NOT DRIVE"

This is technically correct but they, and especially Musk’s public comments, are pretty fast and loose around that. He’s been claiming L5 will be available in 1-2 years since 2016:

https://arstechnica.com/cars/2021/05/tesla-autopilot-directo...

Remember when they claimed that the driver in their promotional video was only there for legal reasons because the car was driving itself? Someone who paid extra for FSD back then has probably already sold the vehicle without ever getting it:

https://www.bloomberg.com/news/articles/2023-01-19/elon-musk...


https://twitter.com/elonmusk/status/1677531425795391489

https://twitter.com/elonmusk/status/1683348253763330048

Elon Musk openly replying to FSD users stating the car drives itself just a few months ago.

Also the promotional video was faked and staged. Ashok Elluswamy, the current head of Autopilot software, stated in a deposition under oath that he was personally involved in faking and staging the demo. During at least one of the test runs it ran off the road into a fence. They still wrote: “The driver is only there for legal reasons.” despite that.


Yeah, I can only imagine their lawyers scrupulously maintain a record of what they told Elon before he did the opposite so they can’t personally be held accountable.


> Nope, Tesla is not offering people the ability to "NOT DRIVE". They might do so eventually, but all that they actually offer today is an advanced driver assistance system.

That's not really what they are selling though, and that's one of the problems.


>Mercedes-Benz has gone quite a bit further and is now offering a true SAE level 3 autonomous driving system. It is significantly more advanced and reliable than the Tesla system.

Is it? From your link...

>On suitable freeway sections and where there is high traffic density, DRIVE PILOT can offer to take over the dynamic driving task, up to speeds of 40 mph...DRIVE PILOT available in the U.S. for model year 2024 S‑Class and EQS Sedan models, with the first cars delivered to customers in late 2023

So very limited driving scenarios and extremely limited product availability.


It is restricted to 40 mph (70 km/h) and it is restricted to just a few limited access roads (just a few sections of the autobahn).

This means that it is useful only dense, slow moving, traffic. That is scarcely better in practice than my 2015 Model S with first generation Autosteer.


Its is also level 3 which means you can watch a movie on your phone or send emails. Something you can't do with Tesla even if you purchased the $15k FSD package.


And when the car asks you to take over how long do you have to respond? Can someone do their knitting while the car drives and be able to put that down and take over before the car decides that it has to come to a safe stop?


Ten seconds per regulations.


If I'm immersed in doing something else, I wonder if that is really long enough.

I'm sure it's some kind of progress but it seems to me that being restricted to 40 mph rather reduces its utility. Traffic on motorways is usually travelling faster than that. This means that it will be used quite rarely which makes me think that this is essentially a PR exercise on the part of Mercedes. Also the press releases say nothing about how the car and driver handle the transition from Level 3 below 40 mph to level 2 above and the reverse.


FSD does use the internal camera to enforce eyes on the road hands on the wheel. It’s actually worse for road-trips on the highway than plain autopilot for this reason, aside from the fact that FSD will automatically pass and change lanes.


Autopilot will nag you every 30 seconds if it doesn't detect hands on steering wheel.


Hilarious typo. I’m imagining the vehicle saying things like “You suck, you’re so lucky I’m the only car that will ever love you.” “


Acktshually, negging would be a tad bit more subtle than that. It is a form of backhanded compliment.

I imagine a negging Tesla car would say something like "it's so cool to see a person who has so little to lose they don't care whether they live or die" when you take your eyes off road and/or hands off wheel.


I find the time varies, and that the amount of attentiveness required is much lower than that for FSD. FSD will ding you very quickly if you have a device in your hand, especially compared to plain autopilot.

My car came with an internal camera that didn’t have IR lights, which made FSD completely unusable at night time (they put in a new one under warranty).

I only subscribe FSD every few months for long trips, but I have a 1400 mile road trip planned this weekend and I will be using plain autopilot.


There are many well known workarounds for that though since the detection is just an accelerometer.


I recommend you read the biography and not some blog taking an entire chapter and turning it into a few sentence out of context, there was a lot more nuance in the book


[flagged]


> The alternative, as we've seen with other manufacturers, is that they just won't have FSD. They'll call it that,

1) no one else calls it FSD or anything like that. Every other car manufacturer with driver assistance tech calls it what it is.

2) other manufacturers define safe constraints on their tech and then enforce those constraints

> , but they'll make you keep your hand on the wheel and your eyes on the road -- which isn't FSD at all.

Super Cruise and BlueCruise are both hands free. Mercedes Drive-Pilot is licensed in California as Level 3 and won't require the driver to pay attention.


> FSD

The F here stands for 'full' here but...

> I need to remain mindful of the system at all times

It sounds like it should really be called supervised self driving (SSD) to me.

I think the technology is interesting but I wholeheartedly object with the name and the promises that it implies.

> If you pin the failure of a driver to oversee FSD on Tesla

I think it's reasonable to pin the failure on the system if you call it FSD.


Tesla's system requires less supervision than other manufacturers. And that's where the rub is. They're saying -- hey, we'll make this tool available to you, and it really will function autonomously, but you have to know Tesla's not going to take the hit if there's an accident. In America's litigous society, I think that's the only way we'll ever get these tools. Otherwise, the plaintiffs' lawyers will destroy Tesla. I want the technology, and I'm comfortable assuming the risk if there's a fuck-up. If Musk needs a way to prove I wasn't watching and that the liability to transfer to me -- that's fine. I want the option. And Ford/GM/BMW et al won't give it to me.


> I'm comfortable assuming the risk if there's a fuck-up.

Well, here's the rub. The risk is not just ours. It also involves others.

If said "fuck-up" is spilling some tomato sauce on the carpet, then sure, we can say "my bad," and take out our checkbook. It's fairly certain that the other person the risk exploded on, will accept our amends.

However, if it is running over a child, I don't think the checkbook thing will work.


> I want the technology, and I'm comfortable assuming the risk if there's a fuck-up. If Musk needs a way to prove I wasn't watching and that the liability to transfer to me -- that's fine. I want the option. And Ford/GM/BMW et al won't give it to me.

It isn't just about you, its about everyone else on the road as well. You don't automatically deserve to operate a less safe system on public roads just because you are willing to accept liability. Other manufacturers, at least with regard to autonomy, recognize that fact and design products that mitigate risks with a proper safety lifecycle and design domain.


But that's where you're wrong. All data indicate it's SAFER than human drivers. My decision is, on average, making people more safe, not less. https://www.tesmanian.com/blogs/tesmanian-blog/tesla-autopil...


There are serious statistical issues with Teslas claimed rates, but even so autopilot != FSD and Tesla FSD is, imo, currently benefiting from the left hand side of this chart:

http://safeautonomy.blogspot.com/2019/01/how-road-testing-se...

Their disengagement rate is so high that as it stands it keeps drivers vigilant, but as the system improves driver vigilance WILL fade and without robust mitigations FSD will become less safe than a human for a considerable amount of it's development.


> Tesla's system requires less supervision than other manufacturers.

> I need to remain mindful of the system at all times

> and it really will function autonomously

To me this is a contraction. It's said to be both autonomously and to also require constant supervision. And that's why I think it's marketed incorrectly despite it being an interesting piece of technology.


If Tesla really believes it "really will function autonomously" then they shouldn't have a problem assuming liability. Further, if it needs to be supervised, then it's not autonomous, is it?

Congratulations on being so accepting of being sold a bill of goods, though.


There are easier, faster, and cheaper ways to reduce traffic accidents rather than relying on hopium AI to solve all our driving problems, starting with reduction in driving and more transits.


[flagged]


> I'll start listening to the mass transit hopium

It's odd that we are talking about 'hopium' when mass transit is already a thing in many countries.


And in rural America it decidedly is not. Tesla Autopilot/FSD, on the other hand, decidedly is.


Yet another in the growing list of reasons to never buy a Tesla. I would go further, that this is an instinct that Musk has tells me strongly to avoid using anything Musk has his fingers in.


While avoiding a Tesla will avoid Musk, it won't avoid privacy nightmares - Mozilla did a report on every car brand, and every car brand is terrible on privacy "They’re all bad" - https://foundation.mozilla.org/en/privacynotincluded/article...


Might be all bad but even from your own link some are more bad then others:

"Tesla is only the second product we have ever reviewed to receive all of our privacy “dings.” (The first was an AI chatbot we reviewed earlier this year.) What set them apart was earning the “untrustworthy AI” ding. The brand’s AI-powered autopilot was reportedly involved in 17 deaths and 736 crashes and is currently the subject of multiple government investigations."


Yes, this is why I won't be buying any modern cars. But my stance with Musk is deeper: I won't be buying anything from a Musk-involved company.


Agreed. Sadly, if you're a US taxpayer, you're giving him quite a lot of business via SpaceX without a choice in the matter :(


They're all bad but I'm reasonably certain that any car company not ran by Musk won't respond to public complaints by combing through my data and publishing whatever they find to Twitter and calling me names.


I used to think that Musk and Tesla would die a death by a thousand cuts to their brand, but it turns out that public perception is more of an anti-fragile system, so every scandal they survive only makes them more resilient to the next scandal.


I've had the same thought about some US politicians recently. I think there's an element of feeling shame that connects public perception about a scandal. More and more we're seeing people who refuse to publicly express remorse, or even acknowledge that something they did was wrong or should be considered wrong.

And the way that public perception works, if you have enough "True believers" who shout loudly when someone tries to speak ill of you, it all seems to be just water off a duck's back.


I'm not fan of Musk, but stories like this where "X wanted to do Y but actually didn't" are not a "scandel" or even bad in my books. This stuff happens all the time behind closed doors and it seems like in this case the checks and balances in the company prevented it from happening, which is what I care about.

If they actually did this, then I'd think twice. But I think what you are seeing is more that the true haters who bring these "didn't actually happen" things up all the time don't have a huge affect on people buying the product.

The closest thing that actually worried me was the recording of the external cameras. Yes they can record stuff in your garage which isn't great. But I don't actually care at all about them recording stuff on the street around me and sharing that. I WANT them to do that to improve the system.


Yeah, like Musk, it is a cult of personality. Results don't matter, actions don't matter. All that matters is preening for a crowd who just wants to hear the hits.


It’s a cult of identity. If you say you are a Musk/Trump/Whoever fan and make that a core of who you are, and then they do something bad, that might transfer to your identity. Better to constantly refute than take damage to your own identity. Better to lash out than look inside.


You generally won't feel shame about something that you don't believe is wrong, so that gives people mountains of leeway to rationalize whatever they want if they are willing to lie to themselves to get their end goals.


Yes, but in the context of a politician (or another public figure) you have historically been obligated to accept the moral standard of the public at large (because you need their vote.) Even if you didn't believe something was wrong, you'd have to at least feign the appropriate remorse so that people would back you again.


So when a politician comes out against drag shows because she thinks they're too sexual for kids, and then gives out a handjob in a theater among a bunch of kids, or calls her political opponents groomers while staying silent about her own party's attempts to codify child marriage or block legislation that would make it harder, what does that say about her voters?


> More and more we're seeing people who refuse to publicly express remorse, or even acknowledge that something they did was wrong or should be considered wrong.

This is because US politicians are loath to take action against their colleagues in legislative chambers, fearing retaliation. So they depend on shame, or "the honor system", hoping that the bad apples will simply step down.


Because about 90% of these 'cuts' are things most people simple don't care about.

One comment of Musk doing something that online 'leftist' don't like will have far, far, far more reach then some technical point about the driver monitor system or privacy concern.

And at the same time all the other companies have just as many 'cuts'. VW is still popular and they don't have cuts but rather literally holes in their brand.


I am surprised he just doesn't ignore the FAA and launch his rocket anyways. Not like anyone is going to do anything about it.


He has done that of course: https://www.theverge.com/2021/6/15/22352366/elon-musk-spacex.... And yes nobody did anything about it.


I mean he did before, I'm surprised he's not continuing to ignore them. Maybe the lawyers got through to the other investors in SpaceX.

https://www.theverge.com/2021/1/29/22256657/spacex-launch-vi...


He got away with the weed thing which IIUC would have revoked someone’s security clearance. Maybe I’m misremembering.


The better question would be: Does he own any guns?

I only ask because a relative of a prominent political figure is currently having the book thrown at them for owning firearms and doing drugs.

(It goes without mentioning that the host of the podcast in question definitely owns guns, and definitely consumes illegal, Schedule 1 drugs. That's a five-year-minimum-prison-term felony, yet somehow, he's still on the air...)


As we learned from a certain governor "hiking the Appalachian trail", some people get treated differently.


You’d lose your security clearance over legally putting a marijuana cigarette in your mouth and demonstratively not inhaling, in the context of an interview made for entertainment purposes?


Yes; that's what a Schedule I drug is. Possession is a federal offense, and the intent does not matter at all. Violation of federal laws can and will get your clearance revoked, even if it's legal in the state where the offense was committed.


Part of becoming an adult is learning that all laws, including hardline federal laws, are entirely subject to the discretion of the human enforcers, who are subject to their own game theory of survival.

Absolutely no law is enforced 100% of the time on everyone equally. There's always a political/social calculation being made and balanced. It's childlike naivety to believe that perfect enforcement ever could (or should) be the case. Akin to believing Santa knows if you've been bad or good so be good for goodness sake.


What you're saying is that it's childish to believe rich people should be subject to the same treatment as everyone else. If that's the case, then everyone that's not childish is kind of a shitty human being.


I know this. The rich people we as society worship get away. And the darker the skin or the thicker your accent or any other marginalized group you’re more likely to get in trouble.


I think you're confusing the point of security clearances. There's no blackmail risk in that drug use, and so it doesn't interfere with the clearance.


Not even the lowest level employee with a security clearance could be blackmailed for smoking weed, but you can bet your ass they would get their clearance revoked for it if they did it.


I think you oversimplify the point of the security clearances.


Obviously nobody should be punished for that, the point is that most people would be, and he got special treatment. The solution is to change the law, rather than just not applying it to billionaires.


I don’t think I agree with this, as someone in the world of security clearances. It would probably just end up being something you need to explain and mitigate when it comes to your next reevaluation.


Yes. Possession of any amount is federal misdemeanor with fine up to $1000 for a first offense. Flagrantly violating the law tends to get your security clearance revoked, and rightfully so. You demonstrated you can’t be trusted to follow law.

https://www.criminallawyersandiego.com/california-marijuana-...


If you own a firearm and smoke weed it is a felony.


"Legally" is doing an awful lot of work in that sentence. It was not federally legal then and would not be today. (I think it should be and wish it was on behalf of others, but it isn't.)

The security clearance in question is federal.


There's no legal weed in the US. Not when dealing with federal authorities.


It is not legal. I disagree with the law but it is not legal at the federal level to have marijuana.


That was an overstatement, as someone who currently holds and knows many people with clearances.

By the way, do you think a security clearance should be revoked for consuming marijuana?


Should a security clearance be revoked for consuming marijuana, ketamine, cocaine, meth? Where do you draw the line?


Marijuana. Same way its not revoked for alcohol unless you are dependent or abuse it.


One would think it would refer to federal law instead of personal ideas or local customs (what if a city said all of them were legal—could you do it then?).


Well I can tell you that most people wouldn’t have their security clearance revoked for using marijuana, instead likely rebuked or made to agree to abstain in the future. It would be more serious if illegally purchasing or distributing, of course.

Would you be making this argument if it was anyone else?

Since there’s no case of drug decriminalization or other substances in the USA, I don’t think this is particularly relevant. There’s a reason people are up in arms about MJ scheduling alongside these other substances. Additionally, I know this might sound very general, but the security clearance process is extremely subjective. There is no published set of circumstances for what gets a clearance denied or revoked and what simply gets addressed and mitigated. And I think that’s a good thing. In this case, I don’t think Elon’s partaking in any way indicates, jeopardizes or threatens his loyalty to the US, the likelihood that he could be blackmailed, or his ability to keep secure data secure.

Would you be making the same argument if it wasn’t Elon, but a close friend from school that hit a doobie on his buddy’s local team’s podcast?

Sorry for the ramble, but the clearance process is one I’m very familiar with, and I just see a lot of assumptions and misunderstandings in these threads.


I was tested for marijuana just to get a low job at a defense contractor.


That didn't stop people from using Uber did it...


I don't use Uber simply because I prefer to avoid doing business with companies that gleefully and intentionally break the law.

I'm not concerned with what others do. I only have control over my own decisions.


Yup.

It might be OK if the drivers could also spy on Tesla's development, or at least the internals of it's cars, but Musk would certainly go into a rage against that...

On avoiding anything built by Musk, agree that is a hard NOPE. And I've defended Musk's accomplishments on here multiple times, but before he revealed his technical cluelessness and authoritarian sympathies. Sad, really.


When I learned about that interior camera in my Model Y, this is exactly what I thought might happen. But, I didn't want to cover it, because I wasn't sure if that would look suspicious in the case I was in an accident.


> I wasn't sure if that would look suspicious in the case I was in an accident.

No reasonable person would fault you for covering the camera. Assuming you still have the car, you have endless ammo for doing so. Including the fact that tesla employees were caught sharing video that was supposedly private.

https://www.reuters.com/technology/tesla-workers-shared-sens...


People complain that Tesla doesn't do enough to prevent humans from defeating driver assist systems and then also complain about the privacy issues. Can't have it both way. Internal cameras are used to confirm you're attentive and eyes are on the road.

Tesla employees being overprivileged with regards to consumer data access is a corporate policy, legal, and governance issue, not a tech issue. Like any other company, if consumer data is accessed in a way exceeding legitimate business purposes, someone should be fired.

EDIT:

https://www.tesla.com/ownersmanual/modely/en_us/GUID-EDAD116...

> The cabin camera can determine driver inattentiveness and provide you with audible alerts, to remind you to keep your eyes on the road when Autopilot is engaged.

> By default, images and video from the camera do not leave the vehicle itself and are not transmitted to anyone, including Tesla, unless you enable data sharing. If you enable data sharing and a safety critical event occurs (such as a collision), Model Y shares short cabin camera video clips with Tesla to help us develop future safety enhancements and continuously improve the intelligence of features that rely on the cabin camera. Data may also be shared if diagnostics are required on cabin camera functionality. Cabin camera does not perform facial recognition or any other method of identity verification. To protect your privacy, cabin camera data is not associated with your vehicle identification number.

> To adjust your data sharing preferences touch Controls > Software > Data Sharing > Allow Cabin Camera Analytics. You can change your data sharing settings at any time.

(own Teslas, you are prompted to agree to different data sharing configurations before it is enabled)


>People complain that Tesla doesn't do enough to prevent humans from defeating driver assist systems and then also complain about the privacy issues. Can't have it both way. Internal cameras are used to confirm you're attentive and eyes are on the road.

Tesla took the cheap way out. Ford Bluecruise uses IR sensors to detect whether you're paying attention or not, as does GM Supercruise. They don't produce potentially embarrassing photos and videos. They also don't send that video back to a mothership for employees to share your private/embarrassing moments.


I thought the IR sensors also use cameras, just with infrared emitters to see in the dark?


You can have it both ways: the internal cameras can be used to compute attentiveness entirely within the car's local computer, any imagery could be immediately discarded after processing (or better, processed in a way where the imagery and algorithms are hardware-isolated from any computing modules connected to a network or permanent storage). This is similar to how Apple computes a successful face ID or fingerprint scan


"People" are not a unified bloc with a consistent set of non-conflicting opinions.


There is a "have it both ways" solution here. Do all the processing of the video feed in the car.


> But, I didn't want to cover it, because I wasn't sure if that would look suspicious in the case I was in an accident.

Now that's some chilling effect


It's suspicious if you habitually cover it right before the sensors show you're driving like a jackass.

I stuck a sliding webcam cover over it and have had no reason to uncover it.


Nothing like buying yourself a telescreen


Could someone invent a hologram cover for the camera that fools it?


> When I learned about that interior camera in my Model Y

Wait, Tesla cars video their users? Seriously?

I'm gobsmacked users tolerate and agree to this.


Yes, and Tesla employees freely share the video recordings when they find something they find interesting.


Or did.. hopefully not anymore.


They probably do. Elon's other company apparently has a global map view of every starlink tereminal and it's exact location that he watches. Why anyone would trust a product of his beats me...


I covered mine with a bit of post-it note. I'm sure masking tape would do the trick as well.


All new cars have cameras, even ones that can see your eyes behind your sun glasses.


All new cars made to me Euro standards, there's still new cars made solely for US markets which don't have these features.


We have cockpit cameras in many vehicles - planes, some trains, other vehicles.

If we are going to have 4 cameras doing outward data collection, it makes sense we'd have them facing inward also.

The question to me isn't the presence of a camera or not for accident reconstruction, but if it is being actively used when not involved in an accident in a big brother sense, and if user data if being unfairly monetized, and if this permitted a user to disable it in most cases (except accidents), or if the driven is operating in a professional capacity and must have the camera enabled for personal safety reasons.

On the other hand I have a number of kids, and we have a plan for our au pair to drive the children. For this, it would be useful if we had in vehicle monitoring and tracking of speed, etc. We don't want an infant being driven to an appointment like the au pair is on the Indy 500.


Those cockpit cameras are in the context of employment and also the context where all incidents are investigated with at least a stated goal of improving safety.

Personal vehicle accidents are rarely investigated, and when they are, it's almost always with a goal of assigning responsibility.

Why would I pay for something that is going to record me and most of the time be useful most for assigning responsibility to me?

If there was a culture that most/all traffic collisions would be investigated and remediations would be taken, then sure. But it's very rare to have an thorough investigation, and it's even rarer for remidiations to be taken. Many locales have 'known dangerous' intersections where there's no consideration of design change. There's no mechanism to provide ongoing training to drivers (in the US).


From the book on Musk:

Musk was not happy. The concept of “privacy teams” did not warm his heart. “I am the decision-maker at this company, not the privacy team,” he said. “I don’t even know who they are. They are so private you never know who they are.” There were some nervous laughs. “Perhaps we can have a pop-up where we tell people that if they use FSD [Full Self-Driving], we will collect data in the event of a crash,” he suggested. “Would that be okay?”

The woman thought about it for a moment, then nodded. “As long as we are communicating it to customers, I think we’re okay with that.”

Glad this guy is the richest man in the world, we are in good hands. It's so comforting to see Musk's attitude when he found out about the privacy team and their temerity to bind him with silly "privacy rules".


Curious why so many people who are sensitive to privacy issues here ignore massive data collection and privacy flaws in Tesla?


Cognitive dissonance


Every car will soon have inward facing cameras.

In the 2021 infrastructure bill, by 2026 cars have to include technology that must “passively monitor the performance of a driver of a motor vehicle to accurately identify whether that driver may be impaired.”

The most straightforward way to implement this is with driver-facing cameras: "Sam Abuelsamid, principal mobility analyst for Guidehouse Insights, said the most likely system to prevent drunken driving is infrared cameras that monitor driver behavior."

https://apnews.com/article/coronavirus-pandemic-joe-biden-te...


The guy interviewed about his camera technology was desperately trying to pitch it.

But I think this is honestly unlikely - watching driver behavior is probably the worst indicator for drunk driving. You are way better off just using the lane telemetry data.


> But I think this is honestly unlikely - watching driver behavior is probably the worst indicator for drunk driving. You are way better off just using the lane telemetry data.

Yep. It's the lane drift that would give the cops probable cause to pull you over for DUI in the first place.

GPS could attest that you were recently parked at a bar.

The most the internal camera might see is an open container.


Pretty sure you’re wrong here. There are already gaze tracking systems deployed in cadillacs and other high end cars. It’s probably decently easy to tell if someone is passing out or just not paying attention with a neural net at this point


> high end cars

Yes. For premium cars that already have this functionality. But I am pretty sure the Mitsubishi Mirage is not going to add state-of-the-art driver tracking technology to comply with the 2026 law.


> watching driver behavior is probably the worst indicator for drunk driving.

I think surveillance is (mostly) the point. I guess we'll see. I'd be happy to be wrong.


For who? The government doesn't specify what technology is required - and most cat manufacturers would probably avoid expensive IR image detection just for compliance.


For cars with the various modern assistance features you could probably do driver impairment detection without any interior sensors. Here are some things that might help detect impairment, especially if the car can identify the driver and so compare their performance on this trip with prior trips.

• Data from the lane assist system about how much the driver is drifting side to side in their lane. An impaired driver will tend to drift more.

• Data from the automatic braking system about how quickly the driver is reacting to obstacles. An impaired driver will tend to come closer to needing the automatic braking system.

• For cars that are aware of traffic lights [1], how quickly the driver starts moving again when a red changed to green. Impaired drivers will tend to take longer.

• How quickly the driver starts moving after a car in front of them that was stopped starts moving. That most likely would happen at traffic lights, so similar to the previous item, but does not require the car to know about traffic lights. It just needs to know that there is something in the way and when it leaves, which the collision avoidance system or automatic braking system can likely sense.

• How much variation there is in speed in areas where the speed limit is not changing. How well the driver adjusts their speed when the speed limit changes. How much the driver is speeding. Impaired people are probably more likely to have large variations in speed, be slow to react to changes in speed limits, and more likely to speed.

If we throw in interior behavior too climate control settings might be useful. Some kinds of impairment change subjective impressions of climate, so if a person is setting the heating or cooling to values that are inconsistent with the current weather and their past history in those conditions it could point to impairment.

[1] I remember Audi once announced a system that would watch traffic lights to figure out a good speed to try to hit all the greens, but I don't know if that caught on, or if any other car makers found reasons to include traffic light sensors.


A problem with those types of indicators is that they can often alternatively indicate an attentive driver handling situations the assistance features don't understand.

A driver may seem to be drifting in lanes because the car's sense of the lanes may be wrong, or may not fit what traffic is actually doing. Large variations in speed in areas where the speed limit isn't changing could indicate difficult traffic or road conditions. Slow reactions to green lights could indicate congestion, obstructions or heavy pedestrian traffic (eg, needing to wait for late crossers at each intersection in a busy city). Climate could be an indication of the condition of passengers, especially for people who specifically drive groups because they aren't impaired when the rest of the group is.


Imagine a world where you're considered a drunk driver because you decided you wanted the AC a little cooler and you were a little slower off the line at a stop light than some historical average. Sounds like a wonderful future.


Drifting lanes, sluggish driving, aggressive driving, ignoring or delaying traffic control devices. These are all signs of impaired or distracted driving and none of it requires a camera being pointed at you.


If true, this is horribly dystopic. Why aren't privacy activists up in arms about this?


"infrastructure" China builds infrastructure. The United States is simply Orwellian.


First they used the data to defend Tesla from NYT reporters:

https://techcrunch.com/2013/02/14/elon-musk-lays-out-his-evi...

This is our future unless the govt demands companies stop spying.


Didn't Tesla also use car telemetry data in their feud with Top Gear? I think this goes way back.


And to attempt to throw the driver in a fatal collision under the bus, holding a press conference to let the public know the vehicle was warning him about being inattentive, but not mentioning that it had warned him once, and that that was nearly eighteen minutes before the collision.


This is what the GDPR is for, the same regulation this site loves to hate.


[flagged]


The truth versus abuse trade off is fascinating.


There are several videos on PH of ppl doing the deed while autopilot drives the car for them, not saying that excuses privacy concerns but this issue is complicated by a rollout of autonomous features.

Tesla wants your car to do most of the work but doesn't want to be liable if you're not paying attention and/or misusing the feature.


So every company that ever sells me anything should be able to install cameras in my house, because I could potentially use the product incorrectly and then sue them for it, and they need to protect themselves.


I bet Meta would love to be an intermediary between your in-house footage and those wonderful and benevolent corporations


Well they did try to sell cameras for video calling into everyone’s home…


> by a rollout of autonomous features

I thought no car had SAE driving automation 5 (steering wheel optional driving) yet, even thought Tesla specifically sells their SAE driving automation 2 (hands off driving) driver assistance as "Full Self-Driving".


> several videos on PH

Philippines? Seems unlikely that Tesla is that common there.

Product Hunt? Even if they added a video feature, this seems a weird thing to post there.

OH! PornHub!


I wish there was like a debunking or common counterarguments website or something to everyone who gives the same excuses to forgive this behavior - I have nothing to hide, everyone’s doing it, I’m not significant enough for them to spy on me, they have to make money somehow etc. As it stands, I feel like even if they started live-streaming every camera on YouTube, people would still not care


It's a little poetic: the techbros got the money to buy those Teslas through worse secret privacy violations of everyone else.


Actual news would be him not wanting to do it.


It's not spying if consent is given.

The good thing is that every idea is considered. The bad thing is that every idea is considered.

The difference, in Musk organizations, is that ideas are aired out. You might think that's a bad thing, but would you rather have this executives slipping backdoors into your products without anyone knowing?

If you notice, the manager in question wasn't afraid to push back on Musk. That's a big difference between Musk companies and your company: in your company a manager would probably fold and just do it.


What legally keeps them from just putting that you agree to be snooped on in the Terms of Service?


It’s already in the TOS that not a single Tesla owner read


You mean the thing that pops up on the dashboard that threatens to revoke usage of your cellphone-on-wheels if you dont "accept", even after a bill of sale was made with respect to the state?


I'd be surprised if it's not already there due to the fact there's already internal cameras?


In Europe, the GDPR.


What does it cost to import to the US a Tesla made for Europe?


Is autopilot really that important to everyone? I can't say I've ever been driving thinking I wish I could be doing absolutely nothing right now.


My car has lane centering and radar cruise control. I use it frequently on dry, well-marked highways. It allows me to drive with less mental energy because I don’t have to steer the car or maintain speed.

My hands are always on the wheel and I’m still watching everything as if I was driving. It does a good job under ideal conditions.

This may not seem like much but it adds up over a multi-hour trip.

In rain or snow, I tend to avoid any automation to keep continuous awareness of how much traction my tires have. Too many people have died when automation hides problems until failure and then squishy thing behind the controls gets surprised.


I use it because the Tesla infotainment system is garbage and quite limited compared to Carplay/Android Auto. And I need to look away and at my phone to use proper apps like Waze, Apple Maps, Overcast, etc.

The onboard Spotify app, for example, doesn't do HD audio and doesn't load playlists completely. Tesla, rather than giving customers what they want, would prefer they deal with their limited inhouse system.

$12/15k for FSD never made sense to me. Tesla is moving to subscriptions where development comes at $200/mo and customers don't have to assume all the risk.


Funny thing is if you paid for FSD in full, you'll never get a car that will be able to have self driving. They are upgrading the hardware again which includes new cameras. I bet it goes two or three more iterations before we have level 3. I have FSD but it was only $1500 or so not $15k. I have come to the realization my car will never be able to drive by itself.


I’ve had several conversations on this very website wherein people declined to ballpark the number of deaths caused by self-driving cars that would be unacceptable in pursuit of the imminent self-driving utopia, which they declined to describe or make a guess to when (or if) it might come to be.

There is a very vocal group of people that are adamant that the ability to fully dedicate themselves to perineum sunning along the I-5 corridor is a societal Good that eclipses any amount of loss of life, and certainly any other safety or privacy concerns.


Traffic is a fucking nightmare in Atlanta and Los Angeles, and Atlanta's mass transit sucks bigtime. Having FSD crawl along in low gear while you read the news or finish getting dressed would make for a way more tolerable commute.


Which, ironically, will make traffic worse by increasing the maximum amount of it a person can endure.


I love the idea of looking at traffic and thinking “the solution here is more induced demand”


Tesla has good ideas, but stupid execution.

Comma.ai has driver monitoring, GM cars have driver monitoring, Hyundai/Toyotas and probably many others yell at you if the hands are not on the steering wheel.

Tesla pure vision based autopilot makes a ton of bad decisions and sudden turns that no sane human will ever make.

Yes, on principle I agree that vision via cameras should be sufficient for perception. However the algorithms and training are not there yet.

Kudos to Waymo and Cruise getting license to operate in California without drivers 24/7 and doing 100s of successful rides per day. We can shit on them for using 250,000 lidar rigs that only companies blessed by infinite VC money can afford but they passed that objective milestone. Tesla has not and probably unlikely for next 5 years.

Waymo and Cruise have shown that they actually give a fuck about safety, while Tesla has their heads in sand ignoring what their system is actually capable of offloading risk to their drivers.


Not to mention Tesla employees were sharing pictures and videos of customer's external cameras. Including Elon's own Tesla.

https://www.reuters.com/technology/tesla-workers-shared-sens...


If you are a driver suing, wouldn't you want this data to prove your case? That the company recognizes it would help their case instead and the accusers wouldn't want it is itself interesting.

Also, these cameras shouldn't exist in non-robotaxi use cases. And since Teslas don't exist as robotaxis, that means these cameras shouldn't exist too.


Whenever the video helps the company, it will be used in any lawsuit.

Whenever the video would help the driver, it was mysteriously lost due to an unforseen technical glitch.


AFAIK, these are stored locally. I think some third party shops are able to recover them? At least they can for the other cameras.


I would rather I have control and the data never leaves my possession unless absolutely required.


I completely agree.


The government is requiring they exist for driver monitoring.


I don't think the government requires storage or transmission. Competing systems don't all do it in as intrusive of a way.


The road to privacy hell is paved with... financial motives.

(Paraphrasing https://en.wikipedia.org/wiki/The_road_to_hell_is_paved_with... )


> Musk proposed a pop-up message, informing customers that data would be collected if they used the Full Self-Driving Beta feature. This placated the manager.

Anybody building Ublock Element Zapper for TeslaOS?


Is it just me or is it totally reasonable in a review the cars sensor data in a case of an accident?


By an independent, third party, yes.


That exactly what happening here. This is how a lawsuit or investigation should be handled.


[flagged]


You are talking about what ended up being implemented, the post talks about what musk wanted to do. Different things.


> what ended up being implemented

He's the CEO, what's implemented is what he wants to do.

He did what he wanted, which was pitch, listen to feedback, and implement. Like a good CEO.


> listen to feedback, and implement.

The whole X debacle has proven once again that Elon will never give up on a bad idea that he has tied his ego to or that was shut down by people he dislikes.

These are exactly the kind of idea that he will jump at once the opportunity arises and the pesky naysayers aren't in the room; "why should we have to bother with any of this privacy or informed consent stuff" is where he's coming from.

That is his style of CEOing, for better or worse.


This event is literally about how he listened to his people.

> The whole X debacle has proven

I don't understand. What was the "debacle" and how did it "prove" your claim?


And then use it against the customers for malicious means. Like a good CEO.


Probably shouldn't call it self driving when you its just a level 2 feature that requires full attention of the driver to use safely. See the confusion. When I think of self driving, i think of level 3 or higher standard where i can safely not pay attention to the road. I don't even think any of the other manufactures who have L3 cars on the market call it self driving.


Concerted effort to make propaganda against him seems a bit superflouous at this point.


[flagged]


Wait, so if he does something, and someone accurately reports on that, but the thing he did kind of sucked, then it's propaganda to report on it? Maybe he could, you know, modify his behavior if he wants to go back to the myth-making coverage from prior years.


nice try, Elon




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: