Hacker Newsnew | past | comments | ask | show | jobs | submit | VelkaMorava's commentslogin

Okay. You know Svitavy?

I thought so, because it's a small city in the Czech republic. There are about 5 guys there which have bussineses in various sectors. They all know each other. If you screw up badly in one job, you are not gonna get hired anywere.

"But why won't you move to Prague?" you ask. Because in Svitavy you get about 13-15k CZK netto/month. Rent in Prague for a reasonable studio apartment is about 11-12k (for reference - 9k for rent + 2,5k for utilities - https://www.sreality.cz/detail/pronajem/byt/1+kk/praha-kobyl...). Realistically the apartment is gonna be higher than that, because there are already 50 people who are willing to pay more than the said 9k for the linked one. You need about 4k a month for food. 500 CZK for phone + internet if you are cheap etc.

So in my example scenario, let's say you are saving 1k while in Svitavy, which is about right. You can't just go to Prague for job interview during your day job at a sweat shop. After a year you are gonna have 3/4 of living costs in Prague. If you are anything else than a programmer or IT guy, you are not gonna find a job in 2-3 months, possibly 6.

So unless you inherit something or take a loan to risk it all, you are stuck in Svitavy. Have a bit more empathy, nextlevelpíčo.


[flagged]


[flagged]


We've banned this account for repeatedly violating the site guidelines.

If you don't want to be banned, you're welcome to email hn@ycombinator.com and give us reason to believe that you'll follow the rules in the future. They're here: https://news.ycombinator.com/newsguidelines.html.


aCtUaLlY...

dude be a bit empathetic. Some people might have relatives they have to take care of, and are unwilling to downgrade their job by multiple levels, hence the commute. Others might have children with friends in local school - sure, they can move, but it's the last resort.

I sincerely hate you libertariantsy types. Your attitude works when you are 25-30 with no kids and distant family. And you know what fascinates me? Having remote/in office work be optional, this doesn't impact you in any way. I can happily live without commute, you can go to office for all I care.


> I sincerely hate you libertariantsy types

Huh. I didn't realize it was a libertarian view that one should be responsible for outcomes that are obviously the result of decisions they made (and still make).

> I can happily live without commute, you can go to office for all I care.

Right. A choice that each of us can individually make. I've made mine and you make yours. I am simply stating that one should take personal responsibility.

In general, your employer shouldn't/doesn't care how long you have to commute to work. Why? Because you are replaceable with someone who doesn't have this problem or won't whine about it.

Stop externalizing blame. Your quality of life will increase greatly once you stop expecting other people to solve problems you create and just solve them yourself.

> dude be a bit empathetic.

Empathy for what, exactly? OP can get a job closer to their home OR move closer to their job OR find a remote only job. It isn't their employer's responsibility to endlessly subsidize and cater to the choices of their employees.

Regarding your other assumptions: I am outside of the age group you mention and have children + distant family.


US has bad commutes because of our terrible land use. So yes people should let go of homeownership fantasies, but also we simply cannot fix land use with individual actions alone.


I got the same feelings. From the article:

> January 29, early: there's this team that nominally owns dashboards, and they got wind of us wanting a dashboard. They want to be the ones to do it, so we meet with them to convey the request.

> January 29, late: asked "dashboard team" manager if they had been able to get the network stuff talking to our server yet via chat. No reply.

Am I the only one to think this is completely unreasoanble to message the other team later that day? I mean it's not as if I am sitting and dangling my legs waiting for a Karen to come, drop everything I do and do her dashboard. They might have had stuff planned for weeks/months in advance...


Yup. I pretty much stopped caring at that point. My backlog is 6+ months long. If you need me to drop something and pick up a new project, it’ll need the blessing of product management and, depending on size/LOE, also approval from my VP or SVP.

PM will adjust when appropriate; they do care. But, I can’t just build every little thing an internal customer wants because I have revenue-producing/cost-saving updates to build and deploy. Make your case that this lost+button saves more money than something else on my backlog, and we’ll probably squeeze it in.

This is a mid-sized enterprise software company and I manage one of our SRE teams.


If your backlog is six-plus months long, do you simultaneously go out and “want to be the ones to do it”?

I have no problem with another team being too busy to take on work in my area, provided they don’t actively try to take on work in my area and then pocket veto it.


This is the real core of the issue. There's always a chance that some software project, small or large, will progress massively slow for 1000 reasons. But the dysfunction with turf wars and actively blocking solutions is really serious, and common in large or growing companies.


Possibly fear that "asking team" builds the dashboard themselves, in a custom way, then asks the "dashboard team" to maintain it.

Although that might still be a time-saver for the backlogged "dashboard team".


I think they (the dashboard team) were worried that if other teams started building their own dashboard, that'd indicate that the dashboard team wasn't that useful. Which could mean they would get fewer promotions or raises, or might even get fired if bad days came. -- So they wanted ownership of all dashboards, whilst actually building them, was less important. (Is my interpretation.)

The new dashboard team manager, who appeared in the end, seemed to be a bit more "get something done" minded though


No of course not, but this person was complaining they didn’t start work the same day the request was made. That’s completely unreasonable.


In that case, shouldn't you reply with "my backlog is currently X weeks/month long, we'll probably get started around Y, I'll keep you updated" instead of saying nothing? Since the team wanted to do it and didn't mention that they had to do something before in the meeting, you can assume that they will get started on it right away. If not, at least tell it to the other team so they can explore their options.


If you go to another team and tell them to drop a project because you will do it for them then it is fair for them to assume that you will start working on it very soon. If you don't have the time to do a project then don't tell people they must let you do it.


One Czech ex-minister of health owns 67 properties. So he probably has a lot of acquintances and relatives among the political elite of my country. To me it seems he has more political power than me, a dude sitting in underwear and ranting on Hackernews.


No doubt high level (ex)politicians are very privileged and operate above most limitations imposed on regular people. But that's moving the goalposts of the discussion pretty far indeed.

The original premise was that the entire "55% [of homeowners] wield extremely disproportionate political power". Sure, a few of those homeowners are also politicians with disproportionate political power. But nearly everyone is just regular people with one vote and no other influence, whether homeowner or renter.


Someone: wield extremely disproportionate political power.

You: House owners don't have any additional political power than renters.

Me: one dude has 67 properties and connections to political elite, so he probably has more political power than renters. Implicitly: chances are the entire political system amassed a huge amount of properties, and are in one bed with prominent businessmen, who have the capital to acquire even more properties than said politicians.

You: u R M0v1nG tHe gO4lPoStS. You are a clown xD


That's what you get when your social security and education policy is Hunger Games equivalent in the Middle East. Exporting democracy one bomb at a time.


Well, the US and Europe call themselves capitalist, yet there are huge barriers to enter some of the fields (e.g. in the Czech Republic the number of mobile operators is defined by law, essentially a state-imposed oligopol). Useless regulations (primarily EU), trade embargos, huge subsidies to companies which don't need them, absurd patents (my favourite one being Apple patenting the shape of a rectangle with rounded corners)...

So now what?


During that episode I have came to conclusion that Bill Burr is who Joe Rogan sees as himself. Eloquent, humorous, says-he's-stupid-but-he-is-actually-smart, able to step back and look at all the stupid stuff everyone does (including himself).

Except Rogan is actually this: https://www.reddit.com/r/JoeRogan/comments/8xofvi/joe_rogan_...


> During that episode I have came to conclusion that Bill Burr is who ~~Joe Rogan~~ people on the internet see~~s~~ themselves as~~himself~~. Eloquent, humorous, says-he's-stupid-but-he-is-actually-smart, able to step back and look at all the stupid stuff everyone does (including himself).

FTFY

Side note, on /r/Math the other day I saw a great joke.

People in real life: Ops, I'm bad at math. I need a calculator to calculate a tip.

People on the internet: Allow me to demonstrate to you why I'm bad at statistics but confident I'm right and all the scientists are wrong.


I've noticed that a lot of overly-confident people do this. They essentially have a list of preferred topics and always try to bring the conversation to those topics.


> What's your rap these days? Most of us have one. Is it a disquisition on the stupidity of television, the rapacity of multinational corporations, how the Yuppies had it coming to them, the thrills of motorcycling, the perils of tuna fish? Some people are always ready to mount the soapbox. (It's the twelfth time you've heard this guy's tirade and it was already boring the second time around.)

> The worst sort of rap is the pet peeve. Pet peeves manage to smuggle their way into every conversation, no matter what the topic. Marty is hung up on America's foolishness in not imposing tariffs against the Japanese. It's not clear why he takes this so personally, but he's definitely obsessed with the problem. The topic of conversation is Monday-night football? Marty contrives a quick segue to the state of television in America, orchestrates a smooth turn to the subject of the future Japanese control of the entertainment business, and— presto— tariffs. Marty's rap is boring for the same reason the preacher's is— it's predictable— but it's also an imposition. He uses friends as a sounding board for his venting.

- From the book Everyday Ethics by Joshua Halberstam


Everyone wants to climb to the top of the nearest hill and scream and shout that he has The Truth, but more often than not we'd be better off just shutting up: striving to understand rather than trying to preach, remaining curious rather than telling others what to think.

"In every man sleeps a prophet, and when he wakes there is a little more evil in the world."

- Emil Cioran


I don't mind this as much. At least they are talking about things that they are knowledgeable about. What I don't like is when people are overly confident about their YouTube degree. The armchair experts that need to prove how smart they are, even if you're an expert in the field they're talking about. It is excruciatingly painful.


The thing of Joe Rogan shouting at the primatologist about the "new" chimpanzee is genuinely hilarious

https://youtu.be/__CvmS6uw7E


Wow, that is a classic example of an "internet researcher" who read some articles and then told the people trying to correct to "do research."

I just looked up this story and the apes in question were regular eastern chimps with some regional variation. They've been studied for nearly two decades. The whole 6 foot lion killer thing was a few sensationalized articles from 2003 or 2004 due to a member of the research group making excited claims. The member wasnt even a primatologist and was kicked out.

Joe Rogan's research was literally some popular articles from a decade and a half ago... And he dares tell the person on the phone they aren't "current" and to do research.


This is really some sorry Rogan/envy ad hominem.

Some really petty schoolyard grimes here.

(EDIT: I mean there are link to reddit caricature / joke lists etc. since when do we do this here?)

I don't think Joe Rogan is very interesting ... but the 'haters' say more about themselves than anything, it's a bad look on them.

Rogan is what is he is. He has a variety of guests, he entertains them and let's them speak, he popularized a fairly new format where people can come on and actually make their case for as long as they like. Turns out it's very refreshing and frankly 'important' thing.

All of these podcast/celebrity etc. people take themselves probably a tad to seriously, but I don't think Rogan lacks the self awareness to not recgonise he's not an intellect, that seems apparent.

For those screaming about his subtle wavering on vaccines, first, he hasn't really, and second, our beloved Lex Friedman has gone a bit into the weeds on that one as well so you'll have to throw him into the cauldron as well.

The possibility of a 'dude bro' who's broken the mould and is more influential, and in most ways legitimately so, than many others who are supposed to be more deserving ... seems to bother a lot of people. I don't care that much one way the other about Joe, but I'm annoyed by those people.

Some of the criticisms of Rogan are legit, a lot of it seems to me like something else going on under the surface.


I have listened to quite a lot of his podcasts. There are times where he is straight up wrong and hiding behind the "I am just a guy asking questions" facade. You know what's even more tiring than "haters"? Fanbois who refuse to admit their idol might have a few flaws...


I opened this thread wondering who Joe Rogan is; glancing at that I'm not surprised I don't know, and have certainly lost any interest in knowing!


K


Funny how the incriminating album cover is a prophetic message in itself; the person portrayed on it chasing after money.

On one hand I get it, he didn't consent and what not. But the guy is 30. Why didn't he sue 10 years ago? I have feeling it's because he's trying to ride the wave of political correctness and cancel culture...


That’s because it’s all a giant reach, and in reality, is probably the most famous thing he’s ever done, and thus he can’t get past it.


> Bitcon/DeFi will render them obsolete in a few years' time anyway.

Cold fusion is coming in the next 30 years starting from datetime.now()


I wonder where "a country road with no lanes which barely fits 1.5 car in winter in the Czech republic" is on your scale... Something like this, just imagine the snowdrifts around it https://www.google.com/maps/@49.080269,16.4569252,3a,75y,307...


Now add the completely blind switchback turns, where your "visibility' into whether another car is coming comes from a convex mirror nailed to a tree or post at the apex of the corner - if it hasn't fallen off or been knocked crooked...

basically all of Italy


Or an ambulance going on the opposite direction (because that’s the only available choice) on a boulevard in a busy capital city like Bucharest. Saw that a couple of hours ago, the ambulance met a taxi which was going the right way but of course that the taxi had to stop and find a way for the ambulance to pass (by partly going on the sidewalk). I said to myself that unless we get to AGI there’s no way for an “autonomous” car to handle that situation correctly.


You don't even need to go that far, the other day I saw an ambulance going down on Burrard Street in Vancouver, BC without lights or sirens then I guess a call came in , it put on both and turned around. It's a six lane street where normal cars aren't allowed to just turn around. It was handled real well by everyone involved, mind you, it wasn't unsafe but I doubt a computer could've handled it as well as the drivers did.


a very complex looking behavior sometimes comes from the very simple easy to implement principles, like say a bird flock behavior https://en.wikipedia.org/wiki/Flocking_(behavior)#Rules

I don't believe people are using their full AGI when driving (and the full "AGI" may as well happen to be a set of basic pattern matching capabilities which we haven't discovered yet). After decades of driving the behavior is pretty automatic, and when presented with complex situation following a simple rule, like just brake, is frequently the best, or close to it, response.


To me the solution to that is obvious and far better than the current status quo. The cars are all attached to a network and when an emergency service vehicle needs to get somewhere in a hurry there is a coordinated effort to move vehicles off the required route.

As things stand emergency vehicles have to cope with a reasonable minority of people who completely panic and actually impede their progress.


This has to work even if network reception is weak or absent. You can't be certain that 100% of cars will receive the signal and get themselves out of the way in time.


Right, so don't use the network: broadcast a signed message on a band reserved for emergency services.


> This has to work even if network reception is weak or absent.

Or hacked maliciously.


Oh you can have that in Bucharest even with regular cars. Lanes are pretty fluid there, as is the preferred direction of travel, I've lived there for only two years and I've seen more vehicles go in the opposite direction ('ghost riders' we call them here) than anywhere else over the rest of my life. Romanian traffic is super dangerous, especially if you are a pedestrian and you can just about forget cycling in traffic. It is also the only place where a car behind me honked to get me to move over when I was walking on the sidewalk.


That is 101 for autonomous driving. Solved years ago.


People at Tesla and other autonomous driving companies, of course are aware and worry about such situations. If you have a few hours and want to see many of the technologies and methods that Tesla is using to solve them, check out Tesla's recent "AI day" presentation. Tesla is quite cool about openly discussing the problems they have solved, problems they still have, and how they are trying to solve them.

An incomplete list includes:

1) Integrating all the camera views into one 3-D vector space before training the neural network(s).

2) A large in-house group (~1000 people) doing manually labeling of objects in that vector space, not on each camera.

3) Training neural networks for labeling objects.

4) Finding edge cases where the autocar failed (example is when it loses track of a vehicle in front of it when the autocar's view is obscured by a flurry of snow knocked off the roof of the car in front of it), and then querying the large fleet of cars on the road to get back thousands of similar situations to help training.

5) Overlaying multiple views of the world from many cars to get a better vector space mapping of intersections, parking lots, etc

6) New custom build hardware for high speed training of neural nets.

7) Simulations to train rarely encountered situations, like you describe, or very difficult to label situations (like a plaza with 100 people in it or a road in an Indian city).

8) Matching 3-D simulations to what the cars cameras would see using many software techniques.


They're cool about openly discussing it because this is all industry standard stuff. It's a lot of work and impressive, but table stakes for being a serious player in the AV space, which is why the cost of entry is in the billions of dollars.


> People at Tesla and other autonomous driving companies, of course are aware and worry about such situations.

Yeah, a Tesla couldn't possibly drive into a stationary, clearly visible fire engine or concrete barrier, on a dry day, in direct sunlight.


As awful of a failure as that is, and as fun as it is to mock Tesla for it, that claim was that they're aware of edge cases and working on fixing them, not that they're already fixed. So your criticism doesn't really make sense.


A system dealing with 'edge cases' by special casing them is not going to work for driving, driving is a continuous string of edge cases, and if you approach the problem that way you fix one problem but create the next.


I don't think anybody said anything about special casing them.

I dislike saying anything in defense of tesla's self-driving research, but let's be accurate.


Neither could a human, I'm sure.

At least, I never would...


If you never fail, you aren't moving fast enough.

A million people are killed globally each year by motor vehicles. Staggering amounts of pain and injuries. Massive amounts of property damage. Tesla's cars are not supposed to be left to drive themselves. The chance to save so much carnage seems worth letting some people driving Tesla's, that fail to pay attention to the road, suffer the consequences of poor decisions.

Plus these problems are likely too be mostly fixed due to the fact that they happened.


> If you never fail, you aren't moving fast enough.

Start-up religion doesn't really work when there are lives on the line. That's fine for your social media platform du jour but please don't bring that attitude to anything that has 'mission critical' in the description. That includes medicine, finance, machine control, traffic automation, utilities and so on.


But what about that million people who die every year now? Are the few thousand people who will die because of AI mishaps worth more than the million who die due to human mishaps?

Not to say that we shouldn't be cautious here, but over-caution kills people too



You described a lot of effort, but no results.


From what I've seen of Tesla's solution at least - even busy city centers and complex parking lots are very difficulty for present day autonomous driving technologies. The understanding level necessary just isn't there.

These things are excellent - undeniably better than humans at the boring stuff, highway driving, even major roads. They can rightfully claim massive mileage with high safety levels in those circumstances... but throw them into nastier conditions where you have to understand what objects actually are and things quickly seem to fall apart.


That is like trying to judge modern supercomputing by your experinces with a 6 year old Dell desktop.

Waymo drove 29,944.69 miles between "disengagements" last year. That is an average California driver needing to touch the wheel once every 2.3 years.

Tesla by comparison is classed as a SAE Level 2 driver assist system and isn't even required to report metrics to the state. While they sell it to consumers as self-driving, they tell the state it is basically fancy cruise control.


"disengagements" is a disingenuous statistic - that'd be like a human driver just giving up and getting out of the car.

What you want is "interventions". Additionally, look at where those miles were driven. Most of them are some of the most simplistic road driving scenarios possible.


> That is an average California driver needing to touch the wheel once every 2.3 years

From my experience of California driving, that doesn't sound too bad. Compared to the entire Eastern seaboard, y'all have great roads and better drivers.


> Waymo drove 29,944.69 miles between "disengagements" last year.

You know better. If most of those miles were in sunny Mountain View suburbs, they don't count.


It's unclear to me why Tesla's solution is so discussed. They are definitely not on the same playing field as Waymo or even Cruise.


There's a lot of people on here who have invested in Tesla


also a lot of people on here who have actually experienced tesla's self-driving. certainly a lot more than have experienced any other self-driving product (at least above a "lane-keeping" system)


Are there a lot of people who have experienced tesla's self-driving?

As I understand it, if you pay for FSD, you don't actually get anything like self-driving, you just get lane-changes on the highway in addition to the lane-keeping. Effectively, you get lane-keeping, which you have if you don't pay too.

All the videos of "FSD driving" are from a small number of beta-testers, and there's no way to opt into the beta.

Because of that, my assumption would be very few people on here have experienced tesla's self-driving. It's only open to a small number of beta testers, whether you have purchased it or not.

On the other hand, waymo is available for the general public to use, though only in specific geographic areas.


Would you describe Tesla's tendency to crash full speed into stopped emergency vehicles during highway driving as "excellent"?

https://www.cnn.com/2021/08/16/business/tesla-autopilot-fede...


While controversial, we tolerate a great deal of casualties caused by human drivers without trying to illegalise those.

While we can (and should) hold autonomous vehicle developers to a much, much higher standard than we hold human drivers, it is precisely because of excellence.


We actually do "illegalise" casualties by human drivers.


I'm sure the grand poster meant banning human driving entirely in order to prevent human driving casualties.


The failure modes are going to be very strange and the technology is not strictly comparable to a human driver. It is going to fail in ways that a human never would. Not recognizing obstacles, misrecognizing things, sensors being obscured in a way humans would recognize and fix (you would never drive if you couldn't see out of your eyes!).

It is also possible that if it develops enough it will succeed in ways that a human cannot, such as extremely long monotonous cross-country driving (think 8 hour highway driving) punctuated by a sudden need to intervene within seconds or even milliseconds. Humans are not good at this but technology is. Autonomous cars don't get tired or fatigued. Code doesn't get angry or make otherwise arbitrary and capricious decisions. Autonomous cars can react in milliseconds, whereas humans are much worse.

There will undoubtedly be more accidents if the technology is allowed to develop (and I take no position on this).


That's autopilot, not FSD beta though, at this point it's probably 10 generations old


Ah yes, because "autopilot" is not autonomous.


Well yeah, it's like other autopilots:

An autopilot is a system used to control the path of an aircraft, marine craft or spacecraft without requiring constant manual control by a human operator. Autopilots do not replace human operators. Instead, the autopilot assists the operator's control of the vehicle, allowing the operator to focus on broader aspects of operations (for example, monitoring the trajectory, weather and on-board systems).


That's just devious marketing on Tesla's part. They can always excuse customer misunderstandings with the original meaning you explained, while normal people can be savely expected to interpret autopilot as full self driving (and I'd be surprised if they didn't have actually tested this with focus groups beforehand). So not really lying (great for the lawsuits), but constructing misunderstanding on purpose (great for the brand image).


Except for the manual and all the warnings that pop up that say you need to pay attention.

3000 people die every day in automobile accidents, 10% of which are from people who are sleepy. Even standard autopilot is better than a tired driver


I would say it's better then the Human's tendency to drive full speed into anything while impaired by a drug. Especially since the bug was fixed in Tesla's case but the bug in Human's case is probably un-fixable.


Drugs (or alcohol)? There are so many more failure modes that drugs are the least of my concerns. Especially of unspecified type. I'm not the least bit worried about drivers hopped up on tylenol. Humans get distracted while driving, by texting, or simply boredom and start daydreaming. Don't forget about driving while tired. Or emotionally disturbed (divorce or a death; road rage). Human vision systems are also pretty frail and have bad failure modes, eg the sun is close to the horizon and the driver is headed towards the sun.


Computer vision systems also have bad failure modes. The camera sensors typically used today have better light sensitivity but less dynamic range than the human eye.


They fixed driving into stationary things? That's news to me. What's your source?

It's not an easy problem to fix at high speed without false positives, and they seem to really hate false positives.


I live in north-central Idaho. 10 minutes from 2 state-universities, but in an otherwise relatively rural part of the county with a 1/4 mile long, somewhat steep driveway.

Every year, I'm amazed at how quickly our personal "veneer of civilization" collapses in the snow.

The prior owners of our home would just keep their kids home from school, and work from home an average of "about a week every winter."

We're a little more aggressive with snow removal, but there are still mornings every winter where I'm getting up at 5 to spend a couple hours plowing and blowing out drifts on our driveway (after typically doing the same thing the night before) just in order for my wife to make it down to our county road which might still have a foot or so of snow covering it.

Similarly, in windy snow-covered conditions, there are a couple spots between us and town where the snow regularly drifts back over the road in a matter of hours, causing a "well, I know the road goes about here, I think I can make it through these drifts if I floor it so here it goes" situation.

Even when the roads are well plowed and clear, there are plenty of situations where it's difficult for me, a human, to distinguish between the plowed-but-still-white-road and the white snow all around it in some lighting conditions.

And let's take snow out of it. Our fastest route into town involves gravel roads. And our paved route is chip-sealed every couple years, and typically doesn't get a divider-line drawn back on it for 6-months or so after.

Which is all to say, I think it's going to be quite a while before I have a car that can autonomously drive me into town in the summer, and global warming aside, I'll probably never have one that can get me there reliably in the winter.


Northern Canada here. We have all been down that road. I had a rental once that wouldn't let me backup as the sensor was frozen over. I doubt AI will ever handle winter situations without help.


> I doubt AI will ever handle winter situations without help.

Sure it will, at least eventually. However, I suspect the humans at the time won’t like the answer: that it’s not always safe to drive in these conditions, and then the car refusing to drive autonomously, even if it is technically capable of navigating the route. It may deem the risk of getting stuck, etc. to be too high. Or you may need to accept a report to your insurance company that you’ve opted to override the safety warnings, etc.


Lol. Good luck selling that in the north, the mountains, farm country or anywhere else more than 10 miles from a starbucks. Sometimes lives depend on being able to move and there isnt time to reprogram robot to understand the risk dynamic. Malfunctioning sensors or broken highbeam circuits (tesla) are no excuse for a car to remain stuck in a snowbank.


Why do you live in a place where you have to shovel snow from 5am on a week day? I mean I appreciate building character but at some point you're just deciding to live life on hard mode.


First, they are "plowing and blowing", not shoveling (or not shoveling much) - if you have a significant amount of snow, shoveling is just impractical as well as back-breaking. Second, even if you don't get snow overnight, you get the drifting they mention, which is where winds blow snow onto the nice clean driveway you had cleared previously. Drifting can be quite significant with lots of snow and big open areas!

Lastly, not the OP, but winter is my favorite season for the most part, and I love being around lots of snow!


A large band of the United States reliably gets heavy overnight snows. In my case we're talking an hour west of a major metro--Boston. These days, the inevitable travel snafus notwithstanding, I just stay home. But when I had to go into an office barring a state of emergency digging out in early am was a regular occurrence.


Jesus christ HN. Not everyone is an IT guy with comfortable salary. Some people have families or other roots they don't want to severe, or lack the money and useful skills to move...


Autonomous driving systems are set at various levels of autonomy.

Level 0 is no automation, level 1 is just a dumb cruise control, level 2 is radar adaptive cruise control plus lane keeping (which is where most production systems like Tesla Autopilot and GM Supercruise are currently at). Level 2 still requires full human supervision, if you engaged it on the road above it would either fail to engage or you'd crash and it would be your fault. Level 3 is the same plus an ability to handle some common driving tasks, like changing lanes to pass a slower vehicle.

Level 4 is where it gets really interesting, because it's supposed to handle everything involved in navigating from Point A to Point B. It's supposed to stop itself in the event of encountering something it can't handle, so you could theoretically take a nap while it drove.

However, an important limitation is that Level 4 autonomy is geofenced, it's only allowed in certain areas on certain roads. Also, it can disable itself in certain conditions like construction or weather that inhibit visibility. Waymo vehicles like these are ostensibly level 4, if you tell them to drive through a back road in the snow they'll simply refuse to do so. It's only useful in reasonably good conditions in a few big cities.

Level 5 is considered to be Point A to Point B, for any two navigable points, in any conditions that the vehicle can traverse. You could build a Level 5 vehicle without a driver's seat, much less an alert driver. I kind of think this will require something much closer to artificial general intelligence; level 4 is just really difficult conventional programming.


It's not obvious that Level 4 falls within what one would call really difficult conventional programming. That level entails something like "in the event of any exceptional situation, find a safe stopping location and safely bring the car to a stop there," and even that alone seems incredibly hard.


Actually it doesn't matter if your cruise control is dumb or adaptive. If you have only cruise control, of either kind, then it's level 1.

And if you have lane-keeping but not cruise control, that's also level 1.

The difference between 1 and 2 is weird.


I'd still buy a self-driving car that refuses to drive on that road.


In the back seat of the Waymo there's a "Pull Over" emergency lever.


You can't always "pull over."


Lots of roads like that in Britian as well and the speed limit is 60mph/100kph. Not uncommon for two cars on a single track road to adjust speed to pass each other at a passing place without slowing down much, so at a closing speed of over 100mph. Perfectly safe for human drivers who know the roads.


This sounds like the sort of "perfectly safe for human drivers who know the roads" that actually results in a fair number of road deaths.


If you look at the accident maps, there are almost none on single track roads and lots on twin track roads. My hypothesis is that driving on a single track road feels much more risky so people pay more attention and slow down more on blind corners. Also, it’s not possible to overtake and a lot of accidents are related to overtaking.


Believe it or not there are tons of two-way roads like that just 30 minutes from Silicon Valley that self-driving cars could practice on. Here's an example: https://goo.gl/maps/1CVb7Mpiwv1VL2sd7


There're also similar roads 30 minutes from Silicon Valley that have all that, plus residences, pedestrians, parked cars, sheer cliffs, unclear driveway splits, and porta-potties, eg. https://goo.gl/maps/57jzzK6fvtCqvu5w5

Strangely I've never seen Waymo vehicles practicing on that. They're all over Mountain View, but I have never once seen them in the mid-peninsula hills.


Just have them drive up to the Lick Observatory and back.


That’s just stunningly beautiful - Czech countryside is something else!

I’d gladly buy a self-driving car that require some additional input on such a road and had additional aids to spot oncoming traffic I can’t see behind the tractor that’s a few hundred meters forward of the spot linked to. It would still be safer.

To really make things work, we need cars to be able to negotiate the way humans do on the right of way, etc. There is a lot of non-verbal (and when that fails, very verbal) communication while driving. Currently, cars can’t communicate with each other and the pedestrians, which limits possibilities a lot.


You can replicate that without going overseas. Send that autonomous vehicle over the Golden Gate bridge, take any of the next few exits, and turn right. The street I live on is a paved horse path from the 1910s. No snowdrifts, but a lot of aggressive drivers angrily refusing to back up, which will be fun to see software deal with!


As someone who learned to drive in the city, those roads make me sweat bullets.

My grandpa who drives on those roads primarily, sweats bullets in the city.

Maybe you’ll have different driving models to load in different scenarios …


My mother thinks nothing of driving on deserted roads in significant unplowed snow. She gets nervous on a dry, Texas highway at rush hour.


Yeah, that seems perfectly rational. There is nothing to hit on a deserted highway. Driving in traffic, on the other hand, is more stressful and has worse downsides.


> significant unplowed snow

Spinning out on a deserted highway and hitting a snowbank and getting trapped in your car kills a large number of people every year. Even with smartphones, calls for help can't always be responded to in time, resulting in death. (Have an emergency kit in your car if you live above the snow line!)

Driving in city traffic can be quite harrowing, but hitting another car at 20-30 mph isn't usually fatal. (Wear your seatbelts!)

The point that GP post was trying to make is that humans have different preferences, and what seems dangerous to one doesn't (and possibly isn't) dangerous to another. Humans are also notoriously bad at judging danger, eg some people feel threatened by the idea of wearing of papers masks.


The computer doesn't have to be perfect; it just has to be better than a human.


Adding to this to really drive the point home: it doesn’t even need to be better than a human that’s good at driving. It only needs to be better than the average human driver. Anecdotally speaking, that’s not such a high bar to pass (relative to the topic at hand).


For general acceptance I think it has to be better than how good the average human thinks they are at driving.

Secondly, its dumbest errors have to be better than what the average human thinks their dumbest errors would be. If there is an accident and every one thinks they would never have made this error, it will crush the acceptance.

Looking at the general accidents stats and saying to people that, on average there are less deaths on the road but they might die in a stupid accident they would never have been into, had they been driving themselves, will be a very hard pill to swallow. Most people prefer to have the illusion of control even if statistically it means worse expectations.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: