Hacker Newsnew | past | comments | ask | show | jobs | submit | _coveredInBees's commentslogin

The Atlantic had a good article on this and how it isn't the doom and gloom you lay out above:

https://www.theatlantic.com/economy/archive/2025/11/mamdani-...

As some of the replies note, it has been rather successful and popular in other cities like Berlin.


Rent control is always initially popular with the people who are already in apartments. But it is longer term effects on supply and quality that are corrosive.

An alternative is Austin:

https://www.texastribune.org/2025/01/22/austin-texas-rents-f...

"Austin rents have fallen for nearly two years. Here’s why.

Austin rents have tumbled for 19 straight months, data from Zillow show. The typical asking rent in the capital city sat at $1,645 as of December, according to Zillow — above where rents stood prior to the pandemic but below where they peaked amid the region’s red-hot growth.

Surrounding suburbs like Round Rock, Pflugerville and Georgetown, which saw rents grow by double-digit percentages amid the region’s pandemic boom, also have seen declining rents. Rents aren’t falling as quickly as they rose during the pandemic run-up in costs, but there are few places in the Austin region where rents didn’t fall sometime in the last year.

The chief reason behind Austin’s falling rents, real estate experts and housing advocates said, is a massive apartment building boom unmatched by any other major city in Texas or in the rest of the country. Apartment builders in the Austin area kicked into overdrive during the pandemic, resulting in tens of thousands of new apartments hitting the market."


I'm all for building more housing, but in places that already have an affordability problem, removing rent control before building more housing would just displace people overnight.

I live in SF and wish we would build as much and as quickly as Austin has been building. But, if we could do that, we shouldn't consider eliminating rent control until after those units are on the market.


It's kind of incredible how the obvious and true solution to rents being too high is to BUILD MORE HOUSES and yet somehow people manage to convince themselves that in fact, the real solution to rents being to high is to artificially cap their prices. Incredible stupidity.


There's two vectors here and people seem to not realize that, isolated from the short term suffering going on right now.

Rental control is short term relief. Obviously, using short term solutions long term is bad. This shouldn't be an enigma.

Building housing is long term. We cannot build new houses in a year. At least, not that I know of. But new houses in 4 years does not help the citizens knocked onto the streets in those times.

You need to relive those people while also securing the future. That's why rent control fails without a proper housing reform.

You can't be mad at a pipe bursting that you used duct tape to cover. But maybe that duct tape buys you time to find a plumber, who needs time to find the right size pipes. So duct tape is still really useful, just not the end all be all.


Mamdani is doing both though, in a controlled manner. He voted to oppose NIMBYism as well and has a plan for new construction.


Extra supply is helping, but I would argue back-to-office and layoffs are the primary culprit.

You're not competing with 4+ techbros to an apartment in downtown Austin anymore.

Anecdotally, the local tech meetups are WAY off in participation since about June. About 1/3 of the people who used to regularly attend have completely left the city.


What a ridiculous statement to make. No wonder the US is in the state it is in. Lets let the ignorant and uninformed decide on policy rather than the scientific community and experts. What could possibly go wrong?


Eh, I dunno. My son plays a bunch of Roblox and has spent a net $10 for a few custom avatar mods. While there is certainly a pay to win aspect for some games within, there is also a ton of "free" games to sift through, and since all of them are competing for players, they still have to make the experience compelling enough at the free tier. We've had conversations about the pay-to-win aspect, and even though he has several hundred dollars saved up, he has never once asked to spend money on pay-to-win aspects of Roblox. I'd argue that almost any modern videogame / mobile game is equally if not more "predatory" with the pay-to-win side of things. Just look at the menu screens in any modern first person shooter / battle royale type game. Those look far worse than anything I have seen in Roblox.


both should be regulated, this type of predatory gambling-like behavior shouldn't be allowed for kids under a certain age


So, no social media and no video games for kids? Man am I glad that I grew up before tomorrow when everything is going to be restricted.


social media generally bans kids under 13 in the US — there's a good amount of evidence regarding the harms it can have at this point

kids haven't been able to buy mature games from brick-and-mortar stores like Gamestop since I was a child decades ago

kids used to be able to smoke cigarettes too


>there's a good amount of evidence regarding the harms it can have at this point

Considering this evidence was produced during a time when the public opinion was looking for any excuse to blame social media companies and that the field of research producing those studies has an accuracy of a coin flip I'm unconvinced. I'd need to see a lot more than out of contact quotes from Facebook research or these questionable "we asked kids to taste xyz, they're totally more depressed and it's totally social media's fault."

>kids haven't been able to buy mature games from brick-and-mortar stores like Gamestop since I was a child decades ago

They pirated them instead because kids don't have money.

That being said, I would rather kids be banned from the internet outright rather than the internet becoming yet another watered down place.


Some of this evidence has been produced by companies with an incentive to not produce it (internal Facebook research has shown negative mental health implications for teenage girls on instagram for example — this is known as part of some whistleblowing efforts)

> They pirated them instead because kids don't have money.

I mean sure, a kid can break a window and rob a gun store too... we're not talking about creating rules that are impossible to circumvent, the answer to imperfect regulation isn't no regulation.

> That being said, I would rather kids be banned from the internet outright rather than the internet becoming yet another watered down place.

Content filters have come a long way, this isn't what anyone is suggesting.


>internal Facebook research has shown negative mental health implications for teenage girls on instagram for example — this is known as part of some whistleblowing efforts

This is one of the reasons why I have difficulty taking this rewatch seriously, because that is not what the internal research at Facebook said. That was a media headline that misrepresented the results.

They measured 12 different indicators problematic use of Instagram, body image issues, sadness etc. For teen girls 32% of respondents said that IG made their body image issues worse, what the media didn't say however, is that 45% thought Instagram had no impact and 22% said it made their body image issues better.

And that was basically the worst indicator out of all 12 of them. For example, the same research said that on the question of loneliness 12% of teen girls said that IG made it worse, 36% said it had no impact and 51% said that IG made it better.

On every issue Instagram eat mainly either neutral or positive. And that's the internal research that places like WSJ used to say Facebook causes negative mental health effects in teen girls.

>Content filters have come a long way, this isn't what anyone is suggesting.

No they haven't. It's still the same garbage it always was just dressed up in fancier words. You can look at AI and see how well censoring it works. It's crude and ultimately doesn't work, just makes for a worse experience.


> So, no social media

When I was a kid, everyone was absolutely riddled with self-doubt and insecurity. Jealousy and bullying was the norm. There wasn't a soul in my middle school who didn't deeply, deeply hate themselves.

This was before social media. Imagine that, but now kids ALSO get to form unrealistic expectations and envy at home on their devices.

> no video games for kids?

What are you talking about? You can still get your friends together and play mario party or super smash or kirby or whatever. That never went away, we still have co-op games where it's free to play for the other kids.

We just shouldn't have gambling for the kids. Probably.


>You can still get your friends together and play mario party or super smash or kirby or whatever. That never went away, we still have co-op games where it's free to play for the other kids.

Yeah, they don't add those free to play mechanics because they force you to buy an extra piece of hardware for $400 to play those games. It works great when you're rich, I guess, but then these f2p games shouldn't matter in the first place.


What? No, you don't need a console. One switch can play a 4 player or 8 player game just fine. How it's been for decades.


Yes, and you need the switch to play that Nintendo game in the first place. Thus forcing you to buy the piece of hardware.


... was there ever a point in time where you were able to play a console game without the console? Was the game magic?

You only need one (1) switch. I can play smash with 8 people, on my couch, and 7/8 DO NOT have a switch. You need at least one (1) switch because the game cartridge cannot magically be projected onto my TV.

This is how it's always been and, in nintendo land at least, has only gotten better. I mean, I certainly couldn't play 8 player anything on the NES.


Sure, I don't disagree with that at all. I'd love to see that happen. I was just pointing out that most of the industry is far worse than what I have seen with Roblox personally.


Totally agree. I enjoyed owning my S9+ and even though I was someone who used to root and ROM my older Android phones, I didn't really have any complaints about the Samsung "bloat" some people love to complain about. Even though I have a Pixel device now, I still use Samsung's browser as it is far superior to Chrome or even Firefox imo (I am a diehard FF user on desktop) and I even had to install a 3rd party app to replicate Samsung's "panels" application for a swipeable side app drawer which I loved and found extremely useful on my S9+.


There are pretty decent pet insurance policies responsible pet owners could purchase and pay a monthly premium on. I've used it for my past two dogs and it has been painless and easy to use when I needed to.

So it isn't a big $10K+ or bust argument. But the sad fact of the matter is very few pet owners assume ownership while planning for the potential bad days and the extra financial burden and responsibility that they should feel when they take on pet ownership.


Same here. Then again, I'll never leave Firefox as long as Tree Style Tabs[1] continues to be supported as an extension. It's been a life-changer since I discovered it well over a decade ago and makes every other browser pretty much unusable for me as a daily driver.

[1] - https://addons.mozilla.org/en-US/firefox/addon/tree-style-ta...


Have you given Sidebery a try? Also for firefox.


There are multiple chromium alternatives for this extension.


And they are all terrible.

One of them is called TabFern, get it? It's like TabTree but it doesn't have branches.


> And they are all terrible.

For other Chromium-based browser users, Tabs Outliner is great, more powerful (but with a steeper learning curve) solution.


It’s not a one for one replacement but edge has vertical tabs and tab groups built in. I have to use edge for work and it’s a decent compromise.


Highly recommend CS61A. Can't say enough good things about it.


The correct interpretation is that relying on tools like MidJourney can be a liability for the exact reason that this can go any which way and the legality issue is rather gray. So the point still stands that many companies would rather have a more clear-cut tool like Adobe Firefly without having to worry about potential liability.


I'm surprised at the number of people here complaining about venvs in Python. There are lots of warts when it comes to package management in Python, but the built-in venv support has been rock solid in Python 3 for a long time now.

Most of the complaints here ironically are from people using a bunch of tooling in lieu of, or as a replacement for vanilla python venvs and then hitting issues associated with those tools.

We've been using vanilla python venvs across our company for many years now, and in all our CI/CD pipelines and have had zero issues on the venv side of things. And this is while using libraries like numpy, scipy, torch/torchvision, etc.


I've been using Python since like 2006, so maybe I just have that generational knowledge and battlefront experience... but whenever I come into threads like this I really feel like an imposter or a fish out of water. Like, am I using the same Python that everyone else is using? I echo your stance - the less overhead and additional tooling the better. A simple requirements.txt file and pip is all I need.


Isn't pip + requirements.txt insufficient for repeatable deployments? You need to pin all dependencies not just your immediate project dependencies, unless you want some random downstream update to break your build. I guess you can do that by hand.. but don't you kind of need some kind of a lock file to stay safe/sane?


The simple solution to this with pip is constraints file:

    pip install <my_entire_universe_of_requirements>
    pip freeze > constraints.txt
And now in any new environment:

    pip install <any_subset_of_requirements> -c constraints.txt
Now you can install prod requirements or dev requirements or whatever other combination of requirements you have and you guarantee to have the exact same subset of packages, no matter what your transitive dependencies are doing.

You can use pip-compile from pip-tools if you want the file to include exact hashes.


This is true, but now you're explicitly depending on all of your transitive dependencies, which makes updating the project a lot harder. For example, if a dependency stops pulling in a transitive dependency past a certain version, you'll need to either recreate the constraints file by reinstalling everything, or manually remove the dependencies you don't need any more.

Also pip freeze does not emit a constraints file, it emits (mostly) a requirements file. This distinction is rarely important, but when it is, it can cause a lot of problems with this workflow. For example, a constraints file cannot include any information about which extras are installed, which pip freeze does by default. It also can't contain local or file dependencies, so if you have multiple projects that you're developing together it simply won't work. You also can't have installed the current project in editable mode if you want the simple "pip freeze" workflow to work correctly (although in practice that's not so difficult to work around).

Pip-tools does work a bit better, although the last time I used it, it considered the dependency chains for production and for development in isolation, which meant it would install different versions of some packages in production than in development (which was one of the big problems I was trying to solve).

From my experience trying basically every single option in the packaging ecosystem, there aren't really any solutions here. Even Poetry, which is pretty much best-in-class for actually managing dependencies, struggles with workspace-like installations and more complicated build scripts. Which is why I think pretty much every project seems to have its own, subtly unique build/dependency system.

Compare and contrast this with, say, NPM or Cargo, which in 95% of cases just do exactly what you need them to do, correctly, safely, and without having to think about it at all.


> This is true, but now you're explicitly depending on all of your transitive dependencies

They're constraints not dependencies they don't need to be installed and you can just update your requirements as you need and regenerate them.

> Also pip freeze does not emit a constraints file, it emits (mostly) a requirements file. This distinction is rarely important, but when it is, it can cause a lot of problems with this workflow. For example, a constraints file cannot include any information about which extras are installed, which pip freeze does by default

Pip freeze does not use extras notation, you just get extra packages listed as individual dependencies. Yes there is an important distinction between constraints and requirements but Pip freeze uses an intersecting subset of the notation.

> You also can't have installed the current project in editable mode if you want the simple "pip freeze" workflow to work correctly

That's why the workflow I gave to generate the constraints didn't use the -e flag, you generate the constraints separately and then can install however you want, editable or not.

> From my experience trying basically every single option in the packaging ecosystem, there aren't really any solutions here. Even Poetry, which is pretty much best-in-class for actually managing dependencies, struggles with workspace-like installations and more complicated build scripts. Which is why I think pretty much every project seems to have its own, subtly unique build/dependency system.

People have subtly different use cases that make a big impact on what option is best for them. But I've never been able to fit Poetry into any of my use cases completely, whereas a small shell script to generate constraints automatically out of my requirements has worked exceedingly well for pretty much every use case I've encountered.


I have been using pip since 2014 and did not know about the constraints. This solves my issue with sub-dependency version pinning!


'pip freeze' will generate the requirements.txt for you, including all those transitive dependencies.

It's still not great though, since that only pins version numbers, and not hashes.

You probably don't want to manually generate requirements.txt. Instead, list your project's immediate dependencies in the setup.cfg/setup.py file, install that in a venv, and then 'pip freeze' to get a requirements.txt file. To recreate this in a new system, create a venv there, and then 'pip install -c requirements.txt YOUR_PACKAGE'.

The whole thing is pretty finicky.


Not sure why repeatable deployments would be a problem. You can pin all dependencies by issuing a

    pip freeze > requirements.txt
if you want. The only catch is you should be using a similar architecture and Python version in both development and production.

This would also pin a few non-project dependencies such as `disttools` but that shouldn't be a problem.

Edit: TIL that pip constraints is a thing. See the comment posted by oblvious-earth for an even better approach.


Is it "generational knowledge and battlefront experience" or just "getting used to the (shitty) way things have always been" and Stockhold Syndrome?


It was pretty bad before but now it seems like there are a bunch of competing solutions each with their own quirks and problems. It feels like the JavaScript ecosystem.


Ironically, the Javascript ecosystem is far better than the Python ecosystem when it comes to packaging and dependencies. NPM just does the right thing by default: you define dependencies in one place, and they are automatically fixed unless you choose to update them. Combine that with stuff like workspaces and scripts, and you basically have everything you need for the vast majority of use cases.

Yes, there's also other options like Yarn, which have typically had newer features and different approaches, but pretty much everything that works has been folded back into NPM itself. Unless you really want to live at the bleeding edge for some reason, NPM is perfectly sufficient for all your needs.

In contrast, the closest thing to that in the Python ecosystem is Poetry, which does a lot of things right, but is not supported by Python maintainers, and is still missing a handful of things here and there.

I'm not saying the JS ecosystem as a whole is perfect, but for packaging specifically, it's a lot better than Python.


> they are automatically fixed unless you choose to update them

That's a good way to never get vulnerabilities fixed.

It hardly seems like "the right thing" to me.


I mean, a project needs regular care and maintenance, however you organise it. If you're never scheduling time to maintain your dependencies, you're going to be in trouble either way. But at least if you lock your dependencies, you know what will actually get installed, and you can find the buggy or insecure versions.

We found a bug on a Python project I worked on recently that only seemed to happen on certain machines. We couldn't reproduce it in a dev environment, and one machine the was affected suddenly stopped being affected after a while. It turns out the issue was a buggy dependency: one particular build of the project happened to have picked up the buggy version, but later builds used the fixed version and so didn't have a problem. So we'd only see the bug depending on which build the machine had last used, and if someone put a different build on there, it would reset that completely. On our development machines, we used slightly different builds that just happened but to have been affected.

Pinning dependencies wouldn't necessarily have prevented the bug in the first place - sometimes you just have buggy dependencies - but the debugging process would have gone much more quickly and smoothly with a consistent build environment. We could also have been much more confident that the bug wouldn't accidentally come back.


You should really start using linux distributions. These problems are all solved and have been solved for a long time.


That's definitely a solution, but it comes with its own problems, in particular that you add a significant dependency on what is essentially a middleman organisation trying to manage all possible dependencies. This doesn't scale very well, particularly because there's a kind of M×N problem where M packages can each have N versions which can be depended on. In practice, most distros tend to only support one version of each package, which makes the job easier for the distro maintainer, but makes things harder for everyone else (library authors get bug reports for problems they've already fixed, end users have less ability to choose the versions they need, etc).

In particular, it also makes upgrading a much more complex task. For example, React releases new major versions on a semi regular basis, each one containing some breaking changes, but not many. Ideally there wouldn't be any, but breaking changes are inevitable with any tool as situations change and the problem space becomes better understood. But because the NPM ecosystem generally uses locked dependency lists, end users can upgrade at their leisure, either with small changes every so often, or only upgrading when there's a good reason to do so. Both sides can be fairly flexible in how they do things without worrying about breaking something accidentally.

Under a Linux distribution model however, those incremental breaking changes become essentially impossible. But that means that either projects accumulate cruft that can't ever be removed and makes maintainers' and users' lives more complex, or projects have to do occasional "break everything" releases a là Python 2/3 in order to regain order, which is also more work for everyone. There is a lot less flexibility on offer here.

I don't think these sorts of problems disqualify the Linux distribution model entirely - it does do a lot of things well, particularly when it comes to security and long-term care. But there's a set of tradeoffs at play here, and personally I'd rather accept more responsibility for the dependencies that I use, in exchange for having more flexibility in how I use them. And given the popularity of language-specific package repositories that work this way, I get the feeling that this is a pretty common sentiment.


What happens when your distribution only have old versions, or worse, no versions of the libraries you need? You hoop distribution? You layer another distribution like Nix or Anaconda over your base distribution? You give up and bundle another entire distribution in a container image?


You make a package for the thing you need.


So the "solution to packages" is to make your own package with someone's else package?

If it's that simple, how come no one already did all that work for us?


It's 200% is "the right thing".

Updating packages should be strictly left to the developer's discretion. That schedule is up to the developer using the packages, not upstream.

Not to mention that dependencies updating themselves whenever they like to "fix vulnerabilities" is a sure-fire way to break your program and introduce breakage and vulnerabilities in behavior...


The Javascript ecosystem for other things, like frameworks, sure.

But when it comes to packages and "virtual envs" the Javascript ecosystem is leaps and bounds better.


The "Javascript ecosystem" on my personal experience seems to prefeer installing everything in the global environment "for ease of use convenience" and then they wonder how did a random deprecated and vulnerable dependency get inside their sometimes flattened, sometimes nested, non-deterministic dependency chain (I wish the deterministic nested pnpm was the standard...) and (pretend) they did not notice.

That being said, the Javascript ecosystem has standarized tooling to handle that (npx) that Python doesn't (I wish pipx was part of standard pip), they just pick the convenient footgun approach.


I don't think so. Python is battery included, and most packages in the Python ecosystem are not as scattered as npm packages. The number of packages in a typical Python project is much smaller than a Nodejs project. I think that's the reason why people are still happy with simple tools like pip and requirements.txt.


People are happy?

It's one of the major sources of disatisfaction with Python!


or the third option: did the whole packaging nonsense actually get kinda alright lately?


There's a PEP to get a part of it right [1] - at least the installation of dependencies and the need for virtualenv side, but atm the packaging nonsense is still as bad as it always has been.

https://peps.python.org/pep-0582/

Sample comment from its discussion:

>> Are pip maintainers on board with this? > Personally, no. I like the idea in principle, but in practice, as you say, it seems like a pretty major change in behaviour and something I’d expect to be thrashed out in far more detail before assuming it’ll “just happen”.

As if the several half-arsed official solutions already existing around packaging (the several ways to build and create packages) had deep thinking and design behind them...


Twice bricking my laptop’s ability to do python development because of venv + symlink bs was the catalyst I needed to go all-in on remote dev environments.

I don’t drive python daily, but my other projects thank Python for that.


How do you brick a machine with venvs?


He runs pip as root and doesn't use venvs.


By debugging homebrew issues during Moneterey updates.

I didn’t brick the machine, just the ability to setup a typical python venv.


System administration skills are necessary to be productive developer. There is nothing re: python that can't be fixed with a few shell commands.


If a rogue package rm's your root directory as root, you need a bit more that a few shell commands to fix it.


Can't happen unless you install as root. You're NOT DOING THAT are you?

Also LiveCDs are a thing for about twenty years. Recovery has never been easier, even after hardware failure.


It doesn't even need root to cause damage most of the time, it just needs to overwrite all files under your user by mistake.

> Recovery has never been easier, even after hardware failure.

If you can use a LiveCD to repair it, it most likely wasn't a hardware failure to start.


I've managed to break venv, npm and composer (php).

I don't use that as a reason to choose what I'll use in my projects, that's decided by the PTSD incurred from 7 years of php.


It's really inconvenient for simple use cases. You don't even get a command to update all packages.


Lol. You put "simple" and "requirements.txt" unironically next to each other...

I mean, I think you genuinely believe that what you suggest is simple... so, I won't pretend to not understand how you might think that. I'll explain:

There's simplicity in performing and simplicity of understanding the process. It's simple to make more humans, it's very hard to understand how humans work. When you think about using pip with requirements.txt you are doing the simple to perform part, but you have no idea what stands behind that.

Unfortunately for you, what stands behind that is ugly and not at all simple. Well, you may say that sometimes it's necessary... but, in this case it's not. It's a product of multiple subsequent failures of people working on this system. Series of mistakes, misunderstandings, bad designs which set in motions processes that in retrospect became impossible to revert.

There aren't good ways to use Python, but even with what we have today, pip + requirements.txt is not anywhere near the best you can do, if you want simplicity. Do you want to know what's actually simple? Here:

Store links to Wheels of your dependencies in a file. You can even call it requirements.txt if you so want. Use curl or equivalent to download those wheels and extract them into what Python calls "platlib" (finding it is left as an exercise for the reader) removing everything in scripts and data catalogues. If you feel adventurous, you can put scripts into the same directory where Python binary is installed, but I wouldn't do that if I were you.

Years of being in infra roles taught me that this is the most reliable way to have nightly builds running quietly and avoiding various "infra failures" due to how poorly Python infra tools behave.


What are specific problems you have with pip + requirements.txt, and why do you believe storing links to wheels is more reliable? Your comment makes your conclusion clear, but I don't follow your argument.


Pip is a huge and convoluted program with tons of bugs. It does a lot more than just download Python packages and unpack them into their destination. Obviously, if you want something simple, then HTTP client, which constitutes only a tiny fraction of pip would be a simpler solution, wouldn't it?

In practice, pip may not honor your requirements.txt the way you think it would. Even if you require exact versions of packages (which is something you shouldn't do for programs / libraries). This is because pip will install something first, with its dependencies, and then move to the next item, and then this item may or may not match what was already installed.

The reason you don't run into situations like this one often enough to be upset is because a lot of Python projects don't survive for very long. They become broken beyond repair after few years of no maintenance. Where by maintenance I mean constant chasing of the most recent set of dependencies. Once you try to install and older project using pip and requirements.txt, it's going to explode...


Except when you try to move it, or copy it to a different location. This _almost_ made sense back when it was its own script, but it hasn't made sense for years, and the obstinacy to just sit down and fix this has been bafflingly remarkable.

("why not make everyone install their own venv and run pip install?" because, and here's the part that's going to blow your mind: because they shouldn't have to. The vast majority of packages don't need compilation, you just put the files in the right libs dir, and done. Your import works. Checking this kind of thing into version control, or moving it across disks, etc. etc. should be fine and expected. Python yelling about dependencies that do need to be (re)compile for your os/python combination should be the exception, not the baseline)


> Except when you try to move it, or copy it to a different location.

Or just, y'know, rename the containing folder. Because last night I liked the name `foo` but this morning I realized I preferred `bar`, and I completely forgot that I had some python stuff inside and now it doesn't work and I have to recreate the whole venv!


Why does that break venv? I thought it'd be linking to things outside of itself but shouldn't be aware of where it is.

(Sorry, not a python expert)


When creating the venv it hardcodes some paths so the python interpreter knows where to find its modules and the likes.

That said, re-creating a venv shouldn't be hard and if it is you're doing something wrong in your development setup.


What am I doing wrong? As far as I can tell, I have to:

1. Copy my code out from the venv folder

2. Delete the venv folder

3. Make a new venv

4. Copy my code back into the new venv folder

5. Re-install dependencies

This doesn't take much longer than 60 seconds, but that's 55 seconds more than I want to spend. How is this a good process? It just makes me avoid using python (at least when I'd need anything outside the standard library).

Is there a simple command that will do this all for me?

Note that I don't typically have a git repository or similar set up because I use python for very simple semi-throw-away scripts. I just want to be able to rename the containing folder and have my script still work.


Your code should not be inside the venv folder. For reference my projects usually look something like this:

     project
     ├─ venv
     |  ╰─ ...
     ├─ pyproject.toml
     ╰─ project
        ├─ __init__.py
        ├─ __main__.py
        ╰─ app.py
Which means recreating the venv is as easy at removing the venv folder, creating a new venv, and running `pip install -e .` when using pyproject.toml or `pip install -r requirements.tx` when using a requirements file.

This of course doesn't quite solve the moving the folder issue, for which unfortunately there isn't an amazing solution currently. One thing you can do is have the venv somewhere else entirely, That way you can keep the venv in a fixed place so it doesn't break but still move the code to wherever you want to put it. In the use-case for tiny scripts like you do you might be better served not using a venv, and just using `pip install --local` for all your packages. Which is a bit messy but has served me for years and years before I landed on the pattern I use now.

Another "unfortunately" is that none of this stuff is documented very well. Writing a working pyproject.toml for example requires switching between the PEP introducing them, the pip documentation, and the setuptools documentation.


The frustration is more than real.


I have drunk the Python kool-aid for too long, but you are absolutely right that this should be corrected.


Every now and then I wake up from the kool-aid stupour because I've been using a different programming language and ecosystem for a while and coming back to Python is just an excruciating exercise in "why is this still so shit?" (who wants to talk about pip vs npm when you're a package maintainer? Anyone?)


> Except when you try to move it, or copy it to a different location.

The article says it is explicitly not designed for that: "One point I would like to make is how virtual environments are designed to be disposable and not relocatable."


Good job, you spotted the exact problem: that's what they were originally for, and that makes no sense in 2023 (or 2020, or the day venv functionality was added into main line Python) where literally everything a venv does is trivially achieved in a new location if it didn't hardcode everything relating to paths.

There is literally nothing about a venv that somehow magically makes it impossible to still work after relocation. Breaking the venv on relocation was a conscious choice that has been insisted on to this day for no good reason other than "a long history of not bothering to fix this nonsense is all the justification we need to continue not fixing this nonsense".


Jeez just clone a venv with venv cloning tools


You mean `python -m venv`? Because that's literally that with just as little effort, and then you copy requirements.txt, but the whole point is that in 2023, this should not be necessary and the continued insistence by both python and specifically venv maintainers that it somehow needs to be this way, is insane. And telling people that they should just use clone tools is equally insane when we could just...

...you know...

...fix virtual environments?


No there is a whole python module for cloning venvs.


Which serves to highlight the insanity of it all? How is having a separate module for that not even worse than, as OP suggested - fixing virtual environments?


> Most of the complaints here ironically are from people using a bunch of tooling in lieu of, or as a replacement for vanilla python venvs and then hitting issues associated with those tools.

That's because the vanilla python venvs feel like a genius idea but not thought out thoroughly, they feel as if there's something missing..., So there's naturally lots of attempts at improvements and people jump at those...

And when you think about it in bigger depth, venvs are themselves just another one of the solutions used to fix the horrible mess that is python's management of packages and sys.path...

The "Zen of Python" says "There should be one-- and preferably only one --obvious way to do it.", so I can't understand why it's nowhere near as easy when it comes to Python's package management...


Honestly, virtual environments are one of the reasons why I prefer to avoid Python whenever I can.


To me, the virtual environments are a symptom. The cause is people defending Python even when it’s not as good as the alternatives. Every language has flaws. Every language has things it can learn from other languages. Every language is a trade off of different features. But somehow Python packaging, despite being really unpleasant compared to other languages and quite stagnant, is defended very vigorously in these threads (which lol are constantly recurring, unlike other languages). Just last week, I tried to install a Python program from 2020. I failed. I think the problem is that it relied on Pandas and maybe Pandas from then doesn’t work? I have no idea what the real flaw was, but jeez, it’s annoying to have people act like there is no problem. Yes, there is a big problem! This is a dead parrot.


why?

Also, you don't have to use them.


Which leaves you with what, not installing packages or sharing packages between everything on my system and all apps I work on? The 80ies just called, they want their development methodologies back.


Is this a response?

I asked what's wrong with a venv and I got a rant…


It strikes me that virtual environments are a fairly ugly hack to smooth over the fact that Python is not a stable language. It changes a lot, requiring the use of particular runtimes for particular Python code, requiring the installation of multiple runtimes.

That's a pretty serious downside to the language. Virtual environments are needed to help people deal with that downside.


Uh? To me they're just a convenient quick way to install stuff that I don't want to install system-wide, if I want to do some quick experimenting.

The normal, permanent, stuff gets installed system wide the normal way, with apt.


Well, if develop and ship in a vm/container, you don't have to do it in your system /s


It's incredibly lacking in features. PyPI doesn't even properly index packages, making pip go into this dependency resolution he'll trying to find a set of versions that will work for you. It works for simple cases with few dependencies/not a lot of pinning. But if your needs are a bit more complex it certainly shows its rough edges.

I actually find it amazing that they python community puts up with that. But I suppose fixing it is not that pressing now the language is widely adopted. It's not going to be anyone's priority to mess with that. It's high risk low rewards sort of project.


I've been writing Python for a looong time. I have pushed out thousands and thousands of deployments across probably 40+ distinct Python codebases and only once or twice have I ever encountered a showstopper dependency resolution issue. At the end of the day you should want to have fine grained control over your deps and frankly there are many times where a decision cannot be automatically made by a package manager. Pip gets beat on hard but it puts in work all day every day and rarely skips a beat. It's entirely free and developed with open source contributions.

Areas where I have felt a lot of pain is with legacy Ruby projects/bundler. Don't get me started on golang.

Can pip be made better? Sure. Should we have an attitude of disgust towards it? Heck no!


> once or twice have I ever encountered a showstopper dependency resolution issue.

Hahaha... (rolls on the floor) Do you want to know why that is? No seriously? I'm not laughing at you as much as I'm laughing at Python now, but hey, well, anyways, do you want to know why that happened to you? I know you don't. But I'll tell you anyways!

Until quite recently, pip didn't give a rat's ass if the dependencies it installed were consistent. It would blink a message in the long stream of vomit it spills on the screen saying something like "you have package X installed of version Y, but package Z wants X of version Q, which will not be installed". And happily streamed more garbage to your screen.

It was an issue that was filed against pip for something like 12 years until it got resolved about a year or so ago. Even after it got resolved a lot of people tried to upgrade, saw that that would "break" their deployment, and rolled back to the latest broken version.

Things are sort of improving gradually since then, but we are light years away from the system working properly, and I know you don't want to know why, but I'll tell you anyways!

So, when for whatever reason pip doesn't find a dependency it thinks you need, a lot of packages, when they roll out their "releases", they upload also what Python calls "source release". Which should have never been treated as an installation option, but it is, and is treated like that by default. So, what will happen once pip finally gives up on finding a match, right, you guessed it! -- It's going to try to build it! Installing build dependencies along the way. What you get in the end is anyone's guess, but most likely, it's something broken because the developers who made this release didn't make a release specifically for your version.

Don't despair. There's a flag you can use with pip install that should prevent it from trying to build stuff. But two bad things will happen to you if you use it: in any non-trivial project your dependencies will irreparably break. And, who knows if that flag is implemented correctly... nobody in the real world is using that. So, who knows, maybe it'll format your hard drive along the way.


I understand that you're being hyperbolic for rhetorical purposes, but I think you're overselling the problem with source distributions: most language package ecosystems boil down to the same "baseline" package representation as sdists do, and have the same basic "build it if no binary matches" approach. Many don't even provide built distributions; Rust and Go come to mind.

Python's problem isn't with source distributions as such, but with really bad metadata control (and excessively permissive approaches to metadata generation, like arbitrary Python setup files). Better metadata makes source-based language package management work just fine in every other language's ecosystem; much of the effort in Python packaging over the last ~8 years has been slowly turning Python in that direction.


> Python's problem isn't with source distributions as such, but with really bad metadata control

One doesn't preclude the other. I'm not against having a mechanism for automating source installs (like this is done in, eg. RHEL-based distros), but it's insanity if you allow this to happen by default. You may not remember Bumblebee deleting /usr while running some innocuously-looking code during install, but things happen... really bad things...

Things don't need to happen all the time in order for them to be scary. It's enough to have possible catastrophic consequences, even if the event itself is rare.

> Better metadata makes source-based language package management work just fine in every other language's ecosystem

I haven't seen a single one, and I used dozens at this point. This is never a good idea. It's OK to do source installs for development, it's never a good idea to do source installs for deployment. It "works" in other places because of how it's presented (i.e. nobody expects this to be the method of software delivery to the end user). Like, eg. in Cargo (Rust): you, as a developer, download sources and build programs from all the sources on your computer, but your user gets a binary blob they put on the system path and run. It would be insanity and a security nightmare if users were supposed to compile program code before they could run it. The select few who can audit what's being downloaded and how it's been compiled would probably manage, the rest would become victims of all sorts of scams or just random failures propagating beyond their builds into their systems.

> much of the effort in Python packaging over the last ~8 years has been slowly turning Python in that direction.

I'm sorry, but PyPA is managed by clueless people. Whatever they do there only breeds more insanity over time. They neither have a general direction where they want to take the packaging system, nor do they understand the fine details of it. They are also bombarded by insane requirements for useless and harmful features, which they often quick to implement... It's a circus what's going on there. I've lost hope years ago, and now I've become an accelerationist. I just like to see it burn and people run around screaming while their backs are on fire. I get paid to fix this mess. So, PyPAs incompetence is my job security.


It behaves like a kid you send to the store with a 100 dollars


I moved to poetry (ergonomics) and publishing wheels with frozen requirements, at least for apps, here's the plugin I used https://github.com/cloud-custodian/poetry-plugin-freeze .. readme has details, tldr freeze the graph for repeatability regardless of tool, Ala pip install works years later.


> only once or twice have I ever encountered a showstopper dependency resolution issue

I've encountered them with other languages and they're the sort of thing where one time is more than enough to make me feel like it could get me fired; they're Never (with a capital N) okay imo


What does that have to do with venvs?

I agree the packaging and distribution setup in python is an absolute mess, but that's entirely unrelated to venvs. It's like bringing up how python uses whitespace instead of curly-braces.


venvs are the recommended workaround for the fact that python packaging and distribution is a mess of global state. Lanugages with working packaging and distribution don't generally bother with anything venv-like.


Sure, but that's like 99% pip. Venvs are patching it (quite effectively), not causing it.


I think the GP comment might have caused some confusion since it mentioned both package management and venvs very close together.


I hate PyPI probably even more than you do, but venv doesn't do that. All it does is write a handful of files and make a bunch of symlinks. It doesn't deal with installation of packages.


Ok. Fair. Venvs are great, unless you want to install packages on them.


I've never used anything but vanilla Python venvs, and no they don't work reliably. What does is a Docker container. I keep hearing excuses for it, but the prevalence of Dockerfiles in GitHub Python projects says it all. This is somehow way less of an issue in NodeJS, maybe because local environments were always the default way to install things.


> This is somehow way less of an issue in NodeJS, maybe because local environments were always the default way to install things.

There's also NodeJS's ability for dependencies to simultaneously use conflicting sub-dependencies.


Yeah, you can't have two deps use different versions of the same sub-dep, cause they flatten everything instead of having a tree. In practice I rarely have issues with this except in React-Native, where it's a common problem, but then again RN is doing some crazy stuff to begin with. Often just force install deps and things work anyway.

Side note, there are way too many React/React-Native "router" type packages, and at least one of them breaks its entire API every update (I think https://reactrouter.com/en/main/upgrading/v5, how are they on version 6 of this). It's so bad that you can't even Google things anymore cause of the naming conflicts.


The most important part about venv is that you shouldn't need it. The very fact that it exists is a problem. It is a wrong fix to a problem that was left unfixed because of it.

The problem is fundamental in Python in that its runtime doesn't have a concept of a program or a library or a module (not to be confused with Python's modules, which is a special built-in type) etc. The only thing that exists in Python is a "Python system", i.e. an installation of Python with some packages.

Python systems aren't built to be shared between programs (especially so because it's undefined what a program is in Python), but, by any plausible definition of a program, venv doesn't help to solve the problem. This is also amplified by a bunch of tools that simply ignore venvs existence.

Here are some obvious problems venv doesn't even pretend to solve:

* A Python native module linking with shared objects outside of Python's lib subtree. Most comically, you can accidentally link a python module in one installation of Python with Python from a "wrong" location (and also a wrong version). And then wonder how it works on your computer in your virtual environment, but not on the server.

* venvs provides no compile-time isolation. If you are building native Python modules, you are going to use system-wide installed headers, and pray that your system headers are compatible with the version of Python that's going to load your native modules.

* venv doesn't address PYTHONPATH or any "tricks" various popular libraries (s.a. pytest and setuptools) like to play with the path where Python searches for loadable code. So much so that people using these tools often use them contrary to how they should be used (probably in most cases that's what happens). Ironically, often even the authors of the tools don't understand the adverse effects of how the majority is using their tools in combination with venv.

* It's become a fashion to use venv when distributing Python programs (eg. there are tools that help you build DEB or RPM packages that rely on venv) and of course, a lot of bad things happen because of that. But, really, like I said before: it's not because of venv, it's because venv is the wrong fix for the actual problem. The problem nobody in Python community is bold enough to address.


> The most important part about venv is that you shouldn't need it. The very fact that it exists is a problem. It is a wrong fix to a problem that was left unfixed because of it.

What Python needs is a tool that understands your project structure and dependencies so the rest of your tools don't have to.

In other languages, that's called a build tool, which is why people have a hard time understanding that Python needs one.


Oh, yeah? It’s working great? Like figuring out which packages your application actually uses? Or having separate development and production dependencies? Upgrading outdated libraries?

Having taken a deep-dive into refactoring a large python app, I can confidently say that package management in python is a pain compared to other interpreted languages.


Virtual environments aren't package management. For example we use Poetry for package management - it supports separate dev and prod dependencies, upgrading etc. It generates a virtual environment.


The distinction feels entirely academic to me. Managing packages means having a sane way to define the dependencies of software projects, so they can be checked into version control and be installed reproducibly later and/or elsewhere.

I don’t know which problem python intended to solve by separating the two, but it doesn’t occur often in contemporary software engineering work.

Having said that, the point you make is valid and Poetry is a good option, but it feels so maddening having to learn about like seven different tools which all do more or less the same but not quite, and everyone and their mother having an opinion on which is the best. Doesn’t help that there’s an arbitrary yet blurry line where package managers end and environment managers begin.


I strongly agree with this, and I have been actively using Python since 2009.

Trying top keep a Pygame/Numpy/Scipy project working has been a real struggle. I started it with Python 2 and ported to Python 3 some years ago. The whole Python 3 transition is a huge mess with every Python 3 point release breaking some things. No other interpreted language’s packaging system is so fucked up.

On a positive note: Lately I've liked using pdm instead of pip, and things seem to work quite a lot better. I evaluated Poetry, Flit and something else also.

I just commented about this on Twitter, when someone asked “Which programming language do you consider beginner's friendly?” https://twitter.com/peterhil/status/1633793218411126789


Likewise, I think people have a negative first experience because it doesn't work exactly like node, throw their toys out the pram and complain on HN for the rest of time.

Guess in taking this stance we're both part of the problem... \s


Because even with --copy it creates all kinds of symlinks, and if you're using pyenv, hard coded paths to the python binary which can break from CI to installation.

If you're using docker then it's a lot easier I guess.


It also quietly reuses the stdlib of whatever python you start from. Which mostly doesn’t matter in real world usage, but can be quite surprising if you ever get into your head the idea that that venv is portable.


But why bother? Just use PDM in PEP-582 mode [1] which handles packages the same way as project-scoped Node packages. Virtual environments are just a stop-gap that persisted for long enough for a whole ecosystem of tooling to support for them. It doesn't make them less bad, just less frustrating to deal with.

[1] https://pdm.fming.dev/latest/usage/pep582/


My complaints stem from libraries/OSes requiring different tools. So conda is sometimes required, and pip is also sometimes required, and some provide documentation only for pipenv rather than venv. And then you've got Jupyter, which needs to be configured for each environment.

On top of that there are some large libraries that need to only be installed once per system because they're large, which you can do but does mess with dependency resolution, and god help you if you have multiple shadowing versions of the same library installed.

I wish it was simpler. I agree the underlying system is solid, but the fact that it doesn't solve some issues means we have multiple standards layered on top, which is itself a problem.

And great if you've been using vanilla venvs. Good for those that can. If I want hardware support for Apple's hardware I need to use fucking conda. Heaven help me if I want to combine that in a project with something that only uses pip.


I agree with this 100%. Simple venv works reliably.

The only gotcha I've had is to make sure you deactivate and reactivate the virtual environment after installing Jupyter (or iPython). Otherwise (depending on your shell) you might have the executable path to "jupyter" cached by your shell so "jupyter notebook" will be using different dependencies to what you think.

Even comparatively experienced programmers don't see to know this and it causes a lot of subtle programs.

Here's some details on how bash caches paths: https://unix.stackexchange.com/questions/5609/how-do-i-clear...


I agree with the statement that venvs are usable and fine. However, they do not come without their pitfalls in the greater picture of development and deployment of python software.

It very often not as simple as going to your target system, cloning the repo and running a single line command that gives you the running software. This is what e.g. Rust's cargo would do.

The problem with python venvs is that when problems occur, they require a lot of deep knowledge very fast and that deep knowledge will not be available to the typical beginner. Even I as a decade long python dev will occasionally encounter stuff that takes longer to resolve than needed.


The annoying thing with vanilla venvs (which are principally what I use) is that when I activate a venv, I can no longer `nvim` in that directory because that venv is not going to have `python-neovim` installed. This kind of state leakage is unpleasant for me to work with.


Lol, I always find it funny when people pretend that Parag is some clueless "management type". I graduated the same year as him from the same school, and he is a ridiculously sharp engineer who has risen up at Twitter the old-fashioned way. It's painfully obvious that Musk just wants sycophantic yes-men around him and was pissed off that Parag wasn't willing to kiss his ass regarding all his "revolutionary" suggestions.


Can he invert a binary tree though?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: