Now, when there are riots, it is well known that police has human "super-recognizers", which can scan thousands of photos and identify suspect individuals on a whim.
Yet at the same time that one of these persons has this ability and ends up working for the police is a low probability event, the police only tend to use said profiler if the event warrants the expense. When a service moves from human ran to technology ran it generally goes from "I need to do this special thing" to "We should just leave it on and do it all the time, the IT budget takes care of that".
If they don't, that's the best startup idea I've heard on HN in a long time. I mean, it's evil and everything, but you could make a lot of money probably. Government contracts would probably pay out too.
The problem is convincing big store chains to put a new dollar barcode scanner machine in their checkout lines at every location, to do something that they have no trouble doing right now with credit card details, rewards cards, and bluetooth tracking.
There is actually a lot of research on how catastrophic failures in highly complex systems happen. Here is a brilliant article that summarizes the main findings:
I cannot read that one without thinking in the descriptions and analyses of disasters like the sinking of the MS Titanic, the Chernobyl disaster, the loss of the Challenger space shuttle, or the Fukushima disaster. Many, many points in the article seem correct for all of them.
And Guix has put that in an beautiful form. There are two things which make Guix special:
1. The package definitions are just a normal, battle-proven, very well defined general-purpose, functional-style supporting programming language (Scheme).
2. There is no conceptual difference between a package definition in the public Guix system, and a self-written package definition which a developers makes to build and test his own package, or to build and run a specific piece of software. The difference is equally small as between using an Emacs package, and configuring that package in ones .emacs configuration file.
Mixing types in an array/list is not remotely stupid. How exactly would you suggest expressing the lack of a value in a series of numbers without None/null? There's your mixed types.
Don't let weird dogma get in the way of practicality.
That is merely a detail of how the typing module has decided to set up convenient aliases. Treated by the language (and likely compilers) as different types.
I mean, it depends on what you mean by “type”. A list of some Protocol type (what other languages call “interfaces”) is still a homogenous list even though the concrete types are heterogeneous. This is almost always what you want when you’re thinking of “a heterogeneous list”.
Yeah, that's a better way of putting it :) 90% of the time, I'm something like Sequence[T], but I'm sure I've used Sequence[Optional[T]] a couple of times. I mean, I could drive donuts in the Piggly Wiggly parking lot at 3a, I just don't, and the same for heterogenous lists.
I’ll second that. I’ve been doing python for a while and haven’t used the mixed type list. I’ve actively avoided doing something like that. The situation doesn’t come up often.
I just use an extension wrapper around boost::property_tree for my json (and xml) needs. Way faster than the built in json support and does automagic type conversion so I don’t have to worry about it.
Now, I’m not running at Web Scale™ but pure python was slow enough to be annoying.
Just out of curiosity: Why is it possible to compile Common Lisp Code (or Scheme, or Clojure) to high-performance native or jit-compiled code, but not Python? It is said that "Python is too dynamic", but is not everything in Lisp dynamic, too?
And none of these languages is less powerful than Lisp, lack Unicode support, or whatever, so this can't be the reason.
It is possible to JIT compile Python just fine. There are projects like PyPy that have been doing this for a long time [1]. The reason these alternative projects never take off is because many of Python's most used libraries are written against CPython's C API. This API is giant, and exposes all of the nitty gritty implementation details of the CPython interpreter. As a result, changing anything significant about the implementation of the interpreter means those libraries no longer work. In order to not break compatibility with the enormous amounts of packages the internals of the CPython interpreter are mostly locked in at this point with little wiggle room for large performance improvements.
The only real way out is to make Python 4 - but given the immense pain of the Python 2 -> 3 transition that seems unlikely.
To be fair the 2 -> 3 upgrade path was terrible. And there wasn't a killer feature in 3 which was terrible. And the tooling around the upgrade was terrible. Basically the python devs completely botched it -- which was terrible.
So one thing of golang that is nice is that go 1.19 compiler will compile go 1.1 just fine, and people can iterate from 1.1 to 1.19 in their own time -- or not if they choose not to. It would not be that hard for golang v2 to continue to allow similar compilation of old code.
this hypothetical 3 -> 4 upgrade would run into a lot of the same issues.
Presumably the killer feature here is that it would be faster. Or at least have the potential to be faster because of less constraints on the c API. But for a lot of python applications, speed isn't all that important. After all, if it really needs to be fast, you probably aren't doing it in python (unless python is just a thin wrapper around a lot of c code like numpy).
And for changes to the C API, it would probably much, much harder, maybe even impossible to automate migrating libraries to the new API. The only way I could see this working well is if you had some kind of shim library between the old API and the new API, but that adds constraints on how much you can change the API, and might add additional overhead.
It’s because Python object attributes can change any time, as they are accessed dynamically. Nothing can be inlined easily. The object structure is pointer heavy.
As other commenters pointed out, some of these Python features, which are unused 99,99% time, could be sacrified for additional speedup by breaking backwards compatibility.
The demand for compiled Python hasn't been as high as the demand for other languages, so the number of people who have worked on it is much smaller than the number who have built JITs for ECMAScript and others. Python has long been fast enough for many things, and where it isn't, it's easy to call C code from CPython.
Python does have lesser-used dynamic capabilities that probably don't exist in Common Lisp. Those capabilities make it difficult to optimize arbitrary valid Python code, but most people who need a Python compiler would be happy to make adjustments.
Having worked on this for a while, one way that might be helpful to understand this is that Python jits (such as the one I work on, Pyston) do in fact make Python code much faster, but the fraction of the time that is spent in "Python code" is only about 20% to begin with, with the other 80% being spent in the language runtime.
For example if you write `l.sort()` where l is a list, we can make it very fast to figure out that you are calling the list_sort() C function. Unfortunately that function is quite slow because every comparison uses a dynamic multiple-dispatch resolution mechanism in order to implement Python semantics.
> Post covid a lot of that got thrown out, commercial buildings are now generally over shooting OA exchange in the name of not turning these spaces into germ breeding grounds.
I think it has become clear that germ loads can be reduced significantly with filtering and ventilation.
> But the general term, as a verb, is "Lüften" or specifically "Stoßlüften" for the shorter form that is often mandatory for apartments with modern insulation but lacking a proper vent system to prevent mold.
Exactly that.
To add, problems with mold are often due to insufficient insulation in some patches of the wall (like near windows), combined with humid air and insufficient ventilation. A properly insulated house should not form mold. But it is necessary to get humidity out, that's correct.
Not exactly. What you heat is the inside of your rooms, and a significant part of that heat can be felt as infrared radiation. When you ventilate in a quick and intensive manner, you exchange the air, which becomes cold for a few minutes but you do not cool down the room - because that takes longer. And the cold air has very little mass, so it is easy to re-heat again, and will be warmed by the walls etc.
I did that in the last weeks. I live in Germany and we have a flat that is not very modern but has reasonable insulation. What I learned that under these conditions, with two people in a 30 square meters room, CO2 rises so quickly that one has to do intermittent, full ventilation (about 3 to 5 minutes, in Germany it's called "Stosslüften") for about every hour, to keep CO2 levels below 1500 ppm.
During the night in a smaller room, one can get easily levels of 3000 ppm. That's not dangerous, but it is not recommended for places where people have to work in a concentrated manner, like offices, or schools, for example. The recommended limit by the federal office for environment is 1000 ppm.
And that build-up of the stuff is a big difference to where we were living before (Edinburgh) where that kind of ventilation was basically unnecessary because no window closed that hermetical (well we lived in one of those wonderful 140 year-old houses in New Town).
In the end, with good insulation, it becomes increasingly important to have some active ventilation - it is also more economical, because it saves on heating. And that is in fact becoming more widespread with new family houses.
Now, when there are riots, it is well known that police has human "super-recognizers", which can scan thousands of photos and identify suspect individuals on a whim.