What type of filter do you mean? Unless I'm misunderstanding/missing something, the approach described doesn't go into the details of how coverage is computed. If the input image is only simple lines whose coverage can be correctly computed (don't know how to do this for curves?) then what's missing?
I'd be interested how feasible complete 2D UIs using dynamically GPU rendered vector graphics are. I've played with vector rendering in the past, using a pixel shader that more or less implemented the method described in the OP. Could render the ghost script tiger at good speeds (like 1-digit milliseconds at 4K IIRC), but there is always an overhead to generating vector paths, sampling them into line segments, dispatching them etc... Building a 2D UI based on optimized primitives instead, like axis-aligned rects and rounded rects, mostly will always be faster, obviously.
Text rendering typically adds pixel snapping, possibly using byte code interpreter, and often adds sub-pixel rendering.
> What type of filter do you mean? […] the approach described doesn’t go into the details of how coverage is computed
This article does clip against a square pixel’s edges, and sums the area of what’s inside without weighting, which is equivalent to a box filter. (A box filter is also what you get if you super-sample the pixel with an infinite number of samples and then use the average value of all the samples.) The problem is that there are cases where this approach can result in visible aliasing, even though it’s an analytic method.
When you want high quality anti-aliasing, you need to model pixels as soft leaky overlapping blobs, not little squares. Instead of clipping at the pixel edges, you need to clip further away, and weight the middle of the region more than the outer edges. There’s no analytic method and no perfect filter, there are just tradeoffs that you have to balance. Often people use filters like Triangle, Lanczos, Mitchell, Gaussian, etc.. These all provide better anti-aliasing properties than clipping against a square.
> If the input image is only simple lines whose coverage can be correctly computed (don't know how to do this for curves?) then what's missing?
Computing pixel coverage accurately isn't enough for the best results. Using it as the alpha channel for blending forground over background colour is the same thing as sampling a box filter applied to the underlying continuous vector image.
But often a box filter isn't ideal.
Pixels on the physical screen have a shape and non-uniform intensity across their surface.
RGB sub-pixels (or other colour basis) are often at different positions, and the perceptual luminance differs between sub-pixels in addition to the non-uniform intensity.
If you don't want to tune rendering for a particular display, there are sometimes still improvements from using a non-box filter
An alternative is to compute the 2D integral of a filter kernel over the coverage area for each pixel. If the kernel has separate R, G, B components, to account for sub-pixel geometry, then you may require another function to optimise perceptual luminance while minimising colour fringing on detailed geometries.
Gamma correction helps, and fortunately that's easily combined with coverage. For example, slow rolling tile/credits will shimmer less at the edges if gamme is applied correctly.
However, these days with Retina/HiDPI-style displays, these issues are reduced.
For example, MacOS removed sub-pixel anti-aliasing from text rendering in recent years, because they expect you to use a Retina display, and they've decided regular whole-pixel coverage anti-aliasing is good enough on those.
In C++, in particular when restricting to a C like subset, I prefer looking at an expression like
foo->bar.baz
instead of (in Rust and other modern languages that decided to get rid of the distinction)
foo.bar.baz
For example, the former lets me easily see that I can copy foo->bar and I now have a copy of baz (and indeed bar). In a newer language, it's harder to see whether we are copying a value or a reference.
I see what you're saying but I'd argue that this is mostly an unnecessary thing to worry about because with the exception of types explicitly opted into being cheaply copyable, you're going to be moving it if you're not accessing it via a reference. The idea is that if you're worried about expensive copies, it shouldn't be possible to copy implicitly in the first place; you'd either explicitly `clone` or you wouldn't be copying it at all.
I'm not worried about expensive copies. I'm worried about being able to understand my systems code. The solution isn't adding more abstractions (like move semantics on top). I don't want to move anything. I want to be clear about taking a reference or making an actual copy, these are deeply, semantically, different. This difference is important for single threaded code but also for concurrency -- not only with mutable data types but also with immutable ones.
Performance is mostly a consequence of clear and direct code. You mostly don't achieve performance by saving individual copies, but by being in control of the code and architecture.
I don't think your run often into these things, because of Rust's ownership enforcement. But I might be misunderstanding you, because it's all pretty abstract and I might not have the whole context.
He wrote this whole game in it. Apart from that, a couple dozen or hundreds of beta-testers. Not sure whether the language ever gets released, maybe he's too worried about having to maintain it and not being able to change it anymore.
I haven't played any of these games, but "explains otherwise" seems to be a misrepresentation given that the commenter you linked is saying himself that Blow's game combines ideas and rulesets from several other previous games.
Elsewhere in the arstechnica comments you linked
> But, uh... this isn't a "Linus Torvalds is a jerk" sort of situation. "Controversial" undersells just how outlandish and inappropriate Blow's views are. Blow is a full-bore fascist sympathizer who also doesn't seem to think that women have any role to play in his profession.
What's going on on these platforms? Is there any serious evidence to the strong claims?
It seems that the "covid trutherism" or "spreading covid misinformation" claim is unjustified. Here's Blow's original tweet:
> If a state entity does an oopsie in a lab, then forces its citizens to undergo an experimental treatment because of the oopsie, while suppressing news of side effects, and also denying that the oopsie is anyone's fault ... that's just abusive?
Unfortunately Blow was unwilling to come out and state his position here, relying instead on innuendo, so we have to kind of guess what he was trying to say. I interpret him as making four claims here:
1. The COVID-19 pandemic originated in a lab leak.
2. Some Chinese people were forced to accept experimental vaccinations.
3. The government of the PRC suppressed news of the side effects of the vaccines.
4. The government of the PRC worked to prevent investigations into the cause of the pandemic.
Claim #4 is plainly true; the WHO and several other countries have protested this at great length.
Claim #2 probably depends on your threshold for "experimental" and "forces". https://en.wikipedia.org/w/index.php?title=Sinopharm_BIBP_CO... explains that emergency vaccination was available in China in July 02020, and there are plausible claims that Chinese state employees and students traveling abroad were required to take it. This was before results were in from clinical trials, which I think qualifies for most people's definition of "experimental"; the WHO wouldn't add it to its list of authorized emergency vaccines until May of the next year.
Claim #3 seems almost guaranteed to be true, but I don't have direct evidence. The government of the PRC routinely suppresses news, and there are numerous well-documented instances of this happening in connection with COVID, and there are always some subjects in clinical trials of vaccines who have major health problems such as death which may or may not be caused by the vaccine. BBIBP-CorV seems to have been, in the end, pretty safe, but it seems inconceivable that there weren't at least some news of people dying or having terrible health problems after receiving it which were deleted from Weibo or other media ("suppressed"), and that these deletions were carried out because of state policy of the PRC.
Claim #1 seems like the most debatable one, but even that isn't an open-and-shut case. At the time, the lab-leak case was fairly weak, and it certainly hasn't been proven, but it hasn't been disproven either; see https://www.astralcodexten.com/p/practically-a-book-review-r... for an extensive summary of the debate. Because of the truth of Claim #4 it seems unlikely that it will ever be disproven.
More generally, I find deplorable the polarization on partisan political grounds of fields like puzzle games, genetics, and quantum physics. Artistic development, understanding the world, and extending technology are necessarily collaborative endeavors, and rejecting Blow's games because he criticizes the Chinese government seems akin to refusing to use the Schrödinger equation because Schrödinger sexually victimized teenage girls.
I think you are taking a very charitable view here - the tweet immediately before the one you quote is clearly talking about the US vaccine mandate (not China).
> There's a weird disconnect in this vaccine mandate debate: many are still pretending that Covid-19 is of natural origin, which gives such mandates a different feel than they otherwise have.
Contrary to your assertion, this is not clearly talking about vaccine mandates in any particular place. And the tweet I quoted previously is claiming (or hinting) that the same "state entity" had caused the pandemic and mandated the "experimental treatment". I'm not familiar with any versions of the lab-leak hypothesis that claimed that covid escaped from a US lab, so I don't think it's a reasonable inference that he's talking about the US vaccine mandate.
On the other hand, he seems to have worked pretty hard to avoid clearly stating any of his positions here, so who knows what he really thinks? Or thought?
The problem with your scenario is that the Chinese government didn't have a covid vaccine mandate in October of '21 (when Blow's tweet was published).
Their covid vaccine program was voluntary up until they tried to establish a mandate in July of '22 (a lot of commentators seem to be confused on this point, as there are mandates for childhood vaccines in China - but these never extended to covid vaccines)
Maybe not for everybody, but I recall hearing that certain military units, students, and government officials were required to get covid vaccinations already in '20. Maybe he heard the same thing?
Ok, I can see the "fascist sympathizer" (though the fascist is Trump, not Mussolini or Hitler, so it's presumably not such a minority opinion in the US overall). But "doesn't seem to think that women have any role to play in his profession" doesn't seem substantiated from those links, unless I'm missing something here? Women being less interested in programming according to him is completely orthogonal to whether he thinks they should play a role
Can you share why these statements are controversial?
They might be misguided or misinformed, but the underlying fact is that women are not as well represented in stem. Just because the reason it's more likely to be misogyny rather than any biological inclination, doesn't make it an outrageous statement in my opinion.
The difference in participation within STEM between men and women is not well explained by biological differences. Blow has repeatedly claimed that it is actually the primary factor and seems actively disinterested in other explanations.
This is "controversial" in that it's a position that is not well supported by evidence and he has repeatedly used his platform in the past to make unsupported claims to the contrary.
Is the opposite explained? I haven't read literature on the topic, and I'm by the way also somewhat of a sceptic of science on such topics, as a layman. But it seems super obvious that girls/women on average are not wanting to spend their teenage years in the basement programming geek stuff, like many boys/men do. In my experience, here in Germany, and you can probably extrapolate to the West in general, it's not like girls aren't encouraged to pursue programming or science. Men are, on average, just more willing to put in the hours of social neglect in order to become good at such things as programming, or also gaming, or whatever other fringe unsocial hobby. A big part of that is probably competitiveness, but also I believe there are more loners among men. Again, this is not scientific, just personal observations, also ideas I've picked up that I can agree with. I'm not even saying that it must be mostly for biological reasons (though I assume it is), just that there is a deeper reason for fewer girls to exist in tech than just "there is patriarchy and power structures and misogynist gatekeeping and shit".
Never forget that the social neglect is not exactly healthy, and programming isn't actually that prestigious and externally rewarding, except for maybe the compensation that you can currently earn in some places.
Adding that for example in math or other sciences, we are much closer to gender parity.
Given the success of women in sports such as ultra marathoning, medicine etc I don't think it is that conclusive that women are not willing to put the hours into difficult and isolating activities.
There are a great number of studies of the social aspects of gender differences in work but I don't have a single authoritative source for you.
> Men are, on average, just more willing to put in the hours of social neglect in order to become good at such things as programming, or also gaming, or whatever other fringe unsocial hobby.
It is much easier to put in the hours of gaming when you're not repeatedly called for your rape or have someone trying to stalk you or similar aggressive behaviors towards people perceived as female in these spaces. I pretended to be a woman in gaming spaces for some time just to see if these women had a point and the level of harassment I experienced is way more than even my most unmoderated cod xbox days. It's a simple voice modulator in chat.
Point taken. I do think that it can be challenging to be a rare female amongst males (it would probably be similar the other way around). But the biggest contributing factor for such behaviours is certainly the anonymity of online gaming.
For all I know, being a male programmer myself, with a significant proportion of females in all my programmer circles so far, I can attest the exact opposite. Every one of those circles has been welcoming and inclusive.
I don't think he's said exactly that in his own words but I think on balance it's fair to say he doesn't seem welcoming about it.
He clearly has right leaning and libertarian views, and seems to be not very articulate or sensitive in how he discusses them so I can see why people might read into that more than they should maybe.
Thekla currently has 10 core permanent employees. 5 of them are women, including their studio manager, creative and art Director, a programmer, and 2 additional artists.
You can say whatever the hell you want. Or you could spend 3 minutes actually looking at public information to see if you're wrong.
Half his employees are women—including leadership, programming, and creative roles. If that doesn’t count as “thinking women have a role,” what would? 51%? 90%?
You’re relying on blatant social media mischaracterizations over real actions.
He actually employs women at parity. You feel like this is unwelcoming.
One of those statements is data. The other is fanfic.
You said, "I don't think he's said exactly that in his own words but..." That's implicitly saying, "well, he hasn't admitted it outright, but yeah, he basically believes it."
Now faced with evidence contrary to your beliefs, you're claiming you didn't say that. When presented with proof, It's ok to just admit that you were wrong.
Am I supposed to be embarrassed for defending someone against a baseless smear?
Anyway, call it "defensive" all you want. It doesn't change the historical thread: You argued, at best, his views made the workplace unwelcoming; the data shows he hires women at parity. You're just backpedaling because the reality didn't match your narrative vibes.
Happily accusing without evidence? Not shocking behavior. What's shocking is to just say it out loud. LMAO. Funny how "believe women" stops applying when their choices contradict your priors.
When you say, "not well support by evidence," you're either wrong, anti-science, or lying. Numerous studies absolutely show very large average differences in interests based on sex. And those carry over into occupation preferences. Just one more recent study:
Plus: Jon never said it's the "primary" factor, as you claim. He said it's a large factor, that doesn't apply at the individual level, but on average. Which is entirely factual and supported by copious amounts of research.
Just because people like you want to be offended by science, doesn't make it wrong, or controversial.
This study confirms that there is a gender difference but it doesn't explain why. I didn't claim that there were not differences, but that they were not well explained by biology.
Sex is the strongest single predictor of vocational interest orientation we’ve found. Nothing else comes close. If that’s not ‘explained by biology,’ you need to tell me what would be. Otherwise you’re operating on faith.
It's hard to control for social conditioning. I don't need to be able to tell you what the alternative is to be able to tell you that there are many confounding factors.
Knowing what does not explain something, doesn't tell you what does explain it.
They did try to account for social conditioning: parents' education and jobs, local labor markets, school performance, the whole bit. The gap still didn't move much. If socialization were the main driver, you'd expect the most egalitarian countries to have the smallest gaps. They don't. In a lot of cases it's the opposite. Sweden, for example, shows bigger differences in occupational preferences than places like Pakistan.
So at that point you're not pointing to a specific confounder, you're basically saying "maybe there's something else." Sure, logically you can always say that. But if the evidence keeps stacking up in one direction and the only reply is "could be something," that's just refusing to update your view.
Congrats! You've made your position unfalsifiable.
When the data consistently shows gaps widening as social strictures loosen, and your response is to blame an invisible, unmeasurable "conditioning," you aren't doing science at all. But you are insulating your belief from any possible counter-evidence.
No I'm just clear that the current state of science makes it impossible to draw the conclusion that you are.
Note that this outcome goes both ways. We can neither confirm that biology is the main driver nor confirm that it isn't. Life is not as certain as you want it to be.
They're not contradictory in a vacuum. But in this sequence, they show you're backpedaling. You opened with a firm claim, and when confronted with actual data, you retreated to 'we can't know.' Pretending that perfect certainty is required here is just a dodge.
Well, no, you're the one that is "wrong, anti-science, or lying".
The very first sentence of the article you linked to says, "Occupational choices remain strongly segregated by gender, for reasons not yet fully understood."
So claiming that its for biological reasons is bullshit. You have no idea whether it is or not. And neither does Blow.
AFAIK there are differences established on many psychological axes that are more basic than "occupational choice", such as competitiveness, neuroticism, interest in things vs human relations, and others. I don't understand these deeply but you can research for yourself, so there is certainly no shortage of possible explanations based on those.
Well, you "haven't read literature on the topic"[1] so maybe leave the speculation at the door or go out and read some literature to cite rather than presenting "ideas [you]'ve picked up that [you] can agree with" as "established"?
I've been very clear that I'm a layman, such as certainly most of the commenters here. I qualified using "AFAIK" and I've heard this on different occasions by people who have actual experience in the field. You can find similar claims on this page, partly backed by links. For example, I too have heard about studies evidencing that gender differences are more stark in developed countries with well functioning social systems, where people are freeer to choose their profession based on personal interest rather than for example economic aspects.
LOL. You're going to dismiss the study because of the justification for doing the study. Here, let me help you understand:
"not fully understood" -> "so we studied it" -> "here's what we found"
Besides that obvious point, the sentence you quoted says "not yet fully understood," not "we have no idea." Those aren't the same thing. We actually have substantial evidence pointing in a clear direction.
- The most egalitarian countries show the largest gaps, not the smallest.
- Women exposed to elevated androgens in utero become more things-oriented despite being raised normally as girls.
- Male and female monkeys show the same toy preferences we do. Nobody's socializing rhesus monkeys into gender roles.
- A 1.28 standard deviation gap in every culture that emerges in infancy and grows as societies get freer is not what socialization looks like.
You're treating "not fully understood" as "both hypotheses are equally supported."
They aren't.
The evidence overwhelmingly favors a substantial biological component. Just because you don't like the implications of that, doesn't make it false.
That study found that when you test 14 monkeys alone in cages where they can’t actually move the toys, you don’t see the same sex differences as when 135 monkeys are tested in social groups with freely movable toys.
The authors themselves say the social context may be necessary for expression. That’s not evidence against biological contribution, but evidence that behavior requires context to manifest.
You don’t disprove hunger by noting that people don’t eat when there’s no food available.
2) An explanation of this needs to account for a great and rapid shift in favor of women, as far as proportion-of-practitioners, that was happening at exactly the same time as the opposite shift in programming, in both law and medicine.
I don’t know what the actual reason is but “it got prestigious so women got pushed out” makes no sense to me, based on the timeline of events in full context. It was very much not prestigious in the ‘80s and early ‘90s, certainly far less so than law and medicine at that time (still isn’t as prestigious as those, outside tech circles—you can see it in people’s faces. It’s high-paid but lower-“class” than those, to this day)
The traditional way I heard it wasn’t that it was about prestige, but rather that programming became engineering-coded rather than humanities-coded. And misogyny did play a role there, one of the Turing movies had a great story line about it, although I can’t remember the name off hand.
Related, I think math went through a similar transition.
It completely neglects the actual history of the field of computing, even just the 20th century, where the field was filled with women.
Something interesting that I think a lot of younger people don't appreciate: back in the day, unless your name was Hemingway, it was considered unmanly to touch a keyboard. Anything that involved a typewriter or anything else with a keyboard was distaff by definition, just so much secretarial work. Maybe a journalist's job, if you were feeling generous.
Sounds stupid as hell, and it was, but that's a big reason why women played an outsized part in the growth of computing. First as the 'calculators' in WWII, then as Baudot terminals started to take over, as keyboard operators.
Don't make the mistake of assuming they were all Grace Hoppers or Margaret Hamiltons or Adele Goldbergs, because that simply wasn't the case. Many of them might have been, though, in a less stereotype-driven world.
I would be very surprised if this connotation was intentional of him. His name was "Naysayer88" for a long time, and I had wondered as well where that 88 came from -- maybe it was a rhyme on "Naysayer", which (ignoring the number) is an apt description of his ways and approaches. At some point he changed the name. I assumed the reason was he had gotten aware of the connotation.
Blow is an odd duck and clearly following a political descent into fascism after his SV tech bro heroes. Just that his political descent occurred after he started Twitch streaming and as much as he boot licks Musk so I can see him defending that (if that’s what you’re referring to) I don’t think it’s credible that he would support Hitler.
I'm thinking he changed the name when too many people had gotten aware of the connotation.
You have to twist logic pretty fucking hard to find a reason for him to put 88 in his username. He's a guy who thinks he's way more clever than he is and gets upset when it gets pointed out to him.
What's the point comparing the sympathy to that of Mussolini or Hitler but qualifying it as not a minority position? Those two had even greater domestic support.
> though the fascist is Trump, not Mussolini or Hitler, so it's presumably not such a minority opinion in the US overall
Does that make a difference? You could levy the exact same argument about the other two in their respective countries in their respective times. Doesn’t make it OK.
It is OK in the sense that these are not fringe opinions, they are part of the mainstream political discourse that, as a serious person, you can not effectively dismiss by throwing around certain bad words like fascist.
Neither was slavery. Was that OK too? And to clarify (though it’s worrying this point needs to be made), I mean morally.
> throwing around certain bad words like fascist
Fascism has a very clear definition. It describes a particular set of behaviours and actions, all of which you can compare to reality and determine if it’s happening or not. It’s an objective word. If anyone is trying to “dismiss” anything, it’s the people pretending it’s subjective because they support its outcome.
> Neither was slavery. Was that OK too? And to clarify (though it’s worrying this point needs to be made), I mean morally.
It may well have been morally OK to most people (see: moral relativism), and since you're implying it wouldn't have been OK to you, it's worth pointing out that you probably wouldn't have done anything about it in the relevant time periods.
If you're an American you don't even need to try that hard to make moral relativism visceral: was the displacement (and far worse) of Native American tribes "OK"? I'd say no, but it isn't morally urgent enough to me or the 99%+ of Americans who are unwilling to pack their bags and return the entirety of two continents to the native descendants.
The therm "fascist" is definitely being thrown around like it was nothing, for the most unnewsworthy opinions or statements. There are definitely people who call anyone fascist who would dare to claim that there might be differences between the sexes on average for example. Doing so probably has a fascist element itself (not accepting different opinions). It's also unreasonable, and let me say _ridiculous_, to even doubt that there are certain differences. To be clear, it's of course not right to make any prescriptions what any specific member of a sex should or could do -- but that's a completely different thing.
"There are differences between men and women" isn't a fascist-coded statement because of the statement itself - it's obviously a true statement no matter what you believe. It's fascist-coded. This statement is almost exclusively said by fascists, for reasons that have not much to do with the statement itself.
Why is that? IMO it's because fascist slogans always tend to drift away from their actual meaning, towards things that are socially acceptable to say.
Back in Hitler's time, Hitler didn't give speeches about "Let's kill all the Jews" - he'd rather give speeches about "Let's clean up Germany" even though he clearly wanted to kill all the Jews. When Hitler says "Let's clean up Germany" and the crowd goes wild, you know they're going wild because they're wild about the idea of killing the Jews, not because they're wild about the idea of mopping the floor. At least I assume you would know that now, with the benefit of hindsight. You'd have to be living under a rock not to.
And that's not a euphemism for "Let's kill all the Jews" specifically. It's a general euphemism for all the bad things he wanted to do with all the people. It's not like there's one euphemism for "Let's kill the Jews" and a different euphemism for "Let's gas the Jews" and a different euphemism for "Let's kill the gays". It's more like all the euphemisms point to all of the underlying true thoughts, all at once. One loose region of semantic space points to another loose region of semantic space.
You can see how Hitler could have started out saying what he actually meant, but to avoid scrutiny he'd drift towards more innocuous words, but anyone who's been following his whole campaign would know what was meant. It's a bit like Cockney rhyming slang - the pointer drifts until it has no surface-level relation to the pointee, but just because it's not surface-level obvious, doesn't mean it's unknowable (as people who pretend not to recognize the statements often claim).
And if I'm in Germany in 1932 and I'm following politics, and my friend says "I support cleaning up Germany" I'm going to do a double-take. I'm going to suspect he's not talking about mopping the floor and picking up litter. Though, if I'm in Germany in 1932 and I'm ignoring politics, I might reasonably assume that he is talking about those things and get quite confused why my other friend thinks he's a fascist.
In modern fascist dialogue, "men and women have differences" is a pointer to the semantic space containing statements like "women belong in the kitchen", which itself is pointer to the semantic space containing statements like "women should do what men tell them". You can see how this came about because saying "women should do what men tell them" would be unpopular, then fascists justified it with logic like "well women are biologically submissive and men are biologically dominant" and it over time it got watered down to stuff like "men are biologically different from women"
I for one have said that sentence you're discussing a lot, and you'll just have to take my word that I'm far from being a fascist. I even draw conclusions from that sentence, but I'm trying hard to not draw any conclusions about specific members of any given sex.
I of course get where you're coming from, but don't you think it is intellectually dishonest to try and police certain "obviously true statements"? Isn't it similar to banning kitchen knifes because they can be used to kill? Doesn't it put under suspicion a lot of people who are simply following their intellectual curiosity?
I would argue that the ideas you seem to be advertising can lead to similar societal catastrophes as the ones you're trying to prevent from reoccurring.
For sure, the misguided idea that men and women are absolutely, 100% the same, and that any other outcome than some equal distribution between males and females means there must be mysoginy and patriarchy at work (which I don't say you're proclaiming directly), has lead to a lot of real problems in the past decades. And that includes aggressive propaganda against males in general, and against some actually valuable male virtues as well as female virtues, in some circles.
Just ask yourself what you'd do if you were living in the early days of Hitler and someone said Germany needed to be cleaned up. This analogy seems to answer several of your questions.
Or if someone says "make America great again", today. I mean who doesn't want America to be great?
I deny this rhetoric; you can use it to justify all kinds of wrongs. I can't tell you what I'd done if I was living in the early days of Hitler, because I don't have that context, while I do have the hindsight. Comparing Hitler with the current US administration seems a bit of a stretch to me, even though I have strong disagreements with some of the things that Trump/MAGA are doing (or _seem_ to be doing. At this point it's hard to trust anything anymore). On the other hand, there's a serious question to be asked, had we not been on a descent to madness for more than a decade before the current administration?
> Neither was slavery. Was that OK too? And to clarify (though it’s worrying this point needs to be made), I mean morally.
From the perspective of a pre-abolitionist society, it evidently was, but that's not a political issue you're gonna have to deal with in 2025. Consider yourself lucky.
> Fascism has a very clear definition.
First of all, that isn't true. Secondly, even if it was true, it wouldn't matter. You are using the word as a though-terminating cliché. That doesn't work in the long run, you'll just get ignored. As a result, you can pat yourself on the back for calling out fascism while all the behaviors and actions that you believe to be fascist are mainstreamed and affecting people's lives. If I was you, I'd be more worried about criticizing those behaviors and actions on their merits (or lack thereof), rather than trying to tie them to some textbook definition fascism and dismissing them wholesale.
All I can say to you is that the nonchalance with which you throw around words like slavery or fascism is gonna do nothing but get your bozo bit flipped. It is not going to help any cause you may care about, valid and righteous as it may be.
Isn’t this just telling on yourself though? If you’ll flip the “bozo bit” over mere aesthetics of word choice you’re probably not a serious person to begin with.
I don't think it's merely an "aesthetic choice" when it comes to words like slavery or fascism, but even then: aesthetics matter. We all know the guy that always speaks in hyperbole. We learn to not take anything he says seriously.
The reason the advice is "do not flip the bozo bit" is because the default is to flip it. It's what people do naturally. If you're running around getting bozo bits flipped, you better know what you're doing.
> You are using the word as a though-terminating cliché.
Of course I’m not, I barely use the word. Pay attention to the person you’re replying to. What you’re doing is putting me in a box of other people you’ve seen online and making a bunch of wrong assumptions. You’re not engaging with the arguments, you’re fighting against a straw man in your imagination.
> I sincerely doubt the slaves would agree with you. Just because one group was economically and societally OK with it, doesn’t make it morally OK.
That is wrong, slaves were happy to be alive instead of killed in most societies. It wasn't "slavery or freedom" it was "slavery or death" in most cases. America is an exception there, but in most areas with slavery it was done to criminals that otherwise would have gotten the death penalty.
Christianity forbade enslaving Christians, so we just killed our criminals for the past thousand years, but before Christianity we practiced slavery as punishment of crime everywhere as people thought that was better than killing them.
That is complete nonsense. Where did you get that from? You really think most slaves were criminals? What culture did that ever happen (apart from modern USA).
> I sincerely doubt the slaves would agree with you.
I sincerely doubt a vegan would agree that eating meat is OK, but as a society, we agree that eating meat is OK. It might not be OK tomorrow, it might not be OK by some moral standard, but that's besides my point.
> That’s a really strange comment. What does that mean?
It means fighting for abolition then was a much tougher fight than the fight you have today.
> Of course I’m not, I barely use the word.
I may have misinterpreted your position to the effect of "look in the textbook, Trump is a fascist by definition". Indeed, I have seen "other people online" argue to that effect, and they weren't made of straw. If that's not the case, I apologize, but the point stands even if you're not the kind of person it should be aimed at.
> From the perspective of a pre-abolitionist society, it evidently was, but that's not a political issue you're gonna have to deal with in 2025. Consider yourself lucky.
...do you not also consider yourself lucky about this? Weird phrasing.
Yes it does. When you live in Europe and listened to your late grand parents talk about the war. In Europe, "fascist" actually has still some weight to it and it doesnt get thrown around so casually as the US, yet. Same strory with the word "communist"...
I think we're all perfectly capable of following links and drawing our own conclusions. They are links to secondary sources mostly because Blow is notoriously unwilling to step outside of his Twitter bubble, and no one wants to link to that anymore.
What is the good faith way to link to "(It doesn't help that all males currently under the age of 40 were raised to be supercucks.)"? The link exists in the post but you object to that link as a bad faith way to link. So what is a good faith way to link to this tweet?
The subredditdrama post in question does in fact contain a link to the full tweet, which you objected to as bad faith. So I'm asking what is a good faith way to link to this tweet.
I don't think "Drama" implies which side of said drama is in the right. That drama surrounds a bunch of Blow's public statements is maybe the one thing everyone can agree on
That community has no oversight for what gets posted. It's a free-for-all for anyone to gather (read: cherrypick) low quality information and present it in an overtly sensationalist way and intentionally misrepresent what they quote.
They have no standards, no oversight, no formal methodology, so naturally it attracts gossip-oriented people who want to stir up drama for fun.
Why link you to the handful of individual links directly when you clearly can identify and sort through the source yourself? The poisoning the well clearly wouldn't work on you. Well, here the links are:
"This is true, the gaming press is super left-wing, but on the other hand they have almost no impact now. I would say that the social pressure keeping "indies" in line mostly comes from them being socially fearful in the normal way. (It doesn't help that all males currently under the age of 40 were raised to be supercucks.)"
https://x.com/Jonathan_Blow/status/1854708962462982465
"Interest is not the same as ability. I believe it is likely that the sexes have different interests on average, and that biological factors play a large part in this. This is *NOT REMOTELY* a controversial opinion except on Weird Far Left Twitter 2017."
https://pbs.twimg.com/media/DRT4vNEUIAEJgP3.jpg
"There's a weird disconnect in this vaccine mandate debate: many are still pretending that Covid-19 is of natural origin, which gives such mandates a different feel than they otherwise have."
https://x.com/Jonathan_Blow/status/1447601578123296769
Alright, I don't agree with half of what he said here, but really? Is that supposed to make him look like some irredeemably bad person?
Are we seriously going to pretend that men and women—on average—do not differ in their general interests, and furthermore get mad at people for pointing that out?
And I'm not fond of the current administration, but it's a bit extreme to write someone off as a person for liking who is president. You would be writing off literally half of the entire country, and no, that's not something to feel virtuous about, that's just nonsense.
Frankly I think I would rather have a conversation with someone like him instead of someone who would get disproportionately upset at those points.
I opened it for you. It's basically the same problem with Notch or JK Rowling and it's backed up by credible sources. He said women don't like programming because of biology; he said the USA made COVID-19 in a lab and he opposed the vaccinations for it; he said Donald Trump is the best president of his life; he supports the new Facebook rule where you're allowed to post misinformation.
There's clearly something about making a successful game (or book) that just makes you completely lose touch with reality after that.
I've been watching Blow work on his compiler and game for many years. He has gone the deep end in his sympathies for Trump and Trump adjacents, but misogyny I've never witnessed from him.
I think he is the latest victim of the Notch-Rowling slide into rightism. It happens when a relatively benign conservatives have opinions that get the internet mob riled up, bullies them, cancels them and thus makes them dig deeper into their righitst believes and moving more and more into hating said mob, extending that hate to the people the mob pretends to represent, etc. It's a bit sad really. I hope he'll come out of it some day, but in my experience he doesn't have the humility of accepting when he's wrong.
I think your general idea is right, it sounds reasonable that the insane cancelation mania can bring some conservatives to dig into deeper holes. It is probably what enabled the recent right shift in politics. As to Blow specifically, I've watched his streams quite a bit. I've always had sympathy for him and have been able to relate to his opinions a lot (about software in particular). But I can see how some other people could take offense from the way he's presented his stances.
I say that as someone who once made him angry myself when I live-commented in one of his streams because I had a rare disagreement. I was maybe not in shock but at least startled by his reaction. I had presented my disagreement relatively casually.
Now, my impression is that he's tuned down his considerably and developed a more well meaning stance on things over the years. Recently I've found him more on the side of "here's how most people are doing this, I don't like this, maybe I don't think it's sustainable or how you get good results, but anyway here's what I like to do instead, make of it what you want".
I'm not talking about his words on technical stuff, I'm talking about him being so pleased with the state of US today. Somehow in Blow's mind what Trump and his handlers do to the country is the best thing ever.
I'm not a US citizen, but being enthusiastic about other people losing their freedom and freedoms is obscene.
There's also just a lot of "No, no, no, I kill the bus driver". A sort of "Greater Fool theory" but for genocide, everybody else is a useful idiot who, having supported your rise, is then next in line to be sacrificed, never for a moment remembering that even if you are the only person to have thought of this - which is unlikely - everybody who understands how this actually works will have been together against you from the outset.
Misogyny is a subset of supporting trump. If you've seen him go off the deep end on supporting trump then you are witnessing his misogyny, even if you ignore his other comments.
> he doesn't have the humility of accepting when he's wrong
Isn't he pretty far on the autistic spectrum? It can be very difficult for that kind of personality to re-evaluate something, once they think they have reached a "logical conclusion".
I'm not making excuses, just agreeing that the chances of him changing seem low.
I don't know, but I doubt it. He's too well adjusted at being social (his hobbies have him interact with people on the regular, and he's streaming on twitch, and doing public speaking at conferences) for me to think that.
You are missing their point. They are saying they start with relatively benign views, and the intense overreaction to those views drives them to support much more extreme views, like what you are describing, that they otherwise might not have.
I can't speak for Blow, but that definitely seems to accurately describe the arc Rowling has taken over the last 7-8 years.
> but that definitely seems to accurately describe the arc Rowling has taken over the last 7-8 years.
What a bizarre time we are living in when "men aren't women" and "women should have single-sex spaces and rape crisis centres" are considered extreme views.
Women who insist that they specifically get to decide who is or isn't a woman and what women believe aren't new. Phyllis Schlafly managed to ensure the Equal Rights Amendment didn't pass on this same basis. Phyllis would fly from city to city, addressing crowds of women to tell them that women should be at home looking after their kids, not um, flying from city to city making political addresses like she did...
Beware anyone who claims to represent "all" of some large diverse group, such as "Women" or "Floridians".
"Women should have single sex spaces" turns out to be used to justify, "It's OK to be hateful and even violent against women in these spaces so long as your excuse is that you believe they're not actually women" which is bullshit.
Years ago, when I wasn't too tired to spend all day and half the night dancing, I went to Bang Face Weekender - basically imagine a huge multi-room club night except for days and days. I keep the socials for it available because hey, it's a nice memory. This sort of "Single sex spaces" bullshit caused a problem for the last-but-one Bang Face because a new-to-this Security outfit somehow decided it's their job to go remove people who in their view weren't women from a toilet for women. These women weren't causing any problems for anybody else, but because they presumably had the wrong genitals or for some other reason were "suspect" to that Security team, Security dragged them out of a toilet cubicle and threw them out of the site. Other clubbers were of course horrified, and the event runners had to apologise to everybody - because regardless of how many X chromosomes you have, or whether you do or don't have a womb, dragging people out of the toilets because you've got weird ideas about what is or isn't a woman is batshit.
Phyllis Schlafly is an odd comparison to make. She argued that women should stay in traditional roles and out of public life (while as you mention, not following her own advice), whereas JKR and other feminists take the exact opposite view. Not sure I see the relevance of your analogy here.
As for Bang Face last year, what happened is that security staff kicked a group of males out from the women's toilets. I agree that this isn't an ideal outcome, much better would have been if these men had respected that women's spaces are not for them, and stayed out in the first place. The fact that their removal was treated as some sort of scandal shows how far we've lost sight of the rights of women and girls to have single-sex provisions.
So, you absolutely agree with Phyllis, that one woman somehow gets to decide who is or isn't a woman and what all women believe.
And yet this fact about your belief makes you so uncomfortable that you find yourself trying to pretend that somehow it's the opposite of what you believe.
I think this is letting people off the hook. We're talking about adults in their 40s and 50s here. When people like that 'suddenly' endorse extreme views it's because they had held them back and feel enabled to say them now, an adult isn't going to become an extremist because someone was mean to them online.
I'm 20 years younger than Blow and even at my age I can tell I'm settled enough psychologically that adopting radically different views would require a lot of internal effort. Views don't exist in a vacuum, to believe radical things you have to radically alter all the other things you belief. I really don't think we should people like this like children without agency.
Thank you for saying this. In particular people are often already on a journey of self radicalisation so blaming people reacting to their views for radicalising them further is seeking to soft soap that. On top of which the people reacting are often framed as “going too far” and thus becoming more radical is the only natural reaction. It removes all agency and generally I think is mostly deployed by people that agree already with the radical views but are too scared to say so.
I am not missing their point at all, you are missing mine.
>drives them to support much more extreme views, like what you are describing, that they otherwise might not have.
The view I mentioned was the one that got Notch (one of the public figures mentioned by GP) the reaction from the internet in the first place. A bit disingenuous to say this was a moderate conservative talking point before he got sent spiraling into a far right abyss by an angry progressive mob.
I am not an expert on Notch's slide into craziness, but I'd argue that the episode you mention it might not be the start. His start was as a "anti-SJW" game developer which got him hated and vilified by his former fans.
I'm not saying these people were rays of sunshine before, I'm saying they could be talked to without them foaming at the mouth and you face palming at how unhinged they were. I was using the meaning of benign attached to tumors.
What are we going to do about those hate mobs in our societies in Western high culture who are so intolerant, intransigent and violent that they radicalise the moderates? I fear for the future. Any good ideas?
I think you identify the cycle of radicalization correctly but only on a specific side.
There are people in this thread comparing Trump to Hitler. I don't think Trump is the US finest president but those of my family who weren't slaves for the Germans were slaughtered.
The fact that people throw comparisons that are false on some massive scale around and it's completely normalized is an example why losing touch with reality is not only a problem of the right
I'm not sure what you're claiming in here. Is it that deporting immigrants, and taking rights from women is as bad as trying to get billionaires to pay more taxes and reducing systemic societal biases?
> What's going on on these platforms? Is there any serious evidence to the strong claims?
The second paragraph in the submitted article has a link to the women claim. I hadn’t seen it before. I have also never personally seen any overt fascist sympathising but then again I don’t follow Blow closely. From what I’ve seen from him, though, doesn’t seem hard to believe. He has very strong opinions on a lot of things he knows little about (and belittles those who disagree with his uninformed opinion), is enamoured with Elon Musk, and is always going on (dismissively, divisively, and dehumanisingly) about “The Left”.
He also has very poor and obvious fallacious arguments filled with bad faith assumptions. He believes in God and (if I recall correctly) his justification was (paraphrasing) “a lot of smart people are not atheists” (weasel words, appeal to authority) then went on to rant about “Reddit atheism” (ad hominem) or whatever. That was on his own stream, by the way, so no chance it was taken out of context when I saw it.
This claim about women [1]? Calling that "doesn't seem to think that women have any role to play in his profession" seems like a wild misquote bordering on slander. His statement is essentially "women might have the same ability but are for biological reasons on average less interested in programming". Which is a statement I don't agree with at all, but also a statement that doesn't make any claims about the role women should play or could play, and he repeatedly states that he is talking about statistics and averages, not all women.
That's the same thing as happened to James Damore, who is, in my view, a harmless guy (even nice) and whoever cancelled him or is unable to acknowledge he had a point is much closer to fascism. I don't like throwing that term but just to return it.
It _boggles_ my mind that someone might find it controversial that there are on average differences between the sexes in terms of behaviour and interests. And to throw extremely strong accusations like "fascist" for a totally reasonable assumption or observation like that, I don't have words for that, I think those people have been smoking too much pot.
> there are on average differences between the sexes in terms of behaviour and interests. And to throw extremely strong accusations like "fascist" for a totally reasonable assumption or observation like that
That’s not why they’re calling him fascist, but because of things like being a Trump supporter. You’re conflating arguments.
It might be the _logically_ correct interpretation that these are separate things. Now let's talk about rhetorics. Why are two unrelated, heavy accusations combined in a single sentence? Then consider that the added accusation (misogynist) doesn't hold water even on a logical level (let alone the bad faith involved here), it is a crass misreading of the evidence that was brought up for it.
Not scrutinizing Blow's words. One must be extremely careful when calling anyone fascist or similar labels. The burden of proof is on the accuser, not on the accused. It's obviously right to demand precision from the accuser, and to interpret whatever the accused said in good faith.
That is a claim I neither made nor defended, I merely pointed the asker to the information they requested in the article to let them decide for themselves.
I even explicitly said I never encountered that claim before. As such, I’m not going to do very stupid armchair expert thing I’m criticising and comment on it. The points I made are on the things I know and reflected on, not on superficial information received three minutes ago.
> He has very strong opinions on a lot of things he knows little about (and belittles those who disagree with his uninformed opinion) (...)
I'm impressed with how well you summarized my thoughts about him. I vaguely recall having this impression about him after I read his technical article (can't remember the topic) and decided that I don't think I need to read more from someone that comes through as an asshole. This was around the time The Witness came out, I'm quite happy that I didn't have to witness (hah!) what sounds like his further slide into the madness.
When you support political leaders that push fascist discourse where regular people that happen to have more empathy for their fellow man are presented as the enemy - in Hegseth's book the call to arms against them is literally in the first paragraph - I think it stops being about not far enough left, but about being way too far right.
I said nothing about "half the US", and nazi is just your projection I think. But I'd like to know, are you disagreeing with me that the "us vs. them", where them is minorities, women, liberals is *not* in fact one of the upmost fascist tenets?
With the risk of being a pedant, I think that even at the time that Trump got elected, the validity of saying he was supported by a majority of Americans would have been questionable. Today, I'm positive that it's wrong.
But please, answer my question: do you disagree that the discourse of Trump's administration, where immigrants and minorities are "the enemy" and every measure is allowed against them, is not fascism?
To quote one of their golden boys Pete Hegseth's book *first* chapter:
> The other side—the Left—is not our friend. We are not “esteemed colleagues,” nor mere political opponents. We are foes. Either we win, or they win—we agree on nothing else.
> The United States has the top economy and military in the world, but our cultural and educational institutions—America’s soul—have succumbed to leftist rot.
> our cultural and educational institutions—America’s soul—have succumbed to leftist rot
Sure, let's examine this. Do you disagree that most organisations are extremely dominated by the left? Something like 90% of people in academia, media, schools, (until recently) corporate leadership, various government institutions etc vote democrat. Do you disagree that in the past 20 years or so, the right has been heavily censored online and in the work place by the left? These are all facts, he is not wrong here. When one side has spent 20 years pushing out the other, taking over institutions, censoring them and calling them fascist/nazi, don't be surprised when they are viewed as the enemy.
I also know exactly what you're thinking, the reasoning you use to justify this:
1. It's not censorship, it's preventing disinformation and "hate". This argument doesn't hold when "disinformation" is political opinions of roughly 50% of the country
2. Academia and institutions lean left because Republicans are simply less intelligent than Democrats. "Truth has a liberal bias". You think kind of arrogance from the left is conductive to a good dialogue and friendly relations?
Even if Trump were quite literally a Nazi, he is the elected President. Democracy is important. I don't know how one can simultaneously believe in democracy and believe that everyone who voted for the winning candidate is objectively incorrect. If most voters want to gas the Jews, that is just the will of the people, and that's terrible, but you need to pick in that scenario between a democracy and some other form of government that surpresses the will of the people.
I am of the opinion that Trump is nowhere near bad enough to choose the latter option; we should preserve democracy I think and allow that the majority of voters are not wrong or "too far" right. Yet a whole lot of people seem to be of the opposite opinion.
Tell me what happened to democracy when Hitler took power? And how democratic was the overall process? So was the decision to commit mass murder of millions of people really the democratic will of the people?
It’s like people haven’t even touched a history book sometimes.
You can also look at the parallels to Trump and his continued assault on the democratic norms in the US government. For example assuming powers that are those of Congress, trying to control what states can do via executive order, a thankfully rebuffed attempt at gerrymandering even the Republicans shied away from and so on.
If one believes democracy is important one must also believe that we need checks and balances within government such that democracy is maintained in the face of bad actors. Trump is not the only elected person in government after all and democracy requires free and fair elections to continue when his presidency ends.
Also nothing about a democratic result means that any side needs to be happy about it or that anyone is or should be protected from criticism.
> Tell me what happened to democracy when Hitler took power? And how democratic was the overall process? So was the decision to commit mass murder of millions of people really the democratic will of the people?
It wasn't, but as I said, if the majority of voters do wish to commit mass murder, that is actually not trivially ignorable.
> You can also look at the parallels to Trump and his continued assault on the democratic norms in the US government. For example assuming powers that are those of Congress, trying to control what states can do via executive order, a thankfully rebuffed attempt at gerrymandering even the Republicans shied away from and so on.
Congress is our representatives. They are philosophically us. The majority of them do not want to impeach Trump for these things. Also the majority of voters reelected Trump knowing how he is. The way things are going is how the people want it (if you believe in democracy and the philosophy of representatives).
> If one believes democracy is important one must also believe that we need checks and balances within government such that democracy is maintained in the face of bad actors. Trump is not the only elected person in government after all and democracy requires free and fair elections to continue when his presidency ends.
There has been absolutely nothing to suggest that democracy, as in the literal sense of voting to determine representation, is at risk from inside the political apparatus. I don't consider Jan6 anything of that sort btw.
> Also nothing about a democratic result means that any side needs to be happy about it or that anyone is or should be protected from criticism
Sure, but the crux of the issue is that the left is going beyond criticism. The vocal left continuously claims that the elected government, and crucially those people who voted for it, are in some outgroup (nazis, fascists, bigots et al) that does not deserve to have democratic power in the country by their very nature. They weild the 'paradox of tolerance' as a bludgeon to disenfranchise half the country. It's unhealthy for democracy, both in itself and because when a group feels under (politically) existential attack they will do heinous things to survive.
You’re mixing the principle of democracy up with the process which is necessary to uphold the principle. It’s quite clear that the issue with the democratic process in the US is not with the language used by Democrat voters. What’s unhealthy for democracy is the continued flouting of the process by Trump and the enablement of that by Republicans. I can definitely understand it feels bad when people compare you to fascists though but y’know stop enabling fascist things. The idea that it’s actually the language causing it is very silly.
Yup, it's quite rare that ADTs (or Rust enums) are so clear cut and obvious.
The idea that the data model looks like
enum XYZ {
case A(B, C, D, E);
case F(G, H, I, J, K);
case L();
case M(N, O);
}
is just not true in practice.
I think messaging is one case where it can happen, but even there it's often good to combine fields and share them (and common code) over multiple types of messages. If Messages A and B both have a field "creationTime" or whatever (with identical semantics), it's probably a bad idea to model them as separate fields, because that leads to code duplication, which is unmaintainable.
Maybe I can be more precise by proclaiming that ADTs can be good to be clear what's "there", so they can be used to "send" information. But to write any useful usage code, typically a different representation that folds common things into common places is needed. And it might just happen that field F is valid in cases A and B but not C. That's life! Reality is not a tree but a graph.
That's why it's a bad idea to try and model the exact set of possible states and rule out everything else completely in the type system. Even the most complicated type systems can only deal with the simple cases, and will fail and make a horrible mess at the more complicated ones.
So I'm saying, there is value to preventing some misuse and preventing invalid states, but it comes at a cost that one has to understand. As so often, it's all about moderation.
One should avoid fancy type system things in general because those create dependency chains throughout the codebase. Data, and consequently data types (including function signatures) is visible at the intersection of modules, so that's why it's so easy to create unmaintainable messes by relying on type systems too much. When it's possible to make a usable API with simple types (almost always), that's what you should do.
It always depends on the definition of OOP. Typical enterprise OOP is grounded of the idea of creating black boxes that you can't misuse. That creates silos that are hard to understand externally (and often internally as well, because their implementation tends to be composed of smaller black box objects). That practice may prevent some misuse but it creates a lot of problems globally because nobody understands what's happening anymore on the global scale. This leads to inefficiencies, both performance wise as well as development wise. Even with some understanding, there is typically so much boilerplate that changing things around becomes extremely tedious.
Actually, I have some similar concerns about powerful type systems in general -- not just OOP. Obsessing about expression and enforcement of invariants on the small scale can make it hard to make changes, and to make improvements on the large scale.
Instead of creating towers of abstraction, what can work better often is to try and structure things as a loosely coupled set of smaller components -- bungalows when possible. Interaction points should be limited. There is little point in building up abstraction to prevent every possible misuse, when dependencies are kept in check, so module 15 is only used by 11 and 24. The callers can easily be checked when making changes to 15.
But yeah -- I tend to agree with GP that immutability is a big one. Writing things once, and making copies to avoid ownership problems (deleting an object is mutation too), that prevents a lot of bugs. And there are so many more ways to realize things with immutable objects than people knew some time ago. The typical OOP codebase from the 90s and 00s is chock-full with unnecessary mutation.
> the idea of creating black boxes that you can't misuse
Could you please expand upon your idea, particularly the idea that creating (from what I understood) a hierarchical structure of "blackboxes" (abstractions) is bad, and perhaps provide some examples? As far as I understand, the idea that you compose lower level bricks (e.g. classes or functions that encapsulate some lower level logic and data, whether it's technical details or business stuff) into higher level bricks, was what I was taught to be a fundamental idea in software development that helps manage complexity.
> structure things as a loosely coupled set of smaller components
Mind elaborating upon this as well, pretty please?
> Could you please expand upon your idea that [..] a hierarchical structure of "blackboxes" [...] is bad?
You'll notice yourself when you try to actually apply this idea in practice. But a possible analogy is: How many tall buildings are around your place, what was their cost, how groundbreaking are they? Chances are, most buildings around you are quite low. Low buildings have a higher overhead in space cost, so especially in denser cities, there is a force to make buildings with more levels.
But after some levels, there are diminishing returns from going even higher, compared to just creating an additional building of the same size. And overhead is increasing. Higher up levels are more costly to construct, and they require a better foundation. We can see that most higher buildings are quite boring: how to construct them is well-understood, there isn't much novelty. There just aren't that many types of buildings that have all these properties: 1) tall/many levels 2) low overall cost of creation and maintenance 3) practical 4) novel.
With software components it's similar. There are a couple of ideas that work well enough such that you can stack them on top of each other (say, CPU code on top of CPUs on top of silicon, userspace I/O on top of filesystems on top of hard drives, TCP sockets on top of network adapters...) which allows you to make things that are well enough understand and robust enough and it's really economical to scale out on top of them.
But also, there isn't much novelty in these abstractions. Don't underestimate the cost in creating a new CPU or a new OS, or new software components, and maintaining them!
When you create your own software abstractions, those just aren't going to be that useful, they are not going to be rock-solid and well tested. They aren't even going to be that stable -- soon a stakeholder might change requirements and you will have to change that component.
So, in software development, it's not like you come up with rock-solid abstractions and combine 5 of those to create something new that solves all your business needs and is understandable and maintainable. The opposite is the case. The general, pre-made things don't quite fit your specific problem. Their intention was not focused to a specific goal. The more of them you combine, the less the solution fits and the less understandable it is and the more junk it contains. Also, combining is not free. You have to add a _lot_ of glue to even make it barely work. The glue itself is a liability.
But OOP, as I take it, is exactly that idea. That you're creating lots of perfect objects with a clear and defined purpose, and a perfect implementation. And you combine them to implement the functional requirements, even though each individual component knows only a small part of them, and is ideally reusable in your next project!
And this idea doesn't work out in practice. When trying to do it that way, we only pretend to abstract, we just pretend to reuse, and in the process we add a lot of unnecessary junk (each object/class has a tendency to be individually perfected and to be extended, often for imaginary requirements). And we add lots of glue and adapters, so the objects can even work together. All this junk makes everything harder and more costly to create.
> structure things as a loosely coupled set of smaller components
Don't build on top of shoddy abstractions. Understand what you _have_ to depend on, and understand the limitations of that. Build as "flat" as possible i.e. don't depend on things you don't understand.
Thanks a ton! While I don't have the experience to understand all of it, I appreciate your writing, like the sibling poster (and that you didn't delete your comment)!
It reminds me of huge enterprise-y tools, which in the long run often are more trouble than they're worth (and reimplementing just the subset you need perhaps would be better), and (the way you speak about OOP) bloated "enterprise" codebases with huge classes and tons of patterns, where I agree making things leaner and less generic would do a lot of good.
At first however I thought that you're against the idea of managing complexity by hierarchically splitting things into components (i.e. basically encapsulation), which is why I asked for clarification, because this idea seems fundamental to me, and seeing that someone is against it got me interested. I think now though that you're not against this idea, and you're against having overly generic abstractions (components? I'm not sure if I'm using the word "abstractions" correctly here) in your stack, because they're harder to understand, which I understand. I assume this is what blackbox means here.
I'm not at all about decomposition and encapsulation. But I do think that the idea of _hierarchical_ decomposition can easily be overdone. The hierarchy idea might be what leads to building "on top" of leaky abstractions.
> When you create your own software abstractions, those just aren't going to be that useful, they are not going to be rock-solid and well tested. They aren't even going to be that stable -- soon a stakeholder might change requirements and you will have to change that component.
I also think it's about how many people you can get to buy-in on an abstraction. There probably are better ways of doing things than the unix-y way of having an OS, but so much stuff is built with the assumption of a unix-y interface that we just stick with it.
Like why can't I just write a string of text at offset 0x4100000 on my SSD? You could but a file abstraction is a more manageable way of doing it. But there are other manageable ways of doing it right? Why can't I just access my SSD contents like it's one big database? That would work too right? Yeah but we already have the file abstraction.
>But OOP, as I take it, is exactly that idea. That you're creating lots of perfect objects with a clear and defined purpose, and a perfect implementation. And you combine them to implement the functional requirements, even though each individual component knows only a small part of them, and is ideally reusable in your next project!
I think OOP makes sense when you constrain it to a single software component with well defined inputs and outputs. Like I'm sure many GoF-type patterns were used in implementing many STL components in C++. But you don't need to care about what patterns were used to implement anything in <algorithm> or <vector>. you just use these as components to build a larger component. When you don't have well defined components that just plug and play over the same software bus, no matter how good you are in design patterns it's gonna eventually turn into spagetti un-understandable mess.
I'm really liking your writing style by the way, do you have a blog or something?
I think I agree with your "buy-in idea", but adding that the Unix filesystem abstraction is almost as minimal as it gets, at least I'm not aware of a simpler approach in existence. Maybe subtract a couple small details that might have turned out as not optimal or useful. You can also in fact write a string to an offset on an SSD (open e.g. /dev/sda), you only need the necessary privileges (like for a file in a filesystem hierarchy too btw).
A database would not work as mostly unstructured storage for uncoordinated processes. Databases are quite opinionated and require global maintenance and control, while filesystems are less obtrusive, they implement the idea of resource multiplexing using a hierarchy of names/paths. The hierarchy lets unrelated processes mostly coexist peacefully, while also allowing cooperation very easily. It's not perfect, it has some semantically awkward corner cases, but if all you need is multiplexing a set of byte-ranges onto a physical disk, then filesystems are a quite minimal and successful abstraction.
Regarding STL containers, I think they're useful and useable after a little bit of practice. They allow you to get something up and running quickly. But they're not without drawbacks and at some point it can definitely be worthwhile to implement custom versions that are more straightforward, more performant (avoiding allocation for example), have better debug performance, have less line noise in their error messages, and so on. The most important containers in the STL are quite easy to implement custom versions with fewer bells and whistles for. Maybe with the exception of map/red-black tree which is not that easy to implement and sometimes the right thing to use.
> I'm really liking your writing style by the way, do you have a blog or something?
Thank you! I don't get to hear that often. I have to say I was almost going to delete that above comment because it's too long, the structure and build up is less than clear, there are a lot of "just" words in it and I couldn't edit anymore. I do invest a lot of time trying to write comments that make sense, but have never seen myself as a clear thinker or a good writer. To answer your question, earlier attempts to start a blog didn't go anywhere really... Your comment is encouraging though, so thanks again!
Asking for those who, like me, haven't yet taken the time to find technical information on that webpage:
What exactly does that roundtrip latency number measure (especially your 1us)? Does zero copy imply mapping pages between processes? Is there an async kernel component involved (like I would infer from "io_uring") or just two user space processes mapping pages?
27us and 1us are both an eternity and definitely not SOTA for IPC. The fastest possible way to do IPC is with a shared memory resident SPSC queue.
The actual (one-way cross-core) latency on modern CPUs varies by quite a lot [0], but a good rule of thumb is 100ns + 0.1ns per byte.
This measures the time for core A to write one or more cache lines to a shared memory region, and core B to read them. The latency is determined by the time it takes for the cache coherence protocol to transfer the cache lines between cores, which shows up as a number of L3 cache misses.
Interestingly, at the hardware level, in-process vs inter-process is irrelevant. What matters is the physical location of the cores which are communicating. This repo has some great visualizations and latency numbers for many different CPUs, as well as a benchmark you can run yourself:
I was really asking what "IPC" means in this context. If you can just share a mapping, yes it's going to be quite fast. If you need to wait for approval to come back, it's going to take more time. If you can't share a memory segment, even more time.
No idea what this vibe code is doing, but two processes on the same machine can always share a mapping, though maybe your PL of choice is incapable. There aren’t many libraries that make it easy either. If it’s not two processes on the same machine I wouldn’t really call it IPC.
Of course a round trip will take more time, but it’s not meaningfully different from two one-way transfers. You can just multiply the numbers I said by two. Generally it’s better to organize a system as a pipeline if you can though, rather than ping ponging cache lines back and forth doing a bunch of RPC.
I'd say COM is also run-time type safe casting, and importantly the reference counting is uniform which might help writing wrappers for dynamic and garbage collected languages.
I'm still not sure that it brings a lot to the table for ordinary application development.
It's been a while since I've written it professionally, but I felt the fact that it has consistent idioms and conventions helped me be somewhat more productive writing C++. In the vast landscape of C++ features it winds up making some decisions for you. You can use whatever you want within your component but the COM interfaces dictate how you talk to outside.
I'll bite. printf might be unsafe in terms of typing, in theory, but it's explicit and readable (with some caveats such as "PRIi32"). The actual chance of errors happening is very low in practice, because format strings are static in all practical (sane) uses so testing a single codepath will usually detect any programmer errors -- which are already very rare with some practice. On top of that, most compilers validate format strings. printf compiles, links, and runs comparatively quickly and has small memory footprint. It is stateless so you're always getting the expected results.
Compare to <iostream>, which is stateful and slow.
There's also std::format which might be safe and flexible and have some of the advantages of printf. But I can't use it at any of the places I'm working since it's C++20. It probably also uses a lot of template and constexpr madness, so I assume it's going to be leading to longer compilation times and hard to debug problems.
I my experience you absolutely must have type checking for anything that prints, because eventually some never previously triggered log/assertion statement is hit, attempts to print, and has an incorrect format string.
I would not use iostreams, but neither would I use printf.
At the very least if you can't use std::format, wrap your printf in a macro that parses the format string using a constexpr function, and verifies it matches the arguments.
_Any_ code that was never previously exercised could be wrong. printf() calls are typically typechecked. If you write wrappers you can also have the compiler type check them, at least with GCC. printf() code is quite low risk. That's not to say I've never passed the wrong arguments. It has happened, but a very low number of times. There is much more risky code.
So such a strong "at the very least" is misapplied. All this template crap, I've done it before. All but the thinnest template abstraction layers typically end up in the garbage can after trying to use them for anything serious.
The biggest issue with printf is that it is not extensible to user types.
I also find it unreadable; beyond the trivial I always need to refer to the manual for the correct format string. In practice I tend to always put a placeholder and let clangd correct me with a fix-it.
Except that often clangd gives up (when inside a template for example), and in a few cases I have even seen GCC fail to correctly check the format string and fail at runtime (don't remember the exact scenario).
Speed is not an issue, any form of formatting and I/O is going to be too slow for the fast path and will be relegated to a background thread anyway.
Debugging and complexity has not ben an issue with std::format so far (our migration from printf based logging has been very smooth). I will concede that I do also worry about the compile time cost.
I largely avoided iostream in favor of printf-like logging apis, but std::format changed my mind. The only hazard I've found with it is what happens when you statically link the std library. It brings in a lot of currency and localization nonsense and bloats the binary. I'm hoping for a compiler switch to fix that in the future. libfmt, which std::format is based on, doesn't have this problem.
Given a data item of non-thread safe type (i.e. not Mutex<T> etc), the borrow checker checks that there's only ever one mutable reference to it. This doesn't solve concurrency as it prevents multiple threads from even having the ability to access that data.
Mutex is for where you have that ability, and ensures at runtime that accesses get serialized.
The maybe unexpected point is that if you know you're the only one who has a reference to a Mutex (i.e. you have a &mut), you don't need to bother lock it; if no one else knows about the Mutex, there's no one else who could lock it. It comes up when you're setting things up and haven't shared the Mutex yet.
This means no atomic operations or syscalls or what have you.
Do you have an example? I don't program in Rust, but I imagine I'd rarely get into that situation. Either my variable is a local (in a function) in which case I can tell pretty easily whether I'm the only one accessing it. Or, the data is linked globally in a data structure and the only way to access it safely is by knowing exactly what you're doing and what the other threads are doing. How is Rust going to help here? I imagine it's only making the optimal thing harder to achieve.
I can see that there are some cases where you have heap-data that is only visible in the current thread, and the borrow checker might be able to see that. But I can imagine that there are at least as many cases where it would only get in the way and probably nudge me towards unnecessary ceremony, including run-time overhead.
When you construct an object containing a mutex, you have exclusive access to it, so you can initialize it without locking the mutex. When you're done, you publish/share the object, thereby losing exclusive access.
struct Entry {
msg: Mutex<String>,
}
...
// Construct a new object on the stack:
let mut object = Entry { msg: Mutex::new(String::new()) };
// Exclusive access, so no locking needed here:
let mutable_msg = object.msg.get_mut();
format_message(mutable_msg, ...);
...
// Publish the object by moving it somewhere else, possibly on the heap:
global_data.add_entry(object);
// From now on, accessing the msg field would require locking the mutex
Initialization is always special. A mutex can't protect that which doesn't exist yet. The right way to initialize your object would be to construct the message first, then construct the composite type that combines the message with a mutex. This doesn't require locking a mutex, even without any borrow checker or other cleverness.
Dude, it's a simplified example, of course you can poke holes into it. Here, let me help you fill in the gaps:
let mut object = prepare_generic_entry(general_settings);
let mutable_msg = object.msg.get_mut();
do_specific_message_modification(mutable_msg, special_settings);
The point is, that there are situations where you have exclusive access to a mutex, and in those situations you can safely access the protected data without having to lock the mutex.
Sorry, I don't find that convincing but rather construed. This still seems like "constructor" type code, so the final object is not ready and locking should not happen before all the protected fields are constructed.
There may be other situations where you have an object in a specific state that makes it effectively owned by a thread, which might make it possible to forgo locking it. These are all very ad-hoc situations, most of them would surely be very hard to model using the borrow checker, and avoiding a lock would most likely not be worth the hassle anyway.
Not sure how this can help me reduce complexity or improve performance of my software.
>I don't program in Rust, but I imagine I'd rarely get into that situation.
Are you sure? Isn't having data be local to a thread the most common situation, with data sharing being the exception?
>Or, the data is linked globally in a data structure and the only way to access it safely is by knowing exactly what you're doing and what the other threads are doing.
That's exactly what the borrow checker does. It tracks how many mutable references you have to your data structure at compile time. This means you can be sure what is local and what is shared.
Meanwhile without the borrow checker you always have to assume there is a remote probability that your mental model is wrong and that everything goes wrong anyways. That's mentally exhausting. If something goes wrong, it is better to only have to check the places where you know things can go wrong, rather than the entire code base.
I use lots of locals but only to make my code very "local", i.e. fine-grained, editable and clear, using lots of temporary variable. No complicated expressions. That's all immutable data (after initialization). I rarely take the address of such data but make lots of copies. If I take its address, then as an immutable pointer, maybe not in the type system but at least in spirit.
I keep very little state on the stack -- mostly implicit stuff like mutex lock / mutex unlock. By "state" I mean object type things that get mutated or that need cleanup.
I always have a "database schema" of my global state in mind. I define lots of explicit struct types instead of hiding state as locals in functions. I've found this approach of minimizing local state to be the right pattern because it enables composability. I'm now free to factor functionality into separate functions. I can much more freely change and improve control flow. With this approach it's quite rare that I produce bugs while refactoring.
So yes, I have lots of locals but I share basically none of them with other threads. Also, I avoid writing any code that blocks on other threads (other than maybe locking a mutex), so there's another reason why I would not intentionally share a local with another thread. Anything that will be shared with another thread should be allocated on the heap just for the reason that we want to avoid blocking on other threads.
In that sense, the borrow checker is a tool that would allow me to write code more easily that I never wanted written in the first place.
It's relevant when you have more complex objects, such as ones that contain independent mutexes that lock different sections of data.
You want the object to present its valid operations, but the object could also be constructed in single or multithreaded situations.
So you'd offer two APIs; one which requires a shared reference, and internally locks, and a second which requires a mutable reference, but does no locking.
Internally the shared reference API would just lock the required mutexes, then forward to the mutable reference API.
I'd be interested how feasible complete 2D UIs using dynamically GPU rendered vector graphics are. I've played with vector rendering in the past, using a pixel shader that more or less implemented the method described in the OP. Could render the ghost script tiger at good speeds (like 1-digit milliseconds at 4K IIRC), but there is always an overhead to generating vector paths, sampling them into line segments, dispatching them etc... Building a 2D UI based on optimized primitives instead, like axis-aligned rects and rounded rects, mostly will always be faster, obviously.
Text rendering typically adds pixel snapping, possibly using byte code interpreter, and often adds sub-pixel rendering.