Hacker Newsnew | past | comments | ask | show | jobs | submit | elcritch's commentslogin

Chrome moved to a time based release cadence.

Fascinating, and us humans aren't that different. Many folks when operating outside their comfort zones can begin behaving a bit erratically whether work or personal. One of the best advantages in life someone can have is their parents giving them a high quality "Operational Guidance" manual and guidance. ;) Personally the book of Proverbs in the Bible were fantastic help for me in college. Lots of wisdom therein.

> Fascinating, and us humans aren't that different.

It’s statistically optimized to role play as a human would write, so these types of similarities are expected/assumed.


I wonder if the prompt should include "You are a robot. Beep. Boop." to get it to act calmer.

Which is kind of a huge problem: the world is described in text. But it is done so through the language and experience of those who write, and we absolutely do not write accurately: we add narrative. The act of writing anything down changes how we present it.

That's true to an extent - LLMs are trained on an abstraction of the world (as are we in a way, through our senses, and we necessarily use a sort of narrative in order to make sense of the quadrillions of photons coming up us) - but it's not quite as severe a problem as the simplified view seems to present.

LLMs distill their universe down to trillions of parameters, and approach structure through multi-dimensional relationships between these parameters.

Through doing so, they break through to deeper emergent structure (the "magic" of large models). To some extent, the narrative elements of their universe will be mapped out independently from the other parameters, and since the models are trained on so much narrative, they have a lot of data points on narrative itself. So to some extent they can net it out. Not totally, and what remains after stripping much of it out would be a fuzzy view of reality since a lot of the structured information that we are feeding in has narrative components.


Wow that’s an interesting list. Sorta of surprising with Raku beating out Perl. Nim is also in the top 10, nice!

Unfortunate because for the first time in his life he had RAD hair. ;)

> How "long term" are we talking about that rewriting battle-tested, mission-critical C utils (which, as other posters noted, in this case often have minimal attack surfaces) actually makes sense?

Makes me wonder if putting a similar amount of effort into building up proof/formal verification system for coreutils would have yielded better results security wise.


Of course! But the problem is much more severe. I can't comment on coreutils, but there are not enough resources for high quality maintenance of the core tool chain. It is completely surprising that effort is wasted for creating new implementations when we do not even have enough resources to properly maintain the existing ones. It is based on the - completely wrong - idea that all the problems we have is from using the wrong language and will magically go away with Rust instead of a fundamental maintenance problem of free software. So we now makes things substantially worse based on this incorrect analysis.

A rewrite in Rust may attract new contributors, thereby aiding maintenance in the coming years.

Especially if you look very long term, as in where the young developers are, you'll see a significant reduction in the amount of people with the ability to write high-quality C. Rust has the benefit that low-quality Rust ist fairly close to high-quality Rust, while low-quality C is a far cry from high-quality C.

Choosing Rust does not necessarily require Rust itself to be better for the task. It can also be the result of secondary factors.

I don't know if this applies to coreutils, but C being technically sufficient does not always mean it shouldn't be replaced.


I don't buy this story. It may attract some people during the hype phase. But in the end, Rust is more complex, so it will make it harder to maintain software. And then, this also only can work if the rewrites completely replace the original (rarely the case), and you do not lose more maintainers than you gain.

Rust isn’t more complex. It just codifies the things you need to know.

It is certainly a lot more complex than C. Whether it codified the things one needs to know is a different question. I would even agree that partially it does and there are aspects I really like about Rust, but I do not believe it matters as nearly as much as some people might think. For example, no complicated super-smart type system will prevent you from adding a CLI option and then not implementing it, breaking automatic updates and backup scripts. But nerds like to believe that this super-smart type system is the solution for out security problems. I can understand this. This is also what I believed 20 years ago.

> A rewrite in Rust may attract new contributors, thereby aiding maintenance in the coming years.

Or they will get bored as soon as a New Awesome Language will be hyped on HN and elsewhere.


Folks also run into compatibility issues with musl as well. The biggest I recall was an issue with DNS breaking because musl didn’t implement some piece.

TBF DNS handling of glibc is crazy.

The IETF has made a bunch of standards lately like COSE for doing certificates and encryption stuff with CBOR. It’s largely for embedded stuff, but I could see it being a modern alternative. I haven’t used it myself yet.

CBOR is self-describing like JSON/XML meaning you don’t need a schema to parse it. It has better set of specific types for integers and binary data unlike JSON. It has an IANA database of tags and a canonical serialization form unlike MsgPack.


> The fact that you theoretically can validate an xml document against a schema is honestly completely useless. If I am parsing XML, its because my code already knows what information it needs from the XML document, and really it should also know where that information is.

You seem to miss the entire point of XML schemas, or any schema really. Validating a document against a schema isn’t really for your code. It’s for documentation of what can be in a given document and how it needs to be structured. So others don’t need to read your code to understand that.

It then allows editing tools to verify generated documents. Or developers to understand how they can structure XML output properly.

Your code could also use it to verify an XML document before passing it to your code. Then you can inform the user of an invalid document and why instead of just crashing at a random point in code without rolling your own. It can also verify an entire document whereas code may only parse portions leading to later corruption.


> Realistically if ASN.1 weren't as badly overengineered and had shipped only with some of the more modern of it's encoding formats we probably would all be using ASN.1 for man things including maybe your web server responses and this probably would cut non image/video network bandwidth by 1/3 or more. But then the network is overloaded by image/video transmissions and similar not other stuff so I guess who cares???!???

For payment systems people really do validate messages' encoding.

> You seem to miss the entire point of XML schemas, or any schema really. Validating a document against a schema isn’t really for your code. It’s for documentation of what can be in a given document and how it needs to be structured. So others don’t need to read your code to understand that.

Schemas also let you parse data into ergonomic data structures in the host programming language. That's really the biggest win in having a schema.

Schemas and schema-aware tooling also help you not produce invalid messages that others then have to put up with and hack their parsers to handle when you present them with a fait accompli and you have the market power to make it stick.

Schemas also let you specify things formally rather than having to use English prose (or worse, prose in not-English, or even worse, having to produce prose in multiple languages and make sure they all say the same thing!).

The downside to schemas is that you have to do the work of writing them, and if you're the first implementor and you're using JSON and the messages are simple you just won't care to.


US Workers while costing more do seem are still competitive at expensive high value tasks. There was some reports a year back where the TSMC labs in Arizona while employee costs were higher also had 4% higher yield [1].

I'd wager combining that with US defense contracts for US made chips would be lucrative for NVidia.

1: https://news.ycombinator.com/item?id=41952534


is it high value? I know a guy working at a fab in Taiwan and he makes peanuts ($2-3K a month?). It's basically a factory job. From what he said it sounded braindead. TSMC is known for overpaying and having PhDs stare at assembly lines though - so I dunno.

It's high value but low margins work. It played a major role into why Intel began offshoring a large portion of it's fab work abroad back in the 1990s-2000s, and also why TSMC became a player in the space undercutting Japanese, Korean, Singapore, and Malaysian/American (Intel) vendors.

> TSMC is known for overpaying

Never heard that in my life. Underpay sure.


In Taiwan they are by far the highest paying employer (maybe Google/Microsoft branches here are comparable). The base pay is usually not impressive, but they come with insane bonus packages. They basically hoover up all the top graduates that haven't left abroad.

They're kind of notorious for have overqualified workers. PhDs left monitoring assembly lines or writing accounting software


> In Taiwan

My bad - didn't see the qualifier for Taiwan there.

Yea, TSMC pay is decent by Taiwan standards, but imo that's largely because Taiwan salaries are fairly low even compared to other East Asian nations when factoring CoL.


Googling TSMC margin and it looks to be 53%-59% gross margin. That is after they became the leading global fab of course.

No doubt keeping costs low helped them get to that stage. Still that's a crazy margin. I imagine much of it's re-invested.

IMHO, Intel could've kept up but dropped the ball pretty hard. The volume of ARM / cell phone cpus caught them off guard and huge amounts of revunue let TSMC drive forward.


The average salary in Taiwan is about $1,500, though.

sorry, I should have thought about it a bit more. It's likely closer to 2K than 3K. A professor at Taiwan University makes ~3K. My only other point of reference is another friend's husband who works as an electric engineer at a LED manufacturer - making 2K a month. Minimum wage is $1K. Point being.. they're miserable jobs and the salaries are low low low. I get fabs sound fancy, but I'm not super sure anyone should be stoked these kinds of jobs are "coming back" to the US.

While I am more of the belief that these people are wanting a time with less income equality and conflating that with factory work.

There's going to be a bunch of people wanting any job [1] and more in the coming years. So they'd probably cheer a local walmart as much as a local tsmc.

[1]: https://www.bls.gov/web/empsit/cpseea10.htm


I've had a number of friends who worked at Micron in Boise at their semi-fabs. Yes boring work, but the pay was decent. Especially for guys just graduating. I think it was like a 4-10 schedule.

Makes sense to me. Lots of folks attracted to Rust seem to enjoy code-golfing more than solving problems. Designing the “perfect” game architecture vs shipping games seems to match that and my take on much of the Rust ecosystem.

Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: