Hacker Newsnew | past | comments | ask | show | jobs | submit | casper14's commentslogin

Could you explain why this is great over alternatives?

"And off by one errors"


Can confirm. Just finished load testing a FastApi service. Now the biggest selling point is that a lot of real backend never experience the level of load where this actually matters


I work for a very large company that has a mostly SSR monolith written in PHP.

Modern PHP is a joy, and it's much faster these days, but performance is still a problem. It was chosen over 25 years ago, and I'm sure they thought the same thing about never getting the amount of load they eventually got.

Modern PHP is virtually indistinguishable from dotnet, with some php-isms sprinkled on top. They should've chosen dotnet all those years ago.


The more common term you're looking for is "ultra-processed food"


Which types of processing exactly is implied by that, and which are not?

Where's the line drawn, is ground beef ultra processed or not? how about a chicken schnitzel? canned sardines? dark chocolate?

Which part of the ultra-processing is making the foot unhealthy, is it chemicals they add? the fact that they heat it up (but at home when you cook you also heat up stuff)? something else they do with it?

If you bake fries yourself from potatoes with olive oil, is it ultra processed?


The term comes from the Nova classification.

https://en.wikipedia.org/wiki/Nova_classification


> Ultra-processed foods are operationally distinguishable from processed foods by the presence of food substances of no culinary use (varieties of sugars such as fructose, high-fructose corn syrup, 'fruit juice concentrates', invert sugar, maltodextrin, dextrose and lactose; modified starches; modified oils such as hydrogenated or interesterified oils; and protein sources such as hydrolysed proteins, soya protein isolate, gluten, casein, whey protein and 'mechanically separated meat') or of additives with cosmetic functions (flavours, flavour enhancers, colours, emulsifiers, emulsifying salts, sweeteners, thickeners and anti-foaming, bulking, carbonating, foaming, gelling and glazing agents) in their list of ingredients.

They have a different definition of "no culinary use" than I do!


Earlier in the definition it uses the more conservative phrase "no or rare culinary use," which I think is more accurate. The point is just to attempt to categorize foods by processing levels in a way the public can understand.

I am curious what items in the list differ for you. When's the last time you grabbed your isolated fructose and maltodextrin to season your steak?

The way I think of it is if I were to cook a chicken breast or bake a loaf of bread and then write down the ingredients, they'd be chicken, oil, salt, pepper; or flour, water, yeast, salt. Now go look at the ingredients of a chicken breast (raw, marinated, or cooked) and a loaf of bread in the grocery store and note the differences between the ingredient list. If the ingredient list for an item from the store includes things a household wouldn't have at home, like fructose or maltodextrin, that item would be considered ultra processed.

I'll note that I don't eat as healthy as I should, people should do what they want, and it's possible to still be unhealthy while avoiding ultra processed foods.


Thanks for linking that. Their rubric for ultra-processed is easy enough to grok that folks could use this at a grocery store. We're on a kick to remove "parameters" from tasks right now, so this definition is clearer than thoughts like "stick to the outside of the store."


Reducing the parameters on tasks, and eliminating tasks has been a huge win for us. Tranquility, and still results.


This is venturing off-topic, but can you expand on "eliminating tasks." Is eliminating a task like setting up auto bill pay, or getting rid of items that I don't want to clean?


Yes to both. This is my heuristic:

- think about what would happen if something is simply left undone

- can I do the same task with fewer steps

- if I relaxed the definition of success a little, does it get a lot easier?

- can I farm it out to a person or a service? (Like bill autopay, or Instacart)


Same! Still good that you can get 4% on a risk free investment these days


Or put another way, your income is depreciating at at least 4%.


Yes


Spot on, device tracking is much better than wifi sensing


Until you TLS a tcp connection of course


Oh can you comment on what this means? I'm not too familiar with it. Thanks!


BSL is a source-available license that by default forbids production use. After a certain period after the date of any particular release, not to exceed four years, that release automatically converts to an open source license, typically the Apache license.

Projects can add additional license grants to the base BSL. EMQX, for example, adds a grant for commercial production use of single-node installations, as well as production use for non-commercial applications.


Nice! What optimizations have you put in llace yo support 150 mil? Just some indexing or other fancy stuff?


You don't need to optimize anything beyond appropriate indices, Postgres can handle tables of that size out of the box without breaking a sweat.


> Postgres can handle tables of that size out of the box

This is definitely true, but I've seen migrations from other systems struggle to scale on Postgres because of decisions which worked better in a scale-out system, which doesn't do so well in PG.

A number of well meaning indexes, a very wide row to avoid joins and a large number of state update queries on a single column can murder postgres performance (update set last_visited_time= sort of madness - mutable/immutable column family classifications etc.)

There were scenarios where I'd have liked something like zHeap or Citus, to be part of the default system.

If something was originally conceived in postgres and the usage pattern matches how it does its internal IO, everything you said is absolutely true.

But a migration could hit snags in the system, which is what this post celebrates.

The "order by" query is a good example, where a bunch of other systems do a shared boundary variable from the TopK to the scanner to skip rows faster. Snowflake had a recent paper describing how they do input pruning mid-query off a TopK.


That’s not the fault of the DB, though, that’s bad schema design. Avoiding JOINs is rarely the correct approach.


You really don't need anything special. 150M is just not that much, postgres has no problem with that.

Obv it depends on your query patterns


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: