Hacker Newsnew | past | comments | ask | show | jobs | submit | Thicken2320's commentslogin

Using the S3 API is like chopping onions, the more you do it, the faster you start crying.


Less to no crying when you use a sharp knive. Japanese chefs say: no wonder you are crying, you squash them.


Haha!

My only “yes, but…” is that this:

> 50k API calls per second (on S3 that is $20-$250 _per second_ on API calls!).

kind of smells like abuse of S3. Without knowing the use case, maybe a different AWS service is a better answer?

Not advocating for AWS, just saying that maybe this is the wrong comparison.

Though I do want to learn about Hetzner.


You're (probably) not wrong about the abuse thing, but it sure is nice to just not care about that when you have fixed hardware. I find trying to guess which of the 200 aws services is the cheapest kinda stressful.


They conveniently provide no detail about the usecase, so it's hard to tell

But, yeah, there's certainly a solution to provide better performances for cheaper, using other settings/services on AWS


We're hoping to write a case study down the road that will give more detail. But the short version is that not all parts of the client's organisation have aligned skills/incentives. So sometimes code is deployed that makes, shall we say, 'atypical use' of the resources available.

In those cases, it is great to a) not get a shocking bill, and b) be able to somewhat support this atypical use until it can be remedied.


Thank you for the reply

I'm honestly quite interested to learn more about the usecase that required those 50k API calls!

I've seen a few cases of using S3 for things it was never intended for, but nothing close to this scale


Why would it be abuse? Serving e.g. map tiles on a busy site can get up to tens of thousands of qps, I'd have thought serving that from S3 would have made sense if it weren't so expensive.


I don’t know much about map tiles… but could that be done more effectively through a CDN or cache, and then have S3 behind it?

Then the CDN takes the beating. So this still sounds like S3 abuse to me.

But I leave room for being wrong here.

Edit: presumably if your site is big enough to serve 50k RPS it’s big enough for a cache?


They specifically mention the dbt-core will remain open source and will be supported. However this type of consolidation will very likely bring increased prices.

I hope Fivetran alternatives like dlt remain open source.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: