Given the water needs of data centers and the ongoing and upcoming water scarcity, I imagine the problem of heat dissipation seems easier to solve, long term, in space.
We can and do build data centres that don't use evaporative cooling, evaporation is just often the cheapest option in places with large natural water sources.
Assuming you're playing the "only pay for a business support plan when you actually need to file a ticket" game like me, with a very slight amount of effort this works in your favor instead of being a downside. Put your expensive-but-reliable stuff (e.g. large 24/7 EC2 instances, your S3 buckets) in one account and your cheap-but-fiddly stuff (e.g. your EKS cluster) in another account. When you need support on the fiddly stuff you're only paying a percent of that account.
At work we did not follow this advice, so we have a single account and we're vulnerable to an unnecessarily high support bill if we happen to need to file a ticket in an expensive month. We could have avoided this with account segmentation; our expensive stuff tends not to be the stuff we need support on.
Enterprise support agreements are organization-wide.
Although, you can gamify Business support (which is priced as a percentage of your bill) to not include things like your CloudTrail account, which probably never require support, but can get expensive across a large enough organization.
I had a similar issue recently and was able to convince the AI agent to give me a phone number to talk to a support representative. They manually fixed my accout and key and gtg in a few minutes.
Not a big fan of this kind of promo article that just links to several of their own learning resources instead of giving some actual examples inline.
I've worked in Mongo enough to know that whatever decision I make will end up being wrong.
What i will never understand is why mongo doesn't have some simple means of document referencing that automatically updates documents a doc is embedded in. If it's such an important pattern that every app needs to reinvent for itself, just add it to the system.
People who don't know how to code know they don't know how. They can look over your shoulder and see that it looks like gibberish, and they also have no interest in understanding it even if they could.
On the other hand, designing the software or engineering a solution to the problem seems like something they could do, as far as they know, because it's not something concrete that they can look at and see is beyond their abilities.
Tech is evolving too quickly; every year the hardware will be much more powerful at the same price (as LLM optimizations reach hardware), so you’d end up replacing the device frequently.
GPUs and NPUs are gaining optimizations for the transformer architecture. It’s not “GPU is 3x faster this year”, it’s “GPU has gates specifically designed to accelerate your LLM workload”
See for instance [0], which is just starting to appear in commercial parts.
This is continuing; pretty much every low cost SoC maker is racing to build and extend ML optimizations.
>without any one of import actually going to jail.
I'd argue these kids won't "be of import" in the long scheme of things. Maybe in future software ethics/security classes at best. I sadly don't think Musk will ever be in a jailcell, but I'll settle for him never stepping foot in a federal facility again.
>There's a reason Musk brought in younglings to do the illegal stuff.
You can pardon people of any age. And Trump even in this future of "there won't be any more elections" probaby isn't living much longer than this presidential term at his health.