That is a problem if you happen to have a nvidia GPU, and, as the article says, by nvidia forcing it, you will not be able to have that brand of customer gamer GPU anymore.
If both programs do support unicode, they should just work. This entire post exists because legacy programs do not. And you are using Win32 because of those legacy programs.
That is also why Win32 seems to be the most stable API for userland programs, while constant recompiles of the entire userland are very much the norm and required so your desktop and apps can keep working on other *NIX.
I know that before, Unicode and locale aware systems were supposed to use unicode tags (U+E0000..U+E007F) to invisibly and "for all plaintext purposes" mark text for such han unification handling but that use is now deprecated.
What I am supposed to use those days? HTML-encoded in utf-8, with lang attributes, so <span lang="ja-JA"> and <bdi lang="zh-Hans"> infested text?
Isn't the whole "thing" about JPA (and all other ORMs ever) that you're supposed to "use it" instead of directly doing well optimized native queries on your database so that you can jump ship if the database provider turns out to be shit?
Nah everything we did was hand crafted for their specific db.
It was particularly bad because it was a very small family business with equally small customers. And they all had to buy oracle licenses first, which made us insanely expensive without making money lol.
On Oracle Cloud Infrastructure, on my region, "Oracle Database - Base Database Service" (single node database) costs the same as a much more powerful cluster of managed "Database with PostgreSQL", or a managed cluster of "MySQL HeatWave".
Under most circumstances, you should still pick non-oracle-DB on Oracle Cloud Infrastructure.
I might have done the math wrong, but is this really supposed to be 330 * 290 um² * 128GiB * 8 = 96 m² big? And this is the RAM one expects per node cluster element for current LLM AI, nevermind future GAI.
reply