Hacker Newsnew | past | comments | ask | show | jobs | submit | falconroar's commentslogin

They will get away with it if we believe we are powerless to change it. Russia has been proven to be pushing defeatist propaganda similar to your sentiment, and I'm sure Israel has been as well.


Are Gazans receiving adequate nutrition? If not, why are we discussing semantics?


Starving infants is justifiable sometimes? When?


It is justifiable to stop a shipment of baby formula if that baby formula is known to be unsafe and carry bacteria that will kill infants.

I think in this particular case it's quite safe to say that those blocking the shipments aren't acting in good faith, however.


Does the food carry bacteria in reality? Why are we talking about bizarre hypotheticals?


Starving infants is morally justifiable if it's possible to make a rocket from their food?


Preventing the entry of something that can be made into a weapon is justified, yes. If you want to call that “starving children”, that’s up to you.


Withholding food from children results in children starving. It’s not semantics.


Only in the same way that pointing at a starving infant as a prop is a moral justification for using food meant for the infant instead to manufacture weapons.


A Pew poll in 2022 found that about half (48%) of Israeli Jews agree that Arabs should be expelled or transferred from Israel. Horrifying statistic.


Now do Arab polls.

Seriously, you can't apply western standards to middle eastern coumtry, even more so, single out one ME country to apply them to.


Being against genocide isn’t a western standard.


1) Not a genocide

2) If forces were reversed, it would be one for sure.

--

Anyhow, war and terror is on the rise in Europe right now, cue immediate swing to far right in the polls, so good luck!


I thought the opposite - they set a precedent indicating that reproduction of a copyrighted text by an LLM is infringement. If authors refuse to sell to them (via legal terms indicating LLMs aren't allowed), it's infringement. No?

I'd be curious to hear from a legal professional...


Essential context. So many variables here with very naive experimental procedure. Also "Cognitive Decline" is never mentioned in the paper.

An equally valid conclusion is "People are Lazier at Writing Essays When Provided with LLMs".


Who did Tesla have to pay off to sell a product called Full Self Driving, that is not actually fully self-driving?


I don't understand why developers of PyTorch and similar don't use OpenCL. Open standard, runs everywhere, similar performance - what's the downside??


I don’t know for sure why the early pytorch team picked it, but my guess is due to simplicity and performance. NVidia optimizes CUDA better that OpenCL and provides tons of useful performance tuning tools. It is hard to match the CUDA performance with OpenCL even on the same NVidia GPU hardware, and making performant code compatible across different GPU with OpenCL is also hard. I know examples of scientific codes that became simpler and faster (on nvidia hardware) by going from openCL to CUDA but haven’t yet heard of examples the other way around.


Is there any reason OpenCL is not the standard in implementations like PyTorch? Similar performance, open standard, runs everywhere - what's the downside?


IIRC, ease of implementation (for the GPU kernels), and cross-compatibility (the same bytecode can be loaded by multiple models of GPU).


How is CUDA-C that much easier than OpenCL? Having ported back and forth myself, the base C-like languages are virtually identical. Just sub "__syncthreads();" for "barrier(CL_MEM_FENCE)" and so on. To me the main problem is that Nvidia hobbles OpenCL on their GPUs by not updating their CL compiler to OpenCL 2.0, so some special features are missing, such as many atomics.


Never used it myself, these are just the main reasons I've heard from friends.


The ease of implementation using CUDA means that your code because effed for life, because it is no longer valid C/C++, unless you totally litter it with #ifdefs to special case for CUDA. In my own proprietary AI inference pipeline I've ended up code-generating to a bunch of different backends (OpenCL SpirV, Metal, CUDA, HLSL, CPU w. OpenMP), giving no special treatment to CUDA, and the resulting code is much cleaner and builds with standard open source toolchains.


> The ease of implementation using CUDA means that your code because effed for life

yes, yes it absolutely does. establishing market dominance as everyone wants to use CUDA but almost nobody wants to write their kernel twice.


Downsides are it can't express a bunch of stuff cuda or openmp can plus the nvidia opencl implementation is worse than their cuda one. So opencl is great if you want a lower performance way of writing a subset of the programs you want to write.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: