Hacker Newsnew | past | comments | ask | show | jobs | submit | anvuong's commentslogin

Whether you think infinity exists or not is up to you, but transfinite mathematics is very useful, it's used to prove theorems like Goodstein's sequence in a surprisingly elegant way. This sequence doesn't really have anything to do with infinity as first glance.

> self-advertised as uncompromising privacy focused OS

> didn't even compromise even a bit (negotiation is already a compromise) against a country who is notorious for advocating for privacy-invasive policies in recent years

> get lectured by yc high-horse rider on the obligation blah blah, even when by and large this move doesn't materially affect the end-users in any substantial way

I used 4chan style because most of the times 4chan commenters have more sense than yc these days. Many people here do live in glass houses.



GP here. Why have three people now said I commented on obligations? I said nothing about it (and thought nothing about it).

Vulkan was one of the hardest thing I've ever tried to learn. It's so unintuitive and tedious that seemingly drains the joy out of programming. Tiny brain =(

You don't have a tiny brain. Vulkan is a low-level chip abstraction API, and is about as joyful to use as a low-level USB API. For a more fun experience with very small amounts of source code needed to get started, I'd recommend trying OpenGL (especially pre-2.0 when they introduced shaders and started down the GPU-programming path), but the industry is dead-set on killing OpenGL for some reason.

Vulkan is definitely a major pain and very difficult to learn... But once you've created an init function, a create buffer function, a create material function etc which you do once you can largely then just ignore it and write at a higher level.

I don't like Vulkan. I keep thinking did nobody look at this and think 'there must be a better way' but it's what we've got and mostly it's just learn it and write the code once


> I'd recommend trying OpenGL

Tbh, OpenGL sucks just as much as Vulkan, just in different ways. It's time to admit that Khronos is simply terrible at designing 3D APIs ;) (probably because there are too many cooks involved)


Does anyone know why the industry is killing OpenGL?

People wanted more direct control over the GPU and memory, instead of having the drivers do that hard work.

To fix this AMD developed Mantle in 2013. This inspired others: Apple released Metal in 2014, Microsoft released DX12 in 2015, and Khronos released Vulkan in 2016 based on Mantle. They're all kind of similar (some APIs better than others IMO).

OpenGL did get some extensions to improve it too but in the end all the big engines just use the other 3.


OpenGL cannot achieve the control over modern hardware necessary to get competitive performance. Even in terms of CPU overhead it’s very limiting.

Direct3D (and Mantle) had been offering lower level access for years, Vulkan was absolutely necessary.

It’s like assembly. Most of us don’t have to bother.


When I first tried to learn Vulkan, I felt the exact same way. As I was following the various Vulkan tutorials online, I felt that I was just copying the code, without understanding any of it and internalizing the concepts. So, I decided to learn WebGPU (via the Google Dawn implementation), which has a similar "modern" API to Vulkan, but much more simplified.

The commonalities to both are:

- Instances and devices

- Shaders and programs

- Pipelines

- Bind groups (in WebGPU) and descriptor sets (in Vulkan)

- GPU memory (textures, texture views, and buffers)

- Command buffers

Once I was comfortable with WebGPU, I eventually felt restrained by its limited feature set. The restrictions of WebGPU gave me the motivation to go back to Vulkan. Now, I'm learning Vulkan again, and this time, the high-level concepts are familiar to me from WebGPU.

Some limitations of WebGPU are its lack of push constants, and the "pipeline explosion" problem (which Vulkan tries to solve with the pipeline library, dynamic state, and shader object extensions). Meanwhile, Vulkan requires you to manage synchronization explicitly with fences and semaphores, which required an additional learning curve for me, coming from WebGPU. Vulkan also does not provide an allocator (most people use the VMA library).

SDL_GPU is another API at a similar abstraction level to WebGPU, and could also be another easier choice for learning than Vulkan, to get started. Therefore, if you're still interested in learning graphics programming, WebGPU or SDL_GPU could be good to check out.


You don't have a tiny brain--programming Vulkan/DX12 sucks.

The question you need to ask is: "Do I need my graphics to be multithreaded?"

If the answer is "No"--don't use Vulkan/DX12! You wind up with all the complexity and absolutely zero of the benefits.

If performance isn't a problem, using anything else--OpenGL, DirectX 11, game engines, etc.

Once performance becomes the problem, then you can think about Vulkan/DX12.


What about new features? There are many small features that can't be used via older APIs and bigger ones like accelerated ray tracing.

Sure, but then you've already thrown away the possibility of using "simpler" APIs that everybody is whining that DX12/Vulkan is more complicated than.

Programmers should absolutely not be using DX12/Vulkan unless they understand exactly why they should be using it.


Exactly the reason why I haven't switched from OpenGL to Vulkan. Vulkan is just ridiculously overengineered. Cuda shows that allocation of GPU memory and copy from host to device can be one-liners, yet in Vulkan it's an incredible amount of boilerplate to go through. Modern Vulkan fixes a lot of issues, like getting rid of pipelines, render passes, bindings, etc., but there is still much more to fix before it's usable.

I think anyone who ever looked at typical Vulcan code examples would reach the same conclusion: it's not for application/game developers.

I really hope SDL3 or wgpu could be the abstraction layer that settles all these down. I personally bet on SDL3 just because they have support from Valve, a company that has reasons to care about cross platform gaming. But I would look into wgpu too (...if I were better at rust, sigh)


Yep. Most of the engine and "game from scratch" tutorials on Youtube, etc, use this style of having OpenGL code strewn around the app.

With Vulkan this is borderline impossible and it becomes messy quite quickly. It's very low level. Unlike OpenGL, one really needs an abstraction layer on top, so you either gotta use a library or write your own in the end.


For wgpu, someone else mentionned in another comment that there are bindings for other languages, maybe your favorite too!

This reminds of a friend I had in college. We were assigned to the same group coding an advanced calculator in C. This guy didn't know anything about programming (he was mostly focused on his side biz of selling collector sneakers), so we assigned him to do all the testing, his job was to come up with weird equations and weird but valid way to present them to the calculator. And this dude somehow managed to crash almost all of our iterations except the few last ones. Really put the joke about a programmer, a tester, and a customer walk into a bar into perspective.

I love that he ended up making a very valuable contribution despite not knowing how to program -- other groups would have just been mad at him, had him do nothing, or had him do programming and gotten mad when it was crap or not finished.

Of course there will be fallout, but much less severe than a meltdown on land. Water is extremely effective in containing radiation, and the ocean is so huge it will be dilutet to neligible amount very quickly.

> the ocean is so huge it will be dilutet to neligible amount very quickly.

Are you basing this on actually knowing this, because it sounds doubtful to me. The ocean would probably dilute a bit, maybe even a bunch, but it'll also lead to contamination of everything around there, the bottom, the animals, and so on, the radiation doesn't suddenly "disappear" in "thin" water.


Unfortunately the most popular distro (Ubuntu - Canonical) is behaving more and more like Microsoft. I updated to 25.10 last week and it decided to ignore my settings, reset the snap priority and reinstall the snap firefox package, all without my consent. I was fed up when Canonical decided to hijack apt to inject their own proprietary closed-source snap packages, now after having dealt with it again and again after each major upgrade, I just switched to Fedora Gnome a few days ago and I'm not missing anything with Ubuntu.

Corporate ethics-wise, Canonical is vastly better than Microsoft.

But I prefer Debian Stable, for reasons both pragmatic and on-principle:

https://cdimage.debian.org/debian-cd/current/amd64/iso-dvd/d...

(Or people can go to a confusing download page: https://www.debian.org/distrib/ )


Debian isn't heading down the best path either with their their policies (keeping the child sex predator on their payroll to do conferences that often times parents bring kids to, forcing the removal of the fortune packafe because it was deemed offensive, openly stating that straight white males shouldn't apply for internship, etc)

The Amazon Lens was pretty bad ethically.

> Canonical is vastly better than Microsoft.

However Canonical apologised, and removed it.

Microsoft doubled down, adding more adverts.


Just BSD and chill

I switched to Mint (Mate) around 2012 or so because of radical UI changes made by Canonical. At the time, the "mobile revolution" was the big industry trend. Windows 8 had come out which was designed for touch screens (and people hated it) ... and Canonical released a new default desktop environment (I think it was Unity? Memory is fuzzy). It was shocking to me and when I complained about it, a friend recommended Mint.

The nice thing about Linux is that you have max choice. That can pose problems for new users who might be a bit overwhelmed but we shouldn't pretend that Canonical "owns" Linux or that everyone is necessarily going to land there. I recommend Mint when people tell me they're thinking of giving Linux a try. Haven't given Ubuntu a second thought in years.


I went from Ubuntu to Mint around the same time on my laptop. I took my desktop from Ubuntu to Fedora. A later laptop followed it, because I was tired of the little differences.

Ubuntu is completely off my radar too. So many dumb things that often lasted a few releases. Like ads for their cloud services, Unity for a while, window controls on the left for a while...

My biggest problem with Mint was that upgrading the OS became a hassle if I put it off for too long (which I started doing after a not-so-smooth upgrade experience, one release).


I'm happy to report that I upgraded my Mint from the previous to the most recent version without any snags. I then discovered I was still on an older kernel version which required a little more research, but went without any difficulties. The second part is admittedly something that not-so technical people will have considerable difficulty in even realizing. But all in all I can say that Mint is a solid, stable, and usable system for experts and novices alike.

Curious your experience with Mint Debian Edition. I currently use Debian Stable as my workstation & local server - I considered Mint, but since it was Ubuntu, I held off...

What is unfortunate? You found one alternative was not to your liking, and another right there to take its place. You didn't have to pay for anything. You were not locked in to anything. Now you are not fighting your OS. Seems to be working as it should.

Some people like Ubuntu but I don't because of so many reasons. If rather use Debian.

Same for Fedora that I don't like also. I prefer to use RockyLinux or AlmaLinux if you really need a RHEL compatible system.

There are other options, most of them based on Debian or Ubuntu.

My desktop choice is ArchLinux with Plasma or XFCE4. No snaps, no crap.

My servers choice is RockyLinux 8 or 10.


Who are you the people still praising Ubuntu? Where does it come from, this Ubuntu by default thing? Why? I genuinely interested. It was one of my first distros, but that was when they were doing this shipping CD thing. There are countless of distros that are better out of the box, e.g. Fedora. Sincerely, I don’t understand. Who uses Ubuntu these days, and why. Especially on servers, lol. Why not use Debian then?

Privately I'm using Fedora, because that works. But my last two companies are using Ubuntu overall. Maybe because there are so many packages available. It's still far less stable than Fedora. I have to fight stupid Ubuntu bugs every single day. On my Fedora machines it tested OK, so that is my gold standard.

Ubuntu was always like this. Use Debian.

Ubuntu was like this since 2009, use Pardus (which is based on Debian and follows roughly the same release process as Ubuntu, but no snaps).

Perhaps SteamOS will take up the mantle.

Or distros taking cues from it like Bazzite.

Mint is love, Mint is life.

Debian, Mint, Fedora...

Canonical needs snap in order to distinguish them from all the other Linux distros, so they've gone overboard to make sure that you "need" it.

I think it's horrible that they've taken extreme measures to overtly circumvent their users' desire to run the Firefox distributed through Mozilla's repo.

The following link describes how to overcome the latest version of Canonical's extreme insistence on the snap version of Firefox. It's almost laughable when you see how far they've gone to try to lock you in.

https://gist.github.com/jfeilbach/78d0ef94190fb07dee9ebfc340...


It's the same amount of pixels though, just with reduced bitrate for unfocused regions so you save time in encoding, transmitting, and decoding, essentially reducing latency.

For foveated rendering, the amount of rendered pixels are actually reduced.


At least when we implemented this in the first version of Oculus Link, the way it worked is that it was distorted (AADT [1]) to a deformed texture before compression and then rectilinear regenerated after compression as a cheap and simple way to emulate fixed foveated rendering. So it’s not that there’s some kind of adaptive bitrate which applies less bits outside the fovea region but achieves a similar result by giving it fewer pixels in the resulting image being compressed; doing adaptive bitrate would work too (and maybe even better) but encoders (especially HW accelerated ones) don’t support that.

Foveated streaming is presumably the next iteration of this where the eye tracking gives you better information about where to apply this distortion, although I’m genuinely curious how they manage to make this work well - eye tracking is generally high latency but the eye moves very very quickly (maybe HW and SW has improved but they allude to this problem so I’m curious if their argument about using this at a low frequency really improves meaningfully vs more static techniques)

[1] https://developers.meta.com/horizon/blog/how-does-oculus-lin...


Although your eye moves very quickly your brain has a delay in processing the completely new frame you switched to. It's very hard to look left and right with your eyes and read something quickly changing on both sides

That depends on the specifics of the encode/decode pipeline for the streamed frames. Could be the blurry part actually is lower res and lower bitrate until it's decoded, then upscaled and put together with the high res part. I'm not saying they do that, but it's an option.


It’s the same number of pixels rendered but it lets you reduce the amount of data sent , thereby allowing you to send more pixels than you would have been able to otherwise


This explains why it suddenly failed for me yesterday and the downloads were limited to 360p only. `dnf install deno` and it was back to normal.


Encoding/decoding tokens doesn't automatically mean lossy. Images, at least in term of raw pixels can be a very inefficient form of storing information from information theoretic perspective.

Now, the difficulty is in achieving an encoding/decoding scheme that is both: information efficient AND semantically coherent in latent space. Seems like there is a tradeoff here.


Enduring based on what metrics? Fornite is now an 8-year old massively (and still is) successful game. And the Battlefield series is actually 7 years older than the Souls series if you count from Demon's Souls. Comparing these 3 games is even more absurd because they are from entirely different genres, and they are not mutually exclusive, one can enjoy more than one genre.

I agree with other commenters in here, I feel sorry for your kids and thankful that my parents didn't treat me like what you are doing with your kids now.


When I think about enduring titles, I think about whether the kind of game that you'd pick up 20 years later and still consider to be a good game. I raised them on a curriculum of games starting with 80s titles and as they got older I progressed them all the way up into the 2020s so they would have a perspective on where particular gameplay mechanics came from. I see your point about the longevity of the Battlefield series and Fortnite, but my impression is people don't go back and play earlier Battlefield titles: I have always viewed them a little bit like the FIFA titles where there is a constant treadmill of needing to buy the latest version of the game. This is not true for Dark Souls, for example -- it's a sort of game you could play in decades and it would be as much of a masterpiece then it is today. I didn't really mean to compare the titles, but rather to use them as examples of titles that I would approve or disapprove of.

My kids choose the games they play, but I exercise judgment in vetoing certain decisions. My example of the From Software titles were not games that I bought for them, or even played (in the case of Elden Ring), but rather titles that my boys were into because of their friend group playing it. They've been playing Night Reign lately and enjoying it. I think people read into my dismissal of Battlefield and Fortnite as indicative of some much larger pattern that they've had a really bad experience with, but I'm not sure that conclusion is warranted.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: