Hacker Newsnew | past | comments | ask | show | jobs | submit | jsheard's commentslogin

Nanoraptor kindly fixed their logos for them: https://bsky.app/profile/nanoraptor.danamania.com/post/3mbif...

That kerning is painful on a few of those. Cleverly done, though.

Not to mention that fractional scaling is practically required in order to use the majority of higher DPI monitors on the market today. Manufacturers have settled on 4K at 27" or 32" as the new standard, which lends itself to running at around 150% scale, so to avoid fractional scaling you either need to give up on high DPI or pay at least twice as much for a niche 5K monitor which only does 60hz.

Fractional scaling is a really bad solution. The correct way to fix this is to have the dpi aware applications and toolkits. This does in fact work and I have ran xfce under xorg for years now on hi-dpi screens just by setting a custom dpi and using a hi-dpi aware theme. When the goal is to have perfect output why do people suddenly want to jump to stretching images?

The overwhelming majority of the low-DPI external displays at this point are 24-27 1080p

Most high-DPI displays are simply the same thing with exactly twice the density.

We settled on putting exactly twice as many pixels in the same panels because it facilitates integer scaling


That doesn't gel with my experience, 1080p was the de-facto resolution for 24" monitors but 27" monitors were nearly always 1440p, and switching from 27" 1440p to 27" 4K requires a fractional 150% scale to maintain the same effective area.

To maintain a clean 200% scale you need a 27" 5K panel instead, which do exist but are vastly more expensive than 4K ones and perform worse in aspects other than pixel density, so they're not very popular.


Why not give up on high DPI?

Save money on the monitor, save money on the gpu (because it's pushing fewer pixels, you don't need as much oomph), save frustration with software.


4K monitors aren't a significant expense at this point, and text rendering is a lot nicer at 150% scale. The GPU load can be a concern if you're gaming but most newer games have upscalers which decouple the render resolution from the display resolution anyway.

Typing "Visual Studio" into the new start menu may randomly trigger a Bing search for "Visual Studio" instead of running it, but on the other hand that makes Bings KPIs go up so it's impossible to say if it's bad or not.

I hate that so much. When blind people are trying to start JAWS (the screen reader) by typing "jaws" into the start menu and pressing Enter, it will sometimes pull up a Bing page on Jaws the movie instead. And the blind person is just sitting there waiting for the screen reader to start. I tell people to use the run dialog for that reason. Sucks but that's what you have to do in the age of inshittisoft.

They are apparently replacing the run dialog with a new "Modern Run" dialog, so we can look forward to that also not working properly:

https://www.windowscentral.com/microsoft/windows-11/after-30...


the only sane tool remaining in windows is the RUN :( I wont even touch this shitty OS without RUN

This is purely insane. Doesn’t Microsoft violate accessibility laws in some jurisdiction due to this?

"rules for thee not for me"

It's been a while since I used Windows regularly or seriously, but I remember start menu search actually being good - maybe around Win7 days? You would just press <Win>, type a few letters of the software you wanted and hit enter, and it would work every time with minimal latency.

You know, like KDE Plasma in 2026.


'Randomly' here likely corresponds to a typo in the search term.

If I type "Visual Std" instead of "Visual Stu" it goes to the Bing results.

Alternatively it shows No results if you disable Bing in the Search settings found in the top right meatballs menu.

I also would expect fuzzy search by default instead of typos sending users to Bing.


It also orrelates with missing the first letter off search terms, such as when you start typing immediately after opening the start menu

  > orrelates with missing the first letter off
Intended?

Still, that shows an issue of using fuzzy search for Bing but not programs. There should be a precedent on local items. A typo is far more likely than a web search, especially when the web search is resulting in the intended application.

Did no one think of that feedback loop? That if the web search is suggesting an installed app that that installed app should be prioritized?


I can only reproduce this by hovering the Windows icon with the mouse and having the finger on a character, in order to press it immediately after clicking. In that case most of the time the Start menu does not open at all, and sometimes it opens but does not have the letter.

When I use the Windows key to open the Start menu I cannot reproduce this, as eg. Win + E opens the Explorer instead of the Start menu.

It does not appear on my machine as if this could possibly happen when opening the Start menu during regular use. Can you reproduce this on your machine?


This rarely (but not never) happened on my gaming desktop when I had windows on that. On the other hand, on my surface go, if it only eats the first character, that’s a good showing, so it’s likely device performance specific

Objectively it wastes developer time making the OS in a non linear way more expensive for companies. Its like a minthly subscription for ever more minutes.

If your opinion mattered, you would work at Microsoft setting the targets that the Start Menu team need to meet to hit their bonuses/not get fired.

But you don't. So it doesn't.

(I've pinned Visual Studio to the start menu.)


It takes literally a click to deactivate it though. One could argue about Bing Search being the default, but I didn’t run the user surveys to see, which is best for the average user.

Either I am stupid or you're being dishonest. There is no one click way to disable it. Only on pro versions of windows, with a group policy otherwise a couple of obscure registry keys no regular users know.

I stand corrected. I didn’t know the Home Version has no option to disable it. You are right for calling me out.

No group policy is needed on Windows 11 Pro, it's in the settings:

---

Type something in the Start menu

Top right meatballs menu button

'Search settings'

'Let search apps show results' -> Off (or disable only Bing)

---

I don't know about the Home edition.


That doesn't appear for me. Win 11 Pro 25H2.

After you typed a Search term, there is no menu button at the right on the same vertical row as the 'All / Apps / Documents' bar?

You can also launch that Settings page by running in powershell:

  Start-Process "ms-settings:cortana-windowssearch"
Or just 'Settings' and in the left navigation 'Privacy & Security' -> 'Search'

The best is none. I have no idea why anyone would want a web search there.

> After all, the most played competitive shooter is CS and Valve has does not use kernel-level AC.

Valve doesn't employ kernel AC but in practice others have taken that into their own hands - the prevalence of cheating on the official CS servers has driven the adoption of third-party matchmaking providers like FACEIT, which layer their own kernel AC on top of the game. The bulk of casual play happens on the former, but serious competitive play mostly happens on the latter.


The best description I've been able to give of the dichotomy of CS is this: there is no way for a person to become good enough to get their signature into the game, without using kernel-level ACs.

Another unresolved roadblock is Nvidia cards seriously underperforming in DX12 games under Proton compared to Windows. Implementing DX12 semantics on top of Vulkan runs into some nasty performance cliffs on their hardware, so Khronos is working on amending the Vulkan spec to smooth that over.

That's being addressed:

    - https://forums.developer.nvidia.com/t/directx12-performance-is-terrible-on-linux/303207/432
    - https://indico.freedesktop.org/event/10/contributions/402/attachments/243/327/2025-09-29%20-%20XDC%202025%20-%20Descriptors%20are%20Hard.pdf
    - https://www.youtube.com/watch?v=TpwjJdkg2RE
The problem is on multiple levels, so everything has to work in conjunction to be fixed properly.

What percentage of games require DX12? From what I recall, a surprisingly large percentage of games support DX11, including Arc Raiders, BF6 and Helldivers 2, just to name a few popular titles.

At the same time, Vulkan support is also getting pretty widespread, I think notably idTech games prefer Vulkan as the API.


DX12 is overwhelmingly the default for AAA games at this point. The three titles you listed all officially require DX12, what DX11 support they have is vestigial, undocumented and unsupported. Many other AAAs have already stripped their legacy DX11 support out entirely.

Id Software do prefer Vulkan but they are an outlier.


DX12 is less and less the default, most gamedev that I’ve seen is surrounding Vulkan now.

DX12 worked decently better than openGL before, and all the gamedevs had windows, and it was required for xbox… but now those things are less and less true.

The playstation was always “odd-man-out” when it came to graphics processing, and we used a lot of shims, but then Stadia came along and was a proper linux, so we rewrote a huge amount of our render to be better behaved for Vulkan.

All subsequent games on that engine have thus had a vulkan friendly renderer by default, that is implemented cleaner than the DX12 one, and works natively pretty much everywhere. So its the new default.


Godot switched over to DX12 over Vulkan for Windows. Blaming bad Windows drivers for the reason.

https://godotengine.org/article/dev-snapshot-godot-4-6-dev-5...


Same but I will steal Tailwinds colour palette, that part is pretty good.

That was most likely a rear projection unit, they looked kind of like CRTs but it's different technology. Sony did make them although they weren't marketed as Trinitrons AFAIK.

Projection displays were CRTs, but they were small (10" or so) and monochrome. Three of them—one each for the red, green, and blue channels, were each oriented and focused to project a clear image at the exact same spot on the screen, overlaying each other to form a single color image.

Projection TVs were even prone to CRT "raster burn", perhaps even more so than single-tube TVs due to the brightness of the image required, which is why Nintendo instruction booklets had stern warnings not to use their consoles with projection TVs.


Yea, but the screen you could see wasn’t a CRT tube. It was just a projection screen but unless you looked closely you’d be unlikely to notice.

Thomas Electronics in the US supposedly still makes and repairs CRTs for military and aerospace, but those will be much smaller than you'd want for a TV or monitor and often if not always monochrome. Even if they did make big colour tubes they wouldn't give mere mortals the time of day anyway, they're in it for the big money contracts.

https://www.thomaselectronics.com


Thanks.

I think I may have already known that at one point, but I'll try to include this the next time I am motivated to brain-dump some CRT lore.


The Sony FW900 was the peak of desktop CRT monitors, and it came out in 1999 so it or one of its rebadges might have been what you saw. That was much smaller than the PVM-4300 at 24" but with a much higher max resolution of 2304x1440@85hz, roughly what we'd now call 1440p, about eight years before the first 1080p LCDs arrived.

Those were still sought after well into the LCD era for their high resolution and incredible motion clarity, but I think LCDs getting "good enough" and the arrival of OLED monitors with near-zero response times has finally put them out to pasture as anything but a collectors item.


Was turned onto the the FW900 from hardforum years before LCD was available/reasonable

Now I have a FW900 sitting in a closet for decades because I can't lift it anymore

Also will never forget I was taking a walk in the woods years ago and in the middle of nowhere, no houses/apartments for miles, there was a FW900 just sitting there like someone must have thrown it out of an airplane but of course impossible as it was intact and inexplicable WTF (when got home made sure mine was still in the closet and had not somehow teleported itself)


I remember in the mid ‘00s having a 19” that did 1600x1200 at (I think) 85 Hz. Damn thing was a tank, but I loved it. So crisp.

We set up one of those widescreen Intergraph CRTs for a client way back then, I think the cost of that thing plus the workstation was easily more than I made in a year

Those Intergraphs were bigger than the FW900 at 28", although lower resolution at 2048x1152@80hz max, so I suppose YMMV which was better.

IBM was producing in Japan T221 monitor staring from 2001. It had 3840x2160 LCD screen.

And even if you found the money to resurrect the production lines, modern regulations probably wouldn't look too kindly on making new consumer goods with several pounds of lead in each unit. Better set aside your morals and enough money to buy some politicians while you're at it.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: