Not to mention that fractional scaling is practically required in order to use the majority of higher DPI monitors on the market today. Manufacturers have settled on 4K at 27" or 32" as the new standard, which lends itself to running at around 150% scale, so to avoid fractional scaling you either need to give up on high DPI or pay at least twice as much for a niche 5K monitor which only does 60hz.
Fractional scaling is a really bad solution. The correct way to fix this is to have the dpi aware applications and toolkits. This does in fact work and I have ran xfce under xorg for years now on hi-dpi screens just by setting a custom dpi and using a hi-dpi aware theme. When the goal is to have perfect output why do people suddenly want to jump to stretching images?
That doesn't gel with my experience, 1080p was the de-facto resolution for 24" monitors but 27" monitors were nearly always 1440p, and switching from 27" 1440p to 27" 4K requires a fractional 150% scale to maintain the same effective area.
To maintain a clean 200% scale you need a 27" 5K panel instead, which do exist but are vastly more expensive than 4K ones and perform worse in aspects other than pixel density, so they're not very popular.
4K monitors aren't a significant expense at this point, and text rendering is a lot nicer at 150% scale. The GPU load can be a concern if you're gaming but most newer games have upscalers which decouple the render resolution from the display resolution anyway.
Typing "Visual Studio" into the new start menu may randomly trigger a Bing search for "Visual Studio" instead of running it, but on the other hand that makes Bings KPIs go up so it's impossible to say if it's bad or not.
I hate that so much. When blind people are trying to start JAWS (the screen reader) by typing "jaws" into the start menu and pressing Enter, it will sometimes pull up a Bing page on Jaws the movie instead. And the blind person is just sitting there waiting for the screen reader to start. I tell people to use the run dialog for that reason. Sucks but that's what you have to do in the age of inshittisoft.
It's been a while since I used Windows regularly or seriously, but I remember start menu search actually being good - maybe around Win7 days? You would just press <Win>, type a few letters of the software you wanted and hit enter, and it would work every time with minimal latency.
Still, that shows an issue of using fuzzy search for Bing but not programs. There should be a precedent on local items. A typo is far more likely than a web search, especially when the web search is resulting in the intended application.
Did no one think of that feedback loop? That if the web search is suggesting an installed app that that installed app should be prioritized?
I can only reproduce this by hovering the Windows icon with the mouse and having the finger on a character, in order to press it immediately after clicking. In that case most of the time the Start menu does not open at all, and sometimes it opens but does not have the letter.
When I use the Windows key to open the Start menu I cannot reproduce this, as eg. Win + E opens the Explorer instead of the Start menu.
It does not appear on my machine as if this could possibly happen when opening the Start menu during regular use. Can you reproduce this on your machine?
This rarely (but not never) happened on my gaming desktop when I had windows on that. On the other hand, on my surface go, if it only eats the first character, that’s a good showing, so it’s likely device performance specific
Objectively it wastes developer time making the OS in a non linear way more expensive for companies. Its like a minthly subscription for ever more minutes.
It takes literally a click to deactivate it though. One could argue about Bing Search being the default, but I didn’t run the user surveys to see, which is best for the average user.
Either I am stupid or you're being dishonest. There is no one click way to disable it.
Only on pro versions of windows, with a group policy otherwise a couple of obscure registry keys no regular users know.
> After all, the most played competitive shooter is CS and Valve has does not use kernel-level AC.
Valve doesn't employ kernel AC but in practice others have taken that into their own hands - the prevalence of cheating on the official CS servers has driven the adoption of third-party matchmaking providers like FACEIT, which layer their own kernel AC on top of the game. The bulk of casual play happens on the former, but serious competitive play mostly happens on the latter.
The best description I've been able to give of the dichotomy of CS is this: there is no way for a person to become good enough to get their signature into the game, without using kernel-level ACs.
Another unresolved roadblock is Nvidia cards seriously underperforming in DX12 games under Proton compared to Windows. Implementing DX12 semantics on top of Vulkan runs into some nasty performance cliffs on their hardware, so Khronos is working on amending the Vulkan spec to smooth that over.
What percentage of games require DX12? From what I recall, a surprisingly large percentage of games support DX11, including Arc Raiders, BF6 and Helldivers 2, just to name a few popular titles.
At the same time, Vulkan support is also getting pretty widespread, I think notably idTech games prefer Vulkan as the API.
DX12 is overwhelmingly the default for AAA games at this point. The three titles you listed all officially require DX12, what DX11 support they have is vestigial, undocumented and unsupported. Many other AAAs have already stripped their legacy DX11 support out entirely.
Id Software do prefer Vulkan but they are an outlier.
DX12 is less and less the default, most gamedev that I’ve seen is surrounding Vulkan now.
DX12 worked decently better than openGL before, and all the gamedevs had windows, and it was required for xbox… but now those things are less and less true.
The playstation was always “odd-man-out” when it came to graphics processing, and we used a lot of shims, but then Stadia came along and was a proper linux, so we rewrote a huge amount of our render to be better behaved for Vulkan.
All subsequent games on that engine have thus had a vulkan friendly renderer by default, that is implemented cleaner than the DX12 one, and works natively pretty much everywhere. So its the new default.
That was most likely a rear projection unit, they looked kind of like CRTs but it's different technology. Sony did make them although they weren't marketed as Trinitrons AFAIK.
Projection displays were CRTs, but they were small (10" or so) and monochrome. Three of them—one each for the red, green, and blue channels, were each oriented and focused to project a clear image at the exact same spot on the screen, overlaying each other to form a single color image.
Projection TVs were even prone to CRT "raster burn", perhaps even more so than single-tube TVs due to the brightness of the image required, which is why Nintendo instruction booklets had stern warnings not to use their consoles with projection TVs.
Thomas Electronics in the US supposedly still makes and repairs CRTs for military and aerospace, but those will be much smaller than you'd want for a TV or monitor and often if not always monochrome. Even if they did make big colour tubes they wouldn't give mere mortals the time of day anyway, they're in it for the big money contracts.
The Sony FW900 was the peak of desktop CRT monitors, and it came out in 1999 so it or one of its rebadges might have been what you saw. That was much smaller than the PVM-4300 at 24" but with a much higher max resolution of 2304x1440@85hz, roughly what we'd now call 1440p, about eight years before the first 1080p LCDs arrived.
Those were still sought after well into the LCD era for their high resolution and incredible motion clarity, but I think LCDs getting "good enough" and the arrival of OLED monitors with near-zero response times has finally put them out to pasture as anything but a collectors item.
Was turned onto the the FW900 from hardforum years before LCD was available/reasonable
Now I have a FW900 sitting in a closet for decades because I can't lift it anymore
Also will never forget I was taking a walk in the woods years ago and in the middle of nowhere, no houses/apartments for miles, there was a FW900 just sitting there like someone must have thrown it out of an airplane but of course impossible as it was intact and inexplicable WTF (when got home made sure mine was still in the closet and had not somehow teleported itself)
We set up one of those widescreen Intergraph CRTs for a client way back then, I think the cost of that thing plus the workstation was easily more than I made in a year
And even if you found the money to resurrect the production lines, modern regulations probably wouldn't look too kindly on making new consumer goods with several pounds of lead in each unit. Better set aside your morals and enough money to buy some politicians while you're at it.
reply