Jump to content

Should gamers stick to Windows 10?

Windows 11 is fast approaching, but while Microsoft promises big improvements, is it really going to be a boon to gamers, or is it just going to be a side-grade with a new skin?

 

 

Emily @ LINUS MEDIA GROUP                                  

congratulations on breaking absolutely zero stereotypes - @cs_deathmatch

Link to comment
Share on other sites

Link to post
Share on other sites

This is a pretty useful video. I had forgotten that strawberry milk is a thing.

 

Also, the insights into Windows 11 were good.

Link to comment
Share on other sites

Link to post
Share on other sites

Even though AMD sponsored the video, would be cool to see the same method of setup but for Intel, then comparing with Linux.

Setup 1 - AMD W10 vs W11 vs Linux
Setup 2 - Intel W10 vs W11 vs Linux

Link to comment
Share on other sites

Link to post
Share on other sites

Scheduling threads on a multi core CPU is an interesting and fairly complex topic. Especially when one takes the nature of a given thread/process in mind, and how the cores are spread out in the processor, or if there is differences in the type of core. Though, thread scheduling could be a ton easier if there were relevant "tags" on the thread for the OS to know what a given thread actually need and don't need, and how it interacts with other threads and resources (so called IPC (Inter process communication.)).

 

Then there is HDR.

Well, to be honest here, HDR seems like a poorly implemented standard at the best of times. It is thankfully not as abhorrently implemented as color calibration is on windows. (though, BTW, is color calibration/correction done correctly on Windows 11? As in, does it apply the correction before sending the image to the screen without first mangling the actual image data in the application space? (Since on Windows 10 and earlier, color "correction" in most cases just applies the inverse "incorrectness" of one's monitor onto the image one is working on, and a lot of programs then saves that incorrectness as part of the image file itself, ie mangling the "perfect" values with one's color "correction" settings... Honestly a "feature" that is so abhorrently implemented that I intentionally turn off all color correction on all computers since it just makes life worse....))

 

Direct storage is "interesting", but partly a gimmick.

A lot of CPUs do have enough free CPU resources to handle pulling data out from storage without impacting much else. And this is mainly since the task is largely non dependent on other processes. Ie, a game engine can send off a call to the storage process, have it ship over the content to the requisite GPU memory space without really impacting anything else.

 

Though, giving the GPU the ability to just pull content from storage is going to decrease access latency a tiny bit (as in a few µs) since the system thread handling storage won't have to be switched to, but the security implications of just giving the GPU access to storage is its own can of worms. And storage devices don't tend to have much in the way of security, especially on a file system level (since file systems are complex beasts unless it is FAT32), most/all storage devices will happily just give you whatever you ask for.

 

But what should I say. I still run Windows 8... And Ubuntu, and a few Raspbian servers, setting up a True NAS Core server, and fiddling with FreeBSD a bit...

Link to comment
Share on other sites

Link to post
Share on other sites

Everybody knows these releases are usually the big mistake releases.

Waiting for the next one, Windows 12?

Link to comment
Share on other sites

Link to post
Share on other sites

Watching this on my balls-to-the-wall desktop... running Windows 7 LOL

 

Is Microsoft seriously convincing old school gamers to make the move to 11 when I wasn't even interested in 8/8.1, let alone 10? Even my XPS laptop the first move I did was hose the windows 10 install on it and replace it with Windows 7. For guys like me still rocking old hardware there is absolutely no need to change the OS. Drivers for the gear I have will never be out for 11. A GPU swap alone can keep even an older gaming rig still relevant provided the GPU drivers are there.

 

It's only a motherboard swap to something newer that would compel me to make the jump, but at that point I wouldn't bring any of my other old hardware along for the ride and just swap out everything else as well. That's a major investment. Similarly, it would have to be a newer or future game that comes out that I really want to play that doesn't work on older hardware forcing me to upgrade... except I'm not at that point yet. None of my games use HDR and the newest thing in my Steam library (World of Warships) runs perfectly fine.

 

And no, I have not had ANY security related problems running Windows 7 beyond it's "expiration date". This is not a networked machine on a work domain with remote access. I always know what I am doing online and I have a solid firewall. I know I'm not the only die hard out there.

Link to comment
Share on other sites

Link to post
Share on other sites

First of all a really quick disclaimer, I'm not a fan of Win11, I think it's looks truly awful. This thread is purely informational, to correct a technical misunderstanding in both the video, and among gamers in general.

 

The video in question at 2:00 timestamp suggests that CS:GO which is an older game, takes, quote, "a significant reduction in performance". It drops from 624fps average on Win10 to 514fps average on Win11 which is a WHOPPING 110fps drop.

 

The problem with this is...brace yourself...frame rate is not a linear measurement of performance. I know it's really jarring to hear that, it was for me when I first learned this many years ago, but it's true (more proof of that later). It should be noted that when measuring performance most developers are interested in the time it takes a specific piece of hardware (a GPU, or a CPU, or both) to execute code that produces some desired effect. This execution time is measured typically in milliseconds which are 1/1000th of a second (there's 1000ms in 1 second). So for example you might introduce a new function into your graphics pipeline to calculate soft shadows, and that takes an extra 1ms to run on top of everything else.

 

The problem is that frame time measured in ms is a linear measurement, whereas frame rate (number of frame per second) is not a linear measurement. If we use the real world example in the video, the frame rate for CS:GO in Win10 is 624fps. If we calculate the average frame time for this by dividing 1000ms by 624fps (1000/624) then we get 1.60ms. If we compare that to the Win11 result of 514fps it seems at first this is a very large and significant drop of 110fps. However if we calculate the frame time of 514fps by doing (1000/514) we get a frame time of 1.94ms. If we subtract these 2 frame time values 1.94ms - 1.60ms we get 0.34ms. This 0.34ms is the overhead Win11 creates to calculate each frame in game, this difference is fixed and static no matter what the performance of the game is.

 

If we take the same 0.34ms difference and look at how that would have an effect on a game running at 60fps we can get a feel for how impactful this same difference is at a different frame rate. Divide 1000ms/60fps and we get 16.66ms, if we assume WIn11 would add 0.34ms to the render time of each frame we get 16.66ms+0.34ms = 17.00ms. If you convert this back into average frame rate you get 1000ms/17ms = 58.8fps. This is a much less significant loss. In fact if you look at the charts in the video at timestamp 3:55 you can see that the game running at much lower frame rates (approx 100fps, give or take) actually have this tiny 1-2fps drop, these drops in performance are the same.

 

I suspect many will be incredulous of this since it appears to be assumed in tech circles that frame rate is linear. But it's worth reading this very old article on frame rate vs frame time and why frame rate is not a linear measurement of performance https://www.mvps.org/directx/articles/fps_versus_frame_time.htm this was written by Robert Dunlop an MVP for DirectX back in Ye Olde days, and covers this topic in an extremely concise way.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, but it's still a difference, and for people who bought high end cpus to get the lowest possible input delay, this is something you shouldn't have to deal with

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, gamagama69 said:

Yes, but it's still a difference, and for people who bought high end cpus to get the lowest possible input delay, this is something you shouldn't have to deal with

Sure, I'm not denying there's a difference. There clearly is a difference, Win11 has an additional overhead in rendering in some circumstances, which happens to be very small. All I am saying is that because frame rate is not a linear measurement of performance, if you look at simple games with very high frame rates such as CS:GO in the ~500fps range, you'll see what looks like massive impact to frame rate drop of ~100fps, but in reality this is a very minor drop because frame rate is not a linear measurement of performance.

 

If you know how the non-linearity of frame rate works you would expect a ~2fps difference at a 60fps baseline to manifest the same way as a ~100fps drop at a 600fps baseline, because they are basically the same thing.

Link to comment
Share on other sites

Link to post
Share on other sites

While you are right that the practical difference is less than a millisecond difference, for the people buying 390Hz monitors and things like that it does matter. Also you're forgetting that less than a minute later in the video the Civ. VI turn times were over a second slower, so the conclusions drawn are still accurate, even if the CS:GO numbers you want to discard because of that.

Link to comment
Share on other sites

Link to post
Share on other sites

I'll be updating my laptop to W11 when the bugs are ironed out, my desktop wont be updated until i upgrade it since i'm still running on unsupported hardware. But i'm in no rush to upgrade anyway, there is nothing in w11 that makes me go "i MUST have this NOW!!!"

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Luscious said:

-snip-

If it connects to the internet it's vulnerable, and your older OS that's missing patches for the latest exploits is even more vulnerable.

Data Systems Administrator | Sergeant - US Marine Corps | CCNA / SEC+

Ryzen 9 5950x | 64 GB DDR4 3600Mhz | Gigabyte RTX 3080 Ti | Full Build Info | HomeLab Setup

Link to comment
Share on other sites

Link to post
Share on other sites

I switched to 11 because of Auto HDR and I'm not going back to 10. Overall, I prefer 11. 

PC Setup: 

HYTE Y60 White/Black + Custom ColdZero ventilation sidepanel

Intel Core i7-10700K + Corsair Hydro Series H100x

G.SKILL TridentZ RGB 32GB (F4-3600C16Q-32GTZR)

ASUS ROG STRIX RTX 3080Ti OC LC

ASUS ROG STRIX Z490-G GAMING (Wi-Fi)

Samsung EVO Plus 1TB

Samsung EVO Plus 1TB

Crucial MX500 2TB

Crucial MX300 1TB

Corsair HX1200i

 

Peripherals: 

Samsung Odyssey Neo G9 G95NC 57"

Samsung Odyssey Neo G7 32"

ASUS ROG Harpe Ace Aim Lab Edition Wireless

ASUS ROG Claymore II Wireless

ASUS ROG Sheath BLK LTD'

Corsair SP2500

Beyerdynamic DT 770 PRO X (Limited Editon) & Beyerdynamic TYGR 300R + FiiO K7 DAC/AMP

RØDE VideoMic II + Elgato WAVE Mic Arm

 

Racing SIM Setup: 

Sim-Lab GT1 EVO Sim Racing Cockpit + Sim-Lab GT1 EVO Single Screen holder

Svive Racing D1 Seat

Samsung Odyssey G9 49"

Simagic Alpha Mini

Simagic GT4 (Dual Clutch)

CSL Elite Pedals V2

Logitech K400 Plus

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, VectorTech said:

If it connects to the internet it's vulnerable, and your older OS that's missing patches for the latest exploits is even more vulnerable.

I have seen people running XP since release and nothing bad happened.

For the educated home user - It's probably fine.

And when Microsoft finds out really really bad vulnerabilities - They will release a patch for even unsupported operating systems.

 

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

I had a bad experience with Windows 11 in games and specifically Factorio (a game that heavily relies on single core performance and memory speed and latency, graphics card is irrelevant here). I saw a 5% drop in all the maps I played and all the benchmarks i ran on it, I was not happy to see that when i upgraded. But now i think i understand why that is the case, as W11 uses more evenly the CPU and it might affect the cores with the highest load used by the game.
FactorioBox Results (1au.us) - Win 11
FactorioBox Results (1au.us) - Win 10
I have an Intel i5-11600k CPU with 32Gb 3200MHz CL14 RAM

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, BetteBalterZen said:

I switched to 11 because of Auto HDR and I'm not going back to 10. Overall, I prefer 11. 

Basically what i plan to do as soon as it's out of beta. i'm just not comfortable running a beta OS on my only PC. But i'm very excited on auto-HDR. Depending on how good it is, it should do wonders for my HDR1000 monitor and my OLED TV.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Stahlmann said:

Basically what i plan to do as soon as it's out of beta. i'm just not comfortable running a beta OS on my only PC. But i'm very excited on auto-HDR. Depending on how good it is, it should do wonders for my HDR1000 monitor and my OLED TV.

Honestly, Auto-HDR is actually surprisingly good. Of course native HDR support in games with tweakable nits settings and so on will still be better but Auto-HDR actually is good. 

Need for Speed (the reboot) looks fantastic with Auto-HDR, and Battlefield 4 also looks really good. I prefer Auto-HDR over SDR, in compatible games 😄

PC Setup: 

HYTE Y60 White/Black + Custom ColdZero ventilation sidepanel

Intel Core i7-10700K + Corsair Hydro Series H100x

G.SKILL TridentZ RGB 32GB (F4-3600C16Q-32GTZR)

ASUS ROG STRIX RTX 3080Ti OC LC

ASUS ROG STRIX Z490-G GAMING (Wi-Fi)

Samsung EVO Plus 1TB

Samsung EVO Plus 1TB

Crucial MX500 2TB

Crucial MX300 1TB

Corsair HX1200i

 

Peripherals: 

Samsung Odyssey Neo G9 G95NC 57"

Samsung Odyssey Neo G7 32"

ASUS ROG Harpe Ace Aim Lab Edition Wireless

ASUS ROG Claymore II Wireless

ASUS ROG Sheath BLK LTD'

Corsair SP2500

Beyerdynamic DT 770 PRO X (Limited Editon) & Beyerdynamic TYGR 300R + FiiO K7 DAC/AMP

RØDE VideoMic II + Elgato WAVE Mic Arm

 

Racing SIM Setup: 

Sim-Lab GT1 EVO Sim Racing Cockpit + Sim-Lab GT1 EVO Single Screen holder

Svive Racing D1 Seat

Samsung Odyssey G9 49"

Simagic Alpha Mini

Simagic GT4 (Dual Clutch)

CSL Elite Pedals V2

Logitech K400 Plus

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, BetteBalterZen said:

Honestly, Auto-HDR is actually surprisingly good. Of course native HDR support in games with tweakable nits settings and so on will still be better but Auto-HDR actually is good. 

Need for Speed (the reboot) looks fantastic with Auto-HDR, and Battlefield 4 also looks really good. I prefer Auto-HDR over SDR, in compatible games 😄

Do games need to get certified or something like that? Or does it "just work" with Vulkan/DX11 games without any work from the devs?

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Stahlmann said:

Do games need to get certified or something like that? Or does it "just work" with Vulkan/DX11 games without any work from the devs?

Auto-HDR "should just work" with all DirectX 11 and 12 games. The developers don't have to do anything. But some reddit posts do mention that some games do not work. Why? Idk. So far, all the DirectX 11 and 12 games I have played, works with Auto-HDR. 

But yeah, you can find reddit posts that tracks working games and so on. The list is quite long now 🙂

Edit: 
List List of games that support Auto HDR - PCGamingWiki PCGW - bugs, fixes, crashes, mods, guides and improvements for every PC game

PC Setup: 

HYTE Y60 White/Black + Custom ColdZero ventilation sidepanel

Intel Core i7-10700K + Corsair Hydro Series H100x

G.SKILL TridentZ RGB 32GB (F4-3600C16Q-32GTZR)

ASUS ROG STRIX RTX 3080Ti OC LC

ASUS ROG STRIX Z490-G GAMING (Wi-Fi)

Samsung EVO Plus 1TB

Samsung EVO Plus 1TB

Crucial MX500 2TB

Crucial MX300 1TB

Corsair HX1200i

 

Peripherals: 

Samsung Odyssey Neo G9 G95NC 57"

Samsung Odyssey Neo G7 32"

ASUS ROG Harpe Ace Aim Lab Edition Wireless

ASUS ROG Claymore II Wireless

ASUS ROG Sheath BLK LTD'

Corsair SP2500

Beyerdynamic DT 770 PRO X (Limited Editon) & Beyerdynamic TYGR 300R + FiiO K7 DAC/AMP

RØDE VideoMic II + Elgato WAVE Mic Arm

 

Racing SIM Setup: 

Sim-Lab GT1 EVO Sim Racing Cockpit + Sim-Lab GT1 EVO Single Screen holder

Svive Racing D1 Seat

Samsung Odyssey G9 49"

Simagic Alpha Mini

Simagic GT4 (Dual Clutch)

CSL Elite Pedals V2

Logitech K400 Plus

Link to comment
Share on other sites

Link to post
Share on other sites

You draw a perfectly valid conclusion, and also show that this is something that will become more apparent the higher your performance gets, and there lies the crux I think. I haven't watched the video completely, so I can't yet comment on how well LTT has conveyed this, but I'd say the general gist of this is that if you are a super high frame rate CS:GO player or something then it matters and you're better off sticking to Win10 it seems. If you're a casual 60 FPS gamer you are completely correct and this is not much to worry about.

 

0.34 ms extra on a 16.67 ms frame time (60 FPS) is no big deal, but 0.34 ms extra on a 1.67 ms frame time (600 FPS) is a significant 21% increase.

24 minutes ago, The Unknown Voice said:

When it is out, then more info will be available. Be patient...

This. It's nice to see and good input to Microsoft (if they even plan to do something with it), but in the end it's all speculation until it's acutally released.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, PrincessFrosty said:

First of all a really quick disclaimer, I'm not a fan of Win11, I think it's looks truly awful. This thread is purely informational, to correct a technical misunderstanding in both the video, and among gamers in general.

 

The video in question at 2:00 timestamp suggests that CS:GO which is an older game, takes, quote, "a significant reduction in performance". It drops from 624fps average on Win10 to 514fps average on Win11 which is a WHOPPING 110fps drop.

 

The problem with this is...brace yourself...frame rate is not a linear measurement of performance. I know it's really jarring to hear that, it was for me when I first learned this many years ago, but it's true (more proof of that later). It should be noted that when measuring performance most developers are interested in the time it takes a specific piece of hardware (a GPU, or a CPU, or both) to execute code that produces some desired effect. This execution time is measured typically in milliseconds which are 1/1000th of a second (there's 1000ms in 1 second). So for example you might introduce a new function into your graphics pipeline to calculate soft shadows, and that takes an extra 1ms to run on top of everything else.

 

The problem is that frame time measured in ms is a linear measurement, whereas frame rate (number of frame per second) is not a linear measurement. If we use the real world example in the video, the frame rate for CS:GO in Win10 is 624fps. If we calculate the average frame time for this by dividing 1000ms by 624fps (1000/624) then we get 1.60ms. If we compare that to the Win11 result of 514fps it seems at first this is a very large and significant drop of 110fps. However if we calculate the frame time of 514fps by doing (1000/514) we get a frame time of 1.94ms. If we subtract these 2 frame time values 1.94ms - 1.60ms we get 0.34ms. This 0.34ms is the overhead Win11 creates to calculate each frame in game, this difference is fixed and static no matter what the performance of the game is.

 

If we take the same 0.34ms difference and look at how that would have an effect on a game running at 60fps we can get a feel for how impactful this same difference is at a different frame rate. Divide 1000ms/60fps and we get 16.66ms, if we assume WIn11 would add 0.34ms to the render time of each frame we get 16.66ms+0.34ms = 17.00ms. If you convert this back into average frame rate you get 1000ms/17ms = 58.8fps. This is a much less significant loss. In fact if you look at the charts in the video at timestamp 3:55 you can see that the game running at much lower frame rates (approx 100fps, give or take) actually have this tiny 1-2fps drop, these drops in performance are the same.

 

I suspect many will be incredulous of this since it appears to be assumed in tech circles that frame rate is linear. But it's worth reading this very old article on frame rate vs frame time and why frame rate is not a linear measurement of performance https://www.mvps.org/directx/articles/fps_versus_frame_time.htm this was written by Robert Dunlop an MVP for DirectX back in Ye Olde days, and covers this topic in an extremely concise way.

Your thread has been merged into official video thread.

^^^^ That's my post ^^^^
<-- This is me --- That's your scrollbar -->
vvvv Who's there? vvvv

Link to comment
Share on other sites

Link to post
Share on other sites

W7MR !!

 

No seriously, still rocking W7 here. Though my next system clean up will involve a OS change, though it will be to a LTSC version of W10 or possibly W11. The 'standard' versions are waaaay to bloated and have far to much telemetry.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Luscious said:

Watching this on my balls-to-the-wall desktop... running Windows 7 LOL

 

Is Microsoft seriously convincing old school gamers to make the move to 11 when I wasn't even interested in 8/8.1, let alone 10? Even my XPS laptop the first move I did was hose the windows 10 install on it and replace it with Windows 7. For guys like me still rocking old hardware there is absolutely no need to change the OS. Drivers for the gear I have will never be out for 11. A GPU swap alone can keep even an older gaming rig still relevant provided the GPU drivers are there.

 

It's only a motherboard swap to something newer that would compel me to make the jump, but at that point I wouldn't bring any of my other old hardware along for the ride and just swap out everything else as well. That's a major investment. Similarly, it would have to be a newer or future game that comes out that I really want to play that doesn't work on older hardware forcing me to upgrade... except I'm not at that point yet. None of my games use HDR and the newest thing in my Steam library (World of Warships) runs perfectly fine.

 

And no, I have not had ANY security related problems running Windows 7 beyond it's "expiration date". This is not a networked machine on a work domain with remote access. I always know what I am doing online and I have a solid firewall. I know I'm not the only die hard out there.

 

Pretty much described my thinking in a nutshell. I only now considering a jump to windows 10 because there is some software I use that absolutely refuses to run on win 7 and since I need it for work I have to jump. Still I'm not going to be a early adopter of windows 11. None of the features seem like anything more than marketing gimmicks or half baked early tech.

 

I'm not looking forward to debloating and locking down windows 10 and I doubt 11 will be any easier or maybe even more restrictive in what we can fix.

"The Codex Electronica does not support this overclock."

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×