Jump to content

Should gamers stick to Windows 10?

3 hours ago, Hybris5112 said:

forward to debloating and locking down windows 10

Windows 10 LTSC.

It's a good start. Still full of BS spyware, but it's easier to turn off and it fucking stays off.

Patches are security related only.

No live tiles or crap like that.

 

 

NOTE: I no longer frequent this site. If you really need help, PM/DM me and my e.mail will alert me. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, tikker said:

0.34 ms extra on a 16.67 ms frame time (60 FPS) is no big deal, but 0.34 ms extra on a 1.67 ms frame time (600 FPS) is a significant 21% increase.

This. It's nice to see and good input to Microsoft (if they even plan to do something with it), but in the end it's all speculation until it's acutally released.

Again the problem with this is that it's the same performance impact measured in frame time it's only because FPS is a non-linear measurement that you think it manifests in a 21% increase. If you had a new graphical feature for example a new type of Anti-Aliasing and that added 0.34ms onto the frame time it would manifest in the same way as a 2fps drop with a 60fps baseline, and a 100fps drop at a 600fps baseline. You have to get out of the mindset of thinking that frame rate has a linear relationship with performance.

 

I think what bothers me the most is that this 110fps dip was used as the reason in the video to go on to explore what was happening as if that needed explaining somehow, which suggests to me that Linus and gang, or whoever made the video don't conceptually understand that the impact in CSGO is basically the same as the impact for everything else they saw, give or take amount.

Link to comment
Share on other sites

Link to post
Share on other sites

the real question though, is it suitable for day to day use, NOW? as in Beta Channel, or Dev channel.

Link to comment
Share on other sites

Link to post
Share on other sites

Lol "more sophisticate scheduler". Nope. The scheduler is supposed to emphasize using Ryzen's preferred cores when the load is not full on all-core. So this is actually legacy behavior that somehow does not utilize specific Ryzen enhancements. And you guys definitely installed the relevant AMD chipset drivers?

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, PrincessFrosty said:

Again the problem with this is that it's the same performance impact measured in frame time it's only because FPS is a non-linear measurement that you think it manifests in a 21% increase

There is no "thinking" involved in the 21% increase. 0.34 ms extra is a 21% increase from a 1.67 ms frame time. Nothing non-linear about it, just simple math. If you require updated information ever 1.67 ms then 0.34 ms extra is much worse then if you require new information every 16.67 ms. Nowhere did I say that implies a 21% loss of FPS. The non-linearity is the underlying cause why different frame rate regimes are affected differently.

2 hours ago, PrincessFrosty said:

If you had a new graphical feature for example a new type of Anti-Aliasing and that added 0.34ms onto the frame time it would manifest in the same way as a 2fps drop with a 60fps baseline, and a 100fps drop at a 600fps baseline.

Exactly and thus the impact is much larger at 600 FPS for situations where you benefit from that 600 FPS. It's not the same performance impact, precisely because it's a constant addition. For 60 FPS you only get a new frame every 16.67 ms. If you'd render 60 frames, that 0.34 ms amounts to 20.4 ms, or 20.4 / 16.7 ~ 1.2 frames of information lost (or if you want FPS, a 17.01 ms frame time, which is 58.8 FPS).

 

If you get a new frame every 1.67 ms then 20.4 ms extra means for every 60 frames,  20.4 / 1.7 ~ 12 frames extra could have been rendered were that overhead not there. Again in terms of FPS that means 1.67, about 600 FPS, to 2.01 ms, or about 500 FPS. Therefore if you can leverage that extra updated position or whatever that you get from the higher refresh rate, the drop is worse for higher refresh rates.

 

A good analogy is a car and a bike traveling at 120 and 15 km/h respectively. If you were to change tires resulting in a loss of 5 km/h for both, traveling 10 km by car would change from 0.083 (5 min) hour to 0.087 hour (5.2 min), i.e. merely 4% longer. By bike it'll go from 0.67 hour (40 min) to 1 hour (60 min), 50% longer. The impact of that constant change is not the same across the board.

2 hours ago, PrincessFrosty said:

You have to get out of the mindset of thinking that frame rate has a linear relationship with performance.

I didn't say this.

2 hours ago, PrincessFrosty said:

I think what bothers me the most is that this 110fps dip was used as the reason in the video to go on to explore what was happening as if that needed explaining somehow, which suggests to me that Linus and gang, or whoever made the video don't conceptually understand that the impact in CSGO is basically the same as the impact for everything else they saw, give or take amount.

Because that's what you do when you see a performance decrease: you investigate. A 2 FPS drop is well within margin of error and can be anything from Windows 11 overhead to your card not boosting as high due to temperatures or whatever, so it won't be noticed. A drop of 110 FPS is very noticeable and won't likely happen by chance.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, tikker said:

Because that's what you do when you see a performance decrease: you investigate. A 2 FPS drop is well within margin of error and can be anything from Windows 11 overhead to your card not boosting as high due to temperatures or whatever, so it won't be noticed. A drop of 110 FPS is very noticeable and won't likely happen by chance.

Most of the rest of the post was just going over what we already know, you don't need to give analogies, I understand how this works which is why I brought the issue up to begin with.

 

But this quote highlights the core issue I think is important. A 1-2fps drop at ~60fps is the same as a 110fps drop at ~600fps. The 1-2fps was actually noticed, it was pretty consistent across a lot of different tests which all had minor frame rate drops of that magnitude. This means it's unlikely it's due to margin of error, a margin of error problem would not favour a positive or negative direction in that way. So it was noticed, but the real issue is the significance assigned to the fps delta for CSGO. Which indicates a lack of understanding of the non linearity of frame rate as a performance metric. If big frame rate differences at ~600fps "stand out" as more significant to you vs small ones at ~1-2fps then it's because you basically don't know this works and I think that's the problem with this video.

 

If you only had the rest of the data to go on, minus the CSGO numbers, and you were asked using that data to predict what CSGO would look like given base frame rate of ~600fps on Win10, you'd predict about a 100fps frame rate drop, and you'd be right.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, PrincessFrosty said:

If big frame rate differences at ~600fps "stand out" as more significant to you vs small ones at ~1-2fps then it's because you basically don't know this works and I think that's the problem with this video.

If you fire up your game now and start playing you are unlikely to go "good lord I have lost 1 FPS over last month, must find the cause of that". If we go by the usual 5 sigma threshold of significance wielded in science, then you cannot claim that 1-2 FPS  to be a significant dip unless your measuring error of that FPS is 0.2-0.4 FPS. If you can only measure accurately to 1 FPS, then a measured 60 FPS is the same as 59 or 61 FPS. You are denying math at this point. From 60 to 58 is a 3% difference, from 600 to 500 is a 17% difference. A constant offset produces wildy different consequences in non-linear relationships depending on where it's applied. That's what a non-linear nature is.

7 hours ago, PrincessFrosty said:

If you only had the rest of the data to go on, minus the CSGO numbers, and you were asked using that data to predict what CSGO would look like given base frame rate of ~600fps on Win10, you'd predict about a 100fps frame rate drop, and you'd be right.

Yes, you'd be right, but predicting the right value does not mean the impact is the same. Another example from my own field related to signal processing. Adding up two 2 MHz signals with a 1 ns mistake doesn't affect much, because the period is 500 ns to which an extra 1 ns causes very little interference. Adding up 2 GHz signals (period = 0.5 ns) with a 1 ns mistake destroys the information, because of the non-linear relationship between time and frequency. Similarly for frame time and FPS. You can completely predict the effect of a constant offset because you know the relation is inverse, but you are mistaken in saying that therefore the impact is the same everywhere.

 

Say you need 2000 FPS for some reason. That's a 0.5 ms frame time. Adding a delay of 0.35 ms there drops it to 1200 FPS. Now go tell your project that needs 2000 FPS this doesn't matter and is the same impact as going from 60 to 59.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, Hybris5112 said:

 

Pretty much described my thinking in a nutshell. I only now considering a jump to windows 10 because there is some software I use that absolutely refuses to run on win 7 and since I need it for work I have to jump. Still I'm not going to be a early adopter of windows 11. None of the features seem like anything more than marketing gimmicks or half baked early tech.

 

I'm not looking forward to debloating and locking down windows 10 and I doubt 11 will be any easier or maybe even more restrictive in what we can fix.

My scenario was the exact opposite of yours - I had older audio software that would not run properly on anything NEWER than 7. Sadly that's what happens when software outfits change hands or go out of business. Sure, I could get the newest version that works with newer audio drivers, except I'm not gonna have a gun held to my head for a subscription for EVERY COPY that I happen to have installed (desktop/laptop). And that's in addition to the fact that the newest version has been completely "reskinned" resulting in a serious performance hit. So when time=money I really cannot be without features that are critical... and then be forced to re-learn how to use something that I already own.

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/26/2021 at 9:44 PM, Nystemy said:

Direct storage is "interesting", but partly a gimmick.

A lot of CPUs do have enough free CPU resources to handle pulling data out from storage without impacting much else. And this is mainly since the task is largely non dependent on other processes. Ie, a game engine can send off a call to the storage process, have it ship over the content to the requisite GPU memory space without really impacting anything else.

 

Though, giving the GPU the ability to just pull content from storage is going to decrease access latency a tiny bit (as in a few µs) since the system thread handling storage won't have to be switched to, but the security implications of just giving the GPU access to storage is its own can of worms. And storage devices don't tend to have much in the way of security, especially on a file system level (since file systems are complex beasts unless it is FAT32), most/all storage devices will happily just give you whatever you ask for.

always give more time, in any way you can, so you can process more with AI with less delay :P 

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/26/2021 at 7:39 PM, King of Memes said:

The auto HDR thingie is kind of cool. I might switch. 

If only people knew about ReShade and FakeHDR shader. I hate how all games look like they have a grey-sih foggy filter applied over textures. It just makes them look washed out and fake no matter how realistic they claim graphics are. The moment you apply FakeHDR, even on SDR monitor, this grey-ish filter looking crap is gone, colors get depth. It sometimes makes things overblown a bit, but so rarely it's still worth it as it just makes everything so much better looking.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×