Jump to content

We THOUGHT this $40,000 PC would break records...

nicklmg
On 9/28/2018 at 8:23 AM, TechyBen said:

Windows resource monitor. HDD/SSD vs RAM vs CPU... then GPU.

I can tell you from your hardware, without even testing if you like. People know the hardware that well (others have tested, so you don't need to).

Your example, has *nothing* to do with the GPU. If your CPU cannot play GTA, it is the CPU. If your GPU cannot play GTA, it is your GPU. What do you mean "I need to control my NVIDIA graphics card", control it to do what?

 

To do REAL resource checking, you need extremely special software, the kind developers use:

https://docs.unrealengine.com/en-us/Engine/Performance/GPU

That's not for you or I, it's for the OS developers and game engine developers/game designers.

 

It's not difficult. It is expensive and needs custom hardware (like the monitor he chose, or a combining card/adapter of some sorts). These are for video display and/or CAD and custom code. Sometimes for expensive broadcast or entertainment industries. See Disney and it's *custom* NVIDIA setup for the video/VR rides! Cards that are Not for "gaming".

 

What? GPUs are both. NVidia and AMD consumer cards are identical vs their commercial products. Except more ram/shaders/colour bits. The difference is in support on the drivers. NVidia/AMD lock down the gaming support to consumer, and industry support to commercial. If you can trick the bios/ID reg, you can do either, with no change in performance/features (minus the locked down shader pipelines/cut off registers/lack of memory).

 

Gaming GPUs are not then, "low end", they are nearly identical. "integrated into a motherboard"... why? Huh? What? Again, all levels of wrong here. xD

So as a consumer I am forced to use this program: https://nvidia-inspector.en.lo4d.com/ to allow GOOGLE CHROME to use my graphics card. The Nvidia control panel doesn't allow me to grant Chrome access to my GPU, Why? Because Youtube has 4K and 8K videos and to be honest, if I have the resources available why doesn't doesnt Windows get the video memory from my GPU, if it can't run the program with the onboard graphics. Point is: I shouldn't need to do this, I should at least have the option to let programs use my GPU through the OEM software.

 

If the CPU and GPU are both running programs at all times: 1. there is less processing load on the CPU, 2. better balanced power distribution throughout the motherboard and 3. less heat being produced by the CPU alone therefore less cooling is needed and better overall processing speeds. 

 

So in conclusion: Windows, intel and Nvidia don't know what they are doing. :D       

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, Akkeio said:

So as a consumer I am forced to use this program: https://nvidia-inspector.en.lo4d.com/ to allow GOOGLE CHROME to use my graphics card. The Nvidia control panel doesn't allow me to grant Chrome access to my GPU, Why? Because Youtube has 4K and 8K videos and to be honest, if I have the resources available why doesn't doesnt Windows get the video memory from my GPU, if it can't run the program with the onboard graphics. Point is: I shouldn't need to do this, I should at least have the option to let programs use my GPU through the OEM software.

 

If the CPU and GPU are both running programs at all times: 1. there is less processing load on the CPU, 2. better balanced power distribution throughout the motherboard and 3. less heat being produced by the CPU alone therefore less cooling is needed and better overall processing speeds. 

 

So in conclusion: Windows, intel and Nvidia don't know what they are doing. :D       

Not Windows. It's NVidia not adding support to the drivers (they can but only support auto switching on laptops), and it's Chrome not having the right switch/settings (they can, but like Youtbue, tend to push their preferences).

https://superuser.com/questions/1319250/how-to-force-chrome-to-use-integrated-gpu-for-decoding

So you can fix it without a third party program.

 

"why doesn't doesnt Windows get the video memory from my GPU", not "memory", but codec. Some codecs have hardware support on certain hardware. So if your CPU does or the GPU does, then yes, the OS, drivers or Chrome should detect and switch accordingly. Sometimes they might just go for software rendering. This is poor implementation, not a windows/OS fault as such.

 

And don't even get us started on DRM. xD

 

Quote

If the CPU and GPU are both running programs at all times: 1. there is less processing load on the CPU, 2. better balanced power distribution throughout the motherboard and 3. less heat being produced by the CPU alone therefore less cooling is needed and better overall processing speeds. 

 

So in conclusion: Windows, intel and Nvidia don't know what they are doing.

Again, what do you mean? How do you know it is not distributing load? How do you know what will or will not work on a CPU or a GPU. Do you know what the CPU (even with integrated GPU) does? Do you know what the motherboard does?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×