Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Just a wonder is there a resion why graphics card requirments can't be expresed in comput units? 

Eg 

720p 5 cus

1080 8 cus

4k 32 cus

 

So you don't have to mess about to see of gt480 is eqivlent to a gt730 

 

All exampals are random 

 

 

 

Link to post
Share on other sites

Some people don't know what comput units mean. It is easier to just give them an AMD and Nvidia card and call it a day.

pc: Cpu: AMD Ryzen 7 1700 @3,9  Gpu: Gigabyte GTX 1080ti Case: NZXT s340 Elite Ram: 16Gb Corsair Vengeance LPX cooler: Corsair Hydro H110i Motherboard: MSI X370 SLI plus Storage: 1x 128gb SSD 1x 1TD HDD 1x 960 Evo Audio: Sennheiser HD600/AT2020

 

I also use a macbook, i don't know why... but i kind of like it.

 

Music... Spotify: jagodverikheetrafv2.0

 

 

Link to post
Share on other sites

How would that be easier? I think it is much easier to just see what the card is, instead of having to look what type of compute units the card has( Nvidia has CUDA, AMD has Vega and something else). Then if you havent noticed that computers have gotten much faster over years, because compute units get faster and more efficient, so you would also have to somehow find out how many older gen compute units equal the new gen compute units, and that for every generation, also compute units arent everything, different cards have them clocked at different speeds, different memory busses, Different VRAM configurations, etc etc.

 

This is a mess. It is much easier to get a gpu name, instead of having to go through all this.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Newegg

×