Posted July 9, 2018 Just a wonder is there a resion why graphics card requirments can't be expresed in comput units? Eg 720p 5 cus 1080 8 cus 4k 32 cus So you don't have to mess about to see of gt480 is eqivlent to a gt730 All exampals are random Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted July 9, 2018 Some people don't know what comput units mean. It is easier to just give them an AMD and Nvidia card and call it a day. Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted July 9, 2018 Not all cores are created equal so to say that the quantity of cores is the only thing matters would be wrong. Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted July 9, 2018 How would that be easier? I think it is much easier to just see what the card is, instead of having to look what type of compute units the card has( Nvidia has CUDA, AMD has Vega and something else). Then if you havent noticed that computers have gotten much faster over years, because compute units get faster and more efficient, so you would also have to somehow find out how many older gen compute units equal the new gen compute units, and that for every generation, also compute units arent everything, different cards have them clocked at different speeds, different memory busses, Different VRAM configurations, etc etc. This is a mess. It is much easier to get a gpu name, instead of having to go through all this. Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now