Jump to content

what do you think about introducing a unit of measurement for graphics(in videos and games)

we already know that there a unit of measurement for compute power which is calculated in flops(gigaflops,teraflops) but what if there was a unit of measurement for graphics say gfxflops (or what ever you feel fits best) to measure the graphics of a game or a movie. we can say that battlefield 4 can produce the max graphics of 35 gfxflops at 3 teraflops at resolution of 1440p (for movies we cant really include the compute performance or the resolution but only describe the graphics in the movie or we can completely remove movies from the topic). this can be a great tool for describing the graphics to a person who has never seen the game in action. sometimes you cant judge the graphics just by looking at a video,can you? this unit can there be used to describe the graphics of that game.

 

please share your thoughts below.and also feel free the include more such ideas in this thread.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think there's any need for a new unit to be quite honest. The units we use at the moment do the job just fine. :) 

CPU: 5930K @ 4.5GHz | GPU: Zotac GTX 980Ti AMP! Extreme edition @ 1503MHz/7400MHz | RAM: 16GB Corsair Dom Plat @ 2667MHz CAS 13 | Motherboard: Asus X99 Sabertooth | Boot Drive: 400GB Intel 750 Series NVMe SSD | PSU: Corsair HX1000i | Monitor: Dell U2713HM 1440p monitor

Link to comment
Share on other sites

Link to post
Share on other sites

what about games like Borderlands, or killer7?

they have a nice style, but not "good" graphics.

"Probably Because I'm A Dangerous Sociopath With A Long History Of Violence"
 

Link to comment
Share on other sites

Link to post
Share on other sites

I think the quality of graphics is in the eye of the beholder. It's an opinionated topic, unlike resolution or frame rate which are quantitative values, 'gfxflops' would have to have to have a very complicated algorithm to calculate. Also, even if the value of gfxflops could somehow be calculated for a sample of media, people can still see it as different. Someone could look at tetris or original mario with a probably low value of gfxflops and agree that it looks better that the value of gfxflops assigned to it.

Link to comment
Share on other sites

Link to post
Share on other sites

What would be the difference between gfxflops and flops?

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

what about games like Borderlands, or killer7?

they have a nice style, but not "good" graphics.

thats is why i asked you to add your own stuff

Link to comment
Share on other sites

Link to post
Share on other sites

Ehhhh I see too many practical issues with this actually being carried out.. Different games run very differently than other games due to optimization and other factors, and the theoretical processing power of GPUs doesn't have much indication of real-world gaming performance. It would be pretty much impossible to create one simple calculable number than summarizes both performance and aesthetics..

i7 not perfectly stable at 4.4.. #firstworldproblems

Link to comment
Share on other sites

Link to post
Share on other sites

What exactly is this unit supposed to measure? If you're trying to come up with a new unit for something it has to be 100% objective, it has to be measurable, it basically has to be something scientific. FLOPS are floating point operations, something you can measure and something that makes (more or less) sense when we talk about compute performance. A unit for graphics in a video game would either have to be a derivate from the performance needed to play it (which makes no sense as soon as you factor in bad optimization) or you would have to come up with a formula that factors in all the objective things you can measure: texture quality, lighting effects (which aren't even objective anymore), the maximum resolution the game supports, the kinds of AA and AO it supports and how well they are implemented, etc. It would be pretty complicated and in the end you would get...... a number. And as someone else already pointed out this still doesn't tell you if the game looks any good and you can't really compare games based on that number, since having a unique style or even 8-bit or 16-bit style graphics can still look very very good, whereas a game with really amazing textures and good lighting effects can still look "meh" because the developer had no clue how to make a good looking game.

 

I'll stick to screenshots and videos when it comes to this kind of stuff, a new unit doesn't make too much sense IMHO.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×