Jump to content
GPUXPert

Nvidia Admits "We're Falling Behind on the Frame Rate Race"

Recommended Posts

Posted · Original PosterOP

EDIT by Linus
-------------------------------------------------------------------------------------------------------------------


NVIDIA reached out to me about this post and wanted to clarify Tom's statement in the interview a little bit... Here's what they had to say:


It seems to us that the forum thread might have been based on a slight misquote. It stated:


  • By the way I want to you to know a thing, since we're working on quality rather than frame rate that's probably one of the major reasons why we're out talking to people at all. And people don't get to really understand it then effectively we're kind of falling behind on the space race for frame rate & that's another major reason why we're talking.

The quote can be found at @1:14:00 of

:
  • By the way I want you to know one thing, since we're working on quality rather than raw frame rate that's probably one of the major reasons that we're out talking to people at all. Because the thing that I want to make sure doesn't happen is we go off and make our quality experience fantastic, and people don't get to really understand it, then effectively we're kind of falling behind on the space race for raw frame rate & that's another major reason why we're talking.

NVIDIA further clarifies: Tom simply said that since NVIDIA is focusing on improving the gaming experience, it is a good idea to make sure that consumers and reviewers understand what it takes to deliver great game play, and how to measure it. In many cases FCAT does do a better job than FRAPS FPS characterizing what gamers see.

EDIT by GPUXPert
-------------------------------------------------------------------------------------------------------------------
UPDATE !

I want to make a few points in response to Nvidia's statement, as an author of this article, a gamer & a PC hardware enthusiast.

I want start out by thanking Nvidia for their response &; interaction but I would like to say that I would have preferred if Nvidia made an independent statement rather than tell Linus to make a statement on their behalf.
Linus is an independent journalist & a great friend of this community and as such I think that it was inappropriate to let Linus relay Nvidia's message for them.
As Linus is not an Nvidia employee It would have been more appropriate to let one of Nvidia's own make a reply or the great Tom Petersen himself comment on his own statements.

Moving forward, I want to comment on the content of Nvidia's response in a number of points so please bare with me.

The first point that I want to make is that there is absolutely no difference between raw framer ate & frame rate, as frame rate or (frames per second FPS) is a scientific &; factual metric, as such these metrics only exist in a single form & thus they are raw by definition.
And for Nvidia to try to insinuate or suggest otherwise (that somehow frame rate & raw frame rate are different) is misleading to our readers.

The second point that I want to make is that as a hardcore gamer & graphics enthusiast I found it admirable that Nvidia indulged in self-criticism & acknowledged that there is an area where they may be falling behind and where they can improve, their statement today however is back-peddling from all of that.
Back-peddling from self criticism means back-peddling from the acknowledgement of a mistake & the effort to correct it.
And that is bad for me & you as consumers because we're the ones that may suffer from it & it's also bad for Nvidia because it means that they're deviating from the path of empowering & innovation driving self-criticism & analysis.

-GPUXPert



Below is the unaltered original post of this thread:
-------------------------------------------------------------------------------------------------------------------



In light of recent events &; the graphics industry's focus on a new performance metric "frame times or frame latency" spear-headed by Nvidia alongside the traditional metric used to rate graphics cards which is "frames per second or FPS".

Nvidia's Tom Petersen was asked in a recent PCPer.com interview why the shift & why now ? Why haven't Nvidia talked about this before?
He started out by saying :

Quote
Quote
As Nvidia we shouldn't talk about something that we haven't fully characterized & understood.

Until we get our tools really easy to use & reproducible, we wouldn't want to expose it to a broader community, it would be just confusing.
On the other hand, this is a technology that we use to make our GPUs better, so there is always a tension between if we brawl it out publicly we're going to enable our competitors to make their GPUs better, so it's a double edged sword.

He then added :

Quote
Quote
By the way I want to you to know a thing, since we're working on quality rather than frame rate that's probably one of the major reasons why we're out talking to people at all.
And people don't get to really understand it then effectively we're kind of falling behind on the space race for frame rate & that's another major reason why we're talking.

So the major reason Nvidia is bringing frame latency to light is because they feel like they're losing the race for FPS or frames per second.
You can find the full video

, warning ! it's long.

UPDATE!!
AMD's Roy Taylor VP Global Channel Sales, Technology & Content Strategist. has just tweeted this
article twice !!

https://twitter.com/amd_roy/status/334817305257209856
https://twitter.com/amd_roy/status/334817393501147136
Link to post
Share on other sites

Thing is... it seems like catching up in the frame rate race wouldn't be as difficult as in the frame latency race.


New build, since late 2015: MSi X99A RAIDER | Intel Core i7 5820K @ 3.3 - 3.6Ghz | Fractal Design Kelvin S24 | nVidia GeForce EVGA GTX 950 SSC | 32GB Corsair Vengeance LPX 2666Mhz CL15 | Fractal Design Define S | Samsung 840 EVO 250GB + Seagate Green 2TB | Fractal Design Integra 650W | 3x Dell U2415 1920x1200

Old build, since early 2009: ASUS P6X8D-E | Intel Core i7 930 @ 3,8Ghz | Corsair H50 | Radeon HD5870 | Kingston ValueRAM 1333Mhz 6GB | Antec 1200 | Samsung 840 EVO 250GB | Corsair HX850 | Seagate 2TB | (Screen res: 1680x1050 + 1280x1024).

Link to post
Share on other sites

People have to realize that in reality, frame rates are far more important than frame latency, because the frame latency issues only affect a fraction of the consumer base, because it only exists on multi-gpu setups.
While frame rates affect everyone.
So if lets say 2% (they're actually less than that but for the sake of the argument) of graphics consumers are SLI/Crossfire users , then AMD is ahead in 98% of the market segment.

Link to post
Share on other sites

I think this is great for NVidia to be able to branch into something to do with GPU's that isn't necessarily all about fps.

 

I personally would much prefer the gameplay actually feel smooth :)


export PS1='\[\033[1;30m\]┌╼ \[\033[1;32m\]\u@\h\[\033[1;30m\] ╾╼ \[\033[0;34m\]\w\[\033[0;36m\]\n\[\033[1;30m\]└╼ \[\033[1;37m\]'


"All your threads are belong to /dev/null"


| 80's Terminal Keyboard Conversion | $5 Graphics Card Silence Mod Tutorial | 485KH/s R9 270X | The Smallest Ethernet Cable | Ass Pennies | My Screenfetch |

Link to post
Share on other sites

Wasn't it PCPer who brought the whole frame latency thingy to light? I think the first quote you got there was just some BS to explain why they kept it to them selves. The way I understood it was that he thought raw framerate power was getting kinda silly and all consuming, and that nVidia wants to shift focus to the quality of animations rather than just showing off muscles. And the point he was making was that if the consumers didn't get what they were doing, then they would have been wasting time

Link to post
Share on other sites

Wasn't it PCPer who brought the whole frame latency thingy to light? I think the first quote you got there was just some BS to explain why they kept it to them selves. The way I understood it was that he thought raw framerate power was getting kinda silly and all consuming, and that nVidia wants to shift focus to the quality of animations rather than just showing off muscles. And the point he was making was that if the consumers didn't get what they were doing, then they would have been wasting time

If frame rates are silly, then performance is silly.

 

They maybe losing the frame rate race but they're still in front in the frame latency race which we all know matters just as much maybe even more

Frame latency exists only in sli & crossfire, it doesn't exist on single cards.

That's why I find this whole "frame-latency is as important as FPS" thing very silly, sli & crossfire users make up a tiny percentage of players.

Link to post
Share on other sites

Nvidia obviously knew about it before PCper, hence already having a hardware solution on their 600 series cards to correct the problem. In the last press conference i watched from nvidia (i believe the user goodbytes documented it all and posted it on here) it seemed as though they were pushing their attention more towards their grid technology.

Link to post
Share on other sites

People have to realize that in reality, frame rates are far more important than frame latency, because the frame latency issues only affect a fraction of the consumer base, because it only exists on multi-gpu setups.

While frame rates affect everyone.

So if lets 2% (they're actually less than that but for the sake of the argument) of graphics consumers are SLI/Crossfire users , then AMD is ahead in 98% of the market segment.

Pretty much how I see it. Bought themselves enough time to get the 700 series out and regain the performance crown for each of the price brackets. Though this won't last until Volcanic Islands.

Link to post
Share on other sites

If frame rates are silly, then performance is silly.

 

Frame latency exists only in sli & crossfire, it doesn't exist on single cards.

That's why I find this whole "frame-latency is as important as FPS" thing very silly, sli & crossfire users make up a tiny percentage of players.

 

There's performance and then theres performance. You won't fast if your wheels are just spinning.

It can be silly when you get to the point where you won't notice the difference. Of course you'll always have future proofing. But look at the Titan, it gains massively on the dual GPUs framerates when the work load gets bigger (multi monitor, 4K). Normal testing would have you believe it underperformed compared to its price, but throw some future at it, and things may very well look different.

 

Edit: Look at it like this: Would you want a speaker that could play reasonably loud and with very good audio quality, or would you prefer the extremely loud speaker with much worse quality?

Link to post
Share on other sites

There's performance and then theres performance. You won't fast if your wheels are just spinning.

It can be silly when you get to the point where you won't notice the difference.

 

It is silly, because right now, it's impossible to notice the difference between a single 7950 (AMD card) & a single 670 (Nvidia card), yet Nvidia wants to charge you more on the 670 for something that's not even perceivable !

Mutli-GPU setups are a different story, but those are being improved as we speak & they make up a very small part of the user base.

 

Edit: Look at it like this: Would you want a speaker that could play reasonably loud and with very good audio quality, or would you prefer the extremely loud speaker with much worse quality?

This is a terrible example because FPS is the major contributor to quality.

Link to post
Share on other sites
AMD Single-GPUs Do NOT Have Latency Issues

 

Frame latency is the big thing everyone is talking about these days with video cards. So much so that Nvidia conveniently released a tool to accurately measure this one statistic that, under certain circumstances, shows Nvidia's cards to be superior to AMD's cards. I'm guessing that's because Nvidia's shit doesn't stink. They didn't email me back on that.

Anywho, it's been said at least twice by Logan on both The Tek and Inbox that AMD cards may run at higher framerates, but the animation is not as smooth because the frame latency is higher. I was paraphrasing, but neither how I said it, nor how Logan has said it on Youtube, states or even implies that the frame latency issues are exclusive to multi-GPU setups.

Logan said it again yesterday on The Tek, so I thought I should look it up, since I had only ever heard of this issue existing for Crossfire setups. The most recent measure of frame latencies I could find was here: http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-High-End-GPUs-Benchmarked-4K-Resolutions. That's from just yesterday.

a_singleGPU%20%284%29.png

This is Battlefield 3, traditionally an Nvidia-favoring game. The 7970 not only beats the 680 at framerate (lower frametime = higher framerate), but also at frame latency (thinner and smoother line = less latency). It even beats the Titan at latency.

a_singleGPU%20%289%29.png

Same goes for Crysis 3, except all the single cards are a bit more jittery here. None of them would likely be noticeably jittery. Same story goes for Sleeping Dogs, Dirt 3, and Skyrim (though skyrim is always jittery).

This:

b_dualcard%20%289%29.png

... is the latency issue people are talking about. That looks like an offensive amount of latency, but people can't even agree on whether or not that mess is noticeable in actual gameplay. The little 2-3 frame jitters in the single card tests are absolutely unnoticeable, and even that tiny chunk of latency was at a ridiculous resolution.

So..... implying that the issue extends to single cards, what gives? Where's the test I haven't seen?

 

 

http://teksyndicate.com/forum/gpu/amd-single-gpus-do-not-have-latency-issues/139321

Link to post
Share on other sites

It is silly, because right now, it's impossible to notice the difference between a single 7950 (AMD card) & a single 670 (Nvidia card), yet Nvidia wants to charge you more on the 670 for something that's not even perceivable !

Mutli-GPU setups are a different story, but those are being improved as we speak & they make up a very small part of the user base.

 

This is a terrible example because FPS is the major contributor to quality.

 

Yes, but only if done right. Keep in mind that most of that segment revolved around dual gpus.

Link to post
Share on other sites

 

Thanks for just copy pasting the article from the link I shared - I'm sure teksyndicate would enjoy that.

Link to post
Share on other sites

Oh come on, this is just a blatant excuse to crap out for not being on the ball...

In a year or two when single-gpus are able to drive 3 monitors, frame latency will not matter.

And at the rate they're going, AMD will get there first, and Nvidia better be worried.

Link to post
Share on other sites

Right now at just about every price point AMD is crushing nVidia. Plus the AND free games are amazing. Grated my 7950 is the first time I ever heard coil whine but its only when frame rates get above  100ish and and at 1440p thats not to often 

Link to post
Share on other sites

atlease Nvidia have more stable frame rates in SLI than AMD does... amd just tosses horses at people without thinking... nvidia just <3 there CUDA and PhysX... there the more flashy and reliable side of graphics cards... amd are the unreliable sports card.

also AMD cant do:

 


Lead 3D artist, Senior Game designer & Head of Hardware maintenance at Andalusian UK.

Part time Fashion and glamour Model, Casual PC & Console Gamer, Custom Desktop modder hobbyist

Link to post
Share on other sites

it nice seeing companies admitting they are making mistakes its uncommon


i5 3570 | MSI GD-65 Gaming | OCZ Vertex 60gb ssd | WD Green 1TB HDD | NZXT Phantom | TP-Link Wifi card | H100 | 5850


“I snort instant coffee because it’s easier on my nose than cocaine"


 

Link to post
Share on other sites

atlease Nvidia have more stable frame rates in SLI than AMD does... amd just tosses horses at people without thinking... nvidia just <3 there CUDA and PhysX... there the more flashy and reliable side of graphics cards... amd are the unreliable sports card.

also AMD cant do:

This doesn't sound biased or emotionally driven at all. Can't say that I'm surprised nor shocked, logic isn't the strong suit of any fanboy.

Just so that you know, the founders of PhysX, work for AMD, have been since 2011.

http://www.bit-tech.net/hardware/graphics/2011/02/17/amd-manju-hegde-gaming-physics/

Link to post
Share on other sites

This doesn't sound biased or emotionally driven at all. Can't say that I'm surprised nor shocked, logic isn't the strong suit of any fanboy.

im no fanboy when it comes to hardware, i choose what is right for a build and most the time i move to Nvidia for CUDA support cause its usefull for what i do, when i do a build that is strictly gaming then i put in AMD but that is so rare i never see the need


Lead 3D artist, Senior Game designer & Head of Hardware maintenance at Andalusian UK.

Part time Fashion and glamour Model, Casual PC & Console Gamer, Custom Desktop modder hobbyist

Link to post
Share on other sites

im no fanboy when it comes to hardware, i choose what is right for a build and most the time i move to Nvidia for CUDA support cause its usefull for what i do, when i do a build that is strictly gaming then i put in AMD but that is so rare i never see the need

Adobe dropped CUDA support.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Recently Browsing   0 members

    No registered users viewing this page.


×