Jump to content

How does the 290X stack against the 970 at 1080p as of now?

dfg666

 

what's so hard for you to understand about fps being an average that your computer calculates and not what actually gets sent to your monitor?

lol

theres proof right there that 60fps does not equal 60 fully rendered frames reaching your eye

 

to me it seems like you have a hard time comprehending 60fps =/= 16.6ms per frame

if you tried to actually learn something by watching the videos, maybe you would stop making useless arguments

 

you're being just as absurd as someone who thinks humans cant see over 30fps xD

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

@EdInk I see its very difficult for you to admit you're wrong, but here is more info about why fps is not the whole story:

 

https://forum.beyond3d.com/threads/on-techreports-frame-latency-measurement-and-why-gamers-should-care.53455/#post-1678305

 

http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Testin

 

overlay2.jpg

 

howgameswork.jpg

frame rating measures over here--------------------------------------------------------^^^

and also analyzes if the frames were 100% rendered or just a part of a frame

 

 

honestly its not that hard to read a couple articles and learn something once in a while...

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

what's so hard for you to understand about fps being an average that your computer calculates and not what actually gets sent to your monitor?

lol

theres proof right there that 60fps does not equal 60 fully rendered frames reaching your eye

to me it seems like you have a hard time comprehending 60fps =/= 16.6ms per frame

if you tried to actually learn something by watching the videos, maybe you would stop making useless arguments

you're being just as absurd as someone who thinks humans cant see over 30fps xD

Frankly, I don't care what you think. You've said that frametime is better as FPS software use a different algorithm which is displayed on the screen. So I've gone by the frametime which can be used to calculate the frame rate and you think I'm one of those humans that support can't see above 30fps.

@EdInk I see its very difficult for you to admit you're wrong, but here is more info about why fps is not the whole story:

https://forum.beyond3d.com/threads/on-techreports-frame-latency-measurement-and-why-gamers-should-care.53455/#post-1678305

http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Testin

overlay2.jpg

howgameswork.jpg

frame rating measures over here--------------------------------------------------------^^^

and also analyzes if the frames were 100% rendered or just a part of a frame

honestly its not that hard to read a couple articles and learn something once in a while...

I'll be dishonest if I say I've not learned anything but does it prove that AMD cards deliver a poor gaming experience. I don't think so.

So tell me, at what stage in the "pipeline" is data extracted from and used to display frametime....as you put it the better measurement of performance?

Link to comment
Share on other sites

Link to post
Share on other sites

I'll be dishonest if I say I've not learned anything but does it prove that AMD cards deliver a poor gaming experience. I don't think so.

I never said anything about AMD cards delivering a poor gaming experience...

 

All I ever said to you was that frame time is more important than fps...

 

AMD cards are great

and they fixed the crossfire microstuttering issues

 

im not arguing about AMD vs nvidia at all

i think the guy you were arguing about that with is someone else, not me

 

 

btw about that data extracted, they connect a custom capture card to the GPU output, so that the capture card receives the images just like a monitor would

then they scan each image and see if it was a full image, part of an image, a single line, or whatever

then they look at how long it was displayed since the previous image

 

so basically they look at game performance frame by frame to see how good or bad it is

stuff that someone like you or i would notice in a game but not know exactly what/why it was happening when were getting a solid 60fps

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hahahaha! Oh my! My apologies. It was with Monarch wasn't it?

Hey, I wasn't having a go at you. On my iPhone so it's hard to keep track unlike my PC monitor.

I think I'll leave it for now till I'm using a proper PC monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

Frankly, I don't care what you think. You've said that frametime is better as FPS software use a different algorithm which is displayed on the screen. So I've gone by the frametime which can be used to calculate the frame rate and you think I'm one of those humans that support can't see above 30fps.

I'll be dishonest if I say I've not learned anything but does it prove that AMD cards deliver a poor gaming experience. I don't think so.

So tell me, at what stage in the "pipeline" is data extracted from and used to display frametime....as you put it the better measurement of performance?

 

Frametime is also important to deliver smooth gaming experience, to learn about it you need to go way back to 7970 vs 680 era when it was actually an important issue for AMD cards especially in Crossfire.

 

When you observed this graph, the 7990 is the real winner here with significantly higher FPS compare to 690.

BF3_2560x1440_FRAPSFPS_2.png

 

 

but the frametime graph show the card produced erratic FPS between 120+fps and as low as 50fps within 1 secs which resulted in noticeable micro stutters when gaming. The 690 actually provide smoother gaming experience.

BF3_2560x1440_PLOT_1.png

 

 

You can read Into The Seconds article by Scott Wasson from Tech Report(former) about how frametiming affect gaming experience.

http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Frametime is also important to deliver smooth gaming experience, to learn about it you need to go way back to 7970 vs 680 era when it was actually an important issue for AMD cards especially in Crossfire.

When you observed this graph the 7990 is the real winner here with significant higher FPS compare to 690.

BF3_2560x1440_FRAPSFPS_2.png

but the frametime graph show the card produced erratic FPS between 120+fps and as low as 50fps within 1 secs which resulted in noticeable micro stutters when gaming. The 690 actually provide smoother gaming experience.

BF3_2560x1440_PLOT_1.png

You can read Into The Seconds article by Scott Wasson from Tech Report(former) about how frametiming affect gaming experience.

http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking

I watched the video where he made reference to the screenshots you posted but I think I'll retire for now till I can view all this info on a better screen than this Phone I'm on right now don't want to end up having a go at the wrong person. There lots of new info to digest and sh*t out.

Cheerio to now

Link to comment
Share on other sites

Link to post
Share on other sites

@Enderman @xAcid9

In regards to the 295x2, it has lower microstuttering then the 7990, mainly because drivers were improved, but also because of the PLX chip being used.

The 7990 couldnt do CFX over PCIe bus, so instead it had an internal CF Bridge connection (read> traces on PCB) between the two GPU halves.

The 295x2 on the other hand has internal x8 or x16 Gen2 PCIe lanes, made possible by the use of a PLX chip. This is, coupled with improved drivers (which has been improved twice in regards to microstuttering since i bought my 295x2 in march/april 2015.

By having a much lower latency and much much higher bandwidth interconnection, AMD GPUs manages to lower the stuttering. SLI still suffers from MORE stuttering, although both solutions have drastically improved.

SLI however is still limited by the high latency and low bandwidth of the SLI Bridge connector.

When i still had my dual 7950 setup, i DID have microstutters. Notably so. But, having stuttering cards paired with a FX 8320 didnt help. So i was unsure which of the two was to blame.

Later tests do show that MOST of the stutter was from the FX in combination with Pre-september 2015 drivers. Incase you didnt know, in September 2015, AMD reduced their driver overhead drastically. The end result was that they increased their maximum drawcalls by 400,000. They further, slightly, increased it by around 50-100,000 with Crimson. As it currently stand, Nvidia's theoretical maximum drawcalls is about 300,000 higher then AMDs drivers.

On the flip side, pre September AMD driver, they would have been able to push 800,000 more drawcalls. Which is why maxwell, at the time, was destroying AMDs R9 290 and R9 290X.

Benchmarks done after september 2015 show pretty much ALL AMD cards back to atleast the 6000 series, gaining about 7-8 FPS in all titles FLAT. Like, flat increase.

But i am getting sidetracked here.

Nvidia still runs things off the SLI bridge, which in terms of bandwidth and latency is massively inferior to PCIe Gen2, let alone PCIe Gen3. This is probably why Nvidia is having scaling issues. Because they simply do not have the bandwidth at the higher end to linearly scale two powerful GPUs.

Simply put, Nvidia is moving away from SLI, whilst AMD is moving towards CF.

Honestly, if AMD can reduce the latency between two GPUs even further, then their proposed dual GPU for VR setups will be really interesting.

Anyway, do i still get microstutters with a 295x2?

Yes

how often?

maybe 5 or less events within 1 hour of play. Depending on the game.

Rise of the Tomb Raider doesnt even have optimized AMD drivers out, and it exhibits no microstuttering. Although, Crossfire does NOT work properly yet.

The Witcher 3 has more stutters now then before. But that i'm confident is due to one of the game patches, as that brought with it a plethora of issues for AMD cards in particular. Dunno what exactly was changed, but AMD cards did NOT like that change. May be due to changes in gameworks effects, as it was about the same time CDPR introduced more advanced Hairworks settings.

Link to comment
Share on other sites

Link to post
Share on other sites

We'll see. Don said that same thing and when months later AMD had caused more issues chasing fps than the fixed it was nothing but excuses for AMD. I could give two shits who made my card, it better fucking work, no excuses. Almost flipped a bitch when I had input lag with the Hybrid, turned out to be something I did lol.

http://linustechtips.com/main/topic/537024-mini-news-rise-of-the-tomb-raider-amd-drivers-coming-asap-in-driver-1611/

I guess it won't be that bad (just an update on the topic)

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

Like I Said we'll see. I spent months waiting for a patch that would be released "shortly" to fix Fallout 4 that never came. Had to fix it myself by getting an Nvidia card.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×