Jump to content

I dont get the benchmark results at all... Im playing with a 4790k @ 4.4 and one 290x and On everything ultra 1080p I sit around 60-90fps at all times. (I am on 14.7 driver so not even the FC4 optimized one)... other reviewers show AMD being just under the 980 by a few frames.

Intel I9-9900k (5Ghz) Asus ROG Maximus XI Formula | Corsair Vengeance 16GB DDR4-4133mhz | ASUS ROG Strix 2080Ti | EVGA Supernova G2 1050w 80+Gold | Samsung 950 Pro M.2 (512GB) + (1TB) | Full EK custom water loop |IN-WIN S-Frame (No. 263/500)

Link to comment
Share on other sites

Link to post
Share on other sites

So, it has been 16 hours since the release of the video and the AMD side of things is already obsolete... personally I wouldn't be satisfied with my work this way

Link to comment
Share on other sites

Link to post
Share on other sites

So when did you guys stop doing benchmarks with overclocked GPUs? Has this been discussed before and I just missed it or did this just happen? I was personally a fan of you guys just overclocking everything. Was it just a concern over the "chip lottery" and having some cards overclock well and other cards not?

 

That's exactly right.  If a person is trying to decide on a video card to get, sees a benchmark that says a 280X performs fine and then goes out a buys one, there is no guarantee that they'll be able to get their own 280X to actually be as fast as the one in the benchmark.  Due to that discrepancy, it's just not a good idea to do benchmarks using custom overclocks *unless* you also include the factory clock rates in the benchmark for completeness.

 

As far as them overclocking in general, I agree, I like them doing it for an *overclocking* test to see just how high they can get a card to go, but not really for benchmarks (unless stock clocks are also included in the benchmarks).

Edited by FurleyBustard
Link to comment
Share on other sites

Link to post
Share on other sites

So, it has been 16 hours since the release of the video and the AMD side of things is already obsolete... personally I wouldn't be satisfied with my work this way

 

I imagine they'll re-run at least some of the benchmarks using the new drivers, but given how quickly things are being updated right now (bugs, optimizations, etc) I wouldn't blame them if they waited for things to stabilize a bit before redoing the entire benchmark.  I'd guess we'll also see some Crossfire/SLI results eventually, but those *have* to wait for the drivers to get updated.  (Since multi-gpu solutions often don't work well at all until custom driver profiles have been created for a given game.)

Link to comment
Share on other sites

Link to post
Share on other sites

dxajVia.png

Just gonna leave this here

Should always wait a little bit until both AMD and NVIDIA release drivers for the game, not just NVIDIA.

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, interesting that an R9 290 beats the 970 without driver optimization at 4k and that the R9 290x manages to keep up with the 980.

I guess this is because of the larger memory bus width of the R9 290 and R9 290X.

This makes R9 290 a good buy for anyone who plans to run FC4 at 4K considering that its quite cheaper than the GTX 970.

Link to comment
Share on other sites

Link to post
Share on other sites

I did the EXACT same run as Luke with a R9 270 overclocked to a mere 1010MHz at ultra and got completely different numbers.

Ultra preset with motion blur off 4xMSAA: Min 7, Max 57, Average 31

Ultra preset with motion blur off SMAA: Min 0 (one random lag spike, generally around ~20), Max 60, Average 45

This is with the new drivers. For some reason I didn't hit over 60 fps despite V-Sync being turned off and doing the tab out 'trick'. Could just be a fluke.

If you want, I can record the run and provide more proof. 

edit: At 1920x1080

 

Settings: http://i.imgur.com/AqnjCsr.png

SMAA Run: http://i.imgur.com/ubgwaUp.png

4xMSAA Run: http://i.imgur.com/BFyvIqu.png

 

@Slick @nicklmg

RIP in pepperonis m8s

Link to comment
Share on other sites

Link to post
Share on other sites

My card is a 560ti this game slaughters it. 

I am getting like 25-30fps on minimum with everything off at 1920X1080. 

I did the graphics card update so i'm not sure why its so poor performance the minimum is 460. 

Does anyone else have this issue with their 560ti? 

At the moment I am playing the game at 1600X900 on medium @ 60fps which is decent but still..

 

Help would be appreciated. Thanks guys!

 

BJ

Usually, minimum requirements means like for really minimum settings (low, 720p), a GTX 460 definitely won't be able to run this game at 1080p with decent settings at an acceptable framerate.

You're probably getting low fps because of, either not enough video memory, or just because the game is very demanding and the 560 Ti is a pretty old card at this time... I think it's time to save some money and upgrade, my friend :)

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

Are the benchmarking results shown in LTT's videos an average of the average FPS or an average of the max FPS?

Link to comment
Share on other sites

Link to post
Share on other sites

Are the benchmarking results shown in LTT's videos an average of the average FPS or an average of the max FPS?

 

 

A chart showing the average fps should be showing an average of *all* of the fps measurements taken throughout a given run.  For example, if you have a 2 minute run you are doing, and your measurement tool samples the fps every second, then you'll have a total of 120 measurements taken during the run.  An "average" should be the average of *all* of those 120 measurements.  For consistency, it's best to do a run about 3 times, then average together those three averages.

 

I might be misunderstanding your question, though...

Link to comment
Share on other sites

Link to post
Share on other sites

 

Which driver did you guys use? Released on the 17th.

 

Highlights of AMD Catalyst™ 14.11.2 Windows® Beta Driver Performance Improvements
Dragon Age: Inquisition performance optimizations
- Up to 5% performance increase over Catalyst™ 14.11.1 beta in single GPU scenarios with Anti-Aliasing enabled.
- Optimized AMD CrossFire™ profile
Far Cry 4 performance optimizations
- Up to 50% performance increase over Catalyst™ 14.11.1 beta in single GPU scenarios with Anti-Aliasing enabled.

 

 

Most irresponsible, misleading benchmark LTT has ever done. Refused to use an AMD driver that came out BEFORE release of the game and BEFORE the release of the Nvidia game ready driver. Slick also infers that he used the new Nvidia driver (says extremely recent and video was released on the 20th). Then Slick misinforms the audience saying that AMD may have a new driver out soon, when their driver was available before the Nvidia driver.

 

In addition MSAA was the worst possible scenario for the AMD card while not using the new driver.

 

I would like Slick to answer the following.

 

1) Why did you not use a driver from AMD that was out before Nvidia's driver was. In what way was that close to a legitimate benchmark?

2) Why did you test the worst possible scenario for the AMD card, without the proper driver, when this game stutters on a GTX 980 with SMAA at an unlocked 60 while doing things like driving a vehicle, let alone MSAA. Is a cpu core usage benchmark beyond LTT, to show if this game is well optimized and that if it is GPU brand or CPU optimization that is causing problems? If so I suggest linking or just posting more informative benchmarks.

3) Why did you misinform the public as far as drivers.

 

Was clicking update driver too hard in catalyst? Are you simply incompetent? Then update this benchmark. If you don't update it? LTT appears to be intentionally misleading people and pushing one video card maker over another. Also these are very valid questions. Could LTT point out any occurrences where a benchmark was done as irresponsibly in favor of AMD? This is an Nvidia Game Works title where AMD beat Nvidia with drivers as far as release date and LTT is still selling a huge advantage for the Nvidia cards.

 

Why should anyone care about further LTT benchmarks when we see one company heavily favored by LTT benchmarks and the benchmarks resorting to deception (you can blame incompetence if you want) to achieve a desired advantage for one card maker?

 

This is the kind of thing that will kill your website. People came here because it was fair and LTT had not sold out to manufacturers and is not owned by large firms like Tom's Hardware now is . If this benchmark was on my page I would delete or update it immediately and apologize for the incompetence (if it was incompetence). 

 

LTT should be ashamed this benchmark even made it on air...

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

snip

Yea this benchmark is a joke; the numbers aren't even close to what you actually get after the driver update. Either they're trying to make it seem like the game runs like complete shit so that you don't buy it (Ubisoft bandwagon) or they just don't give a damn. Obviously I know it takes time and money to benchmark/edit/upload a video, but ffs at the very least add an annotation or just flat out remove the video.

RIP in pepperonis m8s

Link to comment
Share on other sites

Link to post
Share on other sites

Oh god deathjester, you just won't stop. What, are you going to call @Slick a astroturfer shill as well, or what? I'm genuinely curious why you think this test was so nefarious?

I'm curious @Slick and @LinusTech, your thoughts on this "conspiracy theory"? DJs comments have certainly given several members cause for concern lately and he really seems to have a bone to pick lately.

 

Also, LMG has flat out stated they don't use beta drivers, they use the latest stable release there is. So...yea. Foot, meet mouth. I'm curious, do you think calling out LMG head honchos with no proof is a good move? 

Link to comment
Share on other sites

Link to post
Share on other sites

Yea this benchmark is a joke; the numbers aren't even close to what you actually get after the driver update. Either they're trying to make it seem like the game runs like complete shit so that you don't buy it (Ubisoft bandwagon) or they just don't give a damn. Obviously I know it takes time and money to benchmark/edit/upload a video, but ffs at the very least add an annotation or just flat out remove the video.

Yea... You might want to check that again. That "new update" from AMD is a Beta driver. Slick has already made it very clear that the LMG refuses to use beta drivers for benchmarks, regardless if it is from AMD or NVidia. They only use WHQL. You should have known better!

And a note to DJ, that was a bad move on so many levels! Making a direct insult to the LTT staff is no way to bring attention to an issue. If that matter (which is already debunked) angers you so much as to call them sell-outs, then what are you even doing here then? That blatant lie of "helping people" is not going to work here.

Read the community standards; it's like a guide on how to not be a moron.

 

Gerdauf's Law: Each and every human being, without exception, is the direct carbon copy of the types of people that he/she bitterly opposes.

Remember, calling facts opinions does not ever make the facts opinions, no matter what nonsense you pull.

Link to comment
Share on other sites

Link to post
Share on other sites

Yea... You might want to check that again. That "new update" from AMD is a Beta driver. Slick has already made it very clear that the LMG refuses to use beta drivers for benchmarks, regardless if it is from AMD or NVidia. They only use WHQL. You should have known better! And a note to DJ, that was a bad move on so many levels! Making a direct insult to the LTT staff is no way to bring attention to an issue. If that matter (which is already debunked) angers you so much as to call them sell-outs, then what are you even doing here then? That blatant lie of "helping people" is not going to work here.

Why wouldn't they use a beta driver when it gives a 50% increase in FPS (also can you link me to where they say this about not using them)? Be realistic for a second. If you're going to benchmark something, you should give numbers that people, who smartly update their drivers, will achieve. Not numbers that you will achieve by not wanting to update to new drivers. 

RIP in pepperonis m8s

Link to comment
Share on other sites

Link to post
Share on other sites

Why wouldn't they use a beta driver when it gives a 50% increase in FPS (also can you link me to where they say this about not using them)? Be realistic for a second. If you're going to benchmark something, you should give numbers that people, who smartly update their drivers, will achieve. Not numbers that you will achieve by not wanting to update to new drivers. 

 

Because what percentage of the normal gaming population touches beta drivers? Benchmarking games under drivers that are inherently unstable doesn't make much sense. Overclocking is understandable, there is always that minimum amount you can squeeze out. But beta drivers? How does using those make any sense when the performance gains can be up or down or all over the place just because the code is wonky? 

This is a non-issue that Deathjester needed to try and turn into an issue, nothing more. There is no conspiracy here. There is no intentional wrong doing. To think that is just sad and misguided. 

Link to comment
Share on other sites

Link to post
Share on other sites

Because what percentage of the normal gaming population touches beta drivers? Benchmarking games under drivers that are inherently unstable doesn't make much sense. Overclocking is understandable, there is always that minimum amount you can squeeze out. But beta drivers? How does using those make any sense when the performance gains can be up or down or all over the place just because the code is wonky? 

This is a non-issue that Deathjester needed to try and turn into an issue, nothing more. There is no conspiracy here. There is no intentional wrong doing. To think that is just sad and misguided. 

I just don't see the point in doing a benchmark when there's a stable driver that gives a whopping 50% fps increase that probably just about every AMD user who plays FC4 will use. 

At the very least they should add an annotation says, "Hey AMD users! The driver is out! Go download it so you can actually play the game!"

RIP in pepperonis m8s

Link to comment
Share on other sites

Link to post
Share on other sites

Why wouldn't they use a beta driver when it gives a 50% increase in FPS (also can you link me to where they say this about not using them)? Be realistic for a second. If you're going to benchmark something, you should give numbers that people, who smartly update their drivers, will achieve. Not numbers that you will achieve by not wanting to update to new drivers.

Because the 50% FPS improvement cannot be guaranteed under a beta driver. A beta driver is what it is, an unfinished firmware that has the risk of performance fluctuations and crashing. So you are essentially giving up the stability of games so that you get performance gains in specific scenarios. Not every PC gamer is willing to use beta drivers, and even for those that do, very few know how to use it properly.

Read the community standards; it's like a guide on how to not be a moron.

 

Gerdauf's Law: Each and every human being, without exception, is the direct carbon copy of the types of people that he/she bitterly opposes.

Remember, calling facts opinions does not ever make the facts opinions, no matter what nonsense you pull.

Link to comment
Share on other sites

Link to post
Share on other sites

I just don't see the point in doing a benchmark when there's a stable driver that gives a whopping 50% fps increase that probably just about every AMD user who plays FC4 will use. 

At the very least they should add an annotation says, "Hey AMD users! The driver is out! Go download it so you can actually play the game!"

The problem is that it is a BETA driver. It's not ready for release yet (otherwise it wouldn't be a Beta Driver). It might be stable for YOUR setup, but Beta Drivers can be incredibly unstable for many users. "Use at your own risk" and all that. It makes NO SENSE for them to benchmark using Beta Drivers, let alone using Beta Drivers for one side and Stable WHQL drivers for the other side. That's a blatantly unfair comparison, which gives advantage to the side that happens to have the WHQL driver out already.

 

In this case, AMD had a severe disadvantage (being a NVIDIA title), so of course their driver is going to be later (NVIDIA is also notorious at not letting AMD get early access to NVIDIA games). This is not some conspiracy against AMD. @Slick doesn't have an anti-AMD agenda.

 

Once the official WHQL Driver is released from AMD, then yes, an annotation would be good. Until then, Beta is Beta, and should not be part of a standardized test suite for benchmarks.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Because the 50% FPS improvement cannot be guaranteed under a beta driver. A beta driver is what it is, an unfinished firmware that has the risk of performance fluctuations and crashing. So you are essentially giving up the stability of games so that you get performance gains in specific scenarios. Not every PC gamer is willing to use beta drivers, and even for those that do, very few know how to use it properly.

Very few seem to understand the inherent risks of them either. Even here, where you'd think, as a tech enthusiast forum, we'd be more educated about this. I think the games industry has spoiled them a little bit with the word "beta", which now stands for "glorified tech demo". Actual Beta can and does mean very risky potential instability. As I said, no good for a standardized test suite for benchmarks.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Very few seem to understand the inherent risks of them either. Even here, where you'd think, as a tech enthusiast forum, we'd be more educated about this. I think the games industry has spoiled them a little bit with the word "beta", which now stands for "glorified tech demo". Actual Beta can and does mean very risky potential instability. As I said, no good for a standardized test suite for benchmarks.

Yea you're right, with all the 'beta' games recently I've forgotten what it actually means. 

RIP in pepperonis m8s

Link to comment
Share on other sites

Link to post
Share on other sites

Yea you're right, with all the 'beta' games recently I've forgotten what it actually means. 

And that's not even your fault. It's the fault of the fucking games industry *cough* EA and Ubisoft *cough*

 

Although EA has been so much better lately. If only by comparison ;)

 

But we must keep a rational mind about this. AMD (And NVIDIA for that matter) uses "Beta" in the correct context. Think about how many "game breaking" or "system breaking" Beta Drivers we've heard about from either side in the last several years. A Beta driver can literally kill your Windows install (Though to be clear, this is incredibly rare, and the extreme end of the spectrum).

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

And that's not even your fault. It's the fault of the fucking games industry *cough* EA and Ubisoft *cough*

 

Although EA has been so much better lately. If only by comparison ;)

 

But we must keep a rational mind about this. AMD (And NVIDIA for that matter) uses "Beta" in the correct context. Think about how many "game breaking" or "system breaking" Beta Drivers we've heard about from either side in the last several years. A Beta driver can literally kill your Windows install (Though to be clear, this is incredibly rare, and the extreme end of the spectrum).

 

A note to add:

 

The terms "stable" and "beta" are mutually exclusive. There is simply no such thing as a "stable beta driver". It either is a beta firmware (a WIP), or a stable firmware (a public release).

 

While it is being misconstrued in Steam Early Access, this principle holds true for most software. If you check, for example, the Google Chrome browser, you will notice that it has not only two, but four development channels; canary, dev, beta, and stable. What does Google download for you by default? The latest software in the stable channel. You have to search for the other channels to get them. From beta down, you get faster updates and more bleeding edge features, but you sacrifice stability and simplicity as a relative cost. If an issue arises, you have to spend more time fixing it, and there is no telling that the same issue will crop up again.

Read the community standards; it's like a guide on how to not be a moron.

 

Gerdauf's Law: Each and every human being, without exception, is the direct carbon copy of the types of people that he/she bitterly opposes.

Remember, calling facts opinions does not ever make the facts opinions, no matter what nonsense you pull.

Link to comment
Share on other sites

Link to post
Share on other sites

Oh god deathjester, you just won't stop. What, are you going to call @Slick a astroturfer shill as well, or what? I'm genuinely curious why you think this test was so nefarious?

I'm curious @Slick and @LinusTech, your thoughts on this "conspiracy theory"? DJs comments have certainly given several members cause for concern lately and he really seems to have a bone to pick lately.

 

Also, LMG has flat out stated they don't use beta drivers, they use the latest stable release there is. So...yea. Foot, meet mouth. I'm curious, do you think calling out LMG head honchos with no proof is a good move? 

 

I see no reply to the questions I asked in this statement.

 

I see no REASON for Slick to have done the test the way he did. Just more personal attacks. 

 

There is two reasonable conclusions. Slick is incompetent or he wanted one card to perform better over the other by performing a benchmark using a newer driver on one card over an older driver on another card. 

 

Also as far as the "beta driver excuse". The driver was easily upgradable by a click inside Catalyst and there were zero issues on anything else listed in the driver release. All it was, was an improvement to Far Cry 4. Everyone who is playing the game on an AMD card is using that driver and it had zero issues and broke nothing. 

 

This would be like saying never use a new Nvidia driver because people have had issues with Nvidia WHQL drivers that have broken other games (and issues have often been listed on WHQL drivers) overheated cards in the past etc. They are just as much a "beta" and I often had to switch between WHQL drivers as an Nvidia card owner on my GTX 770. The drivers would fix one game and break another.

 

Also I can call out BS any time I see it. This benchmark was a total sham. I do not deify like some of the fanboys in this thread. Slick performed an unfair, unrealistic benchmark that no one is going to experience in their homes. They are going to go to the forum see it probably sticked to download the new AMD driver and click it and be gaming at much higher FPS than his benchmark. That driver was out BEFORE the Nvidia driver.

 

Also DA:I has many Nvidia users complaining on their forums, so does that make WHQL beta? So does Far Cry 4.

 

If Slick wants to comment on it? Let him. I want to hear from him, not idiotic white knights who personally attack me for asking valid questions that he should answer. Slick means nothing to me. He is a young tech on LTT who performed a irresponsible benchmark. If he wants to claim incompetence? Fine. Rerun the test and make it clear that the driver was out before the Nvidia driver. Problem solved. ATM his benchmark is selling a fairy tale which will be viewed many times by people making a purchase of a video card for Xmas, which can swing a ton of money in the direction of one company.

 

You want to be a internet personality? Don't do shady or incompetent stuff and not expect to be called out on it.

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×