Jump to content

( Updated )Star Wars Battlefront Benchmarked On Nvidia And AMD, Runs Best On Radeon – Exceptionally Well Optimized Across The Board

Mr_Troll

Honestly, it would've been so much easier if they just wrote what editions of cards they used since we can see 1 WF card in the bench . Then I could have an argument if the cards in the benchmarks are the best or worst possible scenarios for the specific cards. :ph34r:

The AMD cards are all Sapphire Tri-X and ofc the Fury X is its water cooled reference design. Nvidia's are reference designs with GPU Boost disabled. Biased article is biased. <-- If you reached this point without realizing the sarcasm, I feel sorry for you.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

The AMD cards are all Sapphire Tri-X and ofc the Fury X is its water cooled reference design. Nvidia's are reference designs with GPU Boost disabled. Biased article is biased. <-- If you reached this point without realizing the sarcasm, I feel sorry for you.

do you work at guru3d? show me the line where they say the use reference NV cards with gpu boost disabled...pls

Intel Core i7 7800x @ 5.0 Ghz with 1.305 volts (really good chip), Mesh OC @ 3.3 Ghz, Fractal Design Celsius S36, Asrock X299 Killer SLI/ac, 16 GB Adata XPG Z1 OCed to  3600 Mhz , Aorus  RX 580 XTR 8G, Samsung 950 evo, Win 10 Home - loving it :D

Had a Ryzen before ... but  a bad bios flash killed it :(

MSI GT72S Dominator Pro G - i7 6820HK, 980m SLI, Gsync, 1080p, 16 GB RAM, 2x128 GB SSD + 1TB HDD, Win 10 home

 

Link to comment
Share on other sites

Link to post
Share on other sites

do you work at guru3d? show me the line where they say the use reference NV cards with gpu boost disabled...pls

I work at Alphabet. All knowledge is belong to me. No other citation required.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

The AMD cards are all Sapphire Tri-X and ofc the Fury X is its water cooled reference design. Nvidia's are reference designs with GPU Boost disabled. Biased article is biased. <-- If you reached this point without realizing the sarcasm, I feel sorry for you.

Much Fanboy, Such Salty, So Drama

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

I work at Alphabet. All knowledge is belong to me. No other citation required.

"I am my own source - BELIEVE ME PEASANTS!" - you are starting to sound like a Cult Leader here.

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

I work at Alphabet. All knowledge is belong to me. No other citation required.

 

No, but really do go ahead and provide backup to your claim.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

So instead of being rip graphics cards it would be rip CPU? If that's the case I'm glad I didn't stick with the i5 4440.

well, at stock yes.

If you OC the i7, it should hold out much longer

i think the only games were you could hit CPU limits these days, anyway, will be games who are heavily drawcall based with massive number of units...

games like Ashes, Total War series, civilisation etc... RTS in general, can hit CPU limits even today...

Link to comment
Share on other sites

Link to post
Share on other sites

well, at stock yes.

If you OC the i7, it should hold out much longer

i think the only games were you could hit CPU limits these days, anyway, will be games who are heavily drawcall based with massive number of units...

games like Ashes, Total War series, civilisation etc... RTS in general, can hit CPU limits even today...

I can run my 4790K safely at 4.8GHz, so it will be fine for a long time. The i5 however can only be pushed to 3.25GHz base.....and its version of turbo boost is useless unlike turbo boost on Devil's canyon-the boost speed might as well be the base clock.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

No, but really do go ahead and provide backup to your claim.

As for the consistency, already did. The card models claim was sarcasm and highlighted as such.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

As for the consistency, already did. The card models claim was sarcasm and highlighted as such.

https://en.wikipedia.org/wiki/Sarcasm

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Silicon lottery or not, 1400 MHz on 2.0 is not happening. Please back up your statement. You were just calling out patrickjp93 for that. I don't remember ever seeing it giving more than 100-150, let alone 300 or 400 MHz (because that needs playing with voltages and afaik 2.0 can't do that).

My GTX 970 boosts up to 1430Mhz out of the box... A friend of mine has a 980 and it's almost the same, it boosts up

I guess that it's luck. Anyways AMD is stronger in this game than Nvidia. It would be interesting to see results with Maxwell overclocked, sometimes the extra 100-200Mhz makes a difference...

Link to comment
Share on other sites

Link to post
Share on other sites

Why cant all games be as optimised as this one?

Longboarders/ skaters message me!

Link to comment
Share on other sites

Link to post
Share on other sites

Yes for the small group of hardware enthusiasts like us we OC our GPUs you'd be surprised how many people don't overclock or give a fuck about overclocking, even when I offered to do it for them.

Those people also don't buy a $300-500 GPU's the people that buy those GPU's are mostly enthusiasts.

And almost every benchmark ignores the GPU features like MFAA which increases the performance when using MSAA by almost 10fps on Maxwell.

A benchmark should be done in the way the cards are actually used if a game as an example has Mantle it should be in the benchmark as well.

 

Remember that benchmarks turn on all the eye candy that normal people don't use. I'm pretty sure most settings are on ultra or very high and AA is at 16x.

I'm playing the game already maxed out the game also has no MSAA it only has FXAA/TAA and a resolution scale to downsample.

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

Would it be wrong to assume nVidia will release better drivers once the game is released?

 

They may. I think all of EA's Frostbite 3 engine games have been associated with AMD dating back to Battlefield 4, and I think some of the performance discrepancies got ironed out over time.

 

This isn't really anything to worry about, though. Yeah, Nvidia's cards are a few FPS behind where we'd expect them to be, but they're all achieving very good performance regardless.

Link to comment
Share on other sites

Link to post
Share on other sites

If not, Nvidia wouldn't get that 82% market share 

 

tumblr_nu6xskk0Wc1reorefo4_400.gif

 

:lol:

sort of true really...

if driver "magic" wasnt ever a thing, the GTX 900 series would be battling to even get foothold over the R9 200 series...

Link to comment
Share on other sites

Link to post
Share on other sites

Guru3D has updated the benchmark with Titan X, Titan Black, GTX 770, GTX 950 and R9 280x.

770 beats 960.

280x is tied with 380, knowing that GPU it will probably pull well ahead of the 380 when overclocked.

950 strangely is matching 960, which suggests to me Nvidia needs to tune the 960 driver specifically.

 

22036594625_7e12c23386_o.png

Link to comment
Share on other sites

Link to post
Share on other sites

I am happy with my 290 1440p numbers, I want to buy another for 1 GPU per eye VR

Link to comment
Share on other sites

Link to post
Share on other sites

Those people also don't buy a $300-500 GPU's the people that buy those GPU's are mostly enthusiasts.

And almost every benchmark ignores the GPU features like MFAA which increases the performance when using MSAA by almost 10fps on Maxwell.

A benchmark should be done in the way the cards are actually used if a game as an example has Mantle it should be in the benchmark as well.

 

I agree with you to an extent I have a small sample size (because I'm one person), but I have friends with a 780ti, r9 390, 2 660tis in SLI, a 680 classified, another with a 390, 290s in CF, and none of them overclocked their GPUs. I have more friends with custom PCs, but I don't know if they overclock. Not everyone that spends money on a GPU is a hardware enthusiast they might just be gamers that want to be positive they max out their games and play at 1080p 60fps all the time.

 

Also there reason why they don't use those features like MFAA is because that is comparing Apples to oranges. AMD cards don't even support MFAA, Nvidia cards don't support Mantle etc. Benchmarks have to be in a Apples to Apples comparison otherwise we can't objectively compare the cards. Further more games have to support features like MFAA and not many do just like a lot of games don't support TXAA or AMDs EQAA.

Everyone is well aware that Nvidia does not use the same methodology with their technologies that AMD does. I'm not gonna be like "oh black boxed middleware," it's like TressFX vs HairWorks while Hairworks looks better IMO, but TressFX is open source as a programmer it much easier to optimize for code you can actually look at (At least in my experience as a developer).

Spoiler

Cpu: Ryzen 9 3900X – Motherboard: Gigabyte X570 Aorus Pro Wifi  – RAM: 4 x 16 GB G. Skill Trident Z @ 3200mhz- GPU: ASUS  Strix Geforce GTX 1080ti– Case: Phankteks Enthoo Pro M – Storage: 500GB Samsung 960 Evo, 1TB Intel 800p, Samsung 850 Evo 500GB & WD Blue 1 TB PSU: EVGA 1000P2– Display(s): ASUS PB238Q, AOC 4k, Korean 1440p 144hz Monitor - Cooling: NH-U12S, 2 gentle typhoons and 3 noiseblocker eloops – Keyboard: Corsair K95 Platinum RGB Mouse: G502 Rgb & G Pro Wireless– Sound: Logitech z623 & AKG K240

Link to comment
Share on other sites

Link to post
Share on other sites

It is - GPU Boost 2.0 can get a Maxwell card to 1400 out of the box. That is not stock.

 

Sorry, but this is just not true for most Maxwell cards. You are showing factory OC'd cards that boost higher. Understand, reference GTX 960 is clocked in at 1127mhz, boost clock is 1178, but real world boost is somewhere in the ball park of 1200-1225 depending on multiple factors (thermals, lottery, etc). Each of the cards you linked, has a factory OC of 100mhz or more, and after market coolers designed to handle that extra cooling. They are not only clocked higher out of the box, but the improved coolers allow them to boost even higher too. 

 

Looking at Maxwell across the full spectrum of their lineup (from the GTX 750, all the way to the GTX Titan X) it's safe to say your 1400mhz number is not accurate for all Maxwell cards. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't see the Titan X on any of those graphs.

 

How is its performance?

IMO Titan X was a poorly designed card. Asides from that, its way overpriced. 980ti gets about the same performance for $350 cheaper. Titan X actually got worse frame rates than the 980ti running this game.

Project: JETTE, Custom workstation/gaming PC build http://pcpartpicker.com/b/XZTH99 i7-5820K | Asus SABERTOOTH X99 ATX | Corsair Vengeance LPX 32GB DDR4-2400 RAM | Zotac GeForce GTX 980 Ti 6GB AMP! Extreme | Phanteks EVOLV ATX | Samsung 850 EVO-Series 500GB + Seagate 3TB HDD | Corsair AX760 | LG 34UC87C 3440x1440 34" Ultrawide Curverd Monitor | Corsair H110i GTX | CM Storm QuickFire Ultimate Gaming Keyboard | Logitech MX Master | Bose C2 Speakers | Windows 10

Link to comment
Share on other sites

Link to post
Share on other sites

I can't wait to get home and test this game out on my 3440x1440 monitor! I'm so hyped! Star Wars Battlefront II for PS2 was my childhood!

Project: JETTE, Custom workstation/gaming PC build http://pcpartpicker.com/b/XZTH99 i7-5820K | Asus SABERTOOTH X99 ATX | Corsair Vengeance LPX 32GB DDR4-2400 RAM | Zotac GeForce GTX 980 Ti 6GB AMP! Extreme | Phanteks EVOLV ATX | Samsung 850 EVO-Series 500GB + Seagate 3TB HDD | Corsair AX760 | LG 34UC87C 3440x1440 34" Ultrawide Curverd Monitor | Corsair H110i GTX | CM Storm QuickFire Ultimate Gaming Keyboard | Logitech MX Master | Bose C2 Speakers | Windows 10

Link to comment
Share on other sites

Link to post
Share on other sites

IMO Titan X was a poorly designed card. Asides from that, its way overpriced. 980ti gets about the same performance for $350 cheaper. Titan X actually got worse frame rates than the 980ti running this game.

 

Usually when a 980 Ti outperforms a Titan X, there's some thermal throttling at play. Though I suppose that is consistent with the "poorly designed" point.

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry, but this is just not true for most Maxwell cards. You are showing factory OC'd cards that boost higher. Understand, reference GTX 960 is clocked in at 1127mhz, boost clock is 1178, but real world boost is somewhere in the ball park of 1200-1225 depending on multiple factors (thermals, lottery, etc). Each of the cards you linked, has a factory OC of 100mhz or more, and after market coolers designed to handle that extra cooling. They are not only clocked higher out of the box, but the improved coolers allow them to boost even higher too. 

 

Looking at Maxwell across the full spectrum of their lineup (from the GTX 750, all the way to the GTX Titan X) it's safe to say your 1400mhz number is not accurate for all Maxwell cards. 

A reference 970/980 cannot OC for shit - THERMAL GATE PARTY!

Really though - if you compare reference vs reference then the Fury X > 980 Ti as the GM200 core throttles like a boss.

If you want to compare reference vs reference then Maxwell are not worth a damn considering they MUST be OCed to stand their ground.

And a page or so back a guy got 1450 on his Maxwell GPU without touching shit. Go educate yourself, run along now.

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×