Jump to content

( Updated )Star Wars Battlefront Benchmarked On Nvidia And AMD, Runs Best On Radeon – Exceptionally Well Optimized Across The Board

Mr_Troll

IMO Titan X was a poorly designed card. Asides from that, its way overpriced. 980ti gets about the same performance for $350 cheaper. Titan X actually got worse frame rates than the 980ti running this game.

 

Usually when a 980 Ti outperforms a Titan X, there's some thermal throttling at play. Though I suppose that is consistent with the "poorly designed" point.

Titan X thermal throttles like a true champ. It NEEDS a good cooler in order to get ahead. 980 Ti reference is the same.

Normally this is where I'd point out how much more practical using a AiO for a flagship card is but I don't want to start a flame war or anything so yeah :P

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry, but this is just not true for most Maxwell cards. You are showing factory OC'd cards that boost higher. Understand, reference GTX 960 is clocked in at 1127mhz, boost clock is 1178, but real world boost is somewhere in the ball park of 1200-1225 depending on multiple factors (thermals, lottery, etc). Each of the cards you linked, has a factory OC of 100mhz or more, and after market coolers designed to handle that extra cooling. They are not only clocked higher out of the box, but the improved coolers allow them to boost even higher too. 

 

Looking at Maxwell across the full spectrum of their lineup (from the GTX 750, all the way to the GTX Titan X) it's safe to say your 1400mhz number is not accurate for all Maxwell cards. 

then again, the most sold cards ARE custom version and not reference versions. So his point is more relevant anyway.

 

you do not go out to buy a reference 960 or whatever, hell dunno if you can even find any more reference cards at all or if they even are sold anymore. you go out to buy a Asus, EVGA, Gigabyte, MSI, Zotac, Palit or Gainward card.... which in 99.9% of cases is not using a reference blower type cooler.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

Also there reason why they don't use those features like MFAA is because that is comparing Apples to oranges. AMD cards don't even support MFAA, Nvidia cards don't support Mantle etc. Benchmarks have to be in a Apples to Apples comparison otherwise we can't objectively compare the cards. Further more games have to support features like MFAA and not many do just like a lot of games don't support TXAA or AMDs EQAA.

Everyone is well aware that Nvidia does not use the same methodology with their technologies that AMD does. I'm not gonna be like "oh black boxed middleware," it's like TressFX vs HairWorks while Hairworks looks better IMO, but TressFX is open source as a programmer it much easier to optimize for code you can actually look at (At least in my experience as a developer).

The majority of the cards are sold way above stock from third party vendors.

MFAA doesn't need game support you turn it on in the control panel and it changes the way the GPU handles MSAA so it should be 100% included in every benchmark that has AA turned on otherwise it is showing false performance.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

I can't believe the bench is with DX11.

I would like to see DX12 benches as well though.

I'm gonna enjoy showing my flatmate just how much better it looks on my 285 then his Xbone

Link to comment
Share on other sites

Link to post
Share on other sites

The majority of the cards are sold way above stock from third party vendors.

MFAA doesn't need game support you turn it on in the control panel and it changes the way the GPU handles MSAA so it should be 100% included in every benchmark that has AA turned on otherwise it is showing false performance.

 

 

Exactly, if as many people took the time to manually OC their graphics cards we wouldn't need GPU Boost.

The bit about MFAA I did not know since I don't own a NV graphics cards, my point about apples to apples comparisons still stands. No one would ever take benchmarks like that seriously, unless you aren't using those types of benchmarks to compare graphics cards from AMD & NV.

Spoiler

Cpu: Ryzen 9 3900X – Motherboard: Gigabyte X570 Aorus Pro Wifi  – RAM: 4 x 16 GB G. Skill Trident Z @ 3200mhz- GPU: ASUS  Strix Geforce GTX 1080ti– Case: Phankteks Enthoo Pro M – Storage: 500GB Samsung 960 Evo, 1TB Intel 800p, Samsung 850 Evo 500GB & WD Blue 1 TB PSU: EVGA 1000P2– Display(s): ASUS PB238Q, AOC 4k, Korean 1440p 144hz Monitor - Cooling: NH-U12S, 2 gentle typhoons and 3 noiseblocker eloops – Keyboard: Corsair K95 Platinum RGB Mouse: G502 Rgb & G Pro Wireless– Sound: Logitech z623 & AKG K240

Link to comment
Share on other sites

Link to post
Share on other sites

Im sure custom 980tis will lead the field.

I wish there were R9 fury Xs overclocked by 30%+ too

Link to comment
Share on other sites

Link to post
Share on other sites

I wish there were R9 fury Xs overclocked by 30%+ too

Ya, strange AMD dont try something like that.

CPU i7 6700k MB  MSI Z170A Pro Carbon GPU Zotac GTX980Ti amp!extreme RAM 16GB DDR4 Corsair Vengeance 3k CASE Corsair 760T PSU Corsair RM750i MOUSE Logitech G9x KB Logitech G910 HS Sennheiser GSP 500 SC Asus Xonar 7.1 MONITOR Acer Predator xb270hu Storage 1x1TB + 2x500GB Samsung 7200U/m - 2x500GB SSD Samsung 850EVO

Link to comment
Share on other sites

Link to post
Share on other sites

That's NVidia for you with planned obsolescence and weak architecture.

That's like old previous generation cards.

Tesla was like, you want dx11 then buy a new card.

Fermi and Kepler lacked enough VRAM to use now.

Maxwell is just a power efficient generation and Nvidia didn't a lot in improving raw performance.

Link to comment
Share on other sites

Link to post
Share on other sites

That's like old previous generation cards.

Tesla was like, you want dx11 then buy a new card.

Fermi and Kepler lacked enough VRAM to use now.

Maxwell is just a power efficient generation and Nvidia didn't a lot in improving raw performance.

 

Maxwell's power efficiency came at great cost, with poor compute, poor hardware schedulers, not being able to do concurrent async compute, etc.

 

I've just tried this beta and can confirm the numbers. With everything on ultra at 1080p, my 290 beats a 780ti. A 780ti!! The card that used to beat a 290x and came with a hefty price premium. It's a joke now, because it won't get the necessary driver optimizations anymore, as needed when these cards are more dependant on driver updates/optimizations, than proper hardware.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

That's like old previous generation cards.

Tesla was like, you want dx11 then buy a new card.

Fermi and Kepler lacked enough VRAM to use now.

Maxwell is just a power efficient generation and Nvidia didn't a lot in improving raw performance.

 

Maxwell was also a huge cost cutting measure by Nvidia. If i recall, they had to sell 2 Kepler cards to make the same profit of one equally priced Maxwell card.

 

The efficiency was a one two punch with Maxwell. They implemented their 3rd Gen Delta color compression (comparable to AMD's Tonga and Fiji color compression), but they also went full out with shader recompiling, where the cpu recompiles/compresses graphic shaders, lowering the amount of draw calls needed per frame as the GPU contains the hardware necessary to work with the recompiled shader. Nvidia removed the schedulers from the Maxwell GPU, and used the spare cpu cycles saved from reduced draw calls to schedule the graphic pipeline in the GPU. Really clever, but it also means Nvidia has to optimize their drivers for every single game to get the best performance. 

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Maxwell's power efficiency came at great cost, with poor compute, poor hardware schedulers, not being able to do concurrent async compute, etc.

I've just tried this beta and can confirm the numbers. With everything on ultra at 1080p, my 290 beats a 780ti. A 780ti!! The card that used to beat a 290x and came with a hefty price premium. It's a joke now, because it won't get the necessary driver optimizations anymore, as needed when these cards are more dependant on driver updates/optimizations, than proper hardware.

My gtx 280 is a old card and can't even use it now. I feel like Nvidia gave up on pre maxwell cards and just focused on optimizing maxwell to have similar performance to CGN cards...
Link to comment
Share on other sites

Link to post
Share on other sites

Maxwell was also a huge cost cutting measure by Nvidia. If i recall, they had to sell 2 Kepler cards to make the same profit of one equally priced Maxwell card.

The efficiency was a one two punch with Maxwell. They implemented their 3rd Gen Delta color compression (comparable to AMD's Tonga and Fiji color compression), but they also went full out with shader recompiling, where the cpu recompiles/compresses graphic shaders, lowering the amount of draw calls needed per frame as the GPU contains the hardware necessary to work with the recompiled shader. Nvidia removed the schedulers from the Maxwell GPU, and used the spare cpu cycles saved from reduced draw calls to schedule the graphic pipeline in the GPU. Really clever, but it also means Nvidia has to optimize their drivers for every single game to get the best performance.

Since they removed some features/shaders things to improve power efficiently, they had to optimize the card well or else it would be a dead card/wouldn't perform well. That's why I hate Maxwell. Basically what you posted is what I'm trying to say about Maxwell.

I'm glad it had color tho...if you get what I mean if you refer to James Maxwell with photos.[emoji6]

Link to comment
Share on other sites

Link to post
Share on other sites

Maxwell's power efficiency came at great cost, with poor compute, poor hardware schedulers, not being able to do concurrent async compute, etc.

 

I've just tried this beta and can confirm the numbers. With everything on ultra at 1080p, my 290 beats a 780ti. A 780ti!! The card that used to beat a 290x and came with a hefty price premium. It's a joke now, because it won't get the necessary driver optimizations anymore, as needed when these cards are more dependant on driver updates/optimizations, than proper hardware.

i did some testing of the Beta too..

 

R9 295x2 fires up at 100% load on both GPUs...

Crossfire works in fullscreen.... my overlay is broken so i cannot see my FPS atm ;(

Link to comment
Share on other sites

Link to post
Share on other sites

Fraps?

 

Edit: Oh wait...295x2...EA(dice)....mantle, yeah...sorry.

MARS_PROJECT V2 --- RYZEN RIG

Spoiler

 CPU: R5 1600 @3.7GHz 1.27V | Cooler: Corsair H80i Stock Fans@900RPM | Motherboard: Gigabyte AB350 Gaming 3 | RAM: 8GB DDR4 2933MHz(Vengeance LPX) | GPU: MSI Radeon R9 380 Gaming 4G | Sound Card: Creative SB Z | HDD: 500GB WD Green + 1TB WD Blue | SSD: Samsung 860EVO 250GB  + AMD R3 120GB | PSU: Super Flower Leadex Gold 750W 80+Gold(fully modular) | Case: NZXT  H440 2015   | Display: Dell P2314H | Keyboard: Redragon Yama | Mouse: Logitech G Pro | Headphones: Sennheiser HD-569

 

Link to comment
Share on other sites

Link to post
Share on other sites

Fraps?

 

Edit: Oh wait...295x2...EA(dice)....mantle, yeah...sorry.

 

Battlefront is DX11 only.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

A reference 970/980 cannot OC for shit - THERMAL GATE PARTY!

Really though - if you compare reference vs reference then the Fury X > 980 Ti as the GM200 core throttles like a boss.

If you want to compare reference vs reference then Maxwell are not worth a damn considering they MUST be OCed to stand their ground.

And a page or so back a guy got 1450 on his Maxwell GPU without touching shit. Go educate yourself, run along now.

You may want to treat people that are not hostile towards you with a little more respect. Being arrogant only ends badly around me. You specifically said "GPU boost 2.0 can get a Maxwell card to 1400 out of the box". It's simply not true. Even if we look at non-reference. Show me a GTX 750, 750 Ti, 950, etc that can hit 1400 out of the box. The only cards that do, are the exotic factory overclocked cards. You are also the uneducated one, because the reference coolers CAN OC quite well. Your "Thermal Gate" nonsense is just you being a biased fanboy. 

 

You see, the stock reference cooler's fan curve is so low, that it doesn't even make a noise, even at 80-84C. You can throw it to 70% fan speed, and still surpass the advertised boost speed without ever seeing a boost throttle of 84C. 

 

then again, the most sold cards ARE custom version and not reference versions. So his point is more relevant anyway.

 

you do not go out to buy a reference 960 or whatever, hell dunno if you can even find any more reference cards at all or if they even are sold anymore. you go out to buy a Asus, EVGA, Gigabyte, MSI, Zotac, Palit or Gainward card.... which in 99.9% of cases is not using a reference blower type cooler.

I don't know where you get your statistics from, but they are also not true. You can find reference Nvidia cards anywhere online, and at stores such as Best Buy and even Staples here in the US. Remember, reference x60 cards do not have the magnesium shroud that the x70, x80's and titan class cards have. 

 

960 reference cards (They are reference, just manufactured by MSI and PNY)

http://www.bestbuy.com/site/pny-gtx-960-2048mb-gddr5-pci-e-3-0-graphics-card-multi/1313112985.p?id=mp1313112985&skuId=1313112985

http://www.bestbuy.com/site/msi-geforce-gtx-960-graphic-card-multi/1313264851.p?id=mp1313264851&skuId=1313264851

 

970 reference card

http://www.bestbuy.com/site/nvidia-geforce-gtx-970-4gb-gddr5-pci-express-3-0-graphics-card-silver-black/9855169.p?id=1219441201895&skuId=9855169

 

Plethora of reference cards on newegg:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127842

http://www.newegg.com/Product/Product.aspx?Item=N82E16814133579

http://www.newegg.com/Product/Product.aspx?Item=N82E16814133563

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487067

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487068 (Superclocked edition with stock cooler? but..but... thermal problems!?!?!)

http://www.newegg.com/Product/Product.aspx?Item=N82E16814133611

http://www.newegg.com/Product/Product.aspx?Item=N82E16814500376

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487139

 

Face it. Reference cards exist, and they work. You can't find a better card to handle SLI than reference blowers (unless you put a water block on them, or invest in those ASUS turbo cards that they don't make for the 980 or 980 Ti yet). Yes, they run hotter, its a known fact. The reason they run hotter is because their fan curves are very tame. If you don't mind your card sounding like a jet engine, you can throw your fan speeds to 100% and OC just as well as even the best G1 or Strix cards (Maxwell scales well with high ASIC numbers, so keep that in mind). 

 

Point is, not every Maxwell card boosts to 1400 out of the box. If you OC them, yes, 1400 is quite easily achievable. 1300 would be a much safer, and accurate statement when comparing the averages of every Maxwell family card. Remember, Maxwell starts with the GTX 750, and ends with the Titan X. To say all of those can boost to 1400 out of the box is just silly, don't you think?

 

Now @don_svetlio, be sure to really dig into me. Just remember to ask me for sources to my claims. I'll happily oblige. Just make sure you have the time to deal with me. The last two people that tried gave up before i could even stretch my legs.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I actually like how DICE is optimizing their games. BF4 runs smoothly on my 780. Yes, they're EA studio, however, they have some top notch developers. One of the best in the gaming industry I would say.

CPU: AMD Ryzen 7 3800X Motherboard: MSI B550 Tomahawk RAM: Kingston HyperX Predator RGB 32 GB (4x8GB) DDR4 GPU: EVGA RTX3090 FTW3 SSD: ADATA XPG SX8200 Pro 512 GB NVME | Samsung QVO 1TB SSD  HDD: Seagate Barracuda 4TB | Seagate Barracuda 8TB Case: Phanteks ECLIPSE P600S PSU: Corsair RM850x

 

 

 

 

I am a gamer, not because I don't have a life, but because I choose to have many.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I actually like how DICE is optimizing their games. BF4 runs smoothly on my 780. Yes, they're EA studio, however, they have some top notch developers. One of the best in the gaming industry I would say.

Yeah, I'd say they're the best for the devs (They have some legit bloody awesome studios) and they're all one of the best for gamers too.

 

An EA game that runs well? No way. /s

Your username suits you.
Link to comment
Share on other sites

Link to post
Share on other sites

I've been playing it on both maps. More or less same performance.
1080p @ Ultra
FX 8350 stock clock
R9 290  stock clock
16GB

It was 60+ FPS
 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Its in beta and it already runs just like BF 4, 2 years after release :)

33-56 FPS on my PC, 1080p Ultra.

I wonder if at 4K the game uses more than 8GB of RAM. On my system at ultra it only uses litle under 4GB.

Connection200mbps / 12mbps 5Ghz wifi

My baby: CPU - i7-4790, MB - Z97-A, RAM - Corsair Veng. LP 16gb, GPU - MSI GTX 1060, PSU - CXM 600, Storage - Evo 840 120gb, MX100 256gb, WD Blue 1TB, Cooler - Hyper Evo 212, Case - Corsair Carbide 200R, Monitor - Benq  XL2430T 144Hz, Mouse - FinalMouse, Keyboard -K70 RGB, OS - Win 10, Audio - DT990 Pro, Phone - iPhone SE

Link to comment
Share on other sites

Link to post
Share on other sites

780ti beating a 970 on all 3 tests, poor 970s.  I guess technically the 780ti is more powerful than a 970, I never knew the gap was so large

Sounds like someone was feeding you steaming piles off the maxwell bandwagon. In reality the 980 is only a bit better than 780ti if you overclock both cards. Maxwell only looks so good because of the really aggresive clocks, which also lead to not so high percentage of overclocks compared to the 700 series. 

Ryzen 3700x -Evga RTX 2080 Super- Msi x570 Gaming Edge - G.Skill Ripjaws 3600Mhz RAM - EVGA SuperNova G3 750W -500gb 970 Evo - 250Gb Samsung 850 Evo - 250Gb Samsung 840 Evo  - 4Tb WD Blue- NZXT h500 - ROG Swift PG348Q

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×