Jump to content

( Updated )Star Wars Battlefront Benchmarked On Nvidia And AMD, Runs Best On Radeon – Exceptionally Well Optimized Across The Board

Mr_Troll

There are people who got early access beta keys....

So this performance should be in line with what people will experience tomorrow.

 

Also, game wise, it is just more then 1 month to final release. Any major bugs happening during this beta will probably not be fixed in time. Expect the final product to have roughly the same performance as now.

 

I hope dice fixed their netcode though... if not, prepare for shots that never registrer

 

Driver and game updates could be released after launch. It's happened before.

 

Ah here we go, I was looking for this excuse and found it odd that no one posted it.

 

Ah here we go, bogus being bogus as usual.

Link to comment
Share on other sites

Link to post
Share on other sites

The 960 is getting wrecked so hard and R9 290 beats a 970.

These results are kind of questionable, even though Guru3D are a reliable source. We'll see if things change after nvidia roll out their "game optimised" driver.

From salty to bath salty in 2.9 seconds

 

Link to comment
Share on other sites

Link to post
Share on other sites

Driver and game updates could be released after launch. It's happened before.

 

 

Ah here we go, bogus being bogus as usual.

guru3d benched the NV cards with the latest game ready driver for Battlefront that got released today early in the morning.

Intel Core i7 7800x @ 5.0 Ghz with 1.305 volts (really good chip), Mesh OC @ 3.3 Ghz, Fractal Design Celsius S36, Asrock X299 Killer SLI/ac, 16 GB Adata XPG Z1 OCed to  3600 Mhz , Aorus  RX 580 XTR 8G, Samsung 950 evo, Win 10 Home - loving it :D

Had a Ryzen before ... but  a bad bios flash killed it :(

MSI GT72S Dominator Pro G - i7 6820HK, 980m SLI, Gsync, 1080p, 16 GB RAM, 2x128 GB SSD + 1TB HDD, Win 10 home

 

Link to comment
Share on other sites

Link to post
Share on other sites

Ah here we go, bogus being bogus as usual.

 

Yes, here I am, stating the obvious... it happened with starswarm, it happened with every DX12 benchmark, ofc it would happen with this - the only pattern is AMD then NVIDIA actually doing better in some segments, therefore one must deny the validity of a perfectly valid benchmark.

What's next, NVIDIA didn't had the time to do better WHQL drivers and AMD has beta ones?

Link to comment
Share on other sites

Link to post
Share on other sites

guru3d benched the NV cards with the latest game ready driver for Battlefront that got released today early in the morning.

 

I'm aware of that.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, here I am, stating the obvious... it happened with starswarm, it happened with every DX12 benchmark, ofc it would happen with this - the only pattern is AMD then NVIDIA actually doing better in some segments, therefore one must deny the validity of a perfectly valid benchmark.

What's next, NVIDIA didn't had the time to do better WHQL drivers and AMD has beta ones?

 

I don't know where you're pulling that from.

 

All I said was using software that isn't finalized isn't going to reflect the final product. I'd say the same thing if Nvidia was topping AMD.

 

Stop being an asshole.

Link to comment
Share on other sites

Link to post
Share on other sites

Jesus people, put your brand sponsored weapons aside for a moment. @Kloaked is just saying that games are known to receive updates that changes performance down the line. This has happened historically for almost every AAA game released, ever. Whether it is in the form of a graphics vendor driver update, or a game patch that improves (or decreases) performance.

 

Examples of improvement include games tweaking the engine to have broader hardware support, games that introduce less taxing AA implementations, or games that upgrade to a new API (WoW is a great example).

 

Examples of performance decreasing can come from games that try to force more content on to older engines, and introduce patches that may contain memory leaks or additional particles for a certain object (Perfect World is a great example of this, going from 1.3.6 to 1.4.4). 

 

It is not hard to agree with him, that these results are not concrete just yet, and that anything can happen (for better or for worse) for all sides of the brand spectrum. I swear, you people get out of control with your brand names and are willing to crucify people for what should be seen as obvious facts by now. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Can people stop already with the "fact" that Mantle had anything to do with DX12 and Vulkan being developed?  Mantle at time of announcement had nothing to do with either DX12 or Vulkan development.  Let it rest already.

QUOTE ME IN A REPLY SO I CAN SEE THE NOTIFICATION!

When there is no danger of failure there is no pleasure in success.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know where you're pulling that from.

 

All I said was using software that isn't finalized isn't going to reflect the final product. I'd say the same thing if Nvidia was topping AMD.

 

Stop being an asshole.

What I'm saying also applys no matter the outcome of the result. The software was benchmarked as it it. Period. 

Both IHVs put time and effort to make drivers for it. Period.

This is the purpous of a benchmark. It's not stating how shit will be in the future, it states how shit is in the precise moment somone benchmarked it.

 

Not to mention that BETA and FINAL PRODUCT these days means different things from developer to developer.

 

I'm not being an asshole.

Link to comment
Share on other sites

Link to post
Share on other sites

What I'm saying also applys no matter the outcome of the result. The software was benchmarked as it it. Period. 

Both IHVs put time and effort to make drivers for it. Period.

This is the purpous of a benchmark. It's not stating how shit will be in the future, it states how shit is in the precise moment somone benchmarked it.

 

Not to mention that BETA and FINAL PRODUCT these days means different things from developer to developer.

 

I'm not being an asshole.

 

What you're saying is irrelevant to what I said, and all you're trying to do is stir up some shit. So yes, you are being an asshole.

Link to comment
Share on other sites

Link to post
Share on other sites

This isn't really a surprise for someone who knows how BF4 performs.

GPU: Gigabyte GTX 970 G1 Gaming CPU: i5-4570 RAM: 2x4gb Crucial Ballistix Sport 1600Mhz Motherboard: ASRock Z87 Extreme3 PSU: EVGA GS 650 CPU cooler: Be quiet! Shadow Rock 2 Case: Define R5 Storage: Crucial MX100 512GB
Link to comment
Share on other sites

Link to post
Share on other sites

What you're saying is irrelevant to what I said, and all you're trying to do is stir up some shit. So yes, you are being an asshole.

You said this benchmark isn't trustworthy, I'm calling you out for that non sense. 

You are trying to dodge this by insulting me, saying I'm an asshole.

 

Link to comment
Share on other sites

Link to post
Share on other sites

The 960 is getting wrecked so hard and R9 290 beats a 970.

These results are kind of questionable, even though Guru3D are a reliable source. We'll see if things change after nvidia roll out their "game optimised" driver.

http://anandtech.com/show/9698/nvidia-releases-35850-game-ready-drivers-for-star-wars-battlefront

 

this article uses this exact driver....

which is the "game optimized driver".....

 

want to concoct any more excuses?

Link to comment
Share on other sites

Link to post
Share on other sites

You said this benchmark isn't trustworthy, I'm calling you out for that non sense. 

You are trying to dodge this by insulting me, saying I'm an asshole.

 

Right, because the game could change come launch; the game could perform worse (unlikely) or perform better depending on which vendor GPU you own since the two have various performance gains.

 

You don't have to call me out for saying an unfinished product's benchmark isn't trustworthy for the final product. I don't see how that's nonsense.

Link to comment
Share on other sites

Link to post
Share on other sites

What I'm saying also applys no matter the outcome of the result. The software was benchmarked as it it. Period. 

Both IHVs put time and effort to make drivers for it. Period.

This is the purpous of a benchmark. It's not stating how shit will be in the future, it states how shit is in the precise moment somone benchmarked it.

 

Not to mention that BETA and FINAL PRODUCT these days means different things from developer to developer.

 

I'm not being an asshole.

 

What you're saying is irrelevant to what I said, and all you're trying to do is stir up some shit. So yes, you are being an asshole.

I think what @bogus is saying, is that all benchmarks are to be viewed with the mindset of "As of this moment in time". Meaning, at this moment in time (10/7/2015) These AMD cards are performing X faster than Nvidia cards. He seems to agree that things are subject to change, at least that's the vibe i am getting from him through the context of his posts.

 

Both of you are not wrong. Though, him calling your questioning of a beta game an "excuse" might be a little uncalled for, it's still his opinion to make. The important thing is, we have a point of reference. Now we know what these cards are capable of when it comes to this game right now. We can come back in the future, and see if any progress is made in the future, and have something to compare it to. It also helps people know that their card can handle the game now (assuming a patch does not degrade the performance, in the rare occasion that it could) and that things could only get better from here. 

 

My point is, benchmarking at its core, is just comparing things against each other for a proper frame of reference. They come in all shapes and sizes, from synthetic to real world. Even if it is a beta, it does not make it any less valuable for a frame of reference. We can use it to compare hardware now, and even use it to compare it against future iterations of the software, to see how far it's come in time. 

 

TL:DR? @bogus is right for saying its a valid benchmark. @Kloaked is right in saying that these results are subject to change in the future. Both of you are right. Focus your talents on something far more important, like ending world hunger. Or getting the Kardashians off the air.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

TL:DR? @bogus is right for saying its a valid benchmark. @Kloaked is right in saying that these results are subject to change in the future. Both of you are right. Focus your talents on something far more important, like ending world hunger. Or getting the Kardashians off the air.

 

I'm too triggered to be productive. I have to become the opposite of a tumblrina.

Link to comment
Share on other sites

Link to post
Share on other sites

http://anandtech.com/show/9698/nvidia-releases-35850-game-ready-drivers-for-star-wars-battlefront

 

this article uses this exact driver....

which is the "game optimized driver".....

 

want to concoct any more excuses?

I didn't read the whole benchmark and I thought that the "optimised' drivers are going to get released later. My apologies.

I'm not concocting any excuses, I just find it a bit surprising as games tend to be heavily nvidia optimised lately. As a Radeon user I'm glad that AMD cards' overall performance isn't gimped by crap like GameWorks.

From salty to bath salty in 2.9 seconds

 

Link to comment
Share on other sites

Link to post
Share on other sites

Right, because the game could change come launch; the game could perform worse (unlikely) or perform better depending on which vendor GPU you own since the two have various performance gains.

 

You don't have to call me out for saying an unfinished product's benchmark isn't trustworthy for the final product. I don't see how that's nonsense.

A game changes through out it's product life cycle - it's software, and software is (usually) target for updates aimed at improvements. So I don't get where you are trying to go with this, because by that logic every game benchmark is not trustworthy until it's final update (both game and drivers).

Now I see that the problem seems to be in your definition of benchmark and what kind of information you are trying to get from it.

A benchmark is a picture in some point in time. You cannot project it to the future, specially with just one point of data... you can't see a trend on this benchmark to claim "AMD will perform better overall in SW BF out of beta" or vice versa. It's impossible to know that.

And because of that you are trying to discredit a benchmark with pretty much one of the most optimized game engines we have, probably more up to date then BF4.

Here are some examples of what kind of information you can get from this benchmark:

- AMD seems to be stepping up their driver game (seen also by other early benchmarks)

- WHQL and Game Optimized Drivers VS Beta drivers, doesn't seem to make much of a difference and currently it seems to play more on the "marketing" side of things... to have some seal of approval that after all doesn't seem to mean much (at least performance wise);

- 980ti seems to be strong either by it's nature, or NVIDIA is putting alot of effort making it the top dog and maybe at the cost of negleting other cards;

- etc;

This is a completly valid benchmark- if you want to make more of it then what it is, then that's your problem.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

not-giant-enough-letter-k.jpg
Link to comment
Share on other sites

Link to post
Share on other sites

how the hell is the 290x beating the 980 and the fury x is within 5 fps either way of the 980 ti

Cuz AMD...

4690K // 212 EVO // Z97-PRO // Vengeance 16GB // GTX 770 GTX 970 // MX100 128GB // Toshiba 1TB // Air 540 // HX650

Logitech G502 RGB // Corsair K65 RGB (MX Red)

Link to comment
Share on other sites

Link to post
Share on other sites

is the beta DX12

The weird kid in the corner eating glue
“People think that I must be a very strange person. This is not correct. I have the heart of a small boy. It is in a glass jar on my desk.” - Stephen King

Link to comment
Share on other sites

Link to post
Share on other sites

AMD is coming up more and more as games are launched...What is nVidia doing? First DX12 then Async Compute, now on par performance in SW(for now).

I mean, yeah, 980 Ti, great card, but expensive as f**k. For the mainstream user, the R9s 380 and 390 are becoming the better choices hands down against 960 and 970.

MARS_PROJECT V2 --- RYZEN RIG

Spoiler

 CPU: R5 1600 @3.7GHz 1.27V | Cooler: Corsair H80i Stock Fans@900RPM | Motherboard: Gigabyte AB350 Gaming 3 | RAM: 8GB DDR4 2933MHz(Vengeance LPX) | GPU: MSI Radeon R9 380 Gaming 4G | Sound Card: Creative SB Z | HDD: 500GB WD Green + 1TB WD Blue | SSD: Samsung 860EVO 250GB  + AMD R3 120GB | PSU: Super Flower Leadex Gold 750W 80+Gold(fully modular) | Case: NZXT  H440 2015   | Display: Dell P2314H | Keyboard: Redragon Yama | Mouse: Logitech G Pro | Headphones: Sennheiser HD-569

 

Link to comment
Share on other sites

Link to post
Share on other sites

290x beating gtx 980 is awesome! 980's are like $500 and used 290x in usa is $220

 

and these are benchmarks from dx11, when dx 12 is out shit is gonna get crazy

cpu:i7-4770k    gpu: msi reference r9 290x  liquid cooled with h55 and hg10 a1     motherboard:z97x gaming 5   ram:gskill sniper 8 gb

Link to comment
Share on other sites

Link to post
Share on other sites

what issues?

 

yes CF profile may or may not be 100% ready atm...

microstutter -> that is a thing of the past...

seriously, ever since AMD stopped using the SLI/CF bridge, microstutter has been nearly eliminated.

I only ever get microstutters during valley with heavy OC.

 

old setup: 2x HD 7950 in CF

current setup: R9 295x2

 

Must have been a faulty clock then or maybe just over OC my 295x2. She was doing 10%OC fine but has microstuttering issue when launching apps, I'm guessing it was power draw making the OC unstable.

ROG X570-F Strix AMD R9 5900X | EK Elite 360 | EVGA 3080 FTW3 Ultra | G.Skill Trident Z Neo 64gb | Samsung 980 PRO 
ROG Strix XG349C Corsair 4000 | Bose C5 | ROG Swift PG279Q

Logitech G810 Orion Sennheiser HD 518 |  Logitech 502 Hero

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×