Jump to content

(Official?) Battlefield V with RTX benchmark

Zandvliet
30 minutes ago, M.Yurizaki said:

Maybe related maybe not. But something from the past to remember going forward:

 

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS8x

 

 

Yeah only those of us who weren't complete idiots didn't fully expect this. We've seen this before whenever a new version of DX or some other improvement came out. 1 to 2 years from now when NVIDIA has finished optimizing their drivers for the 20 series platform and Game devs have got a better idea of how to get the best performance out of the tools they have we'll see 1440p 60hz RT being doable on a 2080Ti.

 

But no one with any memory of history excepted playable performance at anything but the lowest resolutions with it enabled on release day.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, CarlBar said:

 

Yeah only those of us who weren't complete idiots didn't fully expect this. We've seen this before whenever a new version of DX or some other improvement came out. 1 to 2 years from now when NVIDIA has finished optimizing their drivers for the 20 series platform and Game devs have got a better idea of how to get the best performance out of the tools they have we'll see 1440p 60hz RT being doable on a 2080Ti.

 

But no one with any memory of history excepted playable performance at anything but the lowest resolutions with it enabled on release day.

I also found benchmarks of the same game, same settings, but a year later when benchmarking the GeForce 580 and Radeon HD 6970. There was a huge positive difference.

 

Though I'm too lazy to find it right now

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, CTR640 said:

Ye sorry but 1080p is ugly when you have experienced 1440p and up. Never gonna buy a €1400 GPU just to go back from 1440p to 1080p lmfao.

I regularly bounce between 1440p and 900p. It's rather easy.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

Some interesting results out of this video. Seems that whilst peak frame rate drops a lot the lows aren't as far off the peak, they're still low enough to be concerning, but getting 60FPS consistent is going to possibly be a lot easier than we thought optimization wise.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, The Benjamins said:

IDK, but it appears that it is bottlenecked by the GPU portion that handles ray tracing, which also points to the hardware being too new and needs work.

This is just DXR so no ray tracing at all, just up sampling on Tensor cores.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Morgan MLGman said:

This graph is a lot more interesting IMO. I mean, who's going to buy an RTX 2080Ti and play on a 1080p monitor? That's like buying a supercar and driving it with a 60MPH speed limiter of some sort.

Even at 1440p, the performance is barely over 40FPS with anything over DX12 + DXR Low enabled. At 4K, it's unplayable beyond that threshold. Note that this is a 1000$+ GPU that's marketed as a 4K GPU ;) I don't remember Nvidia mentioning that it's only a 4K GPU if you don't use the ray-tracing features that they also market these cards with.

 

Edit: Dumb comment contained blow, was thinking DLSS lol.

Spoiler

 

One of the prevailing arguments for DXR is to upscale the image to 4K with internal render at 1080p, for higher FPS. That means you are still putting native resolution output to the monitor.

 

However this graph shows us something very interesting, 1080p DXR on any setting is worse than native 4K rendering to that's RIP to that DXR play book.

 

 

Edited by leadeater
Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, leadeater said:

.

So DXR is the whole "miracles" DLSS that would turn 1440p into 4k with no performance impact? I'm confused myself on what's going on... despite getting a RTX card myself I never really got into Ray Tracing I just wanted the rasterization performance haha, I also don't play battlefield so I have nothing to base on... but damn it's looking worse than I could've expected...

 

Feels odd knowing that somewhat half of my GPU is pretty much useless

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, leadeater said:

This is just DXR so no ray tracing at all, just up sampling on Tensor cores.

"Included with this update will be the first release of DXR ray-traced reflections, which add lifelike, cutting-edge real-time reflections to surfaces and objects in Battlefield V’s epic 64-player multiplayer maps and cinematic War Stories. With DXR enabled, graphical fidelity, realism and immersion are elevated to previously unobtainable levels"

https://www.nvidia.com/en-us/geforce/news/battlefield-v-rtx-ray-tracing-out-now/

 

i'm not saying you're wrong but i'm confused.

.

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, leadeater said:

This is just DXR so no ray tracing at all, just up sampling on Tensor cores.

DXR is ray tracing.

https://www.nvidia.com/en-us/geforce/news/battlefield-v-rtx-ray-tracing-out-now/

DXR is Direct X Ray tracing.

DLSS is the up scaling tech

 

if you want to annoy me, then join my teamspeak server ts.benja.cc

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, voiha said:

DRX Medium : 64.5fps
DRX High: 66.4fps
DRX Ultra: 65.3fps

How is this possible ? Am I misunderstanding something ?

 

 

images.jpg

It's probably within the margin of error of the test. These scores are usually obtained by having the journalist run through a stage, trying to replicate the same movements as closely as possible every run. 2-3 FPS differences can generally be attributed to the journalist being a mere human. What's more likely happening is that there isn't much difference at all between medium/high/ultra DXR, or the card is hitting a specific bottleneck that prevents it from going higher even though parts of the RT cores can be used for higher detail levels.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, M.Yurizaki said:

Maybe related maybe not. But something from the past to remember going forward:

 

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS8x

 

Yep, I don't know why anyone bothered to push forward with AA, clearly it was a failure of a technology that should never have been pursued. oh wait...   9_9

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

benchmarking in 1920x1080 with THAT card? they gotta be fucking kidding, even my 14 year old monitor has a higher resolution.

 

no sane person would spend $2000 on a card to play on 1080p, I think whoever owns a 2080Ti has AT LEAST a 2160p screen.

 

 

ASUS X470-PRO • R7 1700 4GHz • Corsair H110i GT P/P • 2x MSI RX 480 8G • Corsair DP 2x8 @3466 • EVGA 750 G2 • Corsair 730T • Crucial MX500 250GB • WD 4TB

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, asus killer said:

"Included with this update will be the first release of DXR ray-traced reflections, which add lifelike, cutting-edge real-time reflections to surfaces and objects in Battlefield V’s epic 64-player multiplayer maps and cinematic War Stories. With DXR enabled, graphical fidelity, realism and immersion are elevated to previously unobtainable levels"

https://www.nvidia.com/en-us/geforce/news/battlefield-v-rtx-ray-tracing-out-now/

 

i'm not saying you're wrong but i'm confused.

No I was confused lol, was thinking DLSS. Need more sleep at night haha.

 

1 hour ago, The Benjamins said:

DXR is ray tracing.

https://www.nvidia.com/en-us/geforce/news/battlefield-v-rtx-ray-tracing-out-now/

DXR is Direct X Ray tracing.

DLSS is the up scaling tech

See above, ?‍♂️

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, aezakmi said:

benchmarking in 1920x1080 with THAT card? they gotta be fucking kidding, even my 14 year old monitor has a higher resolution.

 

no sane person would spend $2000 on a card to play on 1080p, I think whoever owns a 2080Ti has AT LEAST a 2160p screen.

 

 

 

resolution isnt the only way to improve visuals if you actually paid attention and compared what rt really does visually

 

Link to comment
Share on other sites

Link to post
Share on other sites

Didn't they marketed it as 108060fps ray tracing OR 4k60fps back in the original presentation?

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

thats also a benchmark folks, we all know real games will be worse than this.

at 40fps a FPS becomes more a turn based game. ouchy. i am assuming its a driver issue not what these cards are capable of.

That would really kill RTX before it got of the ground. 

I would expect 65-70FPS be the spot they want to hit given its fast enough to feel fluid obviously not 100FPS fluid but smooth enough.

 

If this is what we expect 40FPS ouch, no one will be turning on ray tracing.

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

Well in light of @mr moose's comment, I did want to see what was a factor in DirectX 11 performance vs. DirectX 10 since I've pulled up more benchmarks that paint a better picture. The only correlation I can find was one of DirectX 11's most touted feature, tessellation, could wreck havoc with performance.

 

12573821526izM8p4LAl_1_1.gif

 

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9O

 

Though to be fair, AMD's tessellator wasn't as good as NVIDIA's at the time. And some games that did use tessellation, like Deus Ex: Human Revolution, did so only on fewer elements so the performance impact wasn't that bad at all. Also, NVIDIA did make special hardware for Tessellation they called the Polymorph Engine.

 

Ah well, I don't know what I'm trying to get at here other than we've seen this before in a different feature.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, M.Yurizaki said:

I don't know how to react to this :c

My point was more just that AA in the early days caused a big drop in FPS.  Now if people are claiming that because RTX has such a massive hit to FPS that it is a failure of a technology,  then shouldn't the same logic should apply to AA? Did these people honestly think AA was going to fail because it was perfect out of the gate?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mr moose said:

My point was more just that AA in the early days caused a big drop in FPS.  Now if people are claiming that because RTX has such a massive hit to FPS that it is a failure of a technology,  then shouldn't the same logic should apply to AA? Did these people honestly think AA was going to fail because it was perfect out of the gate?

Well, AA was used, but the graphs were also showing the difference between DX10 and DX11.

 

That's why I also went out and found some other benchmarks.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, M.Yurizaki said:

Well, AA was used, but the graphs were also showing the difference between DX10 and DX11.

 

That's why I also went out and found some other benchmarks.

I definitely wasn't trying to say there was some intrinsic error in your post, quite the contrary, I found it a very relevant look at how new GPU technology becomes a thing and how well it performs initially versus better optimizations and uptake. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, mr moose said:

I definitely wasn't trying to say there was some intrinsic error in your post, quite the contrary, I found it a very relevant look at how new GPU technology becomes a thing and how well it performs initially versus better optimizations and uptake. 

Literally at this point, the only complaint I'll accept is NVIDIA's charging too much for this tech. Which I think they are.

 

Everything else is just repeating history.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×