Jump to content

(Official?) Battlefield V with RTX benchmark

Zandvliet
31 minutes ago, M.Yurizaki said:

Literally at this point, the only complaint I'll accept is NVIDIA's charging too much for this tech. Which I think they are.

 

Everything else is just repeating history.

to be honest

we have had 1200 dollar card for over 2 yrs (1k card for 6yrs)

 

i'm sure their market analysis knows they can go at this price

just like apple and iphone shit

Link to comment
Share on other sites

Link to post
Share on other sites

Everyone: "WE WANT A NEW CRYSIS, SOMETHING THAT PUSHES BOUNDARIES"

 

Game Devs: *adds Raytracing* 

 

Everyone: "HOW CAN YOU ADD A TECHNOLOGY THAT OUR CARDS STRUGGLE TO RUN EVEN THOUGH IT LOOKS GOOD" 

That's an F in the profile pic

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Froody129 said:

Everyone: "WE WANT A NEW CRYSIS, SOMETHING THAT PUSHES BOUNDARIES"

 

Game Devs: *adds Raytracing* 

 

Everyone: "HOW CAN YOU ADD A TECHNOLOGY THAT OUR CARDS STRUGGLE TO RUN EVEN THOUGH IT LOOKS GOOD" 

But this:

it-amp-039-s-crazy-how-crysis-still-hold

Link to comment
Share on other sites

Link to post
Share on other sites

My opinion is the same as before. 

 

As in that it's not yet going mainstream but it should evolve over time 

 

If you're after pure FPS, I don't think you should be excited as of now

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, sazrocks said:

1. This isn't playable at the resolutions people with 2080tis have (1440p, 4k)

2. Honestly that doesn't look any better than normal render technology with reflection.

I have a 240hz 1080p monitor and a 2080ti so I would have to disagree. Granted I also have a 4k monitor as well. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Morgan MLGman said:

This graph is a lot more interesting IMO. I mean, who's going to buy an RTX 2080Ti and play on a 1080p monitor? That's like buying a supercar and driving it with a 60MPH speed limiter of some sort.

Even at 1440p, the performance is barely over 40FPS with anything over DX12 + DXR Low enabled. At 4K, it's unplayable beyond that threshold. Note that this is a 1000$+ GPU that's marketed as a 4K GPU ;) I don't remember Nvidia mentioning that it's only a 4K GPU if you don't use the ray-tracing features that they also market these cards with.

That would be me. Got my 1080p monitor ready to try it out. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, aezakmi said:

benchmarking in 1920x1080 with THAT card? they gotta be fucking kidding, even my 14 year old monitor has a higher resolution.

 

no sane person would spend $2000 on a card to play on 1080p, I think whoever owns a 2080Ti has AT LEAST a 2160p screen.

 

 

The card is 1200 not 2000 and I for one will be playing at 1080p. 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Zandvliet said:

Hi all,

 

Sorry if this is already posted somewhere, but I coulnd't find it.

 

As far as I'm concerned this is fine I see Ray Tracing more as a beauty thing more suited to single player or people that are perfectly happy playing online 60fps (this includes me btw) I'm still blown away by the tech and can't wait to see how developers put it to use. I mean BFv is the latest in game engine technology and we have a card that can run it in 60fps using Ray Tracing … Ray Tracing that's nuts. That's film territory and it's only Nvidia's first crack at it. I personally couldn't give a crap about > 1080p gaming or high refresh rate gaming I don't need it I don't want it I'll take the accurate lighting / reflections just my thoughts though I know 90% of you will disagree.

There are 10 types of people in the world: those who understand binary numbers and those who don’t

bulgara, oh nono

Multipass

Link to comment
Share on other sites

Link to post
Share on other sites

4 pages and zero RTX On vs RTX Off images. I read it look AMAAAZEEEBAWLLL with RTX On.  Read

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

isn't exactly because the cards struggle to go beyond 1080p at acceptable frame rates with rtx on that they decided to remove SLI from the coffin? i would say they are thinking of doing SLI to get 4k and RTX.

.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, TOMPPIX said:

 

17 hours ago, voiha said:

DRX Medium : 64.5fps
DRX High: 66.4fps
DRX Ultra: 65.3fps

How is this possible ? Am I misunderstanding something ?

 

Looks like there is an issue with the application of the settings. I found a test on a german site that says, that the medium settings do not always apply properly:

 

From: PC Games Hardware (German) in my humble translation:

Quote

[...] It is important to take notice that - according to NVidia, there is a bug when changing to the medium setting resulting in in not beeing applied properly. To successfully change to the medium preset, first the 'low' preset has to be select before changing to the 'medium' one.[...]

Possibly Tom's didn't follow this through, because their results look rather different (Interactive statistic, can't embed that properly here.

 

 

"We cannot change the cards we're dealt - just how we play the hand" - R. Pausch

 

CPU: Ryzen 7 3700X , Cooler: BeQuiet Dark Rock 3 Motherboard: MSI B450 Mortar Titanium RAM: 16 GB Corsair LPX 3200 GPU: EVGA RTX2070 XC Storage: Adata 120GB SSD, SanDisk 1TB SDD, 2TB WD GreenHDD Case: Fractal Design Define Mini C PSU: EVGA Supernova 650GS Peripherals: Master Keys Pro S, Logitech G402 Audio: Schiit Fulla 2 + Sennheiser HD 650. Laptop: Asus Zenbook UX 302

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, asus killer said:

isn't exactly because the cards struggle to go beyond 1080p at acceptable frame rates with rtx on that they decided to remove SLI from the coffin? i would say they are thinking of doing SLI to get 4k and RTX.

but if you buy 2 2080 tis and use nvilink dont you wase around 600 bucks cause of the nv-link shitty limit! why cant they improve that first!

CPU: Intel I9-9900K Motherboard: Asus ROG Maximus XI CODE RAM: 32 GB Corsair Vengeance PRO 3200MHZ GPU: Asus ROG 2080ti OC Storage: Samsung 970pro 2x1TB, 2Tb HDD Case: Lian Li PC O11 dynamic, Cooling:custom loop incoming, for now 360 rog ryujin!

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Bayonett Priest said:

but if you buy 2 2080 tis and use nvilink dont you wase around 600 bucks cause of the nv-link shitty limit! why cant they improve that first!

like rtx on games i expect that to be something they will be working on with time. Let's face it whoever buys an RTX card is a beta tester.

.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Bayonett Priest said:

but if you buy 2 2080 tis and use nvilink dont you wase around 600 bucks cause of the nv-link shitty limit! why cant they improve that first!

NVLink is the improvement from the even more shitty HB SLI bridge,

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, asus killer said:

like rtx on games i expect that to be something they will be working on with time. Let's face it whoever buys an RTX card is a beta tester.

first: i knwo i am but i was hiting 30 fps with my 980 ti in cod

second: true battlefield was worked on and then nvidia came and said: " hey we have new technology implememt it quickly!"  lets see when games come out that have rtx integrated into the game than instead slapped on!

third: with updates maybie it will run better!

CPU: Intel I9-9900K Motherboard: Asus ROG Maximus XI CODE RAM: 32 GB Corsair Vengeance PRO 3200MHZ GPU: Asus ROG 2080ti OC Storage: Samsung 970pro 2x1TB, 2Tb HDD Case: Lian Li PC O11 dynamic, Cooling:custom loop incoming, for now 360 rog ryujin!

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

NVLink is the improvement from the even more shitty HB SLI bridge,

still youre throwing halve the second gpu out the window!

CPU: Intel I9-9900K Motherboard: Asus ROG Maximus XI CODE RAM: 32 GB Corsair Vengeance PRO 3200MHZ GPU: Asus ROG 2080ti OC Storage: Samsung 970pro 2x1TB, 2Tb HDD Case: Lian Li PC O11 dynamic, Cooling:custom loop incoming, for now 360 rog ryujin!

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Bayonett Priest said:

still youre throwing halve the second gpu out the window!

Better than throwing 66% of it out the Window ?. Multi GPU really needs to be handled at the game engine and graphics API level to make proper use of more than 1 GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

Better than throwing 66% of it out the Window ?. Multi GPU really needs to be handled at the game engine and graphics API level to make proper use of more than 1 GPU.

i dont know too much about gpus but when both gpus share the same power like nvi-link on crack to 100%! wouldnt that be better for games as well( i meant implementation rather than fps)! would that even be possible? i think i can remember the new quadro have more nvi-link headers right! why not on the rtx cards?

CPU: Intel I9-9900K Motherboard: Asus ROG Maximus XI CODE RAM: 32 GB Corsair Vengeance PRO 3200MHZ GPU: Asus ROG 2080ti OC Storage: Samsung 970pro 2x1TB, 2Tb HDD Case: Lian Li PC O11 dynamic, Cooling:custom loop incoming, for now 360 rog ryujin!

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Bayonett Priest said:

i dont know too much about gpus but when both gpus share the same power like nvi-link on crack to 100%! wouldnt that be better for games as well( i meant implementation rather than fps)! would that even be possible? i think i can remember the new quadro have more nvi-link headers right! why not on the rtx cards?

It's not actually a bandwidth problem, actual NVLink can't be used for games (as it stands). SLI is used across the NVLink, the technique is unchange even though the connector and fabric has. SLI always has a master and slave GPU and the master does a significant amount more and certain tasks have to run before work on the 2nd GPU can be done, there's a lot of loss because of that.

 

If you move multi GPU up to API/Game Engine you can just treat GPUs as work units and throw work at them how you like, no master slave and no dependencies beyond what you have in your own code.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Brooksie359 said:

I have a 240hz 1080p monitor and a 2080ti so I would have to disagree. Granted I also have a 4k monitor as well. 

And are you happy with the image quality increase that ray tracing has brought, especially for the performance cost? Does it warrant the price jump from $700 to $1200?

CPU - Ryzen Threadripper 2950X | Motherboard - X399 GAMING PRO CARBON AC | RAM - G.Skill Trident Z RGB 4x8GB DDR4-3200 14-13-13-21 | GPU - Aorus GTX 1080 Ti Waterforce WB Xtreme Edition | Case - Inwin 909 (Silver) | Storage - Samsung 950 Pro 500GB, Samsung 970 Evo 500GB, Samsung 840 Evo 500GB, HGST DeskStar 6TB, WD Black 2TB | PSU - Corsair AX1600i | Display - DELL ULTRASHARP U3415W |

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, leadeater said:

If you move multi GPU up to API/Game Engine you can just treat GPUs as work units and throw work at them how you like, no master slave and no dependencies beyond what you have in your own code.

Which is what will happen in this case, assuming Dice decides to support it.  DXR won't be done with SLI, it'll be done with mGPU.  That's because DXR is a DX12-only thing, and SLI no workie in DX12.

 

We have to wait and see what their roadmap is when it comes to mGPU.  Assuming there is a roadmap for it...

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Brooksie359 said:

The card is 1200 not 2000 and I for one will be playing at 1080p. 

the card is 4000 where I live so there's no difference imo

 

a kidney is 3300 in the black market

 

ASUS X470-PRO • R7 1700 4GHz • Corsair H110i GT P/P • 2x MSI RX 480 8G • Corsair DP 2x8 @3466 • EVGA 750 G2 • Corsair 730T • Crucial MX500 250GB • WD 4TB

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, jasonvp said:

and SLI no workie in DX12

SLI does work with DX12, DX12 has many multi GPU modes which in my view is kind of not the best thing as it gets confusing quickly as to what is being talked about.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Carclis said:

And are you happy with the image quality increase that ray tracing has brought, especially for the performance cost? Does it warrant the price jump from $700 to $1200?

It depends on who you ask. Honestly 4k doesn't really seem worth it to alot of people as well but there are many who will say it's well worth the extra cost. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, aezakmi said:

the card is 4000 where I live so there's no difference imo

 

a kidney is 3300 in the black market

 

Yes there is. If there isn't then the price between a 1080ti and 2080ti is insignificant. I mean if price doesn't matter then the 2080ti is the only gpu worth buying and apparently you say it doesn't matter. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×