Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Battlefield V with DXR on tested from Techspot(Hardware Unboxed)

Source:

https://www.techspot.com/review/1749-battlefield-ray-tracing-benchmarks/

Quote

So how does DXR actually look in Battlefield V? Above is a direct comparison of the four presets, and the first thing you should notice is there’s a large visual difference between Ultra and Low, but there’s not as much of a difference between Ultra and Medium. In fact, there is basically no difference between the Ultra, High and Medium settings for DXR reflections, and it’s not just in this scene. We tested several other sections of the game and couldn’t spot any difference between the three higher settings.

This leads me to conclude that there are only two DXR modes: Ultra, which applies the full complement of ray tracing reflections in the game, and a cut down Low mode for better performance. Both the low and ultra modes apply reflections to surfaces like water, puddles and shiny objects, but it’s only the ultra mode that also applies these reflections to more matte surfaces like mud and your gun.

You’ll spot in this scene for example that the ultra mode reflects the fiery vehicle wreck off the muddy surface in the front, as well as on the barrel of your weapon at some distance away. Switch down to low, and the reflections disappear on your gun and from the mud, leaving just reflections from the water’s surface.

From what we can tell, this is the main difference between the two modes. The quality of reflections in terms of resolution, accuracy and draw distance is unaffected, so you’ll get the same sort of experience looking at those shiny surfaces whether you’re playing on Ultra or Low. Switching it down a notch only affects the amount of materials and surfaces that RTX reflections apply to. Ultra, in my opinion, gives a far better and more realistic presentation and really shows off the quality of ray tracing.

S-14.jpg

1.png

Spoiler

2.png

3.png

So cut perf about 60% or more to get more realistic reflections... I think not, at least for a fps. And this certainly doesn't reflect the price.

Western Sydney University - 2nd year BCompSc student

Link to post
Share on other sites

But what's the noise level like?

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Link to post
Share on other sites

I guess people will skip the DXR option in-game totally, maybe if it had a Story mode that you can enjoy with the DXR ON, maybe yes, in fast paced multiplayer and 60% fps cut...I doubt it. Battlefield yet again looks good even on lower settings so it's a no brainer for me - only if they could present the game properly without the stupid stuff..

Motherboard: MSI BM250-PRO-VD           <-- Build 1    Build 2 -->        Motherboard: ASRock Z370M Pro4                     
CPU: Intel Core i7-7700                                                   CPU: Intel Core i7-8700K
GPU: Gigabyte G1 Gaming GTX 1080                                          GPU: Gigabyte Windforce OC GTX660 (Saving money for RTX 2080)
RAM: 16GB HyperX Fury DDR4 2400Mhz                                        RAM: HyperX Predator 16gb DDR4 3000mhz
PSU: Seasonic M12II EVO 620w Bronze+                                      PSU: Seasonic Focus Gold Plus 650W
CPU Cooler: Gammax 400                                                    CPU Cooler: Gammax 400
SSD: Kingston A400 250GB                                                  HDD/SSD: 3 * 1TB WD blue Kingston A400 250GB 
Headphones: HYPERX CLOUD ALPHA                                                                                               Case: Thermaltake V200 RGB
Monitor: LG 24GM77                                                                                                               Mouse: SteelSeries Rival 600
CaseFans: 3 140mm                                                                                                                Phone: Samsung Galaxy S8+
HDD: 1TB 7200RPM
Mouse: SteelSeries Rival 500
Phone: Nokia 7 plus



hehe yeaaah boy
https://linustechtips.com/main/uploads/monthly_2018_09/111.PNG.5713b4f39cae3a5badac216b30d99e65.PNG

 

You want to code but don't know how ? (Personal list)

 

Link to post
Share on other sites

oh, ye. below 50 fps is starting become kinda "meh" in terms of framerate. 

 

i mean, hopefully optimizations help in the future, but i doubt we will ever see more than 70 fps on the 2080ti. and at that point id call it pretty good. 

 

atm its good, but not quite there yet.

Link to post
Share on other sites

like i said on another topic i think Nvidia's plan for this not to happen is to implement a good SLI with DXR. And that's why SLI is now a thing again.

.

Link to post
Share on other sites

Spoiler alert, Raytracing is really really really REALLY fucking hard to render in realtime.

The fact its doable at all is amazing.

System Specs

CPU: Core i7 4770k @4.2ghz | Mobo: Gigabyte Z87X-UDH5 | RAM: Corsair Vengeance 1333 16gb | GPU: ASUS RTX 2080TI | Storage: Samsung 840 Evo, WD Blk 1TB/Blk 500gb /Grn 2tb | PSU:Corsair ax860 | Cooling: Corsair H100i w/NF-F12 LTT Edition Displays: 55" Samsung TV, 24" BenQ XL2420TE/XL2411Z & Asus VG248QE | Kb: K70 RGB Blue | Mouse: Logitech G700s | Case: NZXT H440 Matte Black | Extra: NZXT Hue+ Fanatec CSR/Shifters/CSR Elite Pedals w/ Rennsport stand, Thustmaster Warthog HOTAS, Track IR5, Nvidia 3d Vision 2, ARCTIC Z3 Pro Triple Monitor Arm | OS: Win 10 Pro 64 bit

Link to post
Share on other sites
19 minutes ago, Nexxus said:

Spoiler alert, Raytracing is really really really REALLY fucking hard to render in realtime.

The fact its doable at all is amazing.

people don't get this though ._. they see 60 fps evaporating and people go OMG RAYTRACING SUX

Link to post
Share on other sites
2 minutes ago, VegetableStu said:

people don't get this though ._. they see 60 fps evaporating and people go OMG RAYTRACING SUX

Frankly, I was expecting 30ish at 1920x1080, the fact its getting about 20 over that is blowing my fucking mind lol

System Specs

CPU: Core i7 4770k @4.2ghz | Mobo: Gigabyte Z87X-UDH5 | RAM: Corsair Vengeance 1333 16gb | GPU: ASUS RTX 2080TI | Storage: Samsung 840 Evo, WD Blk 1TB/Blk 500gb /Grn 2tb | PSU:Corsair ax860 | Cooling: Corsair H100i w/NF-F12 LTT Edition Displays: 55" Samsung TV, 24" BenQ XL2420TE/XL2411Z & Asus VG248QE | Kb: K70 RGB Blue | Mouse: Logitech G700s | Case: NZXT H440 Matte Black | Extra: NZXT Hue+ Fanatec CSR/Shifters/CSR Elite Pedals w/ Rennsport stand, Thustmaster Warthog HOTAS, Track IR5, Nvidia 3d Vision 2, ARCTIC Z3 Pro Triple Monitor Arm | OS: Win 10 Pro 64 bit

Link to post
Share on other sites
13 minutes ago, Nexxus said:

Spoiler alert, Raytracing is really really really REALLY fucking hard to render in realtime.

The fact its doable at all is amazing.

But it only works with some reflections and even then not done properly

If the scene has no reflections, it is identical to having raytracing disabled

The hit to performance is unjustifiable for what little it gives

 

It is really fucking annoying during livestreams of the game and everyone constantly asking if RTX is turned on

Because if it does so little that it isn't visually identifiable and you have to ask if its on, then its pointless to care about the feature at all

You dont know how shitty it feels when you cut your framerate to a third and still nobody can tell if its enabled

Awareness is key. Never enough, even in the face of futility. Speak the truth as if you may never get to say it again. This world is full of ugly. Change it they say. The only way is to reveal the ugly. To change the truth you must first acknowledge it. Never pretend it isn't there. Never bend the knee.

 

Please quote my post in your reply, so that I will be notified and can respond to it. Thanks.

Link to post
Share on other sites
5 minutes ago, westport17 said:

Just hope the next 7nm nVidia GPUs can handle it well enough..

It would be nice with a new architecture and node shrink to hope for something kick arse and compelling.   It's starting to get rather droll with Nvidia did this and nvidia did that.  Having something to compare and talk about rather than simply addressing pointless hate would make for a nice change on these forums. 

QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Sometimes I miss contractions like n't on the end of words like wouldn't, couldn't and shouldn't.    Please don't be a dick,  make allowances when reading my posts.

Link to post
Share on other sites
2 minutes ago, huilun02 said:

But it only works with some reflections and even then not done properly

If the scene has no reflections, it is identical to having raytracing disabled

The hit to performance is unjustifiable for what little it gives

 

It is really fucking annoying during livestreams of the game and everyone constantly asking if RTX is turned on

Because if it does so little that it isn't visually identifiable and you have to ask if its on, then its pointless to care about the feature at all

You dont know how shitty it feels when you cut your framerate to a third and still nobody can tell if its enabled

Being its physics-based it is not unreasonable to assume that its still working behind the scene even if nothing is utilizing it at that moment. Since you just don't know when its gonna need to kick in for example in MP.

And I haven't watched steams but the demos frankly blew my mind. I suspect a metric ton of these people saying is RTX on is people trying to be funny and reciting memes and or frankly dont know what they are looking for.

I wish I was getting BF5 to try RTX on my own but sadly Im not buying BF games anymore, I play them a ton day one and never touch em again, so im off that treadmill now.

System Specs

CPU: Core i7 4770k @4.2ghz | Mobo: Gigabyte Z87X-UDH5 | RAM: Corsair Vengeance 1333 16gb | GPU: ASUS RTX 2080TI | Storage: Samsung 840 Evo, WD Blk 1TB/Blk 500gb /Grn 2tb | PSU:Corsair ax860 | Cooling: Corsair H100i w/NF-F12 LTT Edition Displays: 55" Samsung TV, 24" BenQ XL2420TE/XL2411Z & Asus VG248QE | Kb: K70 RGB Blue | Mouse: Logitech G700s | Case: NZXT H440 Matte Black | Extra: NZXT Hue+ Fanatec CSR/Shifters/CSR Elite Pedals w/ Rennsport stand, Thustmaster Warthog HOTAS, Track IR5, Nvidia 3d Vision 2, ARCTIC Z3 Pro Triple Monitor Arm | OS: Win 10 Pro 64 bit

Link to post
Share on other sites

I'm just surprised that a shooter - were people care a lot more about FPS - is the first to implement it. I would've expected NVidia to throw money at e.g. Tomb Raider and have a cinematic game showing off the feature.  1080p Ultra @60fps is what I expected to be what Raytracing could do for us now, simply because it is the "sweet spot" for many gamers.  

Link to post
Share on other sites

For early, most likely somewhat rushed software (being that this is added in a post-launch patch rather than being integrated as part of the base game) running on first generation hardware, this is actually better than I would have expected. Sure, it's not practical right now. I doubt any implementation of RTX coming out in the near future is going to have an FPS hit that most would consider practical (outside of stuff like puzzle or horror games that don't need quick reactions). But as a proof of concept it shows that there are tangible benefits to the tech that are immediately noticeable for the performance cost (lest we forget the trials and tribulations of tessellation, where we paid a huge performance tax for bumpier rocks and the occasional underground ocean).

 

I'm frankly excited for what comes out of this in 2-3 GPU generations. Node shrinks (more than likely just 1 - 1 1/2 though) combined with refinements to the RT cores to boost efficiency and throughput and more optimized software could easily bring us to a point where this becomes a standard feature (assuming AMD comes up with something comparable that doesn't end up needing an entirely separate implementation. I'd expect them to have something working by at least Navi gen 2).

Link to post
Share on other sites
41 minutes ago, mr moose said:

It would be nice with a new architecture and node shrink to hope for something kick arse and compelling.   It's starting to get rather droll with Nvidia did this and nvidia did that.  Having something to compare and talk about rather than simply addressing pointless hate would make for a nice change on these forums. 

This is the way technology progressing. Inventions always come first and hardware improvements will follow.

Just be positive about what future will bring us.

Link to post
Share on other sites
51 minutes ago, Nexxus said:

Frankly, I was expecting 30ish at 1920x1080, the fact its getting about 20 over that is blowing my fucking mind lol

Well, it's Hybrid Rendering and it's only handling some reflections. It's amazing they've gotten that far, but this is more "Can it run Crysis?" territory for at least 2 more generations of GPUs. It's going to either be the 5nm generation of GPUs or some sort of super Vector Math Unit that's built into the GPUs.

Link to post
Share on other sites

To make the performance hit acceptable, for the limited implementation, it looks like at least 100 RT cores are going to be necessary. To make it really viable, it's probably closer to 200 RT Cores.

Link to post
Share on other sites

Not going to lie, those scenes with RTX on ULTRA looked pretty sick so in a few years time I'm really looking forward to playing ultra realistic AAA titles

Work Laptop: HP ZBook 15  i7-4800QM 16GB  Home Laptop: Lenovo Ideapad 720s i7 8550u Phone: Galaxy S9  

CPU: R7 3700X GPU: GTX 1070 it Strix HDD: 1TB WD Blue SSD: 128gb 970 Memory: 16GB Crucial DDR4

 

Link to post
Share on other sites

all those people who bought 2080Ti's expecting at least 60fps.... 

Phone: iPhone 6s | 64GB iOS
MacBook Pro: Mid-2012 | Core i5 3210M | 8GB RAM | 250GB SSD | macOS

PC: Supermicro X8DT3 | 2x Xeon X5650 | R9 290X | 32GB RAM500GB SSD | Bitfenix Whisper 850W | Windows 10

Link to post
Share on other sites

I am hoping that I can use a combination of technologies to arrive at 1440p/60 with my i7-8700K/RTX 2080:

 

1080p, ray tracing enabled in medium or high and DLSS to upsample the resolution back to 1440p.

 

That'll do me for a first-gen. product.

Intel Core i7-8700K, 16 GB DDR4-3000, NVIDIA GeForce RTX 2080, Microsoft Windows 10. More.

 

NVIDIA GPU SKU database:<removed>

Link to post
Share on other sites

oh man this is gonna look so sick on my PC Classic! Everything shining and glistening... RTX rubbing baby oil on everything...you aint gaming unless you shoot a man with a glistening baby oiled gun fool...

 

EDIT: I meant shoot a person... I did not mean it to come out that way, I believe in equality. Please let me keep my job...

System
CPU:Ryzen 3 2200G, Motherboard:ASUS Prime X470-Pro, RAM:G.SKILL TridentZ RGB Series 16GB (2 x 8GB), GPU:EVGA GeForce GTX 1080 SC GAMING,Case:Corsair - SPEC-02, Storage:Seagate - Barracuda 3TB 3.5" 7200RPM / Crucial - MX500 250GB 2.5", PSU:EVGA - BT 450W 80+ Bronze, Display:

VIOTEK GN32DB 32-Inch Curved Gaming Monitor, Cooling:Wraith Stealth Cooler,Keyboard:Logitech Desktop MK120, Mouse:Logitech Desktop MK120, Sound:Logitech Z313 Speaker System, Operating System:Windows 10 Pro 64bit

Link to post
Share on other sites
8 hours ago, Nexxus said:

Frankly, I was expecting 30ish at 1920x1080, the fact its getting about 20 over that is blowing my fucking mind lol

I remember trying raytracing in blender back in 2012ish? It boosted a render from 2-3 mins to 20 secs....20 secs for 1 frame.

Link to post
Share on other sites

Question for anyone here who knows enough about programming with the new RTX in game engines. How hard is it to implement in time trms compared to traditional ways of doing the related parts. Honestly the way it sounds, it comes across as if it uses all of the existing art asset textures, geometry, and basic lighting sources to do it all automatically bypassing the need to do more than build an on switch into the engine, at which point NVIDIA@s hardware and drivers make over and build the raytrace, de-noise it, and integrate it. But i've no idea if thats acurratte.

 

Also for anyone familiar with any of the hardware computational costs, how acurratte do you think NVIDIA's 10 Giga Rays per second number on the 2080Ti actually is in pure "has the processing power" sense. Is it pure marketing of an utterly perfect scenario or is it just a modest  overestimation of the capabilities, (the way AMD's claim about EPYC 2's 4x Floating Point Performance is, acurratte and we'll see a real benefit close to that but not a number you'll hit 100% of the time). Because i've been doing some math and 1920x1080 @ 60FPS with 3 rays per pixel and just the reflection rays on bounce only comes to around 760 million rays per second. It's making me wonder if maybe something else isn't bottlenecking he Tensor cores real hard, (it's obviously not the Rasterization, but could be PCI-E, could be Memory, Could be Denoising, hard to tell).

Link to post
Share on other sites
20 minutes ago, CarlBar said:

Question for anyone here who knows enough about programming with the new RTX in game engines. How hard is it to implement in time trms compared to traditional ways of doing the related parts. Honestly the way it sounds, it comes across as if it uses all of the existing art asset textures, geometry, and basic lighting sources to do it all automatically bypassing the need to do more than build an on switch into the engine, at which point NVIDIA@s hardware and drivers make over and build the raytrace, de-noise it, and integrate it. But i've no idea if thats acurratte.

 

Also for anyone familiar with any of the hardware computational costs, how acurratte do you think NVIDIA's 10 Giga Rays per second number on the 2080Ti actually is in pure "has the processing power" sense. Is it pure marketing of an utterly perfect scenario or is it just a modest  overestimation of the capabilities, (the way AMD's claim about EPYC 2's 4x Floating Point Performance is, acurratte and we'll see a real benefit close to that but not a number you'll hit 100% of the time). Because i've been doing some math and 1920x1080 @ 60FPS with 3 rays per pixel and just the reflection rays on bounce only comes to around 760 million rays per second. It's making me wonder if maybe something else isn't bottlenecking he Tensor cores real hard, (it's obviously not the Rasterization, but could be PCI-E, could be Memory, Could be Denoising, hard to tell).

 

Im still waiting for someone to release some information on the power consumption when the RT and Tensor Cores are turned on, because the PCBs were already hitting max wattage OC'd without those cores turned on.  Maybe creating the bottleneck as its not performing on the Turing processor like it should.

Workstation Laptop: Dell Precision 7540, Xeon E-2276M, 32gb DDR4, Quadro T2000 GPU, 4k display

Ryzen Rig 2: ASrock B450 Pro4 ATX, Ryzen 7 1700 @ 4.2ghz all core 1.4vCore, AMD R9 Fury X w/ Swiftech KOMODO waterblock, Custom Loop 2x240mm + 1x120mm radiators in push/pull 16gb (2x8) 3600mhz V-Color Skywalker (or 4x8gb DDR4 2666mhz for large tasks), Corsair HX850 PSU, 128gb Patriot Scorch NVMe Win 10 boot drive, 500gb Samsung 840 EVO SSD, 512GB TeamGroup MS30 M.2 SATA III, CoolerMaster HAF XM Case.  Zalman K600S keyboard, Zalman ZM-GM1 mouse, Viotek GN24C 24" 1080p 144hz curved and Hannspree HF207 as 2nd monitor

https://www.3dmark.com/3dm/37004594?

Ryzen Rig 1: ASUS B350-PRIME ATX, Ryzen 7 1700, Sapphire R9 Fury Tri-X Nitro 4gb HBM, 16gb (2x8) 3200mhz V-Color Skywalker, ANTEC Earthwatts 750w PSU, MasterLiquid Lite 120 AIO cooler in Push/Pull config as rear exhaust, 250gb Samsung 850 Evo SSD, Patriot Burst 240gb SSD, Cougar MX330-X Case.  Zalman K600S keyboard, Zalman ZM-GM1 mouse, Acer XF270HU 2560x1440 144hz IPS monitor

https://www.3dmark.com/3dm/37628874?

Dwight: The Mixed Metals Loop Media Center.  Ask me about it.  Currently decommissioned to move to an mATX setup on a new MOBO once I pick one out (getting its facelift as of June 2020 have new air cooler, drives etc.  About 60% finished.  Will no longer be a closed loop system.  Slight update, upgraded to larger SSD, and air cooling installed Arctic Alpine cooler.

Schrute: ASUS M5A99FX Pro R2.0, FX 8350, Sapphire R9 Fury Tri-X Nitro 4gb HBM, 16gb (4x4) Corsair Vengeance DDR3 1600mhz, Sparkle/FSP 650w PSU, Corsair H100i GTX 240mm AIO w/ 12mm thick fans to fit in top exhaust, 256gb TIMETEC SSD, 1tb WDBlack HDD, Rosewill Nautilus 1.0 case.  DSI 90-Key Mechanical Keyboard w/ Cherry Red switches, Zalman gaming mouse

Micro Form Factor Dell OptiPlex 3040: Dell 0MGK50 A02, i3-6100T, 2x4gb DDR3 1600, Team Group 120gb SSD, 500gb Seagate 7mm HDD attached storage, Windows 10 Pro, Logitech K400+, USB Wifi adapter all vesa mounted to the back of a 37" 1080p TV 

Linux Box: Toshiba Laptop, i7 620M, NVS graphics, 4gb ram tinker toy at the moment.  Running Manjaro XFCE at the moment.

Home Security: ZOSI 8 channel CCTV (4 used at this time, 1080p) DVR H.265+, 3tb HGST Enterprise HDD, ASUS monitor for display

Link to post
Share on other sites
6 minutes ago, Tristerin said:

 

Im still waiting for someone to release some information on the power consumption when the RT and Tensor Cores are turned on, because the PCBs were already hitting max wattage OC'd without those cores turned on.  Maybe creating the bottleneck as its not performing on the Turing processor like it should.

 

It's lower with them on, give me a bit to dig info up.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×