Jump to content

Battlefield V with DXR on tested from Techspot(Hardware Unboxed)

kiska3
4 minutes ago, Tristerin said:

If that's the case than to me that sounds like intentional gimping of that GPU so that the next generation has an easy fix, more power OR smaller node (first easy, second in the works) or a combination of both.

Why would it be intentional?  The whole AI side of the card might just use less power in total and already be pushing as hard as it can, or it might be more sensitive to heat and have to be kept at a lower power level to prevent errors.

1 minute ago, PhantomHawk11 said:

It doesn't really matter if it's hard to render or not, when paying that much for a GPU people expect more than ~50 frames, and no amount of special shadows is going to change that. The fact is it's an exciting new development, but it hasn't been refined enough

That would indicate an error with expectation and lack of understanding of the product more than the inherent challenges of the problem it is solving.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mr moose said:

Why would it be intentional?  The whole AI side of the card might just use less power in total and already be pushing as hard as it can, or it might be more sensitive to heat and have to be kept at a lower power level to prevent errors.

That would indicate an error with expectation and lack of understanding of the product more than the inherent challenges of the problem it is solving.

Because if it did everything and its mother on initial release you would not need to upgrade for quite some time which is not in line with profit margin growth...this is a Corporation we are talking about.

 

Beyond that, its because we need to pay for the research leading up to the next next generation that's already in development while they release Turing, a small margin of performance above its aged predecessor.  The RT cores and Tensor cores are doing their job no?  If the Turing is losing the power it needs to run for that to happen, well it leads me to believe this is intentional - there are plenty of high TPD GPUs in existence so this wouldn't be a first

Workstation Laptop: Dell Precision 7540, Xeon E-2276M, 32gb DDR4, Quadro T2000 GPU, 4k display

Wifes Rig: ASRock B550m Riptide, Ryzen 5 5600X, Sapphire Nitro+ RX 6700 XT, 16gb (2x8) 3600mhz V-Color Skywalker RAM, ARESGAME AGS 850w PSU, 1tb WD Black SN750, 500gb Crucial m.2, DIYPC MA01-G case

My Rig: ASRock B450m Pro4, Ryzen 5 3600, ARESGAME River 5 CPU cooler, EVGA RTX 2060 KO, 16gb (2x8) 3600mhz TeamGroup T-Force RAM, ARESGAME AGV750w PSU, 1tb WD Black SN750 NVMe Win 10 boot drive, 3tb Hitachi 7200 RPM HDD, Fractal Design Focus G Mini custom painted.  

NVIDIA GeForce RTX 2060 video card benchmark result - AMD Ryzen 5 3600,ASRock B450M Pro4 (3dmark.com)

Daughter 1 Rig: ASrock B450 Pro4, Ryzen 7 1700 @ 4.2ghz all core 1.4vCore, AMD R9 Fury X w/ Swiftech KOMODO waterblock, Custom Loop 2x240mm + 1x120mm radiators in push/pull 16gb (2x8) Patriot Viper CL14 2666mhz RAM, Corsair HX850 PSU, 250gb Samsun 960 EVO NVMe Win 10 boot drive, 500gb Samsung 840 EVO SSD, 512GB TeamGroup MP30 M.2 SATA III SSD, SuperTalent 512gb SATA III SSD, CoolerMaster HAF XM Case. 

https://www.3dmark.com/3dm/37004594?

Daughter 2 Rig: ASUS B350-PRIME ATX, Ryzen 7 1700, Sapphire Nitro+ R9 Fury Tri-X, 16gb (2x8) 3200mhz V-Color Skywalker, ANTEC Earthwatts 750w PSU, MasterLiquid Lite 120 AIO cooler in Push/Pull config as rear exhaust, 250gb Samsung 850 Evo SSD, Patriot Burst 240gb SSD, Cougar MX330-X Case

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Tristerin said:

Because if it did everything and its mother on initial release you would not need to upgrade for quite some time which is not in line with profit margin growth...this is a Corporation we are talking about.

 

Beyond that, its because we need to pay for the research leading up to the next next generation that's already in development while they release Turing, a small margin of performance above its aged predecessor.  The RT cores and Tensor cores are doing their job no?  If the Turing is losing the power it needs to run for that to happen, well it leads me to believe this is intentional - there are plenty of high TPD GPUs in existence so this wouldn't be a first

So, no actual technical reason just accusations of sandbagging?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Tristerin said:

Because if it did everything and its mother on initial release you would not need to upgrade for quite some time which is not in line with profit margin growth...this is a Corporation we are talking about.

 

Beyond that, its because we need to pay for the research leading up to the next next generation that's already in development while they release Turing, a small margin of performance above its aged predecessor.  The RT cores and Tensor cores are doing their job no?  If the Turing is losing the power it needs to run for that to happen, well it leads me to believe this is intentional - there are plenty of high TPD GPUs in existence so this wouldn't be a first

 

Or you know it's drawing less power because it's doping less work. Remember the traditional rasterization side is only running at about 40% of what it was previously. I'm still digging for the video i watched alas.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, mr moose said:

So, no actual technical reason just accusations of sandbagging?

This is a Corporation - whose at the top - so of course accusations of sandbagging lol.  The same one giving you 2 2070 chips.  They are doing this for money not fun.

9 minutes ago, CarlBar said:

 

Yoink, 9:30 if the link fails.

There is a HUGE delta between what I understand and how it actually works so Ill have to take smarter peoples words as truth - if it doesn't need that power, because it cant render the frame because its waiting on RT and Tensor cores (correct me if I am wrong, Im just a newb) to be able to do that is the reason for its power consumption (CUDA cores mentioned in the video) to go down that makes sense.

Workstation Laptop: Dell Precision 7540, Xeon E-2276M, 32gb DDR4, Quadro T2000 GPU, 4k display

Wifes Rig: ASRock B550m Riptide, Ryzen 5 5600X, Sapphire Nitro+ RX 6700 XT, 16gb (2x8) 3600mhz V-Color Skywalker RAM, ARESGAME AGS 850w PSU, 1tb WD Black SN750, 500gb Crucial m.2, DIYPC MA01-G case

My Rig: ASRock B450m Pro4, Ryzen 5 3600, ARESGAME River 5 CPU cooler, EVGA RTX 2060 KO, 16gb (2x8) 3600mhz TeamGroup T-Force RAM, ARESGAME AGV750w PSU, 1tb WD Black SN750 NVMe Win 10 boot drive, 3tb Hitachi 7200 RPM HDD, Fractal Design Focus G Mini custom painted.  

NVIDIA GeForce RTX 2060 video card benchmark result - AMD Ryzen 5 3600,ASRock B450M Pro4 (3dmark.com)

Daughter 1 Rig: ASrock B450 Pro4, Ryzen 7 1700 @ 4.2ghz all core 1.4vCore, AMD R9 Fury X w/ Swiftech KOMODO waterblock, Custom Loop 2x240mm + 1x120mm radiators in push/pull 16gb (2x8) Patriot Viper CL14 2666mhz RAM, Corsair HX850 PSU, 250gb Samsun 960 EVO NVMe Win 10 boot drive, 500gb Samsung 840 EVO SSD, 512GB TeamGroup MP30 M.2 SATA III SSD, SuperTalent 512gb SATA III SSD, CoolerMaster HAF XM Case. 

https://www.3dmark.com/3dm/37004594?

Daughter 2 Rig: ASUS B350-PRIME ATX, Ryzen 7 1700, Sapphire Nitro+ R9 Fury Tri-X, 16gb (2x8) 3200mhz V-Color Skywalker, ANTEC Earthwatts 750w PSU, MasterLiquid Lite 120 AIO cooler in Push/Pull config as rear exhaust, 250gb Samsung 850 Evo SSD, Patriot Burst 240gb SSD, Cougar MX330-X Case

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Tristerin said:

This is a Corporation - whose at the top - so of course accusations of sandbagging lol.  The same one giving you 2 2070 chips.  They are doing this for money not fun.

 

If being a corporation that does things for money and not for fun is the only metric you need, then we don't have to concern ourselves with facts, because we can just jump to any conclusion regarding their motive and ignore the fact we don't even know why the condition exists.  And that is before we even establish if the condition does exist. 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

RTX on these cards is snake oil marketing to justify a big premium for 20 series cards.

 

Is it impressive? Yes without a doubt, that can't be argued.

 

However they marketed on ray tracing, and your best case scenario is around 60ish FPS...at 1080p. which is amazing until you put it in context with pricing and product tier.

 

Buy an RTX 2080ti for $1199.99+ to be able to ray trace at the lowest setting at 1080p. 

 

I don't get it, and i am the target market for the 2080ti. Why would I spend $1199.99 on a graphics card and run @ 1080p60fps ish? 

 

 

CPU | Intel i9-10850K | GPU | EVGA 3080ti FTW3 HYBRID  | CASE | Phanteks Enthoo Evolv ATX | PSU | Corsair HX850i | RAM | 2x8GB G.skill Trident RGB 3000MHz | MOTHERBOARD | Asus Z490E Strix | STORAGE | Adata XPG 256GB NVME + Adata XPG 1T + WD Blue 1TB + Adata 480GB SSD | COOLING | Evga CLC280 | MONITOR | Acer Predator XB271HU | OS | Windows 10 |

                                   

                                   

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, jasonc_01 said:

RTX on these cards is snake oil marketing to justify a big premium for 20 series cards.

 

Is it impressive? Yes without a doubt, that can't be argued.

 

However they marketed on ray tracing, and your best case scenario is around 60ish FPS...at 1080p. which is amazing until you put it in context with pricing and product tier.

 

Buy an RTX 2080ti for $1199.99+ to be able to ray trace at the lowest setting at 1080p. 

 

I don't get it, and i am the target market for the 2080ti. Why would I spend $1199.99 on a graphics card and run @ 1080p60fps ish? 

 

 

Why pay $700 to play at 144hz 4k ultra with no RT rather than paying $300 and play at low settings to the same effect. 

 

Ray tracing is just as much a reason as resolution or texture quality. Just because YOU see things differently doesn't change reality.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Tristerin said:

This is a Corporation - whose at the top - so of course accusations of sandbagging lol.  The same one giving you 2 2070 chips.  They are doing this for money not fun.

There is a HUGE delta between what I understand and how it actually works so Ill have to take smarter peoples words as truth - if it doesn't need that power, because it cant render the frame because its waiting on RT and Tensor cores (correct me if I am wrong, Im just a newb) to be able to do that is the reason for its power consumption (CUDA cores mentioned in the video) to go down that makes sense.

 

Modern microarchitectures turn off stuff when it's not in use, so if it's done rendering the rasterization part the entire rasterization part can literally power off for however many clock cycles it takes for the ray tracing to do it's thing.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, those results are not great, what's even the point of having it on the 2070 when it can't reach 1080p60 with RTX set to low...

Like, yea G-sync could help but still, that's not a good result.

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

After looking at the video closely, I've to say that it's not a major difference compared to physics and textures. I was looking at the scene with the big pond and a tree branch falling smack onto it. Expected a more realistic rendering and physics reaction, but nothing happened.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, 79wjd said:

Why pay $700 to play at 144hz 4k ultra with no RT rather than paying $300 and play at low settings to the same effect. 

 

Ray tracing is just as much a reason as resolution or texture quality. Just because YOU see things differently doesn't change reality.

I just think its counter intuitive. A Halo product for the high end market with the key feature DXR needing to be run at the mid to low end range resolution on the lowest DXR setting to hit 60fps.

 

It just doesn't make sense. Especially when you bring the 2080 and 2070 into the equation. Whats the point of RTX features on the 2080 and 2070 when the 2080ti is around 60fps on DXR low? Are we talking around 45fps and 30fps respectively DXR low at 1080p? We are one generation away from an 80ti hitting 1080p60 DXR ultra and two away from 1440p60fps DXR ultra. 

 

Ray tracing is great, its the future. RTX today though is just marketing to justify the price increase.  

CPU | Intel i9-10850K | GPU | EVGA 3080ti FTW3 HYBRID  | CASE | Phanteks Enthoo Evolv ATX | PSU | Corsair HX850i | RAM | 2x8GB G.skill Trident RGB 3000MHz | MOTHERBOARD | Asus Z490E Strix | STORAGE | Adata XPG 256GB NVME + Adata XPG 1T + WD Blue 1TB + Adata 480GB SSD | COOLING | Evga CLC280 | MONITOR | Acer Predator XB271HU | OS | Windows 10 |

                                   

                                   

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, jasonc_01 said:

I just think its counter intuitive. A Halo product for the high end market with the key feature DXR needing to be run at the mid to low end range resolution on the lowest DXR setting to hit 60fps.

 

It just doesn't make sense. Especially when you bring the 2080 and 2070 into the equation. Whats the point of RTX features on the 2080 and 2070 when the 2080ti is around 60fps on DXR low? Are we talking around 45fps and 30fps respectively DXR low at 1080p? We are one generation away from an 80ti hitting 1080p60 DXR ultra and two away from 1440p60fps DXR ultra. 

 

Ray tracing is great, its the future. RTX today though is just marketing to justify the price increase.  

RTX probably is yes. But you need hardware to drive interest in developers to drive interest in consumers and hardware etc etc. Its really hard getting that stuff off the ground.

 

On the plus side, the ideas and performance improvements behind DLSS should be a pretty big deal once games support it.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

That's.....actually not that bad. 

 

I wouldn't ever enable it for multiplayer because I need the FPS but for single player games, it's actually somewhat playable. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

Oh god, this looks absolutely horrible. And I'm not even going to touch ray tracing. People running through puddles of shiny water, tree parts falling into puddles of water. And what happens? NOTHING. ABSOLUTELY NOTHING. No ripples, no splashes, no water movement. I've seen more dynamic water in freaking Unreal based games from 1999... How someone at EA signed as ready to go out to gamers. HOW!? I'm more bothered by how static water is than fancy reflections which frankly look perfectly fine even if entirely faked. At least water there is dynamic, reflections are still realistic enough to fool you when action is happening and performance was superb with old rasterization. As predicted before, RTX is a gimmick and it will remain this way until we're actually ray tracing entire games. See you then. Now get us back the fast rasterizing graphic cards, thank you very much.

Link to comment
Share on other sites

Link to post
Share on other sites

RTX looks to be nvidia's next Physx

not really a good idea now, maybe in the future when the tech has improved to allow ray tracing to be done in a quarter of the time then real time...

i bet nvidia may revert to the old GTX branding because of the problems and controversy that it's had since it's announcement

not to mention that users who bought it had to wait untill their OS and games supported it, which took a couple of months, then just when it was ready. Microsoft pulled the update due to issues.. those who didn't get it had to wait another month, even though their game supported it, they couldn't use it....

to be fair thats more Microsoft fault, but they could of released a downloadable patch that allowed this without having to upgrade windows. but they didn't...

*Insert Witty Signature here*

System Config: https://au.pcpartpicker.com/list/Tncs9N

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, jasonc_01 said:

I just think its counter intuitive. A Halo product for the high end market with the key feature DXR needing to be run at the mid to low end range resolution on the lowest DXR setting to hit 60fps.

 

It just doesn't make sense. Especially when you bring the 2080 and 2070 into the equation. Whats the point of RTX features on the 2080 and 2070 when the 2080ti is around 60fps on DXR low? Are we talking around 45fps and 30fps respectively DXR low at 1080p? We are one generation away from an 80ti hitting 1080p60 DXR ultra and two away from 1440p60fps DXR ultra. 

 

Ray tracing is great, its the future. RTX today though is just marketing to justify the price increase.  

It doesn't make sense if you view resolution as being the only important factor. That's an opinion; that's not a universal truth. Some people favor resolution, others favor texture quality, and others only care about frame rate. That doesn't make any of them wrong. Ray tracing is no different.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

I have 3 out of 4 things with my setup.

 

+ great details because I can run anything at Ultra

+ high framerate

+ high refresh rate

- resolution

 

Frankly, resolution really isn't an issue, because slamming anything with even crappy FXAA and image is smooth enough at 1080p that individual pixels aren't an issue. If game gives me SMAA, then even better. And frankly, real-time reflections are so convincing in modern rasterized games that I really don't see any point in ray tracing until it's ready to be used full scale at high framerates. This now is great for a tech demo under ideal conditions, but not really usable. Maybe in a slow paced horror game where action is slow and lighting and shadows plays more important role.

 

Eventually I'll be running 4K monitor, but that'll probably happen like 5 years in the future if not further, when 4K becomes as mainstream as 1080p is right now.

Link to comment
Share on other sites

Link to post
Share on other sites

https://forums.guru3d.com/threads/rx-vega-owners-thread-tests-mods-bios-tweaks.416287/page-48#post-5607107

 

Quote

BF V MS RTX first look (Todays patch)
I need to dig into Rotterdam Map ;) (Multi w/RTX is very unstable)

RayTrace Enabled & Working on Vega at 1440p 70FPS (RTX at High preset, screens from Campaign Mission 1)
I can Play the Game NP.

In BFV Directory in MyDocuments -> PROFSAVE_profile
GstRender.Dx12Enabled 1
GstRender.DxREnabled 1
GstRender.RaytraceReflectionQuality 2

Note:
Im not sure if this is IT -> But with those settings game is looking a little better on reflections.

Can see the link for the pictures posted. The Ray Tracing in BFV is a DX12 feature, not explicitly a RTX function, and it's really an Async Compute calculation. Not sure if it's actually working, but this might get really interesting if AMD hardware is working as well.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Taf the Ghost said:

https://forums.guru3d.com/threads/rx-vega-owners-thread-tests-mods-bios-tweaks.416287/page-48#post-5607107

 

Can see the link for the pictures posted. The Ray Tracing in BFV is a DX12 feature, not explicitly a RTX function, and it's really an Async Compute calculation. Not sure if it's actually working, but this might get really interesting if AMD hardware is working as well.

Would be hilarious if Vega 64 ends up being faster at it hehe

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, RejZoR said:

Would be hilarious if Vega 64 ends up being faster at it hehe

It actually wouldn't surprise me. Any tech that's based on either DX12 or Vulkan tends to favor AMD's GPUs due to being designed around them more. AMD was significantly involved in all of the low-level API work. Hybrid Rendering Ray Tracing is an Asynchronous Computation.  Given the very early, unoptimized state that RT Core utilization is in, AMD's much more mature Async Compute driver handling and, at least it seemed to me, some sort of bottleneck with the Denoiser on the RTX cards, it is entirely possible that the BFV ray tracing, if working roughly the same, works better on RX Vega cards at the moment.

 

Now, that isn't to say it is and it definitely needs to be tested, but we saw games get upwards of 20% performance improvements on Ryzen from some very minor patches just because of the nature of the way the cores were handled was adjusted. Weird things happen.

Link to comment
Share on other sites

Link to post
Share on other sites

apparently you can force Ray tracing on any GPU by editing the .ini files even when using a vega gpu.

 

I wonder how the performance hold up.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, mr moose said:

That would indicate an error with expectation and lack of understanding of the product more than the inherent challenges of the problem it is solving.

People expected a sizable increase in performance, if not from ray tracing (since it is so hard in real time) then from normal rendering. What they got was RTX 2080 performance being similar to 1080ti performance for the same price, and the RTX 2080ti being so expensive that it's not viable for the majority of consumers.

Link to comment
Share on other sites

Link to post
Share on other sites

I must admit, based on the digital foundry video DICE did, where they explained it was built on the old titan v and not even using RT cores and already running up to 60 FPS I was expecting much better.

 

 

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×