Jump to content

AMD RX 6000 Ray tracing

letsplaysims3
12 minutes ago, letsplaysims3 said:

For me that extra $500 can be used on other stuffs like a better luxurious case and a monitor.

Of course. The 3090 is terrible value for gaming. For gaming you should go with 6800XT/6900XT or 3080 at most. Anything higher is not worth it anymore.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Stahlmann said:

Of course. The 3090 is terrible value for gaming. For gaming you should go with 6800XT/8900XT or 3080 at most. Anything higher is not worth it anymore.

Absolutely.

Let's wait and see the Raytracing of the 6000 cards.

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Stahlmann said:

6900XT

Even that is quite overkill for gaming.

 

theres no reason to get a $999 graphics card that’s about 10-15% faster (like the 3090) than the RX 6800XT and RTX 3080

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Hymenopus_Coronatus said:

Even that is quite overkill for gaming.

 

theres no reason to get a $999 graphics card that’s about 10-15% faster (like the 3090) than the RX 6800XT and RTX 3080

I included this one for the people who think they have to have the best of the best. For these people the 6900XT still makes more sense than a 3090. For most gamers the 6800XT or 3080 will be to obtainable top-end.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Hymenopus_Coronatus said:

Even that is quite overkill for gaming.

 

theres no reason to get a $999 graphics card that’s about 10-15% faster (like the 3090) than the RX 6800XT and RTX 3080

You do have a point.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Stahlmann said:

I included this one for the people who think they have to have the best of the best. For these people the 6900XT still makes more sense than a 3090. For most gamers the 6800XT or 3080 will be to obtainable top-end.

You're right.

 

By the way, talking about these high end cards, say if I play DOOM ETERNAL and get over 200FPS, how do my 144MHz monitor keeps up with high FPS? Will I get any screen tearing?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, letsplaysims3 said:

You're right.

 

By the way, talking about these high end cards, say if I play DOOM ETERNAL and get over 200FPS, how do my 144MHz monitor keeps up with high FPS? Will I get any screen tearing?

If you have G-Sync or Free-Sync enabled the FPS will cap at 144. Normally with higher FPS than your refresh rate you won't get tearing. Tearing is pretty much only an issue when you are below your refresh rate. Still, if you have Freesync or G-Sync enabled you won't get any tearing at all.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stahlmann said:

If you have G-Sync or Free-Sync enabled the FPS will cap at 144. Normally with higher FPS than your refresh rate you won't get tearing. Tearing is pretty much only an issue when you are below your refresh rate. Still, if you have Freesync or G-Sync enabled you won't get any tearing at all.

But the game is sending 243 frames per second to the monitor, then 243-144= 99 frames are lost in transit, isn't it?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, letsplaysims3 said:

But the game is sending 243 frames per second to the monitor, then 243-144= 99 frames are lost in transit, isn't it?

They're not lost, you still have better input lag than when running at 144FPS. But in games like Doom it is not really noticeable. The only games i'd actually run unlocked are competitive games like CS:GO, Overwatch, Valorant, etc. Any other game is just fine running in a VRR mode capped to 144Hz/FPS to completely remove any tearing.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stahlmann said:

They're not lost, you still have better input lag than when running at 144FPS. But in games like Doom it is not really noticeable. The only games i'd actually run unlocked are competitive games like CS:GO, Overwatch, Valorant, etc. Any other game is just fine running in a VRR mode capped to 144Hz/FPS to completely remove any tearing.

I have some old games like the Sims 3 and the Sims 4. With the forthcoming 6800XT, it can easily saors to 300+ FPS, I am just worrying about some old games I play.

 

Isn't that the idea of getting variable refresh rate and high refresh rate monitor is to cope with the situation I just mentioned?

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, letsplaysims3 said:

I have some old games like the Sims 3 and the Sims 4. With the forthcoming 6800XT, it can easily saors to 300+ FPS, I am just worrying about some old games I play.

 

Isn't that the idea of getting variable refresh rate and high refresh rate monitor is to cope with the situation I just mentioned?

Not really. The "real" target of a 144Hz variable refresh rate monitor. Is to deliver a tear and stutter-free experience between 40-144Hz/FPS when your game is fluctuating in between these FPS. Of course you can run games at higher FPS than that, but then you don't use VRR anymore. Like i said, when you have VRR enabled on a 144Hz monitor games will be capped at 144Hz. When your FPS is higher than your refresh rate, you are NOT using VRR.

 

And if you already have a GPU that can max out your monitors refresh rate then you have no reason to upgrade whatsoever. Only if you want to use newer features like ray-tracing etc.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

From the Gamer Nexus video RX 6000 has 1 HARDWARE ray tracing accelerator per CU that provides about a 10x improvement over software only solutions.  Nobody knows exactly the performance but the expectation is the fastest card is on par with a 2080ti.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, ewitte said:

Nobody knows exactly the performance

This is the most important part of what was posted, and everything else is pure speculation.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, Stahlmann said:

Not really. The "real" target of a 144Hz variable refresh rate monitor. Is to deliver a tear and stutter-free experience between 40-144Hz/FPS when your game is fluctuating in between these FPS. Of course you can run games at higher FPS than that, but then you don't use VRR anymore. Like i said, when you have VRR enabled on a 144Hz monitor games will be capped at 144Hz. When your FPS is higher than your refresh rate, you are NOT using VRR.

 

And if you already have a GPU that can max out your monitors refresh rate then you have no reason to upgrade whatsoever. Only if you want to use newer features like ray-tracing etc.

Sorry, I want to get this right: 

 

1) So there is a button in the OSD or in the nVidia panel for me to enabled VRR?

2) When the game max out the 144Hz, I don't need to enabled VRR because it's already over 144Hz?

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, ewitte said:

From the Gamer Nexus video RX 6000 has 1 HARDWARE ray tracing accelerator per CU that provides about a 10x improvement over software only solutions.  Nobody knows exactly the performance but the expectation is the fastest card is on par with a 2080ti.

So, can we presume that in fact both nVidia and AMD implement ray tracing using the DX12 libraries. On top of that, the card is taking up the role of speeding up the ray tracing rendering by hardware?

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, BTGbullseye said:

This is the most important part of what was posted, and everything else is pure speculation.

What if ray tracing is not important to me?  Then I surely will go for AMD becoz it is way cheaper.

 

But if ray tracing is implement in every game in the coming future, then I will certainly need RT and will go for RTX because nVidia has DLSS as an alternative.

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, letsplaysims3 said:

Sorry, I want to get this right: 

 

1) So there is a button in the OSD or in the nVidia panel for me to enabled VRR?

2) When the game max out the 144Hz, I don't need to enabled VRR because it's already over 144Hz?

These are the things you have to do to properly make use of G-Sync or Freesync monitors with an NVIDIA GPU:

 

1. Enable VRR in your monitor's OSD if there's a setting for it.

 

2. Enable G-Sync in your NVIDIA Control Panel

 

3. Go to your global 3D-App settings in NVIDIA Control Panel and set the following:

- Enable Vsync

- Set preferred Monitor technoligy to G-Sync or G-Sync compatible

- Enable FPS cap and set to 1 lower than your monitor's refresh rate (for example 143 for a 144Hz monitor)

(- for good measure set energy mode to "max performance" and texture filtering to "high performance". This gives a bit more performance without any downsights other than slightly higher power consumption at idle)

 

4. Go to your ingame settings and set the following:

- DISABLE V-Sync ingame (leave it enabled only in NVIDIA control panel)

- Set to Fullscreen (not borderless)

 

5. Now you're using G-Sync / Freesync. As you can see you should cap the frame rate to 143 because it can vary slightly and with this you will never get out of your G-Sync range. (Which is 40-144Hz for most 144Hz monitors)

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/29/2020 at 6:00 AM, Stahlmann said:

Yes. But they havent yet talked about the performance impact from turning it on.

In AMD's footnotes for the recent presentation they mention that with their hardware Ray Accelerators being used they got a 13.8x (1380%) FPS increase over just using software DXR. 

 

I can't recall who but someone on Youtube extrapolated that it would put the 6800xt on par with the 3070 in Raytracing capability. If true that some AIB cards have boost clocks past 2.5Ghz for the 6800xt, then maybe a little faster in a best case scenario.

 

The performance hit iirc (again I can't remember source) said it was similar to Turing's performance hit.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, letsplaysims3 said:

But if ray tracing is implement in every game in the coming future, then I will certainly need RT and will go for RTX because nVidia has DLSS as an alternative.

DLSS has nothing to do with raytracing. All it is is an upscaling technique, and it has to be implemented by the game to function. AMD's upcoming Super Resolution setting, if it works as I have been led to believe, will not require software implementation, and will have a similar result. (though it's going to be 2021 or later before it's released)

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Stahlmann said:

These are the things you have to do to properly make use of G-Sync or Freesync monitors with an NVIDIA GPU:

 

1. Enable VRR in your monitor's OSD if there's a setting for it.

 

2. Enable G-Sync in your NVIDIA Control Panel

 

3. Go to your global 3D-App settings in NVIDIA Control Panel and set the following:

- Enable Vsync

- Set preferred Monitor technoligy to G-Sync or G-Sync compatible

- Enable FPS cap and set to 1 lower than your monitor's refresh rate (for example 143 for a 144Hz monitor)

(- for good measure set energy mode to "max performance" and texture filtering to "high performance". This gives a bit more performance without any downsights other than slightly higher power consumption at idle)

 

4. Go to your ingame settings and set the following:

- DISABLE V-Sync ingame (leave it enabled only in NVIDIA control panel)

- Set to Fullscreen (not borderless)

 

5. Now you're using G-Sync / Freesync. As you can see you should cap the frame rate to 143 because it can vary slightly and with this you will never get out of your G-Sync range. (Which is 40-144Hz for most 144Hz monitors)

I have another question. 

 

I'm currently using a cheap BenQ GW2270.

 

When I played Mafia II (original) with all settings max'd out, but MSI Afterburner monitoring tool showed the FPS being 60 ALL THE TIME.

 

My setup is 9600K+RX570(OC). 

 

In my old LG (dead already), the FPS wnt above 100.

 

My current issue is related to Windows 10 or because this cheap BenQ monitor is only max'd at 60Hz?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, letsplaysims3 said:

I have another question. 

 

I'm currently using a cheap BenQ GW2270.

 

When I played Mafia II (original) with all settings max'd out, but MSI Afterburner monitoring tool showed the FPS being 60 ALL THE TIME.

 

My setup is 9600K+RX570(OC). 

 

In my old LG (dead already), the FPS wnt above 100.

 

My current issue is related to Windows 10 or because this cheap BenQ monitor is only max'd at 60Hz?

That monitor only goes to 60Hz refresh, and it sounds like you're using vsync, so it will lock to whatever your monitor refresh rate is. The limit is caused by you using vsync on a 60Hz monitor.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, BTGbullseye said:

That monitor only goes to 60Hz refresh, and it sounds like you're using vsync, so it will lock to whatever your monitor refresh rate is. The limit is caused by you using vsync on a 60Hz monitor.

Yes, great thanks. It was turned on in game. But why would in game default sets it so?

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, letsplaysims3 said:

Yes, great thanks. It was turned on in game. But why would in game default sets it so?

Because it removes screen tearing.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×