Jump to content

Gigabyte RTX 2060 Super Gaming OC vs Sapphire RX 5700 XT Nitro+

So I was almost positive that I would be getting the 5700 XT (either Nitro or Pulse, depending on the price) until I saw Tech Deals' video yesterday, which made some strong arguments in favor of Nvidia's card:

- better for screen capture

- stable drivers

- only marginal (2-3 frames) difference in performance

 

When I checked comparisons between the two cards, I came to conflicting results: some claim that the 5700 XT is the superior card overall, while others claim that it is only slightly better in gaming, but when it comes to work, they recommend the 2060 Super. Since I do need the card for Blender, Unreal Engine, screen-capture, and video-editing, the 2060 Super is starting to look like the better deal, especially as the gaming performance of the two seems almost identical.

 

What also has me confused are the clockspeeds of the Gigabyte 2060 Super.

Apparently, its core clock is 1815 MHz.

Does that make it better than the XT's 1,770 MHz?

(the Nitro+ has a higher game clock, but I'm not sure if that is the same as core clock).

 

Are these numbers misleading, or is the Gigabyte Gaming OC 2060 Super viable, and arguably better?

 

How about screen-capture, video-editing, and Blender?

At present, I make heavy use of Nvidia Shadowplay's Instant Replay feature with my 1060. I've been told on this forum that AMD has its own equivalent, but Tech Deals made it sound pretty bad. Is that really the case?

The same goes for video-editing software preferring CUDA cores.

 

And what are these horror stories about AMD drivers? I have never so far used an AMD card, and I've been seeing commenters who complain about "headache-inducing" driver issues with AMD.

 

_____

Both cards are the same price (XT may be slightly more expensive).

How about the Sapphire Pulse compared to Gigabyte's Gaming 2060 Super as a less expensive AMD card (or the Gigabyte Gaming 5700 XT)?

_____

I will be ordering one of these days, and this dilemma seems like a huge wrench in my initial build plan.

 

Thank you for reading and replying,

Katarn

Link to comment
Share on other sites

Link to post
Share on other sites

They should perform equally.

The clock speeds don't matter as they're based on completely different platforms/processors.

The AMD driver issues are minor but slightly annoying things that happen every couple of weeks or months. They're common but not that bad; otherwise nobody would recommend AMD cards.

 

However, even if the 2060 Super is the better option, I'd still stay with AMD because Gigabyte is a trash GPU manufacturer in my opinion anyway

Ryzen 7 3700X / 16GB RAM / Optane SSD / GTX 1650 / Solus Linux

Link to comment
Share on other sites

Link to post
Share on other sites

I've been running a 5700 XT Pulse since pretty much launch day, the only problem I've ever encountered has been with HDR. Drivers are rock solid and never crash. What I can't comment on is Radeon ReLive (AMDs Shadowplay equivalent) as I've never used it.

 

Also the 5700XT is closer to the 2070 than the 2060 Super in most titles.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

For performance, 5700XT's lead isnt as small as it seems. It's just that no 5700XT runs nearly as much factory overclock as Nvidia cards in general do which makes them look slower. 

 

Screen capture acceleration with Radeons is terrible, you'll have to rely on the CPU for that. That said you could totally capture lossless and compress it later (even after editing)

 

Drivers-wise, I'd they both are just as likely to go wrong. AMD's just more known for this because they screw it up on every launch and fix it later.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, NunoLava1998 said:

even if the 2060 Super is the better option, I'd still stay with AMD because Gigabyte is a trash GPU manufacturer in my opinion anyway

What makes you say that?

From what I've seen, both their 2060 Super and RX 5700 XT models perform rather nicely. The XT model is somewhere between the Pulse and the Nitro+.

 

5 hours ago, Master Disaster said:

I've been running a 5700 XT Pulse since pretty much launch day, the only problem I've ever encountered has been with HDR. Drivers are rock solid and never crash. What I can't comment on is Radeon ReLive (AMDs Shadowplay equivalent) as I've never used it.

I might go for the Pulse if the price difference between it and the XT is over $60.

What can you tell me about its temps and sound? Hardware Unboxed said that it was way better than the reference one, but the benchmarks showed that it does go up to 82C under load, which is suboptimal.

5 hours ago, Master Disaster said:

Also the 5700XT is closer to the 2070 than the 2060 Super in most titles.

I thought it was better than the 2070, and closer to the 2070 Super.

 

4 hours ago, Jurrunio said:

For performance, 5700XT's lead isnt as small as it seems. It's just that no 5700XT runs nearly as much factory overclock as Nvidia cards in general do which makes them look slower. 

I don't understand. Do you mean that they will run much faster if a game demands it?

4 hours ago, Jurrunio said:

Screen capture acceleration with Radeons is terrible, you'll have to rely on the CPU for that. That said you could totally capture lossless and compress it later (even after editing)

That's a major concern, actually.

I will be using a Ryzen 5 3600. Is that good enough?

The way I use Shadowplay is that I have Instant Replay set to record a span of the last 2 minutes, and if something interesting happens, I save the clip.

How big are these lossless files?

And what about OBS (or similar software)?

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Katarn said:

don't understand. Do you mean that they will run much faster if a game demands it?

No, I mean the custom Nvidia cards gain less from overclocking because the manufacturer used more of that headroom on more aggressive factory settings to make their products look better.

 

28 minutes ago, Katarn said:

I will be using a Ryzen 5 3600. Is that good enough?

Enough for lossless and x264 high preset in most games (1080p 60fps), but

 

28 minutes ago, Katarn said:

The way I use Shadowplay is that I have Instant Replay set to record a span of the last 2 minutes, and if something interesting happens, I save the clip.

definitely not good for use like this.

 

29 minutes ago, Katarn said:

How big are these lossless files?

I think I got 30GB of lossless video from 6 mins of BeamNG.drive gameplay, so lossless is really only an option when you know for certain how long the footage will last.

 

30 minutes ago, Katarn said:

And what about OBS (or similar software)?

Yes I do mean recording with OBS. To be fair I dont like Shadowplay's UI either but at least the tech itself works.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Katarn said:

What can you tell me about its temps and sound? Hardware Unboxed said that it was way better than the reference one, but the benchmarks showed that it does go up to 82C under load, which is suboptimal.

I have mine OCed to its limit and I still can't hear the fan over the noise my case already makes but I should point out that I do have one fan that has squeaky bearings and makes way more noise than it should so not a fair comparison. I was going to replace the fan this month but I've decided to go full custom loop after Christmas is over so I can live with it for now.

 

As for temps, it can get quite toasty, I've seen 85°c under a stress test but for average gaming it normally stays below 80°c.

 

Here's a Time Spy run at stock - https://www.3dmark.com/spy/8323766

 

And another with both the CPU & GPU OCed - https://www.3dmark.com/spy/8534454

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Katarn said:

What makes you say that?

From what I've seen, both their 2060 Super and RX 5700 XT models perform rather nicely. The XT model is somewhere between the Pulse and the Nitro+.

It's not about the performance, just that the warranty and sometimes quality is pretty low

Ryzen 7 3700X / 16GB RAM / Optane SSD / GTX 1650 / Solus Linux

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, NunoLava1998 said:

It's not about the performance, just that the warranty and sometimes quality is pretty low

Yea, but lets forget about MSI's crappy 5700/5700XT Evoke, EVGA's huge GTX 900 series heatsink screw up, and ASUS overpriced cards for something that looks "cooler" than the others.

 

I've never had an issue with Gigabyte and neither has a single one of my friends. Just a senseless hate train for the company from people saying they screw up more than other manufacturers (which is, uh... false...).

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, PHYLO said:

Yea, but lets forget about MSI's crappy 5700/5700XT Evoke, EVGA's huge GTX 900 series heatsink screw up, and ASUS overpriced cards for something that looks "cooler" than the others.

 

I've never had an issue with Gigabyte and neither has a single one of my friends. Just a senseless hate train for the company from people saying they screw up more than other manufacturers (which is, uh... false...).

I do know that; it's just that for me warranty is honestly really important

Ryzen 7 3700X / 16GB RAM / Optane SSD / GTX 1650 / Solus Linux

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, NunoLava1998 said:

I do know that; it's just that for me warranty is honestly really important

Gigabyte and AORUS cards come with a full extra year of warranty (4 years total compared to the competition's 3 years) vs all the other competitions after registering the product online. That is a huge selling factor and I'm surprised not many people (if any at all) are mentioning that in reviews. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, PHYLO said:

Gigabyte and AORUS cards come with a full extra year of warranty (4 years total compared to the competition's 3 years) vs all the other competitions after registering the product online. That is a huge selling factor and I'm surprised not many people (if any at all) are mentioning that in reviews. 

It's the warranty quality that's not good at all

Ryzen 7 3700X / 16GB RAM / Optane SSD / GTX 1650 / Solus Linux

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, NunoLava1998 said:

It's the warranty quality that's not good at all

Again, never had an issue with their customer service and nor have my acquaintances. YMMV.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Jurrunio said:

definitely not good for use like this.

That is a big red flag, since I've been relying on Instant Replay a lot, but why are you saying something so contradictory to the reviews? https://www.pcworld.com/article/3319476/amd-relive-review.html

 

11 hours ago, Master Disaster said:

As for temps, it can get quite toasty, I've seen 85°c under a stress test but for average gaming it normally stays below 80°c.

By average gaming do you mean simply without over-clocking, or playing non-demanding games?

Because I do like to play on Ultra, and I want to play some fairly demanding titles, such as Metro: Exodus, Mankind Divided, and the recent Ghost Recons.

 

2 hours ago, NunoLava1998 said:

It's the warranty quality that's not good at all

What do you mean by that?

 

2 hours ago, PHYLO said:

and ASUS overpriced cards for something that looks "cooler" than the others.

This is a very underappreciated truth.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Katarn said:

By average gaming do you mean simply without over-clocking, or playing non-demanding games?

Because I do like to play on Ultra, and I want to play some fairly demanding titles, such as Metro: Exodus, Mankind Divided, and the recent Ghost Recons.

I mean playing games and not benchmarking or stress testing. Iirc I was playing AC Odyssey at 1440p Ultra when I tested temperatures.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Katarn said:

That is a big red flag, since I've been relying on Instant Replay a lot, but why are you saying something so contradictory to the reviews? https://www.pcworld.com/article/3319476/amd-relive-review.html

5-10fps drop is big. And no, Nvidia GPUs, even the Pascal 1060 and 1070 I have, dont drop this much in a 60fps game.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/30/2019 at 9:56 PM, Jurrunio said:

5-10fps drop is big. And no, Nvidia GPUs, even the Pascal 1060 and 1070 I have, dont drop this much in a 60fps game.

True, I've never had that kind of performance loss with my 1060, except for Shadow Warrior 2 (which weirdly tanked performance) and Rising Storm 2, which is around 5-10 loss in performance.

However, the site mentioned "That’s about the same as we saw with Nvidia’s ShadowPlay, built into the company’s GeForce Experience software for GeForce graphics cards." in the performance section, which leads me to believe that the testing methods might have been a bit more intense than the recording experience would be for most users, which the review confirms " we used a test rig with constrained resources—the point being to see how the software performed in less than ideal conditions." one paragraph above.

Considering that I record close to YouTube's recommended bitrate at 22 Mbps (YouTube recommends 15), I don't expect to have the same trouble as what most performance tests evaluate, namely the maximum of 100 Mbps.

 

Also, even if ReLive is more stressing on the system than ShadowPlay is and hence leads to bigger performance loss, there remains the fact that, since the 5700 XT can put out 5-10 more frames than the 2060 Super in games, the alleged loss of performance when recording with the XT would make its performance equal to the 2060 Super when not recording.

 

I don't know if I'm missing some important detail here, but it seems like the setbacks of AMD's recording software have been exaggerated in favor of NVidia. The only way to nullify this would be, I suppose, if it can be proven that ReLive is simply incompatible with some games and is therefore incapable of recording (crashes, visual/audio glitches, refusing to record...).

 

Otherwise, I suppose that the next step in making the decision between the two cards would be performance in video editing software and, above all, Blender 3D and Unreal Engine 4.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/29/2019 at 10:02 AM, NunoLava1998 said:

They should perform equally.

The clock speeds don't matter as they're based on completely different platforms/processors.

The AMD driver issues are minor but slightly annoying things that happen every couple of weeks or months. They're common but not that bad; otherwise nobody would recommend AMD cards.

 

However, even if the 2060 Super is the better option, I'd still stay with AMD because Gigabyte is a trash GPU manufacturer in my opinion anyway

how is gigabyte bad i was gonna get a gaming oc for the 5700 xt

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Intel 119980XE said:

how is gigabyte bad i was gonna get a gaming oc for the 5700 xt

 

They tend to have a higher default overclock, lower quality components, and a shorter warranty period, in addition to worse RMA reviews overall. ASRock has the best warranty (3 years, same as MSI, for the 5700 XT) and RMA process right now, despite you having to pay for 1-way shipping if you have to return something in the warranty period.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

I will probably be getting either the Sapphire Pule or the Sapphire Nitro+ (depends on the price differences), but this has sparked my curiosity:

 

On 10/2/2019 at 3:36 AM, BTGbullseye said:

They tend to have a higher default overclock

Isn't that a good thing? Overclocking on the user's part voids the warranty. So a higher default overclock means better performance without having to risk your warranty.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/3/2019 at 4:43 PM, Katarn said:

Isn't that a good thing? Overclocking on the user's part voids the warranty. So a higher default overclock means better performance without having to risk your warranty.

It hasn't affected warranty status (in the USA at least) for quite some time, partly because there's really no way for the manufacturer to know if you've done it without there being illegal monitoring hardware built into the card, or you telling them outright that you did that.

 

Also, the default clocks have no effect on the maximum clocks achieved by the GPU, as the 5700 series don't have a hard limit, even when you set it to. (I set my 5700 XT to 2064, and I get spikes to 2175+ quite often when it stays under 85°C at the junction)

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/5/2019 at 10:53 AM, BTGbullseye said:

It hasn't affected warranty status (in the USA at least) for quite some time, partly because there's really no way for the manufacturer to know if you've done it without there being illegal monitoring hardware built into the card, or you telling them outright that you did that.

 

Also, the default clocks have no effect on the maximum clocks achieved by the GPU, as the 5700 series don't have a hard limit, even when you set it to. (I set my 5700 XT to 2064, and I get spikes to 2175+ quite often when it stays under 85°C at the junction)

I'm surprised to hear that. I thought tinkering the power of the GPU was a void by default. I remember it being the case in 2014(ish).

 

But can't they know you've done it by using software that triggers when the GPU operates at a clock higher than the intended, and/or just checking your system for specific OC software? Why would that be controversial among GPU manufacturers?

 

But for someone who doesn't want to manually overclock, isn't a higher default overclock better?

 

Also those are impressive numbers. Just be careful not to burn it up in a few years. There is a reason it used to void the warranty.

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Katarn said:

I'm surprised to hear that. I thought tinkering the power of the GPU was a void by default. I remember it being the case in 2014(ish).

Court case made warranty invalidation from anything that shouldn't destroy the component, or change it's characteristics to outside that of what the component is designed to do, invalid. Basically, you can mod your GPU in any way you want, both physically and using software, but flashing the wrong firmware might invalidate the warranty. (depending on what the firmware does)

47 minutes ago, Katarn said:

But can't they know you've done it by using software that triggers when the GPU operates at a clock higher than the intended, and/or just checking your system for specific OC software? Why would that be controversial among GPU manufacturers?

Because it's illegal to use that kind of monitoring.

48 minutes ago, Katarn said:

But for someone who doesn't want to manually overclock, isn't a higher default overclock better?

Only if it wasn't just a simple slider adjustment in the manufacturer's software that gets installed with the driver by default. AMD makes it incredibly easy to overclock your stuff safely.

50 minutes ago, Katarn said:

Also those are impressive numbers. Just be careful not to burn it up in a few years. There is a reason it used to void the warranty.

It's not that risky anymore. The GPU uses self-throttling to keep the temps safe, and you can't turn it off. As long as it stays under 105°c at the junction, no damage can occur. Also, those numbers aren't all that impressive for a 5700 XT, even though they may seem like it. (this is on the reference blower cooler, but I replaced the thermal pads with good thermal paste)

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×