Jump to content

Help me decide which AMD gpu

torbenscharling

Hey, so earlier this year I built a potential hackintosh out of these parts I selected: Delidded 8700K with the biggest noctua fan, ASRock Z370 Pro, 24GB Elite 2666mHz ram, 4x 860 EVO 1tb ssd and a h.264 avermedia capture card.

 

I aim to use this as a music and video production machine, mainly running Ableton Live, but also doing some OBS video streaming, video editing in Premiere Pro/Final Cut Pro, and casual gaming. I will however, be running up to 4-5 displays, potentially the main one being 4k. I am not looking to max out fps in games, as I am only a very casual gamer. I do like a ultra quiet (no coil whine, no fan noise under normal conditions). One of the things holding me back has been that I will loose one PCI-E port by going with any one of the newer AMD cards since they take up 2.2, 2.5 or 3 slots.. I am thinking, perhaps I am making this too big an issue since I could just get a pci-e extension cable and mount the gpu elsewhere in the case (I guess?) also I hear that going liquid cooled will still make both fan noise and pump noise, so going for say a ASUS Vega 64 ROG STRIX which is one of the few that actually has 5 display outputs, is what I am currently ended up with. Now I see there are new 5xxx variants coming out around the corner, but my mobo doesn't have pci 4.0 (not that big a deal upgrading the mobo I guess, but is it worth it?) and a used vega 64 still is cheaper than a 5700 xt and has (Faster?) ram for rendering and stuff...

 

If anyone got clues please chime in. Perhaps all the negative publicity AMD has gotten for their GPUS from gamers have skewed the actual picture of how these cards perform and I should just go ahead and get one?! I have an offer for a used 64 strix and a 64 sappphire nitro, first is 2 years used, the other half a year, leaning towards the strix though due to 5 display outs and I can always re-do the paste and put more thermap pads on it if needed...seller hasnt tried undervolting though.

Link to comment
Share on other sites

Link to post
Share on other sites

is "go nvidia" valid?

amd only has good CPU's since the HD series of radeon cards tbh.

used to be AMD fanboy but ive had enough of their gpu's

If I had one wish, I would ask for a big enough ass for the whole world to kiss

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, torbenscharling said:

I have an offer for a used 64 strix and a 64 sappphire nitro

if they cost the same, then the spphire nitro is the definitive better card. 

 

just avoid Navi for hackintoshing, it really doesnt add anything while being worse at rendering. 

 

and this is for a hackintosh, correct?

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, GoldenLag said:

if they cost the same, then the spphire nitro is the definitive better card. 

 

just avoid Navi for hackintoshing, it really doesnt add anything while being worse at rendering. 

 

and this is for a hackintosh, correct?

Ya I gathered that much from countless reviews and research but the barebones is that I dont get the dvi out on the sapphire...also, what else am I missing by going strix? (I realize i may have to look at the cooling aspect of the card but in terms features they should be identical more or less, right? hardware encoding and all, I mean it's more or less "just" the heatsink and fans that are differentiating those twoo cards? Ya same price and the sapphire is 1 1/2 years younger with receipt the other without receipt but claims to have never been overclocked and can come check it in use before buying (as with the other card if I choose) also looking at ebay but these local blue paper prices beat them so far.

 

Yes the intent is for high sirra/mojave hackintosh, but may stay on windows regardless it will be for both win and mac on this machine depending the use of the day as I plan to use it in conjunction with my macbook pro, one of which will handle most of the video streaming work while the other handles live audio duties, yet this desktop is meant to do all video editing and rendering. Ya looks like it's a tick or a tock too early for me to worry about navi, pci 4.0, z390 etc. just need this last piece of the puzzle, perhaps along with a UAD dsp accellerator card and I should be good to go :)

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, RelativeMono said:

is "go nvidia" valid?

amd only has good CPU's since the HD series of radeon cards tbh.

used to be AMD fanboy but ive had enough of their gpu's

It would be shooting myself in the foot,  as I specifically bought the hardware to be a mac pro substitute since at the time, the new mac pro weren't around, newer macbook pro throttle, and newly released mac pros are way over my budget. Could/should have just bought a Mini but thats too late now...Cant rely on windows PC for the type of audio I am doing, cause windows dont have baked in MIDI or Audio; no Core Audio, which sucks, so ya, go nividia not valid unless theres been news on the hackintosh front since I last researched it..

Link to comment
Share on other sites

Link to post
Share on other sites

oh i guess i wasnt paying attention 

yeah go with whatever amd if you want a non operating system, it wouldnt be the worst thing in the build at all.

If I had one wish, I would ask for a big enough ass for the whole world to kiss

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, RelativeMono said:

is "go nvidia" valid?

amd only has good CPU's since the HD series of radeon cards tbh.

used to be AMD fanboy but ive had enough of their gpu's

The Vega 56, Vega 64, Vega Frontier Edition, Radeon VII kinda, RX570, RX580, RX590 sometimes, RX 5700, and RX 5700XT are all solid cards for various use cases. 

 

50 minutes ago, torbenscharling said:

Ya I gathered that much from countless reviews and research but the barebones is that I dont get the dvi out on the sapphire...also, what else am I missing by going strix? (I realize i may have to look at the cooling aspect of the card but in terms features they should be identical more or less, right? hardware encoding and all, I mean it's more or less "just" the heatsink and fans that are differentiating those twoo cards?

Nitro+ cools better and boosts a bit higher: 

In some tests the Red Devil is running cooler, but also at a noticeably higher fan RPM (so it'll be louder). 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, torbenscharling said:

Ya I gathered that much from countless reviews and research but the barebones is that I dont get the dvi out on the sapphire.

if you really need DVI, just go for the strix then. 

51 minutes ago, torbenscharling said:

so ya, go nividia not valid unless theres been news on the hackintosh front since I last researched it..

newer than pascal isnt supported in any way afaik. 

 

so vega is the newest worth getting. Navi is just trash at productivity. 

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm, looks like I can only connect up to 4 monitors at a time ?! Thats a bummer when the card has 5 monitor outputs XD

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/3/2019 at 6:09 AM, GoldenLag said:

it really doesnt add anything while being worse at rendering. 

Do you have any benchmarks for this? All the ones I've seen point at the 5700 and 5700 XT being comparable to the Vega 56 and 64 respectively for rendering...

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, BTGbullseye said:

Do you have any benchmarks for this? All the ones I've seen point at the 5700 and 5700 XT being comparable to the Vega 56 and 64 respectively for rendering...

Using the brute force calculation strenght of the GPU, the 5700 and 5700 xt is slower. 

 

Also the architecture differences between GCN and RDNA is the fact it trades the compute structure of GCN for a more gaming oriented compute pipeline. 

 

And iirc the few GPU render benchmarks ive seen, the GCN cards were faster. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, GoldenLag said:

Using the brute force calculation strenght of the GPU, the 5700 and 5700 xt is slower. 

Yes, but that has nothing to do with it. The actual real-world benchmarks matter far more than theoretical limitations.

3 minutes ago, GoldenLag said:

Also the architecture differences between GCN and RDNA is the fact it trades the compute structure of GCN for a more gaming oriented compute pipeline. 

That is a major selling point of the GPU, and says nothing about actual performance in compute tasks.

4 minutes ago, GoldenLag said:

And iirc the few GPU render benchmarks ive seen, the GCN cards were faster. 

And those benchmarks are what I want to see.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

With all due respect, I could care less about the gaming performance or either card, but common sense, even for a somewhat PC illiterate (me) says obviously 64 is better at rendering tasks than 5700 (XT) but I dunno how muc or little pci 4.0 vs 3.0 matters, but the memory bandwidth of the 64 should matter a lot as well as compute units(?!) all in all I get why the 64 is a better, faster card at pro tasks (disregarding gaming completely here, totally not relevant to the topic)

 

Now on another rant/note: looks like AMD official statement is max two non DP displays can be used at the same time as well as two DP displays totalling 4 displays, even if the STRIX has an additional DVI port - so that would make that port and the fact that it has 5 outputs totally useless, since anyone can get an adapter if u need DVI, and since it cant even run all 5 outputs at the same time, I might as well (and will be much better off) buying a Liquid Cooled 64, then I wont loose a PCI-E port either, and I'm not loosing a port since it couldnt technically utilize that stupid DVI port in conjunction with the other 4 ports at the same time anyway, right ?!!  I wonder why NO ONE states this anywhere...I guess not many people require more than 4 displays so most people don't know this limitation? Or am I missing something, and it's possible to do 5 displays with all 5 ports and I should stick to getting the STRIX  ?!!

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, BTGbullseye said:

Yes, but that has nothing to do with it. The actual real-world benchmarks matter far more than theoretical limitations.

https://www.anandtech.com/show/14618/the-amd-radeon-rx-5700-xt-rx-5700-review/13

 

the architecture change does matter a whole lot. (and yes i know its old, but it falls in line with what we would expect)

9 hours ago, BTGbullseye said:

That is a major selling point of the GPU, and says nothing about actual performance in compute tasks.

its a selling point when it comes to gaming and powerconsumption. sadly not when it comes to compute workloads. 

9 hours ago, BTGbullseye said:

And those benchmarks are what I want to see.

i could sadly only find 1 that covered compute. but from what we know architecturally, it falls inline with what we would expect. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, GoldenLag said:

(and yes i know its old, but it falls in line with what we would expect)

So old in fact that it was using pre-launch drivers... The ones well known to cause strange issues with the compute side of things.

4 hours ago, GoldenLag said:

i could sadly only find 1 that covered compute. but from what we know architecturally, it falls inline with what we would expect. 

And since that was using pre-launch drivers, drivers known to cause problems with compute, it's not really a good benchmark to use.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

So whats the verdict?

A liquid cooled vega 64, since you cant use the 5th (dvi) output anyway(?) and compute units are more important than gaming performance to me + I don’t loose the pci-e since it’s only 2U

 

or a strix, because you cán use the 5 outputs all at once for a 5 monitor setup(?)  

 

Or 5700 (xt) or perhaps a business/pro card ?

Link to comment
Share on other sites

Link to post
Share on other sites

Also I guess having 3 DP ports and 1 hdmi on the liquid 64 vs 2 x 2 (+ dvi) on the strix means the liquid wins in terms of outputs?! I guess having 3 DP ports with active hubs would mean I could have 6-7 minitors potentially?! (Given you can have 2 monitors per displayport with thr right adapter as I understand it?!) Or at least it would give me more potential outputs than the 2 DP 2 HDMI + dvi variants due to the limit of two non-DP ports, correct?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, torbenscharling said:

(Given you can have 2 monitors per displayport with thr right adapter as I understand it?!)

Considering all the GPUs in question have DisplayPort 1.4, it can handle two 4K 60Hz monitors per port. (or four 1080p monitors)

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

The confusion thickens and people are not replying with correct statements leading to no help and more confusion.

 

dp 1.4 can handle up to FOUR displays PER PORT WITH ACTIVE HUB

 

MSI stock cooler states max 4 displays (it has 3x dp 1xhdmi)

 

yet:

 

XFX with stock cooler and seemingly to the baked eye identical specs, yet it states 6 monitors max out (also has 3x dp 1x  hdmi)

 

 why?! And who can I trust? This is a jungle, kids :(

 

Why nobody even mentioned this: https://www.gigabyte.com/Graphics-Card/GV-RXVEGA64GAMING-OC-8GD#kf  Has THREE HDMI OUTPUTS AND THREE DISPLAY PORT OUTPUTS !!!! And could anyone explain if I can then get more than 6 with active hubs or it’s capped for some other bizarre reason to 6 monitors ?

also what would the benifit be then, to having 6 dedication outputs as opposed to using active DP hubs with the 3x DP variants? (Cause the stock ones still win given I get to keep my pci-e since they don't steal more than 2U space and given I can just use active hubs to achieve the same amount of bizarrely limited max amount of monitors)

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, torbenscharling said:

The confusion thickens and people are not replying with correct statements leading to no help and more confusion.

 

dp 1.4 can handle up to FOUR displays PER PORT WITH ACTIVE HUB

 

MSI stock cooler states max 4 displays (it has 3x dp 1xhdmi)

 

yet:

 

XFX with stock cooler and seemingly to the baked eye identical specs, yet it states 6 monitors max out (also has 3x dp 1x  hdmi)

 

 why?! And who can I trust? This is a jungle, kids :(

 

Why nobody even mentioned this: https://www.gigabyte.com/Graphics-Card/GV-RXVEGA64GAMING-OC-8GD#kf  Has THREE HDMI OUTPUTS AND THREE DISPLAY PORT OUTPUTS !!!! And could anyone explain if I can then get more than 6 with active hubs or it’s capped for some other bizarre reason to 6 monitors ?

also what would the benifit be then, to having 6 dedication outputs as opposed to using active DP hubs with the 3x DP variants? (Cause the stock ones still win given I get to keep my pci-e since they don't steal more than 2U space and given I can just use active hubs to achieve the same amount of bizarrely limited max amount of monitors)

The DisplayPort standard doesn't care what the manufacturers of the card say, it will still support using four 1080p60 or two 4k60 monitors on each DP1.4 port. No manufacturer can call it "DisplayPort 1.4" if it doesn't support 100% of that specification. (they would get sued into bankruptcy if they did)

 

The port number and type are all that matters, ignore everything else the manufacturers say about them.

 

The reason there is a limit, is because there are bandwidth limits to the specification, and that is what restricts the monitor selections.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites



If you are looking for performance the Vega 64 tweaked its actualy a good card and you have RIS now to suport it. 

if you get one for cheap is a good deal, HBCC is a must have Division 2 goes to 9g vram on some ocasions at 1440p. If I edit with resolve at 4k it jumps to 10g vram used wtih HBCC at 12g.

 

New 5700xt has the advantage but if you get a vega for half the price of a 5700xt just go for the vega.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, BTGbullseye said:

The DisplayPort standard doesn't care what the manufacturers of the card say, it will still support using four 1080p60 or two 4k60 monitors on each DP1.4 port. No manufacturer can call it "DisplayPort 1.4" if it doesn't support 100% of that specification. (they would get sued into bankruptcy if they did)

 

The port number and type are all that matters, ignore everything else the manufacturers say about them.

 

The reason there is a limit, is because there are bandwidth limits to the specification, and that is what restricts the monitor selections.

THANK YOU for making sense out of this, I have no idea why these companies would market their products this way when they are actually capable of more, glad to know I can just go with a 2U one with 3 DP ports an get 13 monitors or get the one with 3 DP 3 hdmi, which would mean a potential of 15 monitors hooked up to it lol :)

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Xkillerpn said:



If you are looking for performance the Vega 64 tweaked its actualy a good card and you have RIS now to suport it. 

if you get one for cheap is a good deal, HBCC is a must have Division 2 goes to 9g vram on some ocasions at 1440p. If I edit with resolve at 4k it jumps to 10g vram used wtih HBCC at 12g.

 

New 5700xt has the advantage but if you get a vega for half the price of a 5700xt just go for the vega.

Ya I think I'll wait with that cause I would get more out of then upgrading the mobo to a better one, z390 and a 9900k but I think I can make do with what I got and if I stick to a 2U card I can get some pci-e DSP cards to handle some of the audio processing, and I got a capture card in one of the pci-e slots as well..

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, torbenscharling said:

Ya I think I'll wait with that cause I would get more out of then upgrading the mobo to a better one, z390 and a 9900k but I think I can make do with what I got and if I stick to a 2U card I can get some pci-e DSP cards to handle some of the audio processing, and I got a capture card in one of the pci-e slots as well..

If you're upgrading the motherboard and CPU, you can save a LOT of money by going for AMD, without sacrificing any performance. If you need single-thread performance, (this is the most important thing for gaming, and most other activities, as extremely multithreaded programs are few and far between) nothing in the world beats out a Ryzen 7 3800X. (at least until you can get ahold of the Ryzen 7 PRO 3700 or Ryzen 9 PRO 3900) So that with a nice x570 motherboard would run you about $200-$250 less than a comparable Intel system. Just an idea...

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/10/2019 at 8:26 AM, BTGbullseye said:

If you're upgrading the motherboard and CPU, you can save a LOT of money by going for AMD, without sacrificing any performance. If you need single-thread performance, (this is the most important thing for gaming, and most other activities, as extremely multithreaded programs are few and far between) nothing in the world beats out a Ryzen 7 3800X. (at least until you can get ahold of the Ryzen 7 PRO 3700 or Ryzen 9 PRO 3900) So that with a nice x570 motherboard would run you about $200-$250 less than a comparable Intel system. Just an idea...

Thx ya I guess I gotta think again about going AMD+Nvidia vs keeping what I got to do hackintosh...maybe I would be better off getting rid of this 8700k delidded cpu and the mobo and get a better mobo and that ryzen cpu as clearly my single core performance is the biggest bottleneck when doing low latency live audio..

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×