Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Taf the Ghost

AMD Radeon VII Benchmark/Launch Mega Thread

Recommended Posts

Posted · Original PosterOP
15 minutes ago, cj09beira said:

interesting had not noticed that, it really does seem like they planned on talking more about it, my guess is that its navi features as navi was supposed to come sooner.

GDC seems to be the big info drop. There's going to be more across the board DXR stuff, which might have more to do with Compute-based lighting engines than necessary Raytracing, if I've read the tea leaves properly.

Link to post
Share on other sites

AMD should just focus on making cross fire really good then they don't have to make chips as good as Nvidias just make 2 cheap cards perform like a mid grade 1 and 2 mid grade perform like a high end 1 and price them lower.

Link to post
Share on other sites
1 minute ago, ToneStar said:

AMD should just focus on making cross fire really good then they don't have to make chips as good as Nvidias just make 2 cheap cards perform like a mid grade 1 and 2 mid grade perform like a high end 1 and price them lower.

problem is it would be very dificult to do that at the driver level and endup requiring per game profiles gain, large performance diffirences between games, it would be a mess.

when games are fully raytraced it will be quite a bit easier 

Link to post
Share on other sites
Just now, cj09beira said:

problem is it would be very dificult to do that at the driver level and endup requiring per game profiles gain, large performance diffirences between games, it would be a mess.

when games are fully raytraced it will be quite a bit easier 

Try to make it so the games don't need to be optimized for it, there have been other technologies in the past that didn't require software such as 3dfx's version of SLI what it did is the cards would render every other line. PowerVR and S3 also had some different hardware technologies for it.

Link to post
Share on other sites

S3 I believe had a technology that would just render different sections of the screen so you could have 4 chips and it would just divide the screen vertically then the software would just sync them together.

Link to post
Share on other sites
Posted · Original PosterOP
3 minutes ago, ToneStar said:

AMD should just focus on making cross fire really good then they don't have to make chips as good as Nvidias just make 2 cheap cards perform like a mid grade 1 and 2 mid grade perform like a high end 1 and price them lower.

No, they need to get their efficiency to Nvidia's level, first. Vega launched in 2017, so it's mostly a 2014 design, with the way things work in silicon engineering. mGPU as we know it will always require Developer support and no one bought them that way even when developers did.

Just now, cj09beira said:

problem is it would be very dificult to do that at the driver level and endup requiring per game profiles gain, large performance diffirences between games, it would be a mess.

when games are fully raytraced it will be quite a bit easier 

AMD will have to rework the entire front-end to be able to run MCMs like we all think they're going to do. Problems will be latency and coherence. We'll see what happens when they drop their MCM MI100/120 card at some point. There'll be at least their first generation on-PCB MCM setups.

 

I honestly think it'll be after Zen4 and the in-package chiplets before we see MCM GPUs that can handle gaming. It's going to take AMD a while before they can get the architecture in a state where it's seemless. CCIX being much more common will help them along.

Link to post
Share on other sites
5 minutes ago, ToneStar said:

Try to make it so the games don't need to be optimized for it, there have been other technologies in the past that didn't require software such as 3dfx's version of SLI what it did is the cards would render every other line. PowerVR and S3 also had some different hardware technologies for it.

problem is games have become much more complex since, things like temporal anti alisasing for example are quite a pain in the butt for crossfire 

 

4 minutes ago, ToneStar said:

S3 I believe had a technology that would just render different sections of the screen so you could have 4 chips and it would just divide the screen vertically then the software would just sync them together.

problem with that is anti aliasing where the cards would have to quickly share the results of the pixels near the divide, or have anti aliasing done on a single die

Link to post
Share on other sites
Posted · Original PosterOP
1 minute ago, cj09beira said:

problem is games have become much more complex since, things like temporal anti alisasing for example are quite a pain in the butt for crossfire 

 

problem with that is anti aliasing where the cards would have to quickly share the results of the pixels near the divide, or have anti aliasing done on a single die

They need cache & memory coherency to work with modern games. Original SLI was great because it actually rendered each line on separate GPUs, but modern graphics doesn't allow for that approach anymore.  Which is why we'll see the Radeon Duo with on-PCB PCIe 4 coherent connections at some point soon. It'll act, as a compute card, as one GPU.

 

Getting a Gaming GPU to that point is a ways off, but it's pretty obvious why AMD will go that way. They can eventually drop the need for a "Big" design and simply double up (or more) their smaller designs.

Link to post
Share on other sites
1 minute ago, Taf the Ghost said:

They need cache & memory coherency to work with modern games. Original SLI was great because it actually rendered each line on separate GPUs, but modern graphics doesn't allow for that approach anymore.  Which is why we'll see the Radeon Duo with on-PCB PCIe 4 coherent connections at some point soon. It'll act, as a compute card, as one GPU.

 

Getting a Gaming GPU to that point is a ways off, but it's pretty obvious why AMD will go that way. They can eventually drop the need for a "Big" design and simply double up (or more) their smaller designs.

well they did it with the CPUs and infinity fabric, threadripper, epyc are more like 2-8 cpus stuck together.

Link to post
Share on other sites
7 hours ago, i_build_nanosuits said:

1) "how many reviews have you watched?"

 

2) "...consumes more power than a 2080ti...in fact more than any nvidia cards..."

 

3) "...doesn't have ray tracing, no tensor core, no DLSS..."

 

4) "You'd have to be NUTS...and i mean it...completely NUTS to blow 700$+ on this crap of a card instead of an RTX 2080."

1) How many videos have you watched? Did you not hear the part about the drivers being broken? GN and Hardware Unboxed already said they are going to re-shoot the review of this card when working drivers are available. Obviously a bad launch is a major black eye and must be corrected for future launches, but AMD apparently doesn't want to sell many of these cards anyways!

 

2) Not if you believe the Hardware Unboxed review. It actually draws 51 watts less than the 2080ti. It has a similar power draw compared to the gtx 1080ti. I agree with the premise that power consumption sucks though. I was hoping 7nm would be better, but it looks like AMD decided to clock the GPU to its limit to compete with RTX. They didn't really have any other option. Still this is effectively a very heavily overclocked Vega card. Anyone that's done any overclocking knows that this kills any kind of reasonable of power draw. The only reason it matches current Vega power numbers is because of the node shrink to 7nm.

image.png.7273e2c0051caac3cb43a9533a613936.png

 

3) How many people have bought RTX cards for Ray tracing and DLSS? 3, maybe? This reminds me of PhysX. The only difference in the gimmicks is that ray tracing actually may have some desirability to the gaming market nVidia is targeting when ray tracing performance increases by an order of magnitude. As for the current cards, no, ray tracing is dumb. That's ok because "Just buy it!" - Tomshardware. Almost no one is buying a gaming GPU to utilize the tensor cores either. Most people looking into AI development are looking at commercial cards not gaming cards. You're just preaching a bunch of marketing crap.

 

4) Uhhh... no. The 1/4 FP64 performance (instead of 1/16 nVidia gaming cards have vs commercial cards) actually makes the Radeon VII much more appealing to people that are in the market for RTX 2080 gaming performance but also do some compute work. Admittedly that is a very small market. Again, AMD really doesn't want to sell many Radeon VIIs so no big deal in the first place. Calm down. Not everyone wants the same thing you want and vice-versa. Just because an RTX card may be better for you at the same price point of the Radeon VII doesn't mean it's better for everyone at the same price.


CPU: i7 4790k @ 4.7 GHz

GPU: XFX GTS RX580 4GB

Cooling: Corsair h100i

Mobo: Asus z97-A 

RAM: 4x8 GB 1600 MHz Corsair Vengence

PSU: Corsair HX850

Case: NZXT S340 Elite Tempered glass edition

Display: LG 29UM68-P

Keyboard: Roccat Ryos MK FX RGB

Mouse: Logitech g900 Chaos Spectrum

Headphones: Sennheiser HD6XX

OS: Windows 10 Home

Link to post
Share on other sites
1 hour ago, mxk. said:

-snip-

 

So, I have watched about 3-4 more reviews of the card from professional 3rd party reviewers.  It seems like the Radeon VII drivers are a key issue right now.  Jay was running into application crashing and black screens, Paul was noticing some unusual things as well, as were a few others.  Glad I'm waiting until August because in Paul's test the Radeon VII lost to the FE of the 2080 in Maya with one type of test, but that could be due to other reasons as well which could favor Turing.  But, Linus did a Blender test which showed the Radeon VII winning in Blender with lower rendering times.  Might get a 2080ti still depending on how things play out by August.  XD


VashTheStampede 4.0:

CPU: AMD Threadripper 1950x | CPU Cooling: EKWB Liquid Cooling(EK-Supremacy sTR4 RGB - Nickel, EK-CoolStream SE 280, EK-Vardar EVO 140ER Black x 2, EK-XRES 100 SPC-60 MX PWM (incl. pump), EK-ACF Fitting 10/13mm - Red (6-pack), EK-DuraClear 9,5/12,7mm 3M, and Scarlet Red Premix) | Compound: Thermal Grizzly Kryonaut | Mobo: Asrock X399 Taichi | Ram: G.Skill Ripjaws V 32GBs (2x16) DDR4-3200 | Storage: Crucial MX500 500GB M.2-2280 SSD/PNY CS900 240GB SSD/Toshiba X300 4TB 7200RPM | GPU: Zotac Geforce GTX 1080 8GB AMP! Edition(Replacing with a Radeon VII | Case: Fractal Define R5 Blackout Edition w/Window | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Operating System: Windows 10 Pro | Keyboard: Ducky Shine 7 Blackout Edition with Cherry MX Silent Reds | Mouse: Corsair M65 Pro RGB FPS | Headphones:  AKG K7XX Mass Drop Editions(Replacing with k712s) | Mic: Audio-Technica ATR2500 | Speakers: Mackie MR624 Studio Monitors

 

Surtr:

CPU: AMD Ryzen 3 2200G(Temp/Upping to a Zen 2 Ryzen 7) | CPU Cooling: Wraith(Dark Rock Pro 4 when I get the 3700x or 3800x) | Compound: Thermal Grizzly Kryronaut | Mobo: Asrock x470 Taichi | Ram: G.Skill Ripjaws V 16GBs (2x8) DDR4-3200 | Storage: PNY - BX500 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 770 2GB /02G-P4-2770-KR(Temp/getting a Navi card later this year) | Case: Corsair - SPEC-DELTA RGB | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 | Keyboard: Corsair K70 with Cherry MX Reds | Mouse: Corsair M65 Pro RGB FPS Speakers: JBL LSR 305 Studio Monitors(At some point)

 

Prince of Dark Rock:

CPU: AMD Ryzen 5 2600 | CPU Cooling: be quiet! - Dark Rock Pro 4 | Compound: Thermal Grizzly Kryronaut | Mobo: MSI B450 Tomahawki | Ram: G.Skill Ripjaws V 8GBs (2x4) DDR4-3200 | Storage: Crucial - BX200 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 1060 6GB SSC | Case: Cooler Master - MasterBox MB511 | PSU: Corsair - CXM 550W | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 Home | Keyboard: Rosewill - NEON K85 RGB BR | Mouse: Razer DeathAdder Elite Destiny 2 Edition 

Link to post
Share on other sites

 

8 minutes ago, ATFink said:

2) Not if you believe the Hardware Unboxed review. It actually draws 51 watts less than the 2080ti. It has a similar power draw compared to the gtx 1080ti. I agree with the premise that power consumption sucks though. I was hoping 7nm would be better, but it looks like AMD decided to clock the GPU to its limit to compete with RTX. They didn't really have any other option. Still this is effectively a very heavily overclocked Vega card. Anyone that's done any overclocking knows that this kills any kind of reasonable of power draw. The only reason it matches current Vega power numbers is because of the node shrink to 7nm.

image.png.7273e2c0051caac3cb43a9533a613936.png

4) Uhhh... no. The 1/4 FP64 performance (instead of 1/16 nVidia gaming cards have vs commercial cards) actually makes the Radeon VII much more appealing to people that are in the market for RTX 2080 gaming performance but also do some compute work. Admittedly that is a very small market. Again, AMD really doesn't want to sell many Radeon VIIs so no big deal in the first place. Calm down. Not everyone wants the same thing you want and vice-versa. Just because an RTX card may be better for you at the same price point of the Radeon VII doesn't mean it's better for everyone at the same price.

btw amd stopped where they did do to pcie spec limitations, pcie cards are limited to 300w per card (stupid as the limit should depend on the amount of connectors not a rambom value)

looking at the card's clock speeds in some of the videos we got, it can clock quite a bit higher, problem is its voltage is high because they want to be able to use the worst dies here (the good ones are going to the server market), which i think is a bad strategic move that just ends up making them sell even less cards (same happened with the original vega cards, when apple was buying the good ones)

Link to post
Share on other sites
1 hour ago, Lathlaer said:

Clearly not entirely since Radeon's performance is higher than Vega 56 and it's not due to 8GB extra RAM.

Yes, it is.

As you would have had to remove two HBM Stacks, wich equals to VEGA56.

Though the new Vega 7 has a bit higher HBM Frequency, that would increase the performance a bit.

BUT: That would negate most of the increases we've seen with Radeon 7 as Radeon 7 has more than double the bandwith of Vega 56...

So that doesn't make much sense.


There is another Possibility: Order 2hi Stacks. But that makes even less sense as you have make the chip specifically for the Radeon 7 - wich is not the plan or what they wanted.

They just used the Server/Workstation Chip without many changes and put them onto a Desktop Card. Nothing more.

 

1 hour ago, Lathlaer said:

Because by the time I need 16GB for gaming, a top of the line current card with 16GB will be lower end mid range. 

So you're saying you're fine with planned obsolescence and rather buy a new card than use your old one a bit longer??
That's essentially what "the other side" does...

 

If there wasn't an AMD, we'd still be at 2GiB VRAM Cards, maybe 4GiB at least.

But right now the Default is 8GiB in Mid Range...

 

Sorry, but no, that's just bullshit.

 

High End Cards SHOULD have more VRAM than the Mid Range ones...

 

So it makes more sense to demand that a HIGH END Card that is sold for almost double the price of VEGA56 has double the VRAM.

Simple as that...

 

Paying 1000€ for a Card that has more than 8GiB VRAM and no alternative cheaper than that is just insane.

Now we have a cheaper alternative for 700€ that has 16GiB VRAM...

 

 

1 hour ago, Lathlaer said:

Again, I am not saying this is a bad card - it's just my take on why I think people are mostly meh about it. Rendering, compute aside - for consumer gaming it's mostly an OKayish premiere, nothing more.

...wich shouldn't be the main focus from AMD as the past has shown that it does not work.

They _HAVE_ to target the Compute Users like Rendering and other professional customers (wich is why they limited the 64bit rate to "only" 1/4 and not less than that)...

 

Why market the Card to someone who wouldn't buy it anyway?
That would use it only to reduce the price on "their side"??

 

1 hour ago, Lathlaer said:

Well yeah but unless AMD beats the fact into people's skulls nothing will ever change.

And how should they do that?!
Telling the overly biased "Testers" to take a hike and a cease and dessist letter??

 

Because that's how nVidia got strong: Propaganda from the Press and bashing for Features the other side didn't have - even if they were useless at the time. (32bit framebuffer was not viable with SDR-SDRAM).

 

Just look at how the Press treats AMD when there is a small oopsie.

Just look at how the Press treated the GTX970 Issue and lately the Geforce Partner Programme...

 

Or how the Press is ignoring the Gameworks shit...

 

They care about making babies in a Video Game.

They do not care about Proprietary garbage in a Video Game that is to the advantage of the one making that shit.

1 hour ago, Lathlaer said:

Your solution is what, that they stop trying because "people will buy NVIDIA anyway"?

More or less, yes. Focus on a Market that actually WANTS your Products.

Many Gamers don't want AMD Cards. They only want AMD to pressure nVidia to lower the price.

But they would never ever buy any AMD Cards, if they have the choice.

 

Even if the AMD Card is 20% faster and cost 25% less (or even more than that), they wouldn't bother with that.

 

1 hour ago, Lathlaer said:

That's not the way to take the market. They need to be relentless in their pursuit of market share. The alternative is to be angry, take your toys and drop the GPU segment altogether.

Wasting money on a Market that doesn't want your products is less viable than to  look for new markets that want your products.

Simple as that.


And the GPU Market for AMD is going very well, even if you don't like it!

They sold more than 200 Million of their Graphics Chips in Consoles. Even more than that...

 

XBox 360 had AMD Graphics, Playstation 4 and also the XBox One do.

 The Gamecube, Wii and WiiU had AMD Graphics. 

Add that up and you have:
Wii: ~100 Million

WiiU: ~15 Million

360: ~85 Million

PS4: probably same as 360

XBone: around 20-50 Million

Add that up and you have hundreds of million of graphics chips...

 

And they sell into the embedded Market. Ever heard about Matrox?? They sell Cards with AMD Chips!

 

With that going for them, why should they bother with High(er) End Graphics cards, where you'd just waste money on a product that hardly anyone wants??

 

Just look at the Steam Hardware Survey. That proves my point...

VEGA is almost non existant there...

 

1 hour ago, Lathlaer said:

If they couldn't make the GPU cheaper with less RAM due to supply chains, then fine - it is what it is.

They COULDN'T!

That would not have made ANY Sense!

Because its entirely possible that Samsung and Hynix don't even make 2hi HBM Stacks...


And using only 2 Stacks has 2  Problems:
a) basically a VEGA 56 in 7nm

b) not interchangable with the MI50...

 

1 hour ago, Lathlaer said:

But they have a recipe for potential Ryzen-esque premiere: make the same card with less RAM but same performance.

....

Why don't you want to understand that its not as easy as you believe?!
We ain't talking about Memory on the PCB.

We're talking about On Chip Memory!


So its not economically viable to reduce the Memory for Radeon 7!

For the Reason I've mentioned above..

 

And they also didn't want to do the Radeon 7.

It was only possible because of the rediculous prices from nVidia...

 


"Hell is full of good meanings, but Heaven is full of good works"

Link to post
Share on other sites
27 minutes ago, valdyrgramr said:

So, I have watched about 3-4 more reviews of the card from professional 3rd party reviewers.  It seems like the Radeon VII drivers are a key issue right now.  Jay was running into application crashing and black screens, Paul was noticing some unusual things as well, as were a few others.  Glad I'm waiting until August because in Paul's test the Radeon VII lost to the FE of the 2080 in Maya with one type of test, but that could be due to other reasons as well which could favor Turing.  But, Linus did a Blender test which showed the Radeon VII winning in Blender with lower rendering times.  Might get a 2080ti still depending on how things play out by August.  XD

This launch really feels like AMD is trying to shout "We're Here!" to the high end GPU market. Even though It might not be worth it it is impressive how AMD has really started to try competing with high end GPUs when they haven't for years.

Link to post
Share on other sites
47 minutes ago, ToneStar said:

AMD should just focus on making cross fire really good

AMD should ignore that and scrap that shit because hardly anybody uses it.

And they don't have the resources for 1% of the Userbase.

Yes, that is not a joke, that is how little people use multiple GPU in their PC...

 

And Crossfire Support was not that bad - back in the Radeon X era, where you didn't need a special GPU just put two (for example) X1800XL in a PC and call it a day.

47 minutes ago, cj09beira said:

problem is it would be very dificult to do that at the driver level and endup requiring per game profiles gain, large performance diffirences between games, it would be a mess.

when games are fully raytraced it will be quite a bit easier 

That as well.

Its a Chicken <-> Egg Problem at the Moment.

 

Because nobody uses it, nobody cares about it.

Becuase nobody cares about it, nobody uses it.

 

45 minutes ago, ToneStar said:

S3 I believe had a technology that would just render different sections of the screen so you could have 4 chips and it would just divide the screen vertically then the software would just sync them together.

AMD used a similar technique back in the Day.

Look up "SUper AA" or however that was called back in the day, when you could use both cards for FSAA and improve the picture further (IIRC up to 12x).


And also something called "Split Frame Rendering". 

 

So what you're saying was already done.

It was however phased out because it collides with fullscreen effects of modern Games...


"Hell is full of good meanings, but Heaven is full of good works"

Link to post
Share on other sites
Posted · Original PosterOP
9 minutes ago, Billy Pilgrim said:

This launch really feels like AMD is trying to shout "We're Here!" to the high end GPU market. Even though It might not be worth it it is impressive how AMD has really started to try competing with high end GPUs when they haven't for years.

Actually feels more like the Driver Team is knee deep in Navi work and behind schedule with the "rushed" Radeon VII launch.

 

Still, this is the King of Science Departments for the next decade.

Link to post
Share on other sites
1 hour ago, Billy Pilgrim said:

This launch really feels like AMD is trying to shout "We're Here!" to the high end GPU market. Even though It might not be worth it it is impressive how AMD has really started to try competing with high end GPUs when they haven't for years.

Well, apparently the one thing I watched was just a shitty synth bench having no real application to the program itself.  The point of Vega 20 is workstation cards that can game well.  My guess is that Navi wasn't ready for CES, so they just took the best card that could game/was ready to be shown off, enough, and that's what this is.  Kinda what you said.  The biggest issue with the reviews is not so much the card itself, but the fact that AMD focused more on the card themselves while not releasing anything close to a stable driver at the moment.  Every reviewer keeps stating they had stability and performance issues when trying to benchmark hence why they were limited reviews.  Jay apparently couldn't even do much at all due to the driver issues because he either got like app crashes or black screens.


VashTheStampede 4.0:

CPU: AMD Threadripper 1950x | CPU Cooling: EKWB Liquid Cooling(EK-Supremacy sTR4 RGB - Nickel, EK-CoolStream SE 280, EK-Vardar EVO 140ER Black x 2, EK-XRES 100 SPC-60 MX PWM (incl. pump), EK-ACF Fitting 10/13mm - Red (6-pack), EK-DuraClear 9,5/12,7mm 3M, and Scarlet Red Premix) | Compound: Thermal Grizzly Kryonaut | Mobo: Asrock X399 Taichi | Ram: G.Skill Ripjaws V 32GBs (2x16) DDR4-3200 | Storage: Crucial MX500 500GB M.2-2280 SSD/PNY CS900 240GB SSD/Toshiba X300 4TB 7200RPM | GPU: Zotac Geforce GTX 1080 8GB AMP! Edition(Replacing with a Radeon VII | Case: Fractal Define R5 Blackout Edition w/Window | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Operating System: Windows 10 Pro | Keyboard: Ducky Shine 7 Blackout Edition with Cherry MX Silent Reds | Mouse: Corsair M65 Pro RGB FPS | Headphones:  AKG K7XX Mass Drop Editions(Replacing with k712s) | Mic: Audio-Technica ATR2500 | Speakers: Mackie MR624 Studio Monitors

 

Surtr:

CPU: AMD Ryzen 3 2200G(Temp/Upping to a Zen 2 Ryzen 7) | CPU Cooling: Wraith(Dark Rock Pro 4 when I get the 3700x or 3800x) | Compound: Thermal Grizzly Kryronaut | Mobo: Asrock x470 Taichi | Ram: G.Skill Ripjaws V 16GBs (2x8) DDR4-3200 | Storage: PNY - BX500 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 770 2GB /02G-P4-2770-KR(Temp/getting a Navi card later this year) | Case: Corsair - SPEC-DELTA RGB | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 | Keyboard: Corsair K70 with Cherry MX Reds | Mouse: Corsair M65 Pro RGB FPS Speakers: JBL LSR 305 Studio Monitors(At some point)

 

Prince of Dark Rock:

CPU: AMD Ryzen 5 2600 | CPU Cooling: be quiet! - Dark Rock Pro 4 | Compound: Thermal Grizzly Kryronaut | Mobo: MSI B450 Tomahawki | Ram: G.Skill Ripjaws V 8GBs (2x4) DDR4-3200 | Storage: Crucial - BX200 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 1060 6GB SSC | Case: Cooler Master - MasterBox MB511 | PSU: Corsair - CXM 550W | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 Home | Keyboard: Rosewill - NEON K85 RGB BR | Mouse: Razer DeathAdder Elite Destiny 2 Edition 

Link to post
Share on other sites
2 hours ago, Stefan Payne said:

AMD should ignore that and scrap that shit because hardly anybody uses it.

And they don't have the resources for 1% of the Userbase.

Yes, that is not a joke, that is how little people use multiple GPU in their PC...

 

And Crossfire Support was not that bad - back in the Radeon X era, where you didn't need a special GPU just put two (for example) X1800XL in a PC and call it a day.

That as well.

Its a Chicken <-> Egg Problem at the Moment.

 

Because nobody uses it, nobody cares about it.

Becuase nobody cares about it, nobody uses it.

 

AMD used a similar technique back in the Day.

Look up "SUper AA" or however that was called back in the day, when you could use both cards for FSAA and improve the picture further (IIRC up to 12x).


And also something called "Split Frame Rendering". 

 

So what you're saying was already done.

It was however phased out because it collides with fullscreen effects of modern Games...

Nobody uses it because they quit making it good and quit making it on affordable GPUs and quit making it on dual gpu high end cards to make up for a lack of power in a single chip.

Link to post
Share on other sites
2 minutes ago, ToneStar said:

Nobody uses it because they quit making it good and quit making it on affordable GPUs and quit making it on dual gpu high end cards to make up for a lack of power in a single chip.

Tell that to Square Enix (Nier:Automata), Koei Tecmo and other game developer that develop the Games in a way that doesn't allow multi GPU.

 

And for DX12 and Vulkan it needs to be supported in the API.

And Vulkan (until recently) didn't support mGPU at all...

 

So no, nobody cares about it is a true statement. It sucks for some people but you need to tell the game developers first that you want multi GPU. And then you have to convince people to do that as well.

And then it might eventually change. But I don't see it happening right now....

 

The chances of that shit was awesome in the DX9 era - but even then it failed. And yes, it was pretty widely supported. The X1950GT had support for that. NVidia 8600GT had SLi support - still it made more sense to sell the card and get the next best thing.

 

So that only leaves the High End for it to make sense...

 

And yeah, I also have a Crossfire Setup lying around.


"Hell is full of good meanings, but Heaven is full of good works"

Link to post
Share on other sites
11 hours ago, i_build_nanosuits said:

if you think there's even a ''debate'' to be had as to wheter or not the RTX 2080 is a VASTLY superior offering in terms of 700$ graphics card, then i'm sorry i can't fix you ;)

 

It depends on use case. Someone that has a limited budget and does compute work + gaming would DEFINITELY buy the radeon vii over the rtx 2080 (provided proper driver support is implemented).


CPU: i7 4790k @ 4.7 GHz

GPU: XFX GTS RX580 4GB

Cooling: Corsair h100i

Mobo: Asus z97-A 

RAM: 4x8 GB 1600 MHz Corsair Vengence

PSU: Corsair HX850

Case: NZXT S340 Elite Tempered glass edition

Display: LG 29UM68-P

Keyboard: Roccat Ryos MK FX RGB

Mouse: Logitech g900 Chaos Spectrum

Headphones: Sennheiser HD6XX

OS: Windows 10 Home

Link to post
Share on other sites
11 hours ago, i_build_nanosuits said:

They'll say anything...you won't win against the die hard ones...
They'll say stuff like: '' yeah but you can undervolt it and underclock it and THEN it will be better and blah blah blah..'' you'll never win.

Probably because the people you're fighting against aren't trying to win any argumrent. No one is saying Radeon VII is better. EVERYONE agrees that at $700 the current state of the Radeon VII is worse than a $700 rtx 2080 (until drivers are fixed; then some people will leverage the FP64 compute capability). Even after driver fixes if the prices are similar the rtx 2080 will still be better for everyone not utilizing FP64.

 

My point is there is not a one size fits all package. Even when excluding fanboys there are still reasons to buy the Radeon VII (presumably when drivers are fixed), albeit the market for that use case is very small. Maybe those reasons are not meant for you, but they will be for some people.


CPU: i7 4790k @ 4.7 GHz

GPU: XFX GTS RX580 4GB

Cooling: Corsair h100i

Mobo: Asus z97-A 

RAM: 4x8 GB 1600 MHz Corsair Vengence

PSU: Corsair HX850

Case: NZXT S340 Elite Tempered glass edition

Display: LG 29UM68-P

Keyboard: Roccat Ryos MK FX RGB

Mouse: Logitech g900 Chaos Spectrum

Headphones: Sennheiser HD6XX

OS: Windows 10 Home

Link to post
Share on other sites
7 minutes ago, ATFink said:

Probably because the people you're fighting against aren't trying to win any argumrent. No one is saying Radeon VII is better. EVERYONE agrees that at $700 the current state of the Radeon VII is worse than a $700 rtx 2080 (until drivers are fixed; then some people will leverage the FP64 compute capability). Even after driver fixes if the prices are similar the rtx 2080 will still be better for everyone not utilizing FP64.

 

My point is there is not a one size fits all package. Even when excluding fanboys there are still reasons to buy the Radeon vii (presumably when drivers are fixed), albeit the market for that use case is very small. Maybe those reasons are not meant for you, but they will be for some people.

Well, I and even reviewers agree that it's a great card for specific workstation tasks and games good enough.  We even state that the RTX 2080 is the better sole gaming card.  Hopefully, Navi has a better sole game card.  But, to be fair, it is possible that the atm shit drivers are impacting the performance of the card.  I'd rather wait to see AMD release a better driver for it then have further testing done before making a final claim against the Radeon VII.  Jay couldn't even bench his due to how shit the drivers were.  XD


VashTheStampede 4.0:

CPU: AMD Threadripper 1950x | CPU Cooling: EKWB Liquid Cooling(EK-Supremacy sTR4 RGB - Nickel, EK-CoolStream SE 280, EK-Vardar EVO 140ER Black x 2, EK-XRES 100 SPC-60 MX PWM (incl. pump), EK-ACF Fitting 10/13mm - Red (6-pack), EK-DuraClear 9,5/12,7mm 3M, and Scarlet Red Premix) | Compound: Thermal Grizzly Kryonaut | Mobo: Asrock X399 Taichi | Ram: G.Skill Ripjaws V 32GBs (2x16) DDR4-3200 | Storage: Crucial MX500 500GB M.2-2280 SSD/PNY CS900 240GB SSD/Toshiba X300 4TB 7200RPM | GPU: Zotac Geforce GTX 1080 8GB AMP! Edition(Replacing with a Radeon VII | Case: Fractal Define R5 Blackout Edition w/Window | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Operating System: Windows 10 Pro | Keyboard: Ducky Shine 7 Blackout Edition with Cherry MX Silent Reds | Mouse: Corsair M65 Pro RGB FPS | Headphones:  AKG K7XX Mass Drop Editions(Replacing with k712s) | Mic: Audio-Technica ATR2500 | Speakers: Mackie MR624 Studio Monitors

 

Surtr:

CPU: AMD Ryzen 3 2200G(Temp/Upping to a Zen 2 Ryzen 7) | CPU Cooling: Wraith(Dark Rock Pro 4 when I get the 3700x or 3800x) | Compound: Thermal Grizzly Kryronaut | Mobo: Asrock x470 Taichi | Ram: G.Skill Ripjaws V 16GBs (2x8) DDR4-3200 | Storage: PNY - BX500 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 770 2GB /02G-P4-2770-KR(Temp/getting a Navi card later this year) | Case: Corsair - SPEC-DELTA RGB | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 | Keyboard: Corsair K70 with Cherry MX Reds | Mouse: Corsair M65 Pro RGB FPS Speakers: JBL LSR 305 Studio Monitors(At some point)

 

Prince of Dark Rock:

CPU: AMD Ryzen 5 2600 | CPU Cooling: be quiet! - Dark Rock Pro 4 | Compound: Thermal Grizzly Kryronaut | Mobo: MSI B450 Tomahawki | Ram: G.Skill Ripjaws V 8GBs (2x4) DDR4-3200 | Storage: Crucial - BX200 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 1060 6GB SSC | Case: Cooler Master - MasterBox MB511 | PSU: Corsair - CXM 550W | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 Home | Keyboard: Rosewill - NEON K85 RGB BR | Mouse: Razer DeathAdder Elite Destiny 2 Edition 

Link to post
Share on other sites
1 minute ago, valdyrgramr said:

Well, I and even reviewers agree that it's a great card for specific workstation tasks and games good enough.  We even state that the RTX 2080 is the better sole gaming card.  Hopefully, Navi has a better sole game card.  But, to be fair, it is possible that the atm shit drivers are impacting the performance of the card.  I'd rather wait to see AMD release a better driver for it then have further testing done before making a final claim against the Radeon VII.  Jay couldn't even bench his due to how shit the drivers were.  XD

My worry is that professional applications may have comparable support to games right now (that's to say next to none). So many games couldn't even boot in many reviews that I'm worried many programs may not be operable either. That's why I emphasized driver updates.

 

With proper support I'm willing to bet performance will go up, but if the card can't run stably then it certainly can't run a compute load. Maybe I'm wrong; if so I'm happy to listen.


CPU: i7 4790k @ 4.7 GHz

GPU: XFX GTS RX580 4GB

Cooling: Corsair h100i

Mobo: Asus z97-A 

RAM: 4x8 GB 1600 MHz Corsair Vengence

PSU: Corsair HX850

Case: NZXT S340 Elite Tempered glass edition

Display: LG 29UM68-P

Keyboard: Roccat Ryos MK FX RGB

Mouse: Logitech g900 Chaos Spectrum

Headphones: Sennheiser HD6XX

OS: Windows 10 Home

Link to post
Share on other sites
7 minutes ago, ATFink said:

My worry is that professional applications may have comparable support to games right now (that's to say next to none). So many games couldn't even boot in many reviews that I'm worried many programs may not be operable either. That's why I emphasized driver updates.

 

With proper support I'm willing to bet performance will go up, but if the card can't run stably then it certainly can't run a compute load. Maybe I'm wrong; if so I'm happy to listen.

Well, it struggled with some editing applications yet it was able to handle things like 4 3D modelers in a synth bench, actual rendering in blender as Linus tested, and a few others.  But, the reason it struggled, compared to the 2080 and 2080 Ti, in the editing tool was due to Nvidia's encoder thing or whatever that Linus talked about.  The guy at HWUnboxed stated that it was a driver issue for games, and he could only get a handful to work rather than the dozens he wanted to bench.  So, he's waiting for AMD's driver update with actual support before he does more testing.

https://www.spec.org/gwpg/gpc.static/vp13info.html This being the synth bench Paul used to cover Maya, 3ds Max, and like 2 others.


VashTheStampede 4.0:

CPU: AMD Threadripper 1950x | CPU Cooling: EKWB Liquid Cooling(EK-Supremacy sTR4 RGB - Nickel, EK-CoolStream SE 280, EK-Vardar EVO 140ER Black x 2, EK-XRES 100 SPC-60 MX PWM (incl. pump), EK-ACF Fitting 10/13mm - Red (6-pack), EK-DuraClear 9,5/12,7mm 3M, and Scarlet Red Premix) | Compound: Thermal Grizzly Kryonaut | Mobo: Asrock X399 Taichi | Ram: G.Skill Ripjaws V 32GBs (2x16) DDR4-3200 | Storage: Crucial MX500 500GB M.2-2280 SSD/PNY CS900 240GB SSD/Toshiba X300 4TB 7200RPM | GPU: Zotac Geforce GTX 1080 8GB AMP! Edition(Replacing with a Radeon VII | Case: Fractal Define R5 Blackout Edition w/Window | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Operating System: Windows 10 Pro | Keyboard: Ducky Shine 7 Blackout Edition with Cherry MX Silent Reds | Mouse: Corsair M65 Pro RGB FPS | Headphones:  AKG K7XX Mass Drop Editions(Replacing with k712s) | Mic: Audio-Technica ATR2500 | Speakers: Mackie MR624 Studio Monitors

 

Surtr:

CPU: AMD Ryzen 3 2200G(Temp/Upping to a Zen 2 Ryzen 7) | CPU Cooling: Wraith(Dark Rock Pro 4 when I get the 3700x or 3800x) | Compound: Thermal Grizzly Kryronaut | Mobo: Asrock x470 Taichi | Ram: G.Skill Ripjaws V 16GBs (2x8) DDR4-3200 | Storage: PNY - BX500 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 770 2GB /02G-P4-2770-KR(Temp/getting a Navi card later this year) | Case: Corsair - SPEC-DELTA RGB | PSU: EVGA SuperNOVA G2 750W 80+ Gold | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 | Keyboard: Corsair K70 with Cherry MX Reds | Mouse: Corsair M65 Pro RGB FPS Speakers: JBL LSR 305 Studio Monitors(At some point)

 

Prince of Dark Rock:

CPU: AMD Ryzen 5 2600 | CPU Cooling: be quiet! - Dark Rock Pro 4 | Compound: Thermal Grizzly Kryronaut | Mobo: MSI B450 Tomahawki | Ram: G.Skill Ripjaws V 8GBs (2x4) DDR4-3200 | Storage: Crucial - BX200 240 GB SSD+Seagate Constellation ES.3 1TB 7200RPM | GPU: EVGA - GeForce GTX 1060 6GB SSC | Case: Cooler Master - MasterBox MB511 | PSU: Corsair - CXM 550W | Optical Drive: Random HP DVD Drive | Operating System: Windows 10 Home | Keyboard: Rosewill - NEON K85 RGB BR | Mouse: Razer DeathAdder Elite Destiny 2 Edition 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×