Jump to content

AMD Radeon VII Benchmark/Launch Mega Thread

Taf the Ghost
2 minutes ago, ThePD said:

Most 2080s are more expensive than 700 dollars. Some models go as high as 900 USD.

It's relitiviely easy to find 2080 models cheaper than $699 as well. So that is something to keep in mind.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, maartendc said:

Well, that is one way to "win" a debate.

 

Good for you, you won.

if you think there's even a ''debate'' to be had as to wheter or not the RTX 2080 is a VASTLY superior offering in terms of 700$ graphics card, then i'm sorry i can't fix you ;)

 

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, pas008 said:

no ray tracing will not die off

its not proprietary

you cant leap forward if stepping stones arent in place

Nvidia be like:

 

proxy.duckduckgo_com.jpg.bbeb0d7be9b4d4b520e2e2de66dc33ec.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, maartendc said:

Hey, look at it this way: Your 1080Ti was the best investment ever. Because it still would be great value even today. (kind of sad, but good for you!)

 

Heck, I feel the same way about my 980Ti, which I bought used for $350 about 2 years ago. At that price, it is still only 12% slower than a RTX 2060 (which is $349). Still does everything I could need it to do at 1440P/60.

Best GPU investment this generation was actually a pre-ordered RX Vega 64, since you could mine on it from day 1. There's a chunk of r/AMD that profited from preordering. Haha.

 

But used 1080 Tis are still the best deal right now. The Radeon VII will live a long life as a FP64, price/performance King compute card. It's interesting as a card to have in the market, same as the 2080, but these never move large volume. 

Link to comment
Share on other sites

Link to post
Share on other sites

My take on this: an OK premiere, nice to have a choice but unfortunately with that price/performance ratio AMD will not be the savior in GPU segment. There is zero reasons for NVIDIA to even contemplate price changes.

 

As such, I think most are disappointed not because of the GPU's worth in the vacuum but because they kinda expected AMD to do what they did in CPU market. It's a bit frustrating because it seems like they could've done that if they relinquished that HBM amount fetish.

 

People who bought a 1080ti when it premiered are the real lottery winners here ?

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Taf the Ghost said:

2080 Ti draws more power and no one cares about RTX.

2080 Ti is 250W TDP, Radeon VII is 295/300W TDP (depending on where I look)? Or does it not work out that way in comparable use cases? I still haven't had a good chance to check out reviews yet.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, porina said:

2080 Ti is 250W TDP, Radeon VII is 295/300W TDP (depending on where I look)? Or does it not work out that way in comparable use cases? I still haven't had a good chance to check out reviews yet.

They'll say anything...you won't win against the die hard ones...
They'll say stuff like: '' yeah but you can undervolt it and underclock it and THEN it will be better and blah blah blah..'' you'll never win.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, porina said:

2080 Ti is 250W TDP, Radeon VII is 295/300W TDP (depending on where I look)? Or does it not work out that way in comparable use cases? I still haven't had a good chance to check out reviews yet.

See the time stamp for Power under load. Varies between game, actually, but that's the rough ranking.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, CTR640 said:

IMHO AMD should have dropped HBM and redesigned GDDR5X/6 for Radeon VII to make it cheaper. 2+ years old performance for $700 or more is not something I would pay for it. I mean, I already bought 1080Ti for €680 back in september last year. Hopefully Intel can develop better GPU's to tackle nVidia as AMD is not doing very well in the high-end segment for consumers. Or AMD should lower it to $499 to make it much more attractive.

 

nVidia is just cancer with their RTX and Tensor cores. Why make the RTX cards damn expensive if they want to make ray tracing more common?

The vega 64 and Vega 56 are both bandwidth starved already with the having huge performance gains from memory overclocking. Using gddr6 would likely kill the performance of the card making it basically useless because the Vega 64 and Vega 56 exist. 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD simply doesn't bother to compete in gaming market anymore imo, Bullzoid said this in one of his video and i agree with that. 

 

10 minutes ago, porina said:

2080 Ti is 250W TDP, Radeon VII is 295/300W TDP (depending on where I look)? Or does it not work out that way in comparable use cases? I still haven't had a good chance to check out reviews yet.

power_average.png

I can't find anyone measure power consumption with RTX on though, doubt it stay the same.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, porina said:

2080 Ti is 250W TDP, Radeon VII is 295/300W TDP (depending on where I look)? Or does it not work out that way in comparable use cases? I still haven't had a good chance to check out reviews yet.

TDPs are not directly comparable. There is no universal standard on how to measure for it.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, maartendc said:

Hey, look at it this way: Your 1080Ti was the best investment ever. Because it still would be great value even today. (kind of sad, but good for you!)

 

Heck, I feel the same way about my 980Ti, which I bought used for $350 about 2 years ago. At that price, it is still only 12% slower than a RTX 2060 (which is $349). Still does everything I could need it to do at 1440P/60.

I kept my GTX780 for 4 years and 7 months, pretty cool for such old GPU. I think I plan on to use my 1080Ti for 4 years and 7 months lol

But I am afraid if this pricing bullshit keeps stayin like that, it can end up like 8 years or so. And 12% difference is not much.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

Looks alright. Has some disadvantages compared to RTX 2080, but also advantages with other. It's pretty much how I expected it'll stack up. If you don't give flying fart about power and doesn't like NVIDIA, it's certainly a good option. I'd prefer if it was 650€ (coz $ = €), but I know HBM2 is still very expensive and given it comes with aftermarket grade cooler, the price is "reasonable".

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, xAcid9 said:

AMD simply doesn't bother to compete in gaming market anymore imo, Bullzoid said this in one of his video and i agree with that. 

 

-snip-

I can't find anyone measure power consumption with RTX on though, doubt it stay the same.

GN compared whole system power consumption between the R7 and 2080 and they were really quite similar under an AotS benchmark.

USEFUL LINKS:

PSU Tier List F@H stats

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Taf the Ghost said:

See the time stamp for Power under load. Varies between game, actually, but that's the rough ranking.

Thanks, not in a position to watch video right now (in conference call at work :)).

16 minutes ago, Drak3 said:

TDPs are not directly comparable. There is no universal standard on how to measure for it.

Good point, one I keep forgetting on GPU side. If the results above are system level, that is a fair comparison method. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, CTR640 said:

I kept my GTX780 for 4 years and 7 months, pretty cool for such old GPU. I think I plan on to use my 1080Ti for 4 years and 7 months lol

But I am afraid if this pricing bullshit keeps stayin like that, it can end up like 8 years or so. And 12% difference is not much.

Bought my 1080ti before the mining craze at 800 usd for the ftw3. Then bought the 2080ti because I want to game at 4k and be able to do more than 60 fps. Gave the 1080ti to my brother because why not. People can find reasons to upgrade and to not but it's up to the individual to decide. 

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, Taf the Ghost said:

It's a re-purposed Compute card that's going to live a long life at Science Departments the world over and on the shelves of Tech Reviewers. We'll get an actual high-end card probably next year.

It actually reminds me more of the orginial Titan because of the compute chomps this Radeon VII has.  Good at gaming but no slouch either when some compute work needs to be done.  

2023 BOINC Pentathlon Event

F@H & BOINC Installation on Linux Guide

My CPU Army: 5800X, E5-2670V3, 1950X, 5960X J Batch, 10750H *lappy

My GPU Army:3080Ti, 960 FTW @ 1551MHz, RTX 2070 Max-Q *lappy

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, Xbox One S, Xbox One X

My Tablet Squad: iPad Air 5th Gen, Samsung Tab S, Nexus 7 (1st gen)

3D Printer Unit: Prusa MK3S, Prusa Mini, EPAX E10

VR Headset: Quest 2

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to comment
Share on other sites

Link to post
Share on other sites

I have no idea why someone would choose this over the 2080.  In the best cases it barely edges out the 2080, in performance averages it is 10-14% behind the 2080 depending on resolution. 

i9-9900k @ 5.1GHz || EVGA 3080 ti FTW3 EK Cooled || EVGA z390 Dark || G.Skill TridentZ 32gb 4000MHz C16

 970 Pro 1tb || 860 Evo 2tb || BeQuiet Dark Base Pro 900 || EVGA P2 1200w || AOC Agon AG352UCG

Cooled by: Heatkiller || Hardware Labs || Bitspower || Noctua || EKWB

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, TahoeDust said:

I have no idea why someone would choose this over the 2080.  In the best cases it barely edges out the 2080, in performance averages it is 10-14% behind the 2080 depending on resolution. 

yeah but the radeon is also more power hungry, hotter, louder and it doesn't support any of the new tech that will come with 2019 games...developer support and optimization for AMD cards is few and far between, and getting worse everyday...and they removed crossfire support...what else could you want? if only they we're only down on performance...and if only this was a 499$ card we were talking about.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

This summer we will see a PCIE 4 version

CPU i7 4960x Ivy Bridge Extreme | 64GB Quad DDR-3 RAM | MBD Asus x79-Deluxe | RTX 2080 ti FE 11GB |
Thermaltake 850w PWS | ASUS ROG 27" IPS 1440p | | Win 7 pro x64 |

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, THraShArD said:

This summer we will see a PCIE 4 version

What do they expect to change with this?  Is it saturating PCIE 3 now?

i9-9900k @ 5.1GHz || EVGA 3080 ti FTW3 EK Cooled || EVGA z390 Dark || G.Skill TridentZ 32gb 4000MHz C16

 970 Pro 1tb || 860 Evo 2tb || BeQuiet Dark Base Pro 900 || EVGA P2 1200w || AOC Agon AG352UCG

Cooled by: Heatkiller || Hardware Labs || Bitspower || Noctua || EKWB

Link to comment
Share on other sites

Link to post
Share on other sites

This is one huge dud of a product, I guess its good for the Never Nvidia'ers, but that is it. Same price, less features, more power hungry, and less availability than the 2080.

 

This GPU generation is turning out to be the most disappointing. Even with the 7nm advantage, and a ton of HBM2, AMD could not top Nvidia's best, or give consumers a price/performance champion.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I thought AMD was going to be our savior and save everyone from high GPU prices.  Maybe Nvidia is not actually gouging the market and it is just really expensive to make GPU with the performance we are demanding?  Who would have thunk it.

i9-9900k @ 5.1GHz || EVGA 3080 ti FTW3 EK Cooled || EVGA z390 Dark || G.Skill TridentZ 32gb 4000MHz C16

 970 Pro 1tb || 860 Evo 2tb || BeQuiet Dark Base Pro 900 || EVGA P2 1200w || AOC Agon AG352UCG

Cooled by: Heatkiller || Hardware Labs || Bitspower || Noctua || EKWB

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Chett_Manly said:

This is one huge dud of a product, I guess its good for the Never Nvidia'ers, but that is it. Same price, less features, more power hungry, and less availability than the 2080.

 

This GPU generation is turning out to be the most disappointing. Even with the 7nm advantage, and a ton of HBM2, AMD could not top Nvidia's best, or give consumers a price/performance champion.

 

 

It's a tweaked Vega 64. What were you expecting? It's a miracle they mostly got performance close to GTX 2080 given they used old(er) core design.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×