Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Squibbies18

2080 or radeon VII?

Recommended Posts

Posted · Original PosterOP

So, I've been looking for a new GPU to replace my R9 fury and I was really considering radeon VII however, the only one I was willing to buy was the sapphire branded card as it's the cheapest version of the card here in the UK at £650. Unfortunately it doesn't seem like that version of the card is gonna be back for at least the next month at this point. So I started looking at the RTX 2080 instead. On Amazon they have EVGAs RTX 2080 Black edition doing for £694 and a basic gigabyte windforce card for £689. I've got a £50 amazon voucher to use so I could get either of these cards for less that any radeon VII currently being sold. I was just wondering if getting the 2080 over radeon VII is worth it at the moment. Thanks in advance.

Link to post
Share on other sites
1 minute ago, Squibbies18 said:

I was just wondering if getting the 2080 over radeon VII is worth it at the moment.

thats a question you answer not us

we dont know if you can wait a month or not

Link to post
Share on other sites

Both have nearly identical performance, it's up to you if you think Nvidia's value add features like raytracing and DLSS are worth the extra $50.


¯\_(ツ)_/¯

 

 

Desktop:

Intel Core i7-3820 | Corsair H100i | ASUS P9X79-LE | 16GB Patriot Viper 3 1866MHz DDR3 | MSI GTX 970 Gaming 4G | 2TB WD Blue M.2 SATA SSD | 2TB Hitachi Deskstar HDD | 1TB WD Black HDD | Corsair CX750M Fractal Design Define R5 Windows 10 Pro / Linux Mint 19 Cinnamon

 

Laptop:

Dell XPS 15 9560 4K Touch | Intel Core i5-7300HQ | 12GB Generic (Crucial?) 2133MHz DDR4 | Nvidia GTX 1050 | 256GB Toshiba M.2 NVMe SSD | Windows 10

Link to post
Share on other sites
2 minutes ago, Squibbies18 said:

So, I've been looking for a new GPU to replace my R9 fury and I was really considering radeon VII however, the only one I was willing to buy was the sapphire branded card as it's the cheapest version of the card here in the UK at £650. Unfortunately it doesn't seem like that version of the card is gonna be back for at least the next month at this point. So I started looking at the RTX 2080 instead. On Amazon they have EVGAs RTX 2080 Black edition doing for £694 and a basic gigabyte windforce card for £689. I've got a £50 amazon voucher to use so I could get either of these cards for less that any radeon VII currently being sold. I was just wondering if getting the 2080 over radeon VII is worth it at the moment. Thanks in advance.

If you have tasks that could put the Radeon VII to work, then that would be it. If you're mostly gaming, then I'd argue GeForce card is better.

Link to post
Share on other sites

I would wait because 16GB HBM2 and better drivers (and better compute), but up to you really. They are both good cards.

RTX and DLSS are useless/joke features anyway (on these cards at least, we will see in the future...).


CPU: Ryzen 5 1600 (3.6GHz; 1.25v) Cooler:  Noctua NF-A9x14 on Wraith Spire Heatsink Mobo: Asus Strix B350-i GPU: EVGA XC Gaming RTX 2070 RAM: G.Skill Trident Z 3200MHz 14CL 2x8GB SSD0: Crucial MX300 525GB SSD1: ADATA Ultimate SU800 128GB M.2 PSUCorsair SF450 Case: Fractal Design Node 202 Monitor1ViewSonic XG2401 144Hz 24" / AOC G2460PF 144Hz 24"

Laptop

Microsoft Surface

I usually edit my posts immediately after posting them, as I don't check for typos before pressing the shiny SUBMIT button. 

Spoiler

Other Builds:

CPU: Ryzen 3 1200 Cooler: Wraith Stealth Mobo: Biostar X370GTN GPU: Sapphire Pulse RX 580 8GB RAM: Corsair LPX 3000MHz 15CL 2x8GB SSD0: Crucial BX500 480GB PSU: Corsair SF450 Case: Silverstone Sugo SG13B White

 

Retired Devices and Inventory:

  • Asus NV56 17" (i7-3360QM, NVIDIA GT 650M, DDR3 8GB) laptop
  • Lenovo Thinkpad X220 12" (i5 vPro 2520M, Intel HD Graphics 3000, Samsung DDR3 1300MHz 9CL 2x4GB) laptop

 

Link to post
Share on other sites

Yes and also no game needs 16GB of VRAM,  I saw a chart, and Quake Champions uses the most VRAM 7.8GB  the rest of the games are lower like GTA 5 is 4 or 5GB VRAM and the list goes lower and lower, and battle field 5 doesnt use as much as Q ,, so,, ya,,,,,,,, The only reason to go over 8GB of VRAM is if you plan on gaming at 4k resolution which the Ti can do at 60fps.   Can the new Radeon do it welll, no body knows.  Go with the for sure thing,, grab 2080 unless you can wait for benchmarks.


Asus Sabertooth x79 / 4930k @ 4505 @ 1.408v / Gigabyte WF 2080 RTX / Corsair VG 64GB @ 1911 @ 9CL & AX1600i & H115i Pro @ 2x Noctua NF-A14 / Carbide 330r Blackout

AOC 40" 4k Curved / Samsung 40" 4k TV 120 MR / LaCie Porsche Design 2TB & 500GB / Samsung 950 Pro 500GB / 850 Pro 500GB / Crucial m4 500GB / Asus m.2 Card

Scarlett 2i2 Audio Interface / KRK Rokits 10" / Sennheiser HD 650 / Logitech G910 & G700s & C920 / SL 88 Grand / Cakewalk By BandLab / NF-A14 Intake / NF-P12 Exhaust

 

Link to post
Share on other sites
34 minutes ago, Turtle Rig said:

Yes and also no game needs 16GB of VRAM,

Not true. 
There are some games that can use more than 8GiB VRAM right now.

Geothermal Valley (Tomb Raider) is one of those cases for example.

 

And also look at the Video from Level1techs (Wendel) from today (the Adobe), where the 8GiB G-Forces crashes because lack of Memory.

 

And there is also the "HBCC" stuff, wich also lessens the load on the Memory and offloads it to the main memory.

 

So yeah, for "the future" the Radeon 7 might be a better choice because of that.

 

and to be blunt:
A High End Card for 700€ that has the same Amount of Money as a 150€ one is just rediculous.

Why do we accept this bullshit?!

A high end Card for 700€ should have at least 12GiB VRAM!
 

Why are we even discussing this?!


"Hell is full of good meanings, but Heaven is full of good works"

Link to post
Share on other sites
57 minutes ago, Stefan Payne said:

Not true. 
There are some games that can use more than 8GiB VRAM right now.

Geothermal Valley (Tomb Raider) is one of those cases for example.

 

And also look at the Video from Level1techs (Wendel) from today (the Adobe), where the 8GiB G-Forces crashes because lack of Memory.

 

And there is also the "HBCC" stuff, wich also lessens the load on the Memory and offloads it to the main memory.

 

So yeah, for "the future" the Radeon 7 might be a better choice because of that.

 

and to be blunt:
A High End Card for 700€ that has the same Amount of Money as a 150€ one is just rediculous.

Why do we accept this bullshit?!

A high end Card for 700€ should have at least 12GiB VRAM!
 

Why are we even discussing this?!

I totally don't know where I saw the charts, I wish I did so I could show you.  Some review site.  Yes games will use 8GB or more depending on textures and resolution.. so that is where the Ti shines,,,,,, tomb raider will def use 10GB 4VRAM on a Ti card at 4k resolution possibly a bit more but it wont choke the card out,,,,,, this is where the 3GB more of vram kicks in.  But if someone is playing at 1440p or 1080p then honestly even 6GB is fine.


Asus Sabertooth x79 / 4930k @ 4505 @ 1.408v / Gigabyte WF 2080 RTX / Corsair VG 64GB @ 1911 @ 9CL & AX1600i & H115i Pro @ 2x Noctua NF-A14 / Carbide 330r Blackout

AOC 40" 4k Curved / Samsung 40" 4k TV 120 MR / LaCie Porsche Design 2TB & 500GB / Samsung 950 Pro 500GB / 850 Pro 500GB / Crucial m4 500GB / Asus m.2 Card

Scarlett 2i2 Audio Interface / KRK Rokits 10" / Sennheiser HD 650 / Logitech G910 & G700s & C920 / SL 88 Grand / Cakewalk By BandLab / NF-A14 Intake / NF-P12 Exhaust

 

Link to post
Share on other sites
On 3/1/2019 at 2:24 AM, Turtle Rig said:

I totally don't know where I saw the charts, I wish I did so I could show you.  Some review site.  Yes games will use 8GB or more depending on textures and resolution.. so that is where the Ti shines,,,,,, tomb raider will def use 10GB 4VRAM on a Ti card at 4k resolution possibly a bit more but it wont choke the card out,,,,,, this is where the 3GB more of vram kicks in.  But if someone is playing at 1440p or 1080p then honestly even 6GB is fine.

Yes or look at the Review of the Radeon 7 from Adored (Youtube), where the 2080 totally crapped out - because of lack of VRAM.
And VEGA did not because of the HBCC Stuff..


"Hell is full of good meanings, but Heaven is full of good works"

Link to post
Share on other sites

The Radeon VII is available on AMDs website right now in the US. If I literally didn't just buy a 1080 ti I would have picked one over the 2080 as well. I didn't think they would be back in stock that quickly.


Ryzen 2600 (CH7) w/ Radeon VII + 3600 c16 Trident Z RGB in a Custom Loop powered by Seasonic Prime Ultra Titanium
PSU Tier List | Motherboard Tier List | My Build

Link to post
Share on other sites
Posted (edited)

I think the VRAM thing is more complicated than "[Game] uses X amount of VRAM, therefore, you need more than X amount of VRAM these days" for performance. I've been reading on the interwebs from people that games will request more VRAM than they actually need and they may never use it, much like how apps may overshoot how much memory they need (yes this is actually a thing: https://blogs.msdn.microsoft.com/oldnewthing/20091002-00/?p=16513). Though in a lot of cases where I've seen VRAM usage, the game tends to use the same amount regardless of VRAM. e.g., a game uses around 4.5GB of VRAM regardless if there's 6GB, 8GB, or 16GB.

 

So what about the case where a game uses roughly the same amount of VRAM regardless and there isn't enough? I'm not convinced there's a huge issue here. So here's an example (from https://www.techspot.com/article/1600-far-cry-5-benchmarks/😞

 

VRAM.png

 

Given that Far Cry 5 uses around 3GB of VRAM at 1080p, this may not be much of an interesting result to look at but for the record:

Spoiler

1080p.png

 

Now the 1440p and 4K benchmarks should be more interesting. Clearly Far Cry 5 will use more VRAM than the GTX 1060 3GB has

Spoiler

1440p.png

 

4K.png

Yet strangely enough, performance, even the min FPS results, aren't tanking hard and is remaining in line with the expected performance delta from the GTX 1060 6GB. In fact, even the GT 1030 is only seeing a linear drop-off in performance despite having 2GB of VRAM (1440p is basically 2x 1080p, and 2160p is 4x 1080p)

 

And for all the research I'm willing to do, I came across a PcPer article interviewing one of NVIDIA's VPs of engineering, with the most interesting bit being:

Quote

If a game has allocated 3GB of graphics memory it might be using only 500MB on a regular basis with much of the rest only there for periodic, on-demand use. Things like compressed textures that are not as time sensitive as other material require much less bandwidth and can be moved around to other memory locations with less performance penalty. Not all allocated graphics memory is the same and innevitably there are large sections of this storage that is reserved but rarely used at any given point in time.

 

tl;dr, VRAM usage may not actually be indicative of any actual requirement of what's needed.

Edited by Mira Yurizaki
Link to post
Share on other sites

2080 by a land slide, VII is a cash grab.


Stinkpci5 3550. DDR3 1600mhz 8GB. Gigabyte GA-H61N-USB3.0. Sapphire RX 570 Nitro 4GB oc. Noctua NH-L12. WD Black 600GB. Silverstone PSU 1KW. Advent 1440x900 75hz VGA monitor 1ms. Acer Veriton M464 chassis.

Self help guide.

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×