Jump to content

AMD RX Vega 64 Outperforms NVIDIA GTX 1080 Ti By Up To 23% In DX12 Forza 7

Gdourado
2 minutes ago, valdyrgramr said:

Furys don't use GDDR5, they use HBM2.  The increased memory bandwidth makes up for it.  I will assume you are joking.

They don't use HBM2, they use original HBM. And higher bandwidth cannot fully compensate for lack of capacity.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, valdyrgramr said:

Never said they fully did, but it makes up for a lot of it.  GDDR5 is slower, so it needs more VRAM.

5 minutes ago, valdyrgramr said:

Furys don't use GDDR5, they use HBM2.  The increased memory bandwidth makes up for it.  I will assume you are joking.

No, I'm dead serious 4GB isn't enough for 4k no matter how fast it may be as it will likely need to load that amount in a single sequence of draw calls, thus tanking performance.

 

https://linustechtips.com/main/topic/631048-psu-tier-list-updated/ Tier Breakdown (My understanding)--1 Godly, 2 Great, 3 Good, 4 Average, 5 Meh, 6 Bad, 7 Awful

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, valdyrgramr said:

That depends on various factors not just VRAM.  Optimization, which tech[Nvidia over AMD] is used, and more.  It's simply not true in every situation.

well in this case the fury at 4k had like 9 fps so its probably the case

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, valdyrgramr said:

That depends on various factors not just VRAM.  Optimization, which tech[Nvidia over AMD] is used, and more.  It's simply not true in every situation.

For a typical AAA sandbox/open world/ large environment games (an entire race track with lots detailed cars would be a large environment) 4gb vram isnt enough. Its not even enough for a 4k firestrike text.

 

Source: I have a fury x thats OC'd

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, valdyrgramr said:

 

The main issue with HBM tech is production cost backed by stack limitations.  Other factors can make up for it, but yes in titles and certain genres, it's problematic.  I'm actually more in favor of when they used more GDDR5 than HBM.  But, AMD is known for making bad decisions.  Just saying there are situations where it is perfectly fine for 4k.  However, unless it's overly optimized for AMD's tech then ya it's problematic.  I wonder if AMD even optimizes their drivers much in favor of the furys anymore compared to the Vega cards.  Because even the 480s were catching up with them in AMD titles.

when you are developing a product you eventually have to release something for potential problems that would only arise in the wild are found, hbm lacked enough capacity per stack yes, but hbm2 improves it so much its crazy, we are seeing 2 hbm2 stacks, having more capacity and being as fast as 11 gddr5x chips, and that is with lower than expected frequencies, right now the lower production volume and the need for a interposer increase their price a lot, but thats exactly why amd is pushing hbm2, so that it gets enough volume to eventually be cheaper than gddr memory/equivalent 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, valdyrgramr said:

Wasn't the intent of HBM based memory to give more room for the die and also to reduce power consumption?

High Bandwidth Memory had one purpose which the name states: more memory bandwidth. And it has achieved that. Side effects are a reduced number of chips and lower power consumption. There is a limit to die size so it isn't really relevant to get more room. Also, HBM allows smaller boards and hasn't increased die size indicating there is no intention of using the space saved to increase the size.

 

Too bad it's expensive as hell and the interposer complicates things. Otherwise it'd likely replace GDDR.

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, cj09beira said:

when you are developing a product you eventually have to release something for potential problems that would only arise in the wild are found, hbm lacked enough capacity per stack yes, but hbm2 improves it so much its crazy, we are seeing 2 hbm2 stacks, having more capacity and being as fast as 11 gddr5x chips, and that is with lower than expected frequencies, right now the lower production volume and the need for a interposer increase their price a lot, but thats exactly why amd is pushing hbm2, so that it gets enough volume to eventually be cheaper than gddr memory/equivalent 

 

30 minutes ago, valdyrgramr said:

Wasn't the intent of HBM based memory to give more room for the die and also to reduce power consumption?

More PCB space, AMD is also researching using HBM for a enterprise APU solution. they have a white paper detailing a APU with 8 CPU CCXs, 8 GPU dies with HBM stacks per die. the target is 100 TFLOPs per package, and the calculate that it would be more then that (leaving room for thermals)

 

It will also help for custom SOC designs like future consoles.

 

Potential examples (if HBM becomes affordable) a SOC with HBM for a gaming ultra book, Portable console (like the switch), smaller Xbox/PS?.

if you want to annoy me, then join my teamspeak server ts.benja.cc

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, valdyrgramr said:

Wasn't the intent of HBM based memory to give more room for the die and also to reduce power consumption?

yes it does both, 

 

54 minutes ago, The Benjamins said:

 

More PCB space, AMD is also researching using HBM for a enterprise APU solution. they have a white paper detailing a APU with 8 CPU CCXs, 8 GPU dies with HBM stacks per die. the target is 100 TFLOPs per package, and the calculate that it would be more then that (leaving room for thermals)

 

It will also help for custom SOC designs like future consoles.

 

Potential examples (if HBM becomes affordable) a SOC with HBM for a gaming ultra book, Portable console (like the switch), smaller Xbox/PS?.

ya hbm has great potential, its much better approach to memory, that apu is crazy :) 

Link to comment
Share on other sites

Link to post
Share on other sites

It doesn't beat it in 4k which is what these high end 600+ cards are meant for. If you game in 1080p the most you'd really want to go is a 1070 anything above that performance level is a waste. AMD make something that competes with the 1080ti at a resolution we'd actually use these high end cards in please and thank you. If I wanted 1080p gaming I'd stick to the msi 1060 6gb I had.

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, MadyTehWolfie said:

It doesn't beat it in 4k which is what these high end 600+ cards are meant for. If you game in 1080p the most you'd really want to go is a 1070 anything above that performance level is a waste. AMD make something that competes with the 1080ti at a resolution we'd actually use these high end cards in please and thank you. If I wanted 1080p gaming I'd stick to the msi 1060 6gb I had.

still has a lot better mins :P 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, cj09beira said:

still has a lot better mins :P 

Don't really care about that and the extra power/heat isn't worth that. Though to be honest from my game play of Forza 7 I don't think it's really optimized that well. I get random frame drops below 60fps max settings at 1440x3440p. Eh either way the whole class and division restrictions kinda killed it for me though. :/

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, MadyTehWolfie said:

Don't really care about that and the extra power/heat isn't worth that. Though to be honest from my game play of Forza 7 I don't think it's really optimized that well. I get random frame drops below 60fps max settings at 1440x3440p. Eh either way the whole class and division restrictions kinda killed it for me though. :/

Wait you don't care about minimum frame rates yet your complaint about Forza 7 is frame drops below 60fps? Not that it goes against your feelings that the game isn't actually optimized that well, just thought that was a bit counter to what you were saying.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

Wait you don't care about minimum frame rates yet your complaint about Forza 7 is frame drops below 60fps? Not that it goes against your feelings that the game isn't actually optimized that well, just thought that was a bit counter to what you were saying.

I are about it dipping below 60. I don't care if one gpu has a min fps of a few fps then something else. The dips recur at the same exact point in the tracks every time. Doesn't seem like a gpu problem but a optimization problem. Average fps is more important then min or max fps to me. 

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, MadyTehWolfie said:

I are about it dipping below 60. I don't care if one gpu has a min fps of a few fps then something else. The dips recur at the same exact point in the tracks every time. Doesn't seem like a gpu problem but a optimization problem. Average fps is more important then min or max fps to me. 

I dont get dips on a 470 at 2560x1440

if you want to annoy me, then join my teamspeak server ts.benja.cc

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, michaelocarroll007 said:

A lot more people then you Think buy a 1080 $600 priced GPU for 1080p 144hz. as there is a lot of games the 1080 cant max it.  Some just like it that way. i am personally a fan of Higher Res over FPS but that is just me.

 

1080p your right is going to be dying sometime in the future. People will jump straight to 4k as monitors at 4k have been droping a lot in price. you can get 27 inch IPS 4k pannels for around $350 and GPUs have been doing okay at 4k. Once we get a like 70 series card that can run 4k high settings i think the adoption will jump.  

I doubt 1080p is dying, as i mentioned somewhere its all to do with PRICING.

1080p 60 fps monitors still cost around £100 or more and has been for a few years. So unless 1440p or 4k gets to that price point 1080p isnt gonna die. Monitor pricing hasnt come down for years. Its all to do with price/benefit and economy, economy hasnt been good and prices are still as they have been for quite some years not to mention with nvidia always increasing their prices, people arent gonna be able to afford better monitors.

 

With laptops and portables being the norm, many of these have a 1080p resolution or less at the lower price point.

 

LED projectors that are compact also have dropped in price. For £300 you can get a decent LED projector that is compact to display 1080p and projectors can actually have a much higher refresh rate for cheaper. Besides gaming with a projector is much nicer as it actually has lower response times and costs less than the screen size you can project in a room. LED projectors are already good that you can just project onto a wall and decent ones come with colour correction for the non white walls.

 

If you think 4:3 has died out you're quite wrong, i still have and use 4:3 monitors. I have an old flat CRT with decent resolution and an LCD with a 1280x960 resolution that i use with my 1080p 60 fps monitor and i use the CRT for my servers when needed which is rare.

 

So really theres no such thing as resolutions and monitors dying. Its all to do with lifespan and costs, costs is still the same with £100 being the norm/mainstream for a monitor, 4k monitors are still out of reach for the budgets of many even 1440p too so dont expect any of the higher monitors to be the norms soon, not with some brands increasing their prices that are important like ram, storage and even nvidia as well increasing prices bit by bit and using supply and demand for ram to increase prices and not lower them. So with all these components being more costly, people cant afford monitors that cost more as as i keep saying the monitor prices havent come down yet for mainstream adoptation so people will reuse their old monitors or buy lower res or cheaper ones that is good enough for them.

 

 

To add some insult, i have titan xps, vega fe and i game with them at 1080p 60 fps :P. You guys shouldnt complain about the choice of components really.  That extra cost of the titan and vega benefit me more than paying for a 4k display and switching down to a gtx 1070 or gtx 1080 or vega 56. This is because i do most of my work using my laptop and use my various PCs for more processing and storage remotely saving more power and allowing me portability and flexibility.

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, The Benjamins said:

I dont get dips on a 470 at 2560x1440

Nvidia? Or AMD? Could be a dx12 issue for Nvidia either that or the increased resolution of 1440x3440 and the graphical effects in that part of the screen. (Particle effects and such) idk I shouldn't be getting drops below 60fps period. Bench marks have the 4k at 90ish fps. 1440x3440 usually gets me 10-15fps more then gaming at 4k.

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, MadyTehWolfie said:

Nvidia? Or AMD? Could be a dx12 issue for Nvidia either that or the increased resolution of 1440x3440 and the graphical effects in that part of the screen. (Particle effects and such) idk I shouldn't be getting drops below 60fps period. Bench marks have the 4k at 90ish fps. 1440x3440 usually gets me 10-15fps more then gaming at 4k.

Its a RX 470, so AMD, I have a 3440x1440 monitor but my game will only do 16:9 so it runs at 2560x1440. I have the game set to that res with a target FPS of 60 and the rest of the settings is on dynamic. Out of the gate I was getting a smooth experience, which is surprising I thought my GPU power was low for the game and I some times get stutters in other games due to my 100+ tabs in chrome.

It feels very optimized on my GPU.

if you want to annoy me, then join my teamspeak server ts.benja.cc

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/1/2017 at 3:47 AM, TheOriginalHero said:

If this is true then vega 11 gpus are going to be amazing for the midrange consumer in dx12

No they won't because there's two fundamental problems with Vega and AMD cards in general, shown below:

On 10/1/2017 at 7:21 AM, cj09beira said:

still would like to know what is holding vega back, :/

It's starved for bandwidth.

 

AMD's drivers are terrible at keeping the core filled with work because they're not developing an architecture for today, they're developing an architecture for tomorrow.. (ie: we can't make drivers so let's just have the game engine sort that out for us!) Which'll be underperforming and dead by the time tomorrow rolls around.

.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, System Error Message said:

I doubt 1080p is dying, as i mentioned somewhere its all to do with PRICING.

1080p 60 fps monitors still cost around £100 or more and has been for a few years. So unless 1440p or 4k gets to that price point 1080p isnt gonna die. Monitor pricing hasnt come down for years. Its all to do with price/benefit and economy, economy hasnt been good and prices are still as they have been for quite some years not to mention with nvidia always increasing their prices, people arent gonna be able to afford better monitors.

 

With laptops and portables being the norm, many of these have a 1080p resolution or less at the lower price point.

 

LED projectors that are compact also have dropped in price. For £300 you can get a decent LED projector that is compact to display 1080p and projectors can actually have a much higher refresh rate for cheaper. Besides gaming with a projector is much nicer as it actually has lower response times and costs less than the screen size you can project in a room. LED projectors are already good that you can just project onto a wall and decent ones come with colour correction for the non white walls.

 

If you think 4:3 has died out you're quite wrong, i still have and use 4:3 monitors. I have an old flat CRT with decent resolution and an LCD with a 1280x960 resolution that i use with my 1080p 60 fps monitor and i use the CRT for my servers when needed which is rare.

 

So really theres no such thing as resolutions and monitors dying. Its all to do with lifespan and costs, costs is still the same with £100 being the norm/mainstream for a monitor, 4k monitors are still out of reach for the budgets of many even 1440p too so dont expect any of the higher monitors to be the norms soon, not with some brands increasing their prices that are important like ram, storage and even nvidia as well increasing prices bit by bit and using supply and demand for ram to increase prices and not lower them. So with all these components being more costly, people cant afford monitors that cost more as as i keep saying the monitor prices havent come down yet for mainstream adoptation so people will reuse their old monitors or buy lower res or cheaper ones that is good enough for them.

 

 

To add some insult, i have titan xps, vega fe and i game with them at 1080p 60 fps :P. You guys shouldnt complain about the choice of components really.  That extra cost of the titan and vega benefit me more than paying for a 4k display and switching down to a gtx 1070 or gtx 1080 or vega 56. This is because i do most of my work using my laptop and use my various PCs for more processing and storage remotely saving more power and allowing me portability and flexibility.

 

When i Mean Dying i mean it will be at a much lower % then today or the past years it will stop growing and go down slowly to irrelevance.  yes your right reason 1080p is dominant now is  a price Cost reason. and 4k moinitors are Dropping fast. there basically cheaper then a 1440p now forgetting about high refresh rate ones. Just like people thought 720p would stay popular due to high 1080p monitor costs. Prices drop and it will take over. Its already 5x cheaper then just a few years ago and better looking moinitor pricing has come down and so have tvs by large margins in the 4k Side of things. use to be over $1500 for a decent one now there $400ish and can find some deals in the 250-350 range if you look around.  They will have budget 4k pannels in the $200 mark in the next two years and i think youll see the shift largely then and now slowly till then 

 

Yes 4:3 and 720p still exist today but i personally consider them Pretty dead overall There not Growing and theres always thoese who Just dont change or straggle behind in updating. i would consider windows XP dead but people still use that also. 

 

Tvs and desktop 4k will continue to grow and eventually outpace 1080p. i can see laptops taking a while to gain traction overall though as a 13 inch 1080p has decent PPI where you dont really need or benefit as much with 4k.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, michaelocarroll007 said:

When i Mean Dying i mean it will be at a much lower % then today or the past years it will stop growing and go down slowly to irrelevance.  yes your right reason 1080p is dominant now is  a price Cost reason. and 4k moinitors are Dropping fast. there basically cheaper then a 1440p now forgetting about high refresh rate ones. Just like people thought 720p would stay popular due to high 1080p monitor costs. Prices drop and it will take over. Its already 5x cheaper then just a few years ago and better looking moinitor pricing has come down and so have tvs by large margins in the 4k Side of things. use to be over $1500 for a decent one now there $400ish and can find some deals in the 250-350 range if you look around.  They will have budget 4k pannels in the $200 mark in the next two years and i think youll see the shift largely then and now slowly till then 

 

Yes 4:3 and 720p still exist today but i personally consider them Pretty dead overall There not Growing and theres always thoese who Just dont change or straggle behind in updating. i would consider windows XP dead but people still use that also. 

 

Tvs and desktop 4k will continue to grow and eventually outpace 1080p. i can see laptops taking a while to gain traction overall though as a 13 inch 1080p has decent PPI where you dont really need or benefit as much with 4k.

What i was getting it as that people already have monitors arent likely to upgrade. As i said as myself as an example, i already have 3 1080p monitors and older monitors too, so i have too many and dont see any need for 4K yet.

 

But once 4k gets to the price of 1080p and if i do need a monitor thats when i'd buy it, otherwise even for gaming i dont see much of a need for it as i use my laptop for most things so that 15 inch monitor isnt going to benefit being 4k if it costs a lot just to get it.

 

So new buyers are still gonna buy 1080p until the price of 4k drops to where 1080p is now. The other thing consider is the price of other components. If they keep increasing as they do (intel, nvidia, ram, etc) than it will significantly delay 4k adoption.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, Sakkura said:

Take a look at the Steam hardware survey. It's going to take a long, long time for 1080p to die.

 

Currently 1080p is by far the most common resolution, but even more interesting is that the next 4 on the list are lower than 1080p. 1440p is the most common resolution above 1080p, but accounts for just 2.68% with a growth of 0.02%. 4K is at 0.79% and actually decreased in popularity.

 

And get this - the fastest-growing resolution, by far, is 1080p. It grew by 2.51%, nearly as much as the entire existing share of 1440p monitors.

It will take a long time for it to Die yes. It will Start to decrease just like 720p and 4:3 did years ago. in the next 2 years or so i think the shift will start to happen as Mid range GPUs can run 4k and a Budget 4k monitors start to cross around the $200 Range. They have drastically dropped in price over the last few years and GPUs have been getting much stronger. 700 Series and even 900 series struggled with 4k and the monitors were above $1500. It is going to become much more viable in the near future 1-2 years from now and currently its becoming viable currently for the enthusiast. a 1080 &1080ti and a $500 monitor will play 4k pretty decent now

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/1/2017 at 2:01 AM, Master Disaster said:

Amazing job AMD, your top of the line GPU manages to smash everything in a 10 year old resolution.

 

All hail the new 1080p king.

you gotta give AMD kudos for trying though...

 

They also gave us the RX 480 as well. Remember that card?

Indus Monk = Indian+ Buddhist

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, michaelocarroll007 said:

It will take a long time for it to Die yes. It will Start to decrease just like 720p and 4:3 did years ago. in the next 2 years or so i think the shift will start to happen as Mid range GPUs can run 4k and a Budget 4k monitors start to cross around the $200 Range. They have drastically dropped in price over the last few years and GPUs have been getting much stronger. 700 Series and even 900 series struggled with 4k and the monitors were above $1500. It is going to become much more viable in the near future 1-2 years from now and currently its becoming viable currently for the enthusiast. a 1080 &1080ti and a $500 monitor will play 4k pretty decent now

The bulk of people are using laptops or sub-$150 monitors.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AlwaysFSX said:

No they won't because there's two fundamental problems with Vega and AMD cards in general, shown below:

It's starved for bandwidth.

 

AMD's drivers are terrible at keeping the core filled with work because they're not developing an architecture for today, they're developing an architecture for tomorrow.. (ie: we can't make drivers so let's just have the game engine sort that out for us!) Which'll be underperforming and dead by the time tomorrow rolls around.

amd cards usually to the opposite of becoming slower with time, before memory starvation there are other things, like the same amount of rops, etc of a 290x with 2 times the compute 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, The Benjamins said:

Its a RX 470, so AMD, I have a 3440x1440 monitor but my game will only do 16:9 so it runs at 2560x1440. I have the game set to that res with a target FPS of 60 and the rest of the settings is on dynamic. Out of the gate I was getting a smooth experience, which is surprising I thought my GPU power was low for the game and I some times get stutters in other games due to my 100+ tabs in chrome.

It feels very optimized on my GPU.

You do know the game lowers and raised in game assets to keep frame rate up that =/= good optimization. Also I don't have anything set to dynamic and playing at a higher resolution. If you have resolution set to dynamic as well it'll lower that too. Turn off dynamic settings and see what happens. Also AMD performs better in dx12 maybe a factor as well. 

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×