Jump to content

AMD's Vega architecture previewed at ve.ga

captain cactus
13 hours ago, Energycore said:

I think there's an option to change all voltages to the same? Otherwise what a chore.

Don't OC with Wattman, use some other program :P

I'll try :D (just remembered how you need to tick something to unlock voltages in Afterburner...whoops)

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kumaresh said:

So far, I have only used C++ as a student at a very basic level, so I don't think using namespace std would be a big problem for me. If I become a programmer in a professional capacity, I will get used to it. I use Arrays, stacks, queues, linked lists, classes, files and a little bit of graphics in C++, I'm still a newbie xD I have only compiled single file codes with standard libraries so far....

Lol it's not like I'm Bill Gates or anything hahaha It's just that good practices are good to be learned early. Once you get used to something you can't easily let it go ;) *holds back tears* lol

MacBook Pro 15' 2018 (Pretty much the only system I use)

Link to comment
Share on other sites

Link to post
Share on other sites

Ah, I see the post in the GPU section about the same thing has been merged with this one. I was wondering why I suddenly saw posts appear on page 5-6 of this thread that weren't there before.

 

Thanks mods.

Ye ole' train

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, Princess Cadence said:

I actually really agree with nVidia making SLI only on the 70's 80's because it has always been a not perfect technology with a lot of problems for both ends, the consumer and manufacturer, going with a single higher end GPU like buying a 1070/1080 for the price you'd pay by SLI'ing two 1060's seems to me like the smartest thing.

 

Also the 1060 6gb does have more raw power than the rx 480, the day nvidia fix the DX12 drivers which might happen after some of these many releases incoming the table here will turn completely and AMD will likely depend on this new product to keep its share on the market... if its just a very high end, high priced deal that still doesn't reach the 1080 they will see themselves in  trouble to find market space.

 

Now I have no reason not to want to see AMD succeeding on this for the competition will make nVidia better but lets be honest, especially with Ryzen AMD has been all about over-hyping it without any solid background so far.

 

@CostcoSamples I don't agree with you, I was using a GTX 560Ti until little ago and its performance in titles like CS:GO League of Legends and alike were all 100+ fps with all maxed out, and it did good enough in a lot of games being able to almost max even the latest 2013 titles, nVidia old cards are as solid as AMD old ones.

The 1060 does not have more raw power than the 480.  The latest benchmarks show them to be equal in DX11, and the 480 is 6-8% faster in DX12.  In Doom Vulkan we saw the 480 gain a whopping 40%!  The reason for the massive increase with better APIs has to do with AMD's high driver overhead on DX11, which eats into frame rates.  DX 12 and Vulkan allow game developers to access GPU resources more directly which sidesteps a lot of the problems AMD suffered in DX11.  In other words, when you take out the driver/API overhead we see that the 480 has a much higher performance than the 1060.  At this point the number of DX12 or Vulkan games is very limited, but as they become more common we will see that the 480 will age like a fine wine, getting better with time.  Nvidia doesn't have this problem and sees little or no gain in performance with DX12 or Vulkan.  And the lack of gain is not indicative of broken drivers, but rather, that Nvidia cards are fairly well optimized in DX11 and their design just doesn't gain much from reduced driver/API overhead.

 

I also have a 560 Ti (still use it on my secondary PC) and it runs most games very well, with reduced settings as needed.  BUT if you go back to around 2008, we saw AMD cards that were far superior to the Nvidia equivalent, yet AMD could not sell as many!  Look at the HD 4870/4850 for example.  My point is that Nvidia has stronger branding.  Gamers will often buy Nvidia because they FEEL like it's better, because Nvidia, even if it's objectively worse.  

 

AMD's VEGA could turn out to be objectively superior to anything Nvidia and yet still be undersold.

i7 4790k @4.7 | GTX 1070 Strix | Z97 Sabertooth | 32GB  DDR3 2400 mhz | Intel 750 SSD | Define R5 | Corsair K70 | Steel Series Rival | XB271, 1440p, IPS, 165hz | 5.1 Surround
PC Build

Desk Build

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, MadyTehWolfie said:

Hate on SLI is mostly justified but it also depends on the games you play. At most your only going to get 50% increase and that's the best case scenario. Some games become unplayable with SLI....Cough cough dues ex mankind divided. In my experience most games see a Meh improvement with SLI and very few get the full 50% increase. It's always better to just buy a card from next gen then to double up. A trend with Nvidia is that their xx70 cards usually pack the same punch as last year's Titan model. Seen it with the 970 and with the 1070 and Volta will be more of the same if not slightly better.

So you don't SLI...

 

I do, and have Deus Ex. No issues. 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, CostcoSamples said:

I also have a 560 Ti (still use it on my secondary PC) and it runs most games very well, with reduced settings as needed.  BUT if you go back to around 2008, we saw AMD cards that were far superior to the Nvidia equivalent, yet AMD could not sell as many!  Look at the HD 4870/4850 for example.  My point is that Nvidia has stronger branding.  Gamers will often buy Nvidia because they FEEL like it's better, because Nvidia, even if it's objectively worse.  

 

AMD's VEGA could turn out to be objectively superior to anything Nvidia and yet still be undersold.

Last time I said anything remotely close to this, I got called a fan boy even though the proof was right there on the internet.

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, CostcoSamples said:

The latest benchmarks show them to be equal in DX11

[Citation Needed]

I haven't seen any extensive testing of this but I am very interested.

 

26 minutes ago, CostcoSamples said:

In Doom Vulkan we saw the 480 gain a whopping 40%!

[Citation Needed] and what was the actual FPS like?

I remember a thread not too long ago that claimed that the 480 beat the 1060 by 16% in DirectX, and as it turned out the 1060 won at 1080p and the 16% number was taken from a 4K benchmark where neither card could run the game (it was 23 FPS vs 26 FPS).

So just a heads up to everyone, percentage can be very misleading, and you need to consider more than one resolution when saying "X is better than Y".

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, LAwLz said:

[Citation Needed]

I haven't seen any extensive testing of this but I am very interested.

 

[Citation Needed] and what was the actual FPS like?

I remember a thread not too long ago that claimed that the 480 beat the 1060 by 16% in DirectX, and as it turned out the 1060 won at 1080p and the 16% number was taken from a 4K benchmark where neither card could run the game (it was 23 FPS vs 26 FPS).

So just a heads up to everyone, percentage can be very misleading, and you need to consider more than one resolution when saying "X is better than Y".

Citation 1: testing by Hardware Canucks, which I already posted.

 

Citation 2: Testing by AdoredTV Doom OpenGL vs Vulkan at 1080p.  Similar results have been duplicated by others, with variation depending what part of the game they used to test.

i7 4790k @4.7 | GTX 1070 Strix | Z97 Sabertooth | 32GB  DDR3 2400 mhz | Intel 750 SSD | Define R5 | Corsair K70 | Steel Series Rival | XB271, 1440p, IPS, 165hz | 5.1 Surround
PC Build

Desk Build

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, TheRandomness said:

I wonder what a Fury X with GDDR5 and 1024 more shaders (removing the HBM and interposer would free up the 'budget' to do that, no?) would've been like...

it would have become too big to work, and the power would be way worse, because hbm uses  less energy than gddr5. but i have an idea, two dies and hbm stacks on an interposer, where each has half the Cus and one is the master and controls both, this way we would have the benifits of lower costs, higher max performance and none of the bad mambo jambo that comes with crossfire 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, cj09beira said:

it would have become too big to work, and the power would be way worse, because hbm uses  less energy than gddr5. but i have an idea, two dies and hbm stacks on an interposer, where each has half the Cus and one is the master and controls both, this way we would have the benifits of lower costs, higher max performance and none of the bad mambo jambo that comes with crossfire 

Well, this is AMD we're talking about, they didn't seem to care about power xD

But still, the performance would've been quite good c:

USEFUL LINKS:

PSU Tier List F@H stats

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, CostcoSamples said:

Citation 1: testing by Hardware Canucks, which I already posted.

Sorry, didn't feel like reading though over 150 replies.

Seems like you're right. For DX11 they are very even. Both cards win a few games, lose in some, and are even in the rest.

 

 

6 minutes ago, CostcoSamples said:

Citation 2: Testing by AdoredTV Doom OpenGL vs Vulkan at 1080p.  Similar results have been duplicated by others, with variation depending what part of the game they used to test.

AdoredTV is one of the biggest AMD fanboys I have ever seen. How anyone trusts anything he says is beyond me. I will gladly look at other reviews showing the same results though.

The 18% improvement HardwareCanucks got sounds way more reasonable than 40%.

 

When one reviewer sees an 18% improvement, and someone else claims 40%, then something fishy is going on (and my guess is that AdoredTV is the one being deliberately misleading).

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, App4that said:

So you don't SLI...

 

I do, and have Deus Ex. No issues. 

I did. I had two titanX and two 1080's at one point. Deus ex had a freak show when I enabled SLI.

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MadyTehWolfie said:

I did. I had two titanX and two 1080's at one point. Deus ex had a freak show when I enabled SLI.

Well, there are a few possibilities. Your cards weren't linked, software wise so didn't match frequencies. You have different models of cards. You used a shiet flex bridge rather than a hard bridge. 

 

Or more likely you tried to SLI before the profile came out and didn't use Nvidia inspector. 

 

But please, don't bad mouth something because of you. Ask for help and let those that asked for help when they had problems, help you. 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, cj09beira said:

it would have become too big to work, and the power would be way worse, because hbm uses  less energy than gddr5. but i have an idea, two dies and hbm stacks on an interposer, where each has half the Cus and one is the master and controls both, this way we would have the benifits of lower costs, higher max performance and none of the bad mambo jambo that comes with crossfire 

 

10 minutes ago, TheRandomness said:

Well, this is AMD we're talking about, they didn't seem to care about power xD

But still, the performance would've been quite good c:

Yo.

 

These are AMD's plans for Navi. Scalable GPUs means that instead of one big fat 4096 SP GPU like a Fury X we'd have 2 2048 SP GPUs (like two RX470s) but on one interposer sharing all the memory with a master-slave setup. That way the cost is lower because of the smaller die sizes (and thus better yields) but you still have a high-end GPU in your system.

Ye ole' train

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, LAwLz said:

Sorry, didn't feel like reading though over 150 replies.

Seems like you're right. For DX11 they are very even. Both cards win a few games, lose in some, and are even in the rest.

 

 

AdoredTV is one of the biggest AMD fanboys I have ever seen. How anyone trusts anything he says is beyond me. I will gladly look at other reviews showing the same results though.

The 18% improvement HardwareCanucks got sounds way more reasonable than 40%.

 

When one reviewer sees an 18% improvement, and someone else claims 40%, then something fishy is going on (and my guess is that AdoredTV is the one being deliberately misleading).

here are some reviews

hardwareunboxed : 480 gains 35.9%

https://www.hardwareunboxed.com/doom-vulkan-vs-opengl-benchmark-the-tide-turning-in-amds-favour/

 

gamernexus: 480 gains 30%

http://www.gamersnexus.net/game-bench/2510-doom-vulkan-vs-opengl-benchmark-rx-480-gtx-1080

pc gamer: 480 gains 26.8% for average and 30.6% for the minims 

http://www.pcgamer.com/doom-benchmarks-return-vulkan-vs-opengl/2/

 

bottom line:

its more than 18% :-P

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, App4that said:

Well, there are a few possibilities. Your cards weren't linked, software wise so didn't match frequencies. You have different models of cards. You used a shiet flex bridge rather than a hard bridge. 

 

Or more likely you tried to SLI before the profile came out and didn't use Nvidia inspector. 

 

But please, don't bad mouth something because of you. Ask for help and let those that asked for help when they had problems, help you. 

Lol Both cards were ordered at the same time with the same device ID, same brandwith a EVGA hard bridge. You should do a quick Google search as there are a crap tone of people who suffered SLI issues in that game. I wasn't asking for help I know the facts about SLI and it's short comings and it does in fact run like poop in deus ex for a large amount of people. You can do all the mental gymnastics you want it won't prove your side of the argument. A single more powerful card is always preferable to SLI. While it worked in most of my games it didn't even give a huge bump in fps in most titles hence why it's not really worth the price for maybe 30% boost in fps if your lucky and 50% on the off chance the devs put really good SLI drivers on it.

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, lots of unexplainable lag said:

 

Yo.

 

These are AMD's plans for Navi. Scalable GPUs means that instead of one big fat 4096 SP GPU like a Fury X we'd have 2 2048 SP GPUs (like two RX470s) but on one interposer sharing all the memory with a master-slave setup. That way the cost is lower because of the smaller die sizes (and thus better yields) but you still have a high-end GPU in your system.

now imagine if the master can control more than one slave, it could be really powerful really fast :-)

(I know savi would do that )

Edited by cj09beira
forgot some points
Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, MadyTehWolfie said:

Lol Both cards were ordered at the same time with the same device ID, same brandwith a EVGA hard bridge. You should do a quick Google search as there are a crap tone of people who suffered SLI issues in that game. I wasn't asking for help I know the facts about SLI and it's short comings and it does in fact run like poop in deus ex for a large amount of people. You can do all the mental gymnastics you want it won't prove your side of the argument. A single more powerful card is always preferable to SLI. While it worked in most of my games it didn't even give a huge bump in fps in most titles hence why it's not really worth the price for maybe 30% boost in fps if your lucky and 50% on the off chance the devs put really good SLI drivers on it.

So you ignore any evidence offered and go purely based off beliefs... How unfortunate for you.

 

SLI works amazing if configured correctly. This forum is filled with individuals, like myself that run SLI and Crossfire daily. My two 980ti beat a Pascal Titan X, and even including the cost of the first 980ti back when it was the shizle I haven't spent the 1200 dollars Nvidia wants for a Titan X. 

 

So, I'm asking again. Don't bad mouth something you obviously don't understand. 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, App4that said:

So you ignore any evidence offered and go purely based off beliefs... How unfortunate for you.

 

SLI works amazing if configured correctly. This forum is filled with individuals, like myself that run SLI and Crossfire daily. My two 980ti beat a Pascal Titan X, and even including the cost of the first 980ti back when it was the shizle I haven't spent the 1200 dollars Nvidia wants for a Titan X. 

 

So, I'm asking again. Don't bad mouth something you obviously don't understand. 

Are you stupid? I just told you I had multiple cards twice before. I'm not talking without experiencing it. I never said it was shit at all games it's just not worth the usual 30% bump in games to deal with the games that run like shit with it or don't support it at all. It's far better to invest in a better single card then multiple cards. Unless your calling Linus a lier too as he also says this. SLI CAN work amazing however most of the time your getting less then a 50% bump in fps hence not worth it when you factor in the games that either don't support it or have issues with it (some get a patch with SLI profiles some don't give a shit). Honestly I'm not bad mouthing it seems your fanboying pretty hard mate. 

 

Sorry to say but facts are on my side more than yours.

 

Linus's word/person experience with SLI and crossfire at one point < your fanboy mental gymnastics

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, lots of unexplainable lag said:

 

Yo.

 

These are AMD's plans for Navi. Scalable GPUs means that instead of one big fat 4096 SP GPU like a Fury X we'd have 2 2048 SP GPUs (like two RX470s) but on one interposer sharing all the memory with a master-slave setup. That way the cost is lower because of the smaller die sizes (and thus better yields) but you still have a high-end GPU in your system.

Wouldn't that essentially be just crossfire on an interposer?

USEFUL LINKS:

PSU Tier List F@H stats

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, TheRandomness said:

Wouldn't that essentially be just crossfire on an interposer?

Except games and other applications would see it as one GPU pretty much eliminating the problem of both Crossfire and SLI.

 

You'd still have one black heatsink, just with a spilt GPU underneath it with one of them also having the control logic of managing the two CU clusters (being the master in the master-slave configuration) which also simplifies heatsink design by not having to have two vapour chambers/two sets of heatsinks, you'd just have one of them. The managing of GPU recourses would be done on a bios/driver level, not a case-by-case game level.

Ye ole' train

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, lots of unexplainable lag said:

Except games and other applications would see it as one GPU pretty much eliminating the problem of both Crossfire and SLI.

 

You'd still have one black heatsink, just with a spilt GPU underneath it with one of them also having the control logic of managing the two CU clusters (being the master in the master-slave configuration) which also simplifies heatsink design by not having to have two vapour chambers/two sets of heatsinks, you'd just have one of them. The managing of GPU recourses would be done on a bios/driver level, not a case-by-case game level.

gpu resources would be managed by the gpu itself mostly, like it happens now with any gcn gpu.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, cj09beira said:

gpu resources would be managed by the gpu itself mostly, like it happens now with any gcn gpu.

True. But now you have an opportunity where if you don't need all the power you can switch off a GPU half and save on power/heat output which is something you'd do with drivers. Or why stop at halving the GPU when 4x1024 is also possible.

 

But hey, we're not seeing this with Vega. This whole scaling thing is for Navi, and that's 2018 stuff. We're a looooong way from 2018.

Ye ole' train

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, lots of unexplainable lag said:

True. But now you have an opportunity where if you don't need all the power you can switch off a GPU half and save on power/heat output which is something you'd do with drivers. Or why stop at halving the GPU when 4x1024 is also possible.

 

But hey, we're not seeing this with Vega. This whole scaling thing is for Navi, and that's 2018 stuff. We're a looooong way from 2018.

True.But this is way more fun to talk about :-)

it seems like it might be a good idea to have the controler as a separate chip, so that you dont need two different dies to make a gpu, unless the controler on one of them could me turned off,

they will probably go with the lather, we will see.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, cj09beira said:

True.But this is way more fun to talk about :-)

it seems like it might be a good idea to have the controler as a separate chip, so that you dont need two different dies to make a gpu, unless the controler on one of them could me turned off,

they will probably go with the lather, we will see.

Speculating is fun but it's also completely useless and actually may disappoint people if their expectations are set to high.

 

So we'll leave it at that. All we know about Navi is that AMD wants it to scale well and have this "NextGen" memory made in a forest by wizard dwarfs (without the "made in a forest by wizard dwarfs" part).

Ye ole' train

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×