Jump to content

Vega FE - Placebo Mode

tom_w141
1 hour ago, PCGuy_5960 said:

No it doesn't, a fully working 7980XE die costs way more than $30. 

Potassium

sure, but they don't have a fully working 7980XE, do they? (:

I was talking more about the quad cores

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, DXMember said:

sure, but they don't have a fully working 7980XE, do they? (:

I was talking more about the quad cores

I eagerly await the release of your quad core seen as you have identified such an arbitrage opportunity and you can produce them so much cheaper than Intel.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Grinners said:

I eagerly await the release of your quad core seen as you have identified such an arbitrage opportunity and you can produce them so much cheaper than Intel.

I can't, but Intel can, but they won't  do it. Why don't they do it? Because free market, fucking capitalist pigs.

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DXMember said:

I can't, but Intel can, but they won't  do it. Why don't they do it? Because free market, fucking capitalist pigs.

 You can't.  Absolutely you can't. Hardly anyone can.  Who's best placed to do what Intel is right now? AMD is, but are they doing a bang up job? They have 1 line of CPU's worth buying right now after how many years?  You don't understand corporate economics, you think it is all about greed.  It's not a product you can mock up in a few months in your back yard and then forget about it once sold, Intel literally invests billions $$$ in R+D alone.  They are not going to make that back selling their products at a little over cost.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

On 2017-7-8 at 5:23 AM, PCGuy_5960 said:

Can we all just agree that Vega FE was 100% rushed?

Nobody said it wasn't to be fair :P 

 

Anyway, I do hope AMD pulls a Hawaii (again), a card that was considerably slower than the top kepler (780ti) which after many years of driver updates is now almost a 980 perf wise however they will have to do it in like 3 months as after all, AMD is known for the finewine nature of their cards although in this instance, they don't have the time. 

Basically what I'm saying is hopefully the more mature drivers that comes out in the next few months is able to give Vega a nice big kick in performance :D 

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, mr moose said:

 You can't.  Absolutely you can't. Hardly anyone can.  Who's best placed to do what Intel is right now? AMD is, but are they doing a bang up job? They have 1 line of CPU's worth buying right now after how many years?  You don't understand corporate economics, you think it is all about greed.  It's not a product you can mock up in a few months in your back yard and then forget about it once sold, Intel literally invests billions $$$ in R+D alone.  They are not going to make that back selling their products at a little over cost.

they should consider maybe 10% perf over 6 years wasn't worth the billions upon billion $$$ in R+D after all?

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Mr.Meerkat said:

Nobody said it wasn't to be fair :P 

 

Anyway, I do hope AMD pulls a Hawaii (again), a card that was considerably slower than the top kepler (780ti) which after many years of driver updates is now almost a 980 perf wise however they will have to do it in like 3 months as after all, AMD is known for the finewine nature of their cards although in this instance, they don't have the time. 

Basically what I'm saying is hopefully the more mature drivers that comes out in the next few months is able to give Vega a nice big kick in performance :D 

because thee is a lot of stuff that seems to not be working, it might become faster rapidly 

Link to comment
Share on other sites

Link to post
Share on other sites

Not too far behind the 1080 here, considering the FE is running at 1440Mhz, and doesn't come close to max boost of 1600Mhz. I was surprised by the low temps on it though; so wonder why it didn't boost higher.
 

 

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Valentyn said:

Not too far behind the 1080 here, considering the FE is running at 1440Mhz, and doesn't come close to max boost of 1600Mhz. I was surprised by the low temps on it though; so wonder why it didn't boost higher.
 

 

might be power throttling 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, tom_w141 said:

Anyone still hoping a driver is going to pull out 40-50% performance is high AF lol.

there might be some good gains, as vega should have much higher geometry horsepower, and the rasterizer will help too, both don't seen to be enabled right now

Link to comment
Share on other sites

Link to post
Share on other sites

On 2017-07-07 at 11:10 PM, tmcclelland455 said:

It's a brand new card so naturally the drivers aren't going to work as they should. Common sense, people.

and its AMD, which never launch cards with good drivers either lol. the R9 Fury is a perfect example and so is the entire re:live driver update.

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Bananasplit_00 said:

and its AMD, which never launch cards with good drivers either lol. the R9 Fury is a perfect example and so is the entire re:live driver update.

Plus let's not forget the AMD 480 launch, that awesome 970 competitor :D

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, JediFragger said:

Plus let's not forget the AMD 480 launch, that awesome 970 competitor :D

hmm it wasent that bad was it? i dont remember any major things fixed beyond haveing it draw more then the rated current from the PCIe slot(which would have been safe tbh, they are massivly under rated for what they can actiually take)

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bananasplit_00 said:

hmm it wasent that bad was it? i dont remember any major things fixed beyond haveing it draw more then the rated current from the PCIe slot(which would have been safe tbh, they are massivly under rated for what they can actiually take)

its lower because its limited by what a 24pin connector can deliver, so if you have multiple cards, its better to have lower pcie power per card or have added connectors just for pcie.

some overclockers have managed to blown their 24pin because of that in the past :)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, cj09beira said:

its lower because its limited by what a 24pin connector can deliver, so if you have multiple cards, its better to have lower pcie power per card or have added connectors just for pcie.

some overclockers have managed to blown their 24pin because of that in the past :)

thats with 980TIs and other similar cards, mainly dual GPU ones. for a single GPU card at stock there was no way that it wouldent handle it, the only reason it was adressed was because it was outside of the PCIe spec by a few watts, it wasent a problem beyond that and i doubt there was a recent motherboard out there for consumers that couldent handle it(OEM boards dount count because they are asholes and i say recent as there are some older motherboards on PCIe 2.0 that have the power cut down)

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Bananasplit_00 said:

thats with 980TIs and other similar cards, mainly dual GPU ones. for a single GPU card at stock there was no way that it wouldent handle it, the only reason it was adressed was because it was outside of the PCIe spec by a few watts, it wasent a problem beyond that and i doubt there was a recent motherboard out there for consumers that couldent handle it(OEM boards dount count because they are asholes and i say recent as there are some older motherboards on PCIe 2.0 that have the power cut down)

what i mean is that the pcie spec it self, is lower than it could be to help reducing the chances of multiple cards on the same motherboard blow the 24 pin connector, (2,3+ cards)

Link to comment
Share on other sites

Link to post
Share on other sites

It honestly looks like AMD is in a bad spot with this one to me. It sucks because I don't want to see AMD fall deep behind on the GPU war right as they're getting their act together on the CPU front.

 

On one hand I don't care one bit about high-end hardware. But I hate to see people or groups who are never able to recover from a mistake they make. I hope that's not the case because it just seems like nVIDIA has so much better resources to the point where they can simply toy with AMD by launching their next product when it looks like AMD is even dreaming of catching up and that's not really the fair fight I like to see.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Valentyn said:

Not too far behind the 1080 here, considering the FE is running at 1440Mhz, and doesn't come close to max boost of 1600Mhz. I was surprised by the low temps on it though; so wonder why it didn't boost higher.

Well Vega is not a hot chip.

It's actually quite cool running considering the high TDP and blower coolers that we have got so far. People assume the chip stays hot when they see the TDP numbers.

 

The below test was done by Gamers Nexus to show what temperature various GPUs can maintain at a fixed noise level.

The coolers were not allowed to spin up past 40dBA.

 

vega-fe-thermals-40dba-gpu.png

 

This I is a very good performance from a blower cooler for a high TDP card.

It means that for whatever reason Vega is very efficient at dissipating it's heat...

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Okjoek said:

It honestly looks like AMD is in a bad spot with this one to me. It sucks because I don't want to see AMD fall deep behind on the GPU war right as they're getting their act together on the CPU front.

 

On one hand I don't care one bit about high-end hardware. But I hate to see people or groups who are never able to recover from a mistake they make. I hope that's not the case because it just seems like nVIDIA has so much better resources to the point where they can simply toy with AMD by launching their next product when it looks like AMD is even dreaming of catching up and that's not really the fair fight I like to see.

well, vega will probably help them enter the server space because of its compute abilities, which for amd is probably a much more important market as its a huge market, that and all ryzen skus will help amd to have larger R&D for 2 gens from now (as the next one is probably reaching end of development soon)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Humbug said:

Well Vega is not a hot chip.

It's actually quite cool running considering the high TDP and blower coolers that we have got so far. People assume the chip stays hot when they see the TDP numbers.

 

The below test was done by Gamers Nexus to show what temperature various GPUs can maintain at a fixed noise level.

The coolers were allowed to spin up past 40dBA.

 

vega-fe-thermals-40dba-gpu.png

 

This I is a very good performance from a blower cooler for a high TDP card.

It means that for whatever reason Vega is very efficient at dissipating it's heat...

I saw that test alright. The Frontier Edition blower is as good as the MSI Armor was a surprise!

Beats the pants off the NV Reference; although it seems the concentrated heat from HBM+GPU is an issue under some loads.

 

PcPer seeeing it settle at 83 degrees; similar to NV reference; but FE still not able to clock above 1440Mhz.

So it seems there's more to the boosting issue than just temps.

 

Also saw this posted on Overclockers.co.uk. If that power usage numbers for 16GB HBM2 is true that's a nice extra 75W 8GB RX Vega can spend on GPU frequency.

 

https://disqus.com/by/klaudiuszkaczmarzyk/comments/

 

18037b5c27aebe9c5dc09d06f413ea12063187fa

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, cj09beira said:

there might be some good gains, as vega should have much higher geometry horsepower, and the rasterizer will help too, both don't seen to be enabled right now

1 hour ago, tom_w141 said:

Anyone still hoping a driver is going to pull out 40-50% performance is high AF lol.

I Don't really see people expecting that anymore. That would make it ahead of a 1080ti.

 

I think now people are just expecting Rx Vega to come out and beat the GTX 1080 with it's gaming drivers, at a slightly lower price.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Humbug said:

Well Vega is not a hot chip.

It's actually quite cool running considering the high TDP and blower coolers that we have got so far. People assume the chip stays hot when they see the TDP numbers.

 

The below test was done by Gamers Nexus to show what temperature various GPUs can maintain at a fixed noise level.

The coolers were allowed to spin up past 40dBA.

 

vega-fe-thermals-40dba-gpu.png

 

This I is a very good performance from a blower cooler for a high TDP card.

It means that for whatever reason Vega is very efficient at dissipating it's heat...

amd has reduced its CU/mm^2 which should help

 

2 minutes ago, Valentyn said:

I saw that test alright. The Frontier Edition blower is as good as the MSI Armor was a surprise!

Beats the pants off the NV Reference; although it seems the concentrated heat from HBM+GPU is an issue under some loads.

 

PcPer seeeing it settle at 83 degrees; similar to NV reference; but FE still not able to clock above 1440Mhz.

So it seems there's more to the boosting issue than just temps.

 

Also saw this posted on Overclockers.co.uk. If that power usage numbers for 16GB HBM2 is true that's a nice extra 75W 8GB RX Vega can spend on GPU frequency.

 

https://disqus.com/by/klaudiuszkaczmarzyk/comments/

 

18037b5c27aebe9c5dc09d06f413ea12063187fa

hbm does not use 150w ahaha,  

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, cj09beira said:

amd has reduced its CU/mm^2 which should help

 

hbm does not use 150w ahaha,  

Unless it's "overclocked" and overvolted to get to 1.89Gbps; since Hynix are still only listed 1.6Gbps HBM2.

Buildzoid did that PCB teardown of the Frontier Edition and was concerned the HBM2 will be overvolted from what he saw.

 

Still lots to wait and see for.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

according to buildzoid the card many times reaches its 300w limit.

 hbm2 consumption is super low 20-40w (made up numbers)

there are reasons to believe the tiled based rasterizer reduces power consumption as it reduces the amount of work the gpu has to do by not drawing certain triangles much earlier in the pipeline, but that has to be tested latter 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×