Jump to content

AMD employee confirms new GPU with HBM and 300W

Kowar

There, hyper linked it and added another write up with some other tidbits

My wholr point is that it should have been in OPs post in the first place.

Connection200mbps / 12mbps 5Ghz wifi

My baby: CPU - i7-4790, MB - Z97-A, RAM - Corsair Veng. LP 16gb, GPU - MSI GTX 1060, PSU - CXM 600, Storage - Evo 840 120gb, MX100 256gb, WD Blue 1TB, Cooler - Hyper Evo 212, Case - Corsair Carbide 200R, Monitor - Benq  XL2430T 144Hz, Mouse - FinalMouse, Keyboard -K70 RGB, OS - Win 10, Audio - DT990 Pro, Phone - iPhone SE

Link to comment
Share on other sites

Link to post
Share on other sites

Colbert_Nation_Report_Comedy_Central_tro

Troll on dude.  At least you admit what you are.  How about though you go back under your bridge for awhile? OK?

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

exactly what i said, i know all that already man.

No, that's not what you said. You said; even the windforce and vapour-x coolers are pushed to their limits, which is far from the truth. ;)

 

Unless I somehow misunderstood what you meant?

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

My wholr point is that it should have been in OPs post in the first place.

 

 

oh, well then. Carry on i suppose.

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

No, that's not what you said. You said; even the windforce and vapour-x coolers are pushed to their limits, which is far from the truth. ;)

 

Unless I somehow misunderstood what you meant?

well...would you say the windforce or vapor cards are running at 60c with 55% fan speed with a stock 290X?

No, they run at 79c with 80% fan speed...if this is not pushing it at least ''a little'' then ok i admit i'm clueless...you dont have much more room for more heat dissipation even with the best air coolers available...but i honestly tought the 290X was like a 250W TDP GPU and that now they where shooting for 300W this is why i said that...but some say it's about the same power consumption in which case it's obviously gonna be fine.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Troll on dude.  At least you admit what you are.  How about though you go back under your bridge for awhile? OK?

Wait, you were being serious? Wow, that's sad... The way you kept harping on about PhysX and cooking and ovens led me to genuinely believe you were trolling.

      

Link to comment
Share on other sites

Link to post
Share on other sites

Wait, you were being serious? Wow, that's sad... The way you kept harping on about PhysX and cooking and ovens led me to genuinely believe you were trolling.

Going on about PhysX?  Uh all I did was respond to other posts.  Man people sometimes ....

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

Going on about PhysX?  Uh all I did was respond to other posts.  Man people sometimes ....

Yeah, you responded by rambling about PhysX and how it makes Nvidia oh so much better than AMD.

But nvm, moving on...

      

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah 1000W of heat dumping in your room, totally acceptable. And you want a custom loop for that. 3x 970's should be on 450W with each of them at 1200-1400 rpm would be significantly quieter than three 290x.

It doesn't bother me how much heat it throws out. In the winter it will keep me warm and in the summer it's no where near enough to overpower my AC. I was still establishing the point of pricing. Three GTX 970 would cost mega bucks in comparison to three R9 290X. For what it costs for three GTX 970 I could go with three R9 290X and have enough money to spare to put them all under water.

Link to comment
Share on other sites

Link to post
Share on other sites

well...would you say the windforce or vapor cards are running at 60c with 55% fan speed with a stock 290X?

No, they run at 79c with 80% fan speed...if this is not pushing it at least ''a little'' then ok i admit i'm clueless...you dont have much more room for more heat dissipation even with the best air coolers available...but i honestly tought the 290X was like a 250W TDP GPU and that now they where shooting for 300W this is why i said that...but some say it's about the same power consumption in which case it's obviously gonna be fine.

 

The Vapour X tested by Techpowerup reached a max temp of 76* and 77* with OC. Sound under load was 37db, which indicates those 3 fans were nowhere near 80%. OC potential is related more to the power delivery design and based partly on silicon lottery. The Hawaii GPUs were never strong overclockers to begin with, so they don't have much overhead anyways and what overhead they do have is not limited by the cooler.

 

I never said you were clueless, just that some of your information and assumptions were incorrect/exaggerated. I'm only trying to set the record straight and clear up some of the "myths" that surround the Hawaii cards, that's all. ;)

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

The GTX 980 and 970 had aspects of the card cut back to help cut costs. Even then they landed in the market $550 and $350 respectively today as to where a R9 290X is only $269. You can start to see the trend where AMD has an upper hand in pricing. You can go crossfire R9 290X over a single GTX 980 while being cheaper and easily doubling their game performance with proper scaling. Pricing has always been Nvidia's downfall as it is for Intel when AMD has a competitive architecture. People will always go for the cheaper option if it even offers dead equivalent performance.

 

 

Please enlighten me on what aspects of the 980 and 970 were cut back to cut costs. I really would like to hear this one.  Yeah, R9 290X $269 my balls: http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=R9+290x&N=-1&isNodeId=1 All I see is $479, $389, $379, $359, $369, $379, $359, the lowest price is $319 and it has terrible reviews. Next to that is Open Box items which are in the $287 range. 

 

AMD doesn't have an upper hand in pricing. They are similarly priced to a 970 which the 290X competes with. You can also go SLI 970 while being cheaper and easily doubling the performance with proper scaling. 

 

NVIDIA prices their products as such, because they can. AMD has no choice but to lower their prices. As we have seen in the past, when they try to mark their products up they dont sell. If people always went for the cheaper option if it offers dead equivalent performance, AMD would be number one in discrete GPU sales. Which they are not:

 

ftn3ab.jpg

 

And this is before the results from the GTX 980 and 970 launch.

 

Don't get me wrong, there are plenty of instances where an AMD card is preferred over an NVIDIA equivalent in many different price brackets. But, if you want the best product, you have to pay for it. Like anything in this world. Right now, the card that holds the price to performance crown is not an AMD card but an NVIDIA card, and that card is the GTX 970. 

Link to comment
Share on other sites

Link to post
Share on other sites

It doesn't bother me how much heat it throws out. In the winter it will keep me warm and in the summer it's no where near enough to overpower my AC. I was still establishing the point of pricing. Three GTX 970 would cost mega bucks in comparison to three R9 290X. For what it costs for three GTX 970 I could go with three R9 290X and have enough money to spare to put them all under water.

3x 290x's 900$, 3x 970's 1000$. Goodluck getting 3 of them custom watercooled or with 3 AIO's for 100$.

 

 

The Vapour X tested by Techpowerup reached a max temp of 76* and 77* with OC. Sound under load was 37db, which indicates those 3 fans were nowhere near 80%. OC potential is related more to the power delivery design and based partly on silicon lottery. The Hawaii GPUs were never strong overclockers to begin with, so they don't have much overhead anyways and what overhead they do have is not limited by the cooler.

From a meter away, even the reference can be considered quiet when it's 40 dBa.

http://nl.hardware.info/productinfo/217269/sapphire-radeon-r9-290x-vapor-x-oc-4gb#tab:testresultaten (under benchmarks)

Thats 60 dBa from 4". 970 Strix is 38 dBa; http://nl.hardware.info/productinfo/253092/asus-geforce-gtx-970-strix-4gb#tab:testresultaten

Link to comment
Share on other sites

Link to post
Share on other sites

Please enlighten me on what aspects of the 980 and 970 were cut back to cut costs. I really would like to hear this one.  Yeah, R9 290X $269 my balls: http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=R9+290x&N=-1&isNodeId=1 All I see is $479, $389, $379, $359, $369, $379, $359, the lowest price is $319 and it has terrible reviews. Next to that is Open Box items which are in the $287 range. 

 

AMD doesn't have an upper hand in pricing. They are similarly priced to a 970 which the 290X competes with. You can also go SLI 970 while being cheaper and easily doubling the performance with proper scaling. 

 

NVIDIA prices their products as such, because they can. AMD has no choice but to lower their prices. As we have seen in the past, when they try to mark their products up they dont sell. If people always went for the cheaper option if it offers dead equivalent performance, AMD would be number one in discrete GPU sales. Which they are not:

 

ftn3ab.jpg

 

And this is before the results from the GTX 980 and 970 launch.

 

Don't get me wrong, there are plenty of instances where an AMD card is preferred over an NVIDIA equivalent in many different price brackets. But, if you want the best product, you have to pay for it. Like anything in this world. Right now, the card that holds the price to performance crown is not an AMD card but an NVIDIA card, and that card is the GTX 970. 

First and foremost was the obvious bus width thanks to delta compression. Also if you only shop Newegg then you're not getting the best deal possible.

 

The R9 290X for $269 competes with the GTX 980 ($550) and performs better than the GTX 970 ($350). So AMD does have a upper hand in pricing.

 

The market changes too fast for either company to hold any kind of leadership title. AMD prices their cards quite high during initial launch and then saturates them into the market with price cuts. Something Nvidia is not familiar with. AMD does this to not only push stock but to also allow other classes of consumers the ability to purchase higher end cards. Such large cuts on last generation cards (200 series) is quite obvious as they are still GCN 1.0* based products. AMD didn't have to throw out a new architecture to remain competitive with Nvidia they just had to cut pricing. As for market share Nvidia will always remain the most purchased product. Reason for that is because nearly every retailer will push Nvidia cards because of their high price point which brings in higher revenue. They literally don't care if the card performs better or worse as they only care about dollar signs.

 

Their current performance crown is the GTX 980. Tho as stated the R9 380X will more than likely become the new performance leader which will probably cost in the same ball park considering all their new technologies. Tho after a month or two you'll likely notice a fluctuation in pricing of their products.

 

 

3x 290x's 900$, 3x 970's 1000$. Goodluck getting 3 of them custom watercooled or with 3 AIO's for 100$.

 

 

From a meter away, even the reference can be considered quiet when it's 40 dBa.

http://nl.hardware.info/productinfo/217269/sapphire-radeon-r9-290x-vapor-x-oc-4gb#tab:testresultaten (under benchmarks)

Thats 60 dBa from 4". 970 Strix is 38 dBa; http://nl.hardware.info/productinfo/253092/asus-geforce-gtx-970-strix-4gb#tab:testresultaten

 

(3x R9 290X = $800) vs. (3x GTX 970 = $1050)

 

That's a $250 difference which nearly covers the cost of another R9 290X. Tho will definitely cover the cost of AIO and brackets. Personally I would never go that far as that's just overboard. Crossfire R9 290X for $540 is the sweet spot.

Link to comment
Share on other sites

Link to post
Share on other sites

yes 300watt is a lot and i can't wait to see the temps it will do, but amd is famous for this, but hbm with bandwidth of 4096bit is wonderful, right now to get that bandwidth we would need over 300watts.

personally we have been using ggd5 for too long and amd could certainly use the hand up

"If a man has not discovered something that he will die for, he isn't fit to live."   -Martin Luther King, Jr

Link to comment
Share on other sites

Link to post
Share on other sites

(3x R9 290X = $800) vs. (3x GTX 970 = $1050)

 

That's a $250 difference which nearly covers the cost of another R9 290X. Tho will definitely cover the cost of AIO and brackets. Personally I would never go that far as that's just overboard. Crossfire R9 290X for $540 is the sweet spot.

Cheapest 290x (no mail-in rebates);

 

 
Video Card: Gigabyte Radeon R9 290X 4GB WINDFORCE Video Card (3-Way CrossFire)  ($304.99 @ NCIX US) 
Video Card: Gigabyte Radeon R9 290X 4GB WINDFORCE Video Card (3-Way CrossFire)  ($304.99 @ NCIX US) 
Video Card: Gigabyte Radeon R9 290X 4GB WINDFORCE Video Card (3-Way CrossFire)  ($304.99 @ NCIX US) 
Total: $914.97
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2015-01-14 20:15 EST-0500

 
Type|Item|Price
:----|:----|:----
**Video Card** | [PNY GeForce GTX 970 4GB XLR8 Video Card](http://pcpartpicker.com/part/pny-video-card-vcggtx9704xpb) (3-Way SLI) | $324.99 @ Directron 
**Video Card** | [PNY GeForce GTX 970 4GB XLR8 Video Card](http://pcpartpicker.com/part/pny-video-card-vcggtx9704xpb) (3-Way SLI) | $324.99 @ Directron 
**Video Card** | [PNY GeForce GTX 970 4GB XLR8 Video Card](http://pcpartpicker.com/part/pny-video-card-vcggtx9704xpb) (3-Way SLI) | $324.99 @ Directron 
 | | **Total**
 | Prices include shipping, taxes, and discounts when available | $974.97
 | Generated by PCPartPicker 2015-01-14 20:15 EST-0500 |

Just 60$ that only covers a single AIO.

Link to comment
Share on other sites

Link to post
Share on other sites

so itll probably be 50 watts more power then a 980, 50 watts over an hour gaming session is an extra 180,000 joules. or the equivalent of having a 500 watt hair dryer on for 6 minutes. so that difference will probably be noticeable. if your room is particularly small or your pc has not great ventilation

I don't think 50 Watts is enough to heat even a small room.

Just not enough BTUs.

Also I don't think 100% of the energy of a GPU is converted to Heat , but TBH I have no Idea where the energy actually goes.

A riddle wrapped in an enigma , shot to the moon and made in China

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think 50 Watts is enough to heat even a small room.

Just not enough BTUs.

Also I don't think 100% of the energy of a GPU is converted to Heat , but TBH I have no Idea where the energy actually goes.

It's heat or sound, there the only option,

Link to comment
Share on other sites

Link to post
Share on other sites

 

Cheapest 290x (no mail-in rebates);

 

 
Video Card: Gigabyte Radeon R9 290X 4GB WINDFORCE Video Card (3-Way CrossFire)  ($304.99 @ NCIX US) 
Video Card: Gigabyte Radeon R9 290X 4GB WINDFORCE Video Card (3-Way CrossFire)  ($304.99 @ NCIX US) 
Video Card: Gigabyte Radeon R9 290X 4GB WINDFORCE Video Card (3-Way CrossFire)  ($304.99 @ NCIX US) 
Total: $914.97
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2015-01-14 20:15 EST-0500

 
Type|Item|Price
:----|:----|:----
**Video Card** | [PNY GeForce GTX 970 4GB XLR8 Video Card](http://pcpartpicker.com/part/pny-video-card-vcggtx9704xpb) (3-Way SLI) | $324.99 @ Directron 
**Video Card** | [PNY GeForce GTX 970 4GB XLR8 Video Card](http://pcpartpicker.com/part/pny-video-card-vcggtx9704xpb) (3-Way SLI) | $324.99 @ Directron 
**Video Card** | [PNY GeForce GTX 970 4GB XLR8 Video Card](http://pcpartpicker.com/part/pny-video-card-vcggtx9704xpb) (3-Way SLI) | $324.99 @ Directron 
 | | **Total**
 | Prices include shipping, taxes, and discounts when available | $974.97
 | Generated by PCPartPicker 2015-01-14 20:15 EST-0500 |

Just 60$ that only covers a single AIO.

 

 

 
Video Card: PowerColor Radeon R9 290X 4GB Video Card (3-Way CrossFire)  ($269.99 @ NCIX US) 
Video Card: PowerColor Radeon R9 290X 4GB Video Card (3-Way CrossFire)  ($269.99 @ NCIX US) 
Video Card: PowerColor Radeon R9 290X 4GB Video Card (3-Way CrossFire)  ($269.99 @ NCIX US) 
Total: $809.97
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2015-01-14 20:26 EST-0500
 
 
Video Card: PNY GeForce GTX 970 4GB XLR8 Video Card (3-Way SLI)  ($324.99 @ Directron) 
Video Card: PNY GeForce GTX 970 4GB XLR8 Video Card (3-Way SLI)  ($324.99 @ Directron) 
Video Card: PNY GeForce GTX 970 4GB XLR8 Video Card (3-Way SLI)  ($324.99 @ Directron) 
Total: $974.97
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2015-01-14 20:26 EST-0500
 
$165 difference so my estimates are about right (close enough for not looking). That's still enough to block every single one as well with an AIO.
Link to comment
Share on other sites

Link to post
Share on other sites

$165 difference so my estimates are about right (close enough for not looking). That's still enough to block every single one as well with an AIO.

60$ mail-in rebate so 100$. Still one AIO and two more to go.

You still pay more. 

Link to comment
Share on other sites

Link to post
Share on other sites

First and foremost was the obvious bus width thanks to delta compression. Also if you only shop Newegg then you're not getting the best deal possible.

 

The R9 290X for $269 competes with the GTX 980 ($550) and performs better than the GTX 970 ($350). So AMD does have a upper hand in pricing.

 

The market changes too fast for either company to hold any kind of leadership title. AMD prices their cards quite high during initial launch and then saturates them into the market with price cuts. Something Nvidia is not familiar with. AMD does this to not only push stock but to also allow other classes of consumers the ability to purchase higher end cards. Such large cuts on last generation cards (200 series) is quite obvious as they are still GCN 1.0* based products. AMD didn't have to throw out a new architecture to remain competitive with Nvidia they just had to cut pricing. As for market share Nvidia will always remain the most purchased product. Reason for that is because nearly every retailer will push Nvidia cards because of their high price point which brings in higher revenue. They literally don't care if the card performs better or worse as they only care about dollar signs.

 

Their current performance crown is the GTX 980. Tho as stated the R9 380X will more than likely become the new performance leader which will probably cost in the same ball park considering all their new technologies. Tho after a month or two you'll likely notice a fluctuation in pricing of their products.

 

 

Look your argument is flawed because you aren't even clicking the website itself, you are just linking PCPart Picker which isn't even correct half the time. This is what happens when you actually click the link to the PCPartPicker that you keep saying is $269.99:

 

2e3z5nn.jpg

 

It's $324. So literally like $25 from a $970 (and there are even $970's under $350). So no they don't have the upper hand in pricing. They price their products lower because they have no choice, it's obviously the outdated and inferior product that's why it's priced competitively. How else are they going to sway customers to get that card over a 970? That's also why you sometimes find places dropping the price so low. To get rid of stock. You keep trying to make it seem like this is, "Good Guy AMD." They have to do these tactics because they have no other means to make money. That's why AMD can't turn a profit. They are hanging by a thread at this point.

 

NVIDIA has held the leadership title for much longer than AMD. Anytime AMD releases releases a card, NVIDIA has an answer. This is how it's going to be for some time to come. So you actually believe, that NVIDIA holds market share, not because they have the superior product but because retailers push the cards? Most people buy their products from online, what kind of marketing are online retailers doing to push NVIDIA products solely? Basically none. They have the highest market share because they make better products. Not to say that AMD cards don't perform well, because they do. But it's not just about pure performance. It's about the whole packaged, including drivers, usability, reliability, infrequency of RMAs, etc. etc. This is why NVIDIA holds market share, because they give you the whole package not just the card itself. 

 

The 380x will take the performance crown, but then NVIDIA will drop GM200. Lets say the 380x is 10% faster than the 980. GM200 will then be 40% faster than the 380x. If AMD drops the 380x and NVIDIA answers with a GM200 card. You really think AMD is going to drop the 390x? Yeah, maybe like 8 months later and the first GM200 part is not going to be fully unlocked from the get go. So all they have to do, is release a fully unlocked version and there you have it. AMD loses the crown again. Just like that. If you want to understand the future, look back to the past.

Link to comment
Share on other sites

Link to post
Share on other sites

So we're only getting 380x at first? At least it will still have the HBM. And holy balls 300w? Is this a 20nm or still on 28nm? I've zero problems with feeding them that kind of power as long as they give me more 4K performance. 

AMD Ryzen 5900x, Nvidia RTX 3080 (MSI Gaming X-trio), ASrock X570 Extreme4, 32GB Corsair Vengeance RGB @ 3200mhz CL16, Corsair MP600 1TB, Intel 660P 1TB, Corsair HX1000, Corsair 680x, Corsair H100i Platinum

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

So we're only getting 380x at first? At least it will still have the HBM. And holy balls 300w? Is this a 20nm or still on 28nm? I've zero problems with feeding them that kind of power as long as they give me more 4K performance. 

It depends on their strategy if they can dethrone the GTX 970 and GTX 980 with the R9 380X then they might launch it first. Then they could sit on Bermuda for as long as it takes Nvidia to retaliate with GM200. Unlike before it seems like AMD has a response for every card that Nvidia throws out there. I think the little/big architecture release cycle that Nvidia does has caught on with AMD. As for node we still don't know that either other than rumors. Speculation suggests 20nm and other speculation suggests 28nm. Truth of the matter is GloFo has been capable of mass production of 20nm 2.5D since Q4 2014 so it could be either.

Link to comment
Share on other sites

Link to post
Share on other sites

It depends on their strategy if they can dethrone the GTX 970 and GTX 980 with the R9 380X then they might launch it first. Then they could sit on Bermuda for as long as it takes Nvidia to retaliate with GM200. Unlike before it seems like AMD has a response for every card that Nvidia throws out there. I think the little/big architecture release cycle that Nvidia does has caught on with AMD. As for node we still don't know that either other than rumors. Speculation suggests 20nm and other speculation suggests 28nm. Truth of the matter is GloFo has been capable of mass production of 20nm 2.5D since Q4 2014 so it could be either.

If its 20nm and a 300w (assuming power draw not TDP) that's crazy sauce. But the HBM is the main interest for me playing at 4k (UHD) fast memory is sooo important. But its also going to annoy me that if I get CFX 380x, then they'll bring out the 390x and I'll feel out of the loop.

AMD Ryzen 5900x, Nvidia RTX 3080 (MSI Gaming X-trio), ASrock X570 Extreme4, 32GB Corsair Vengeance RGB @ 3200mhz CL16, Corsair MP600 1TB, Intel 660P 1TB, Corsair HX1000, Corsair 680x, Corsair H100i Platinum

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD Solution: Let's make our cards more power efficient and output less heat after all the jokes people make about us. THEN CLOCK IT UP AND PUMP AS MUCH POWAH AS POSSIBLE INTO IT TO MAKE IT TAKE OFF!

 

Isn't that what AMD always standed for  :D Personally, fuck the PSU, give me the POWAAAAAA

Link to comment
Share on other sites

Link to post
Share on other sites

If its 20nm and a 300w (assuming power draw not TDP) that's crazy sauce. But the HBM is the main interest for me playing at 4k (UHD) fast memory is sooo important. But its also going to annoy me that if I get CFX 380x, then they'll bring out the 390x and I'll feel out of the loop.

It really depends as the R9 280X will pull 300w at full load as well with just 2048 SP's. The R9 380X is rumored to have 3072 SP's so if it can operate at 300w then that would be quite impressive. An extra 1024 SP's at the cost of no extra power. Tho I get the feeling that he was in fact referring to TDP as the R9 290X has a TDP of 290w. So the extra 256 SP's would equate for the extra 10w TDP. That's why you wait for the R9 390X to launch. We know it's coming we just don't know when.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×