Jump to content

R9 390X Coming in 2015. Featuring 20nm, Liquid Cooling and High Bandwidth Memory "HBM"

dont forget to buy a fire extinguisher along with the card

:P

Directly buy fire brigade! Don't wait to the last moment. Get it right now.

This chip on 20mn process will come with same TPD like 290X. May be also wil have 'uber' mode.

Link to comment
Share on other sites

Link to post
Share on other sites

I'll bet the power requirement will be at least a 750w psu.

 

Ok, so we've got Tonga, which showed a massive performance/watt improvement over Tahiti.

Second, HBM consumes 30% less power than GDDR5.

Third, 20nm is a more power efficient node than 28nm.

 

So, assuming that the new cards will, in fact, be manufactured on 20nm and use HBM, is there any (logical) reason to think that the 390X is going to be more of a power-sucking monster than the 290X?

Link to comment
Share on other sites

Link to post
Share on other sites

Ok, so we've got Tonga, which showed a massive performance/watt improvement over Tahiti.

Second, HBM consumes 30% less power than GDDR5.

Third, 20nm is a more power efficient node than 28nm.

 

So, assuming that the new cards will, in fact, be manufactured on 20nm and use HBM, is there any (logical) reason to think that the 390X is going to be more of a power-sucking monster than the 290X?

Anti-AMD fanboyism?

 

Because the 29x series cards ran particularly hot, people pretty much assume AMD is just gonna build heat expelling monsters now - whether that assumption is logical or not. Given the (rather limited) information we currently have, it appears like AMD has all the pieces to have a very cool and efficient card.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Anti-AMD fanboyism?

 

Because the 29x series cards ran particularly hot, people pretty much assume AMD is just gonna build heat expelling monsters now - whether that assumption is logical or not. Given the (rather limited) information we currently have, it appears like AMD has all the pieces to have a very cool and efficient card.

This.

Just because Nvidia's 480 were leafblowers does not mean they can not create efficient cards. *looks at maxwell*

I'm really looking forward getting the 390x if it does not suck.

My Rig "Jenova" Ryzen 7 3900X with EK Supremacy Elite, RTX3090 with EK Fullcover Acetal + Nickel & EK Backplate, Corsair AX1200i (sleeved), ASUS X570-E, 4x 8gb Corsair Vengeance Pro RGB 3800MHz 16CL, 500gb Samsung 980 Pro, Raijintek Paean

Link to comment
Share on other sites

Link to post
Share on other sites

This.

Just because Nvidia's 480 were leafblowers does not mean they can not create efficient cards. *looks at maxwell*

I'm really looking forward getting the 390x if it does not suck.

Agreed, you've got the perfect attitude on this. Hopeful optimism, yet waiting for the final results before judging.

 

The 4xx series cards were monsters and spewed out a ton of heat. But I believe the GTX 5xxx series were even worse, if you remember that far back.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Agreed, you've got the perfect attitude on this. Hopeful optimism, yet waiting for the final results before judging.

 

The 4xx series cards were monsters and spewed out a ton of heat. But I believe the GTX 5xxx series were even worse, if you remember that far back.

Actually, the architecture for the 580/570 was modified a bit.  The GTX 480 was a lot more leaky.

Link to comment
Share on other sites

Link to post
Share on other sites

Actually, the architecture for the 580/570 was modified a bit.  The GTX 480 was a lot more leaky.

You misunderstand. I meant GTX 5xxx, not GTX 5xx.

 

http://www.gpureview.com/geforce-fx-5900-ultra-card-161.html

256MB of VRAM bitches...

 

A LOT older than the 580/570. The 580 (and so on) cards were actually quite a nice change from the 480, given that they share the same basic architecture.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Liquid cooled reference card = case compatibility problem :/

LTT CSGO SERVER! IP 8.12.22.45!~  Connect by connecting on csgo console

Use console command "connect"   --->  connect 8.12.22.45

Link to comment
Share on other sites

Link to post
Share on other sites

What case doesn't have a 120mm fan placement available?

 

I got an old case that uses these "clips" rather than screws so some fans don't work on it depending on the clearance of the clips.

Link to comment
Share on other sites

Link to post
Share on other sites

You misunderstand. I meant GTX 5xxx, not GTX 5xx.

i remember this generation quite well as i was trying to research my first build (with parents money LOL). Think ATi cards that gen were great with the 9800pro etc. They had superior shader and directx 9 performance. So nvidia was ramping up clock speed and TDP trying to stay in the game which. But they came back strongly with the next gen 6xxx series.
Link to comment
Share on other sites

Link to post
Share on other sites

i remember this generation quite well as i was trying to research my first build (with parents money LOL). Think ATi cards that gen were great with the 9800pro etc. They had superior shader and directx 9 performance. So nvidia was ramping up clock speed and TDP trying to stay in the game which. But they came back strongly with the next gen 6xxx series.

Oh man the ATI 9800 Pro.... Jesus what a BEAST of a card that was. That thing could run anything back in the day, it was just so powerful. They didn't really have that same resurgence again until the x800XT and the x850XT series, and by then, the difference wasn't as drastic compared to what NVIDIA was offering at the time.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Oh man the ATI 9800 Pro.... Jesus what a BEAST of a card that was. That thing could run anything back in the day, it was just so powerful. They didn't really have that same resurgence again until the x800XT and the x850XT series, and by then, the difference wasn't as drastic compared to what NVIDIA was offering at the time.

for me it is the ATi 9550, my first ever computer part that I owned, that I bought on my hard spared money at age of 12 or so. I am not sure how that compared to other HW at that time ( we had no internet, just local shops) but for me it was the most powerful component ever :)

Link to comment
Share on other sites

Link to post
Share on other sites

How many people that are actually getting a 390x are going to run into this problem?

 

Not many to the point that it doesn't matter, to be honest about it. It's just that there will always be cases of some new things not working on old things.

 

I'm not sure why you italicized "actually" considering that I'm using a full tower case and a 700 watt power supply so I could get one if I wanted to.

Link to comment
Share on other sites

Link to post
Share on other sites

Ok, so we've got Tonga, which showed a massive performance/watt improvement over Tahiti.

Second, HBM consumes 30% less power than GDDR5.

Third, 20nm is a more power efficient node than 28nm.

 

So, assuming that the new cards will, in fact, be manufactured on 20nm and use HBM, is there any (logical) reason to think that the 390X is going to be more of a power-sucking monster than the 290X?

 

dont think so, also wont HBM mean they wont need such a large BUS  to get the memory bandwidth needed

Processor: Intel core i7 930 @3.6  Mobo: Asus P6TSE  GPU: EVGA GTX 680 SC  RAM:12 GB G-skill Ripjaws 2133@1333  SSD: Intel 335 240gb  HDD: Seagate 500gb


Monitors: 2x Samsung 245B  Keyboard: Blackwidow Ultimate   Mouse: Zowie EC1 Evo   Mousepad: Goliathus Alpha  Headphones: MMX300  Case: Antec DF-85

Link to comment
Share on other sites

Link to post
Share on other sites

Not many to the point that it doesn't matter, to be honest about it. It's just that there will always be cases of some new things not working on old things.

 

I'm not sure why you italicized "actually" considering that I'm using a full tower case and a 700 watt power supply so I could get one if I wanted to.

 

I italicized it because the majority of the people out there aren't running some old case that doesn't have a normal mounting system for fans or a 120mm fan placement anywhere inside the case. If they are, chances are they aren't looking at a 390x, because it doesn't seem very logical for someone to spend 600-some-odd dollars on a new GPU but they haven't updated their case in 6 years.

 

Anyway, If AMD watercools their next flagship GPU, it will revolutionize the industry. This will be a huge stepping stone for AMD as far as taking the industry to another level of manufacturing. With the progression AIO cooling units are making now in the industry. Having a flagship GPU also being water cooled from the factory will take all the positives from AIO cooling into another segment of the industry. This will be a huge progressive step for setting an industry standard. That in a decade from now I see being the norm (air cooling obsolete on the desktop PC). Right now the biggest limiting factor with main computer components like CPU's and GPU's is heat. Watercooling addresses those problems so well. However not everyone has the knowledge or patience or money to dive into a custom loop themselves. It also is such a niche section of the industry that it doesn't really allow access for everyone to dive into it. Some people wouldn't even consider it because it almost to some extent is an expensive hassle - unless of course you are passionate about it, which is why it is considered a niche section of the industry. AIO cooling takes the same positives that custom loops have and simplifies it. Maintenance is a breeze, installation is relatively simple, and the temperatures speak for themselves. Especially when you put some nice static pressure fans on the units. AIO units are going to make a huge leap in performance in the next five years. By then delivering custom loop temps right out of the box. If AMD successfully produces a flagship GPU that is watercooled from the factory. It will take the industry to another level. Showing other manufacturers that you can successfully release a product that is watercooled from the get go. This means not just for GPU's but also for CPU's; higher clock speeds right out the gate. Which means faster products that still are quiet and remain cool. If AMD has a successful launch for a watercooled reference GPU. They are only going to take that success to their CPU's. Meaning their next released flagship CPU will also be watercooled. Which again, means higher clockspeeds and more performance. If they have two successful product releases that are watercooled from the factory. This means companies like Intel and nVidia will follow suit. Which again, means faster products right from the factory with higher clockspeeds for everyone. You don't see Intel releasing 5GHz clocked CPU's yet, because as far as the Haswell architecture goes it runs hot as balls especially when compared to Sandy Bridge. The Haswell Refresh provided slightly better temps but overall compared to Sandy Bridge it is still a huge increase in temperatures. It has been that way since Ivy Bridge came along. These chips are only going to get hotter from here on out. Which for the consumer would only mean we aren't going to see a huge increase in clockspeeds. Unless of course they address that issue from the factory with proper cooling solutions instead of your standard heatsink. They cant guarantee faster products, if they can't address the heat issue. Even take nVidia with Maxwell. Maxwell is more efficient than Kepler. But what the most amazing thing about Maxwell is what it is able to overclock too. Of course the overclocks performance doesn't scale as well as previous generation cards. It still is amazing to see how high they can hit out of the box. The only thing really holding back Maxwell clocks is locked voltages. Most people topping out in the 1450-1550MHz range. Increase the voltage though and you can start getting into the 1650MHz-1750MHz range. Now of course when you increase the voltage and you increase the clocks. Well then it isn't so efficient anymore and now you are back to square one again with heat being the main problem. But put it under water and voila, problem solved. Higher voltages, higher clocks, but heat is a non issue. Everyone is happy because we get the performance and quality we should be expecting out of products in 2014 (I mean the future is now).

 

Now of course this means higher prices for these products. But personally, I rather have a better performing product that is faster that comes with proper cooling for little more money as opposed to a gimped product that is slower for less money. Of course as the industry progresses the prices of AIO units will drive down as they rise in popularity. Which means they will be able to be retrofitted for much less money. This also means since everyone in the industry is simultaneously working on AIO's the development process for them will increase at lightning speeds. Seeing vastly better, more reliable, and cheaper units much faster than they have been progressing right now. It also means aftermarket units will be multitudes better than the ones being shipped with the products from the factory. 

 

All we can hope is AMD goes ahead and does this for the 390 and 390X, because if they do it, and do it successfully, it will change everything. 

Link to comment
Share on other sites

Link to post
Share on other sites

I italicized it because the majority of the people out there aren't running some old case that doesn't have a normal mounting system for fans or a 120mm fan placement anywhere inside the case. If they are, chances are they aren't looking at a 390x, because it doesn't seem very logical for someone to spend 600-some-odd dollars on a new GPU but they haven't updated their case in 6 years.

 

I haven't updated my case in over 8 years and am having no troubles  :P There is no reason to update a case as the only changes made over the past few years are cable management, easily accessible dust filters, slots for your ssd, toolless designs, or other small things that don't matter, but are nice to have. Even with my clips in my case I could just take a drill and make some screw holes if I really wanted to.

 

Anyways, I agree with you that AMD should watercool their high end cards. They are a graphics card company and doing everything they can to make the best card is what they should be doing.

Link to comment
Share on other sites

Link to post
Share on other sites

Watercooled reference card

 

Definitely not having problems keeping them a reasonable temperature and noise then /s

Link to comment
Share on other sites

Link to post
Share on other sites

AMD already ships AIOs with their flagship CPUs 9370 and 9590).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I see, some really, really, stupid comments up in this bish, its kinda sad.

For your money, AMD gives you more, if you have more money to spend, Intel gives you more performance because hey, AMD don't go that high so its common sense. Its simple, up to the AMD 9590 NOTHING Intel offers at THAT PRICE POINT beats the AMD period. But of course, if you got more money to spend, there are plenty of boss hoss Intel cpu's.

9590 right now on newegg is 260. FOR THAT SAME PRICE but NO MORE, you are looking at the i5-4690k. http://cpuboss.com/cpus/Intel-4690K-vs-AMD-FX-9590. You HAVE to compare price points. There is no if's and's or but's.

Now, IF you had the extra 50 bucks, which most might have depending on their budget (but very likely), they could easily get the 3770k which then I would agree beats the 9590. Now for gaming, this isn't a huge deal, the extra 50 bucks isn't going to reward you with uber amounts of FPS. BUT EVEN THEN, you are still looking at other parts cost as a whole. Intel motherboard are generally more expensive, same with memory. Not all the time, but its there.  Now I got my 9590 for only 215 on a combo deal totaled out to 380 for cpu/mobo combo which was a hella good deal (thanks microcenter). Of course this might label me as a "fanboy" but who cares. Truth hurts. Also, the 9590 doesn't "run hot" by any means. On my kraken x60 with proper fans (not that noctua garbage Linus is so anal about) my 9590 with house A/C on 70 runs 33c idle 38c load. With heat on 72f, the cpu runs 35c idle and 44c load. Still not "uber hot" like retards spew, even worse reviews that like and try to run it with an "air cooler" because "air coolers are still better than liquid cooling" SMH.

The worst comment ever, is "intel has better single core performance". That is just stupid. Even single core applications get a boost from multi-core setups. Sure the program isn't "written" to take "full advantage of" but it still gets a nice kick in the ass. I KNOW for a fact since I was an early true dual core supported when I had purchased a AMD Athlon x2 4400 (2.2ghz x2). My father got the p4 and the ONLY thing he was able to do "faster" was encoding and decoding videos. It took his pc 20 minutes it took mine 30. Same memory speeds and timings. Now creating zip files, I beat him by 15 seconds. He was pissed about that. He even said "10 minutes on a video isn't all that bad, I should have saved money with the AMD".  Ever since we been building extreme budget AMD rigs and always getting more performance for out dollar. Now at the time, reviews had always been saying "intel has better single core performance, nothing takes use of dual core, so there is no point buying AMD, don't be stupid, get Intel" and so the bullshit went. THE WORST PART is when the reviews claimed a result, yet actually owning the part and comparing it to their result and copying their test, I found that my part didn't do as bad as they claimed. I couldn't explain it back then, but over the years I've learned.

Most review sites, including Linus here, get money and perks for spreading love for a particular brand. In the Linus world, that's Intel, and I guarantee he gets perks and cheaper if not free parts. I've seen him do it many times, any person with a modicum of common sense would see it too. Now I am not hating on Linus overall he does a decent job. But, the "paid off" mentality is still there, and I hear it in his voice all the time when he compares the two. If you think this is bullshit, and you've been around a long long time like myself, ask yourself this, do you remember times when gpu's were still being reviewed at 800x600 yet monitors supported all the way up to 1280x1024? Nvidia used to dominate the 800x600 realm and since "no real gamer uses more than 800x600 we wont review any higher" was the mentality. Of course being a smart ass I had spend money on both brands to do my own tests at true resolution, and I found that at the time ATI raped nvidia in the higher resolution market, but of course, 800x600 the reviews were spot on, but no one I knew played at such a low resolution. This bullshit went on all the way up to a point where monitors were LCD and supporting 1680x1050. It was really sad. Then one day most review sites changed, some had already changed but weren't very popular, but then one day everyone started to release proper reviews, well, somewhat proper. Then you had the horrifying problem when reviews wouldn't turn on AA or AF knowing that AMD would outperform nvidia when maxing out AA and AF the way most gamers would. Once again I bought both brands to compare and my results were spot on. For the longest time I had noticed that with ATI/AMD maxing AA and AF didn't matter, I always seemed to get the same fps with it on and off, even in between with different settings (2x 4x etc) where as my nvidia chips would drop fps each time I added on more AA and AF. Then I realized review sites didn't turn on AA and AF past a certain point per review just to make nvidia look good. Some would do 4x, others 2x, yet the games would support the highest it could go. This too went on for a long time. Out of it all, I had learned one good thing, never trust a review, take it with a grain of salt. Which in itself would mean for you all to take what I say with a grain of salt, but hey, you can do your own research and find out for yourself. My biggest "hurra" for AMD gpu's were that if my fps didn't change when adding on AA and AF, why not leave it on and gain the quality of picture which allowed me to see games better and more clearly thus raping newbs. But hey, to each their own (talking to you, 800x600 kids).

Now, onto the topic of said forums post. The new 300 series AMD with HBM and 20nm. Someone had said "if it has HBM and 20nm, it will just suck up more power right?" either you were being facetious as hell or you are really stupid. HBM uses less power than GDDR5 by a large % and 20nm also would use ding ding less power. Therefore even if AMD was "power hungry" it would still be less than a 28nm GDDR5 counter part if they went that route. Honestly, the 200 series AMD wasn't THAT much better than my 7970, as my 7970 is basically a 270x. So no point in upgrading. Even if the new nvidia chips are great prices, they are early releases. I will wait to see what AMD offers on the 300 series.

ON THAT NOTE. Some of you are saying "amd always waits for nvidia to release just so they have a fighting chance at releasing a proper gpu" this is so ignorant I can't even..... First off, if you go back to the dawn of time, maybe you have a decent memory but probably not, google it, if you count release for release starting from first gpu to last, right now, nvidia is 1 generation ahead of AMD. RELEASE for RELEASE, REGARDLESS OF ""DATE"", nvidia is ahead by 1 generation, they have been like this for a long time, always being the first to release a new generation card. They have the money and fan support to do so. Now of course fanboys will scream "but amd is slow and it gives them time and blah blah blah" take it for what it is, AMD has yet to release a card that is the same generation as nvidia. So go nvidia for being on point and quick with their research/building times. WHEN the 300 series hits, I will be pleased to see how it compares to nvidia's generation 900 series. Its going to be quite the battle indeed.

In the end, honestly, get what you can afford, and enjoy it. I have been building computers for 17 years and I enjoy all builds, both intel and amd, nvidia and amd, its all great in my book. All in all, we are all master PC race, and must fight to destroy petty consoles!

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

... Honestly, the 200 series AMD wasn't THAT much better than my 7970, as my 7970 is basically a 270x.  ...

-snip-

Actually this statement is factually incorrect.

 

The R9 280 is essentially an HD 7950 with tweaked clock speeds. The R9 280x is essentially an HD 7970 GHz Edition with some minor tweaks. They did do some very minor architectural and hardware tweaks, but it's basically the same core. They are both based on the Tahiti core.

 

A R9 270x is actually not much more than a rebranded HD 7870 GHz Edition. It is based on the Pitcairn core.

 

Here are some links if you'd like to learn more:

http://en.wikipedia.org/wiki/AMD_Radeon_Rx_200_Series

http://en.wikipedia.org/wiki/Radeon_HD_7000_Series

 

Unfortunately, that little mistake does erode a little at the credibility of your statements. I find it highly unlikely that Linus is being biased towards Intel because he's being paid off. Nor do I believe that there is a CONSPIRACY within the entire PC Enthusiast journalism community. One or two review sites being bought off? Sure. No problem. All of them? Hell no.

 

Where's the proof? Show me hard evidence that is replicable, that the reviews are biased.

 

Also, FYI, if you actually bothered to look at that CPU Boss link about the AMD 9590 vs i5-4690K, the i5 actually wins... There were only four benchmarks that they showed that the AMD took the lead in. All the rest (That's seven additional benchmarks), the i5 took the lead, sometimes drastically.

 

Everyone knows that in gaming, an AMD 83xx is fast enough and will be fine. IF that is all you are doing. Most people recommend the i5 over AMD because it's more well rounded. Gaming is very GPU intensive, and most games are relatively light on CPU demand. However if you are doing any CPU bound work, then the i5 is most definitely a better value in most (but not all) circumstances.

 

As for the rest of your post, I have no idea what you're ranting about.  Yes there are tons of NVIDIA fanboys posting, but there are also tons of AMD fanboys posting.

 

FANBOYS CAN GO FUCK THEMSELVES, NO MATTER WHAT SIDE THEY ARE ON.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

FANBOYS CAN GO FUCK THEMSELVES, NO MATTER WHAT SIDE THEY ARE ON.

QFT,  they are the reason every single thread gets derailed with stupid fucking graphs that prove nothing.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

QFT,  they are the reason every single thread gets derailed with stupid fucking graphs that prove nothing.

9fa113b777.png

My Rig "Jenova" Ryzen 7 3900X with EK Supremacy Elite, RTX3090 with EK Fullcover Acetal + Nickel & EK Backplate, Corsair AX1200i (sleeved), ASUS X570-E, 4x 8gb Corsair Vengeance Pro RGB 3800MHz 16CL, 500gb Samsung 980 Pro, Raijintek Paean

Link to comment
Share on other sites

Link to post
Share on other sites

9fa113b777.png

that actually makes this thread much more bearable.  And to be honest your argument is a lot more rational than all the BS about power consumption and lies etc.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×