Jump to content

AMD Lists The Radeon RX 490 Flagship – Polaris based Dual GPU Graphics Card For 4K Ready Gaming

Mr_Troll
11 minutes ago, -BirdiE- said:

As much as I'd love to think they're just waiting... I'm worried the issue is deeper than that.

If it takes them 165w (or whatever it is) with Polaris just to get 970 performance, what would it need to draw to compete with something like the 1080?

 

Maybe I'm just John Snow (I know nothing) but I feel like they were trying for Polaris to be the 490, and Vega to be the next iteration of the Fury series, but Polaris was just incredibly underwhelming so they had to sell the full chip for $239. There were rumors that Polaris was a big disappointment within AMD, which I dismissed at the time, but the more I see about Polaris, the more it seems as if those were true.

 

They need Vega ASAP.

Nvidia's GPUs are leaner. That much is clear but they do sacrifice some things to get there. I'm not gonna argue which approach is better.

 

Polaris was always meant to be a small entry-to-mid range chip. It's the media and consumers who have had this skewed perception of what it was. There is a reason Vega is coming so soon on the heels of Polaris, it's a bigger Polaris with some tweaks (probably); you can't just make a new architecture in six months. AMD just has different code names than Nvidia. There is of course also the fact that AMD doesn't have the same resources as Nvidia does. It slows them down and hinders them time and time again. That's probably why Vega isn't launching sooner. They need to fit it within their budget, so that means a very staggered release.

Link to comment
Share on other sites

Link to post
Share on other sites

Polaris OC to death with gddr5x or HBM to boost performance

Processor: Intel core i7 930 @3.6  Mobo: Asus P6TSE  GPU: EVGA GTX 680 SC  RAM:12 GB G-skill Ripjaws 2133@1333  SSD: Intel 335 240gb  HDD: Seagate 500gb


Monitors: 2x Samsung 245B  Keyboard: Blackwidow Ultimate   Mouse: Zowie EC1 Evo   Mousepad: Goliathus Alpha  Headphones: MMX300  Case: Antec DF-85

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, goodtofufriday said:

Ive said it before and Ill say it here, Why does power draw even matter these days? Im thinking back to my 6990 that used up 450W and other cards of the day. A couple W difference literally means nothing today. Even in places where electricity is expensive 40W is a marginal difference in price. 

It has nothing to do with the cost of electricity. As chips get smaller and faster, they get harder to cool as the heat gets more concentrated. The more power draw, the more heat, the lower the clock speeds for a similar chip.

 

Do you think it's coincidence that today's architectures are much faster than your 6990? Probably not. Greater efficiency allows them to make faster chips.

 

What does this mean for consumers? Not a whole lot as long as price/performance is the same. (Stated it was fine for consumers in my original post)

 

What does this mean for the companies? Higher cost to produce a GPU at any given performance.

Link to comment
Share on other sites

Link to post
Share on other sites

Very bad news if they turn out to be true. Dual GPU cards are simply not as good as single GPU ones.

What they need in order to compete with Nvidia is Vega. They should not be making a dual GPU card to compete with them. A dual GPU card using mid-range GPUs doesn't even make any sense.

 

 

42 minutes ago, samcool55 said:

Wut? Doesn't make sense.

They should call it the rx 480x2 if it's a dual-gpu polaris-based card.

AMD has historically not used x2 for their dual GPU cards.

Examples:

5870 was single GPU. 5970 was dual GPU.

6970 was single GPU. 6990 was dual GPU.

7970 was single GPU. 7990 was dual GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, -BirdiE- said:

It has nothing to do with the cost of electricity. As chips get smaller and faster, they get harder to cool as the heat gets more concentrated. The more power draw, the more heat, the lower the clock speeds for a similar chip.

 

Do you think it's coincidence that today's architectures are much faster than your 6990? Probably not. Greater efficiency allows them to make faster chips.

 

What does this mean for consumers? Not a whole lot as long as price/performance is the same. (Stated it was fine for consumers in my original post)

 

What does this mean for the companies? Higher cost to produce a GPU at any given performance.

So then as long as the cooler can do the job my point stands that power consumption is a non-issue

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, LAwLz said:

7970 was single GPU. 7990 was dual GPU.

Wasn't it an unofficial card made by like Powercolor or Club3D?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, goodtofufriday said:

So then as long as the cooler can do the job my point stands that power consumption is a non-issue

As I said, it's a non-issue for consumers given the same price/performance... Never did I claim it was some huge issue for consumers...

 

It IS an issue for the companies, since more cores and a more robust cooler will eat into their margins.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, goodtofufriday said:

I've said it before and Ill say it here, Why does power draw even matter these days?

power draw means heat - excessive heat kills components

 

proud owner of a, now defective, HD4870X2

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, goodtofufriday said:

Ive said it before and Ill say it here, Why does power draw even matter these days? Im thinking back to my 6990 that used up 450W and other cards of the day. A couple W difference literally means nothing today. Even in places where electricity is expensive 40W is a marginal difference in price. 

Power draw is important because rampant power draw increases your thermal envelop, which means you need a better cooling solution (that is, loud fans or an AIO water cooler). Not to mention you won't have a shot in the mobile and server markets if you're a higher power consumer.

 

For mobile, every minute on the battery counts. Sure you won't be gaming on battery, but if an NVIDIA laptop can squeeze out an hour more for the same or slightly worse performance, I'd take that over AMD. In servers and workstations, sure, 40W may not seem like much to you, but when you multiply that by hundreds if not thousands of units, it starts to add up real quick. And that's not even factoring what it would cost to keep those computers cool.

 

Intel and later NVIDIA learned the best way to go forward is to build from the bottom up, because it's much easier to scale up than it is to scale down. AMD seems to just realize this, sort of.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, LAwLz said:

AMD has historically not used x2 for their dual GPU cards

well, HD4870X2 exists under AMD portfolio

although, ATi used to name x2 their dual-GPU cards: HD3870X2, and maybe others ... don't recall

 

le: HD3870X2 was under AMD too 9_9

Edited by zMeul
Link to comment
Share on other sites

Link to post
Share on other sites

Considering how much time they needed for the last dual GPU, I am expecting this when Win11 is released.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, M.Yurizaki said:

Power draw is important because rampant power draw increases your thermal envelop, which means you need a better cooling solution (that is, loud fans or an AIO water cooler). Not to mention you won't have a shot in the mobile and server markets if you're a higher power consumer.

 

For mobile, every minute on the battery counts. Sure you won't be gaming on battery, but if an NVIDIA laptop can squeeze out an hour more for the same or slightly worse performance, I'd take that over AMD. In servers and workstations, sure, 40W may not seem like much to you, but when you multiply that by hundreds if not thousands of units, it starts to add up real quick. And that's not even factoring what it would cost to keep those computers cool.

 

Intel and later NVIDIA learned the best way to go forward is to build from the bottom up, because it's much easier to scale up than it is to scale down. AMD seems to just realize this, sort of.

My comment is for the desktop consumer enthusiast market. For other markets its a different story. I just get upset when people compare power draw as through its a deciding factor. In desptops for games it just doesn't matter anymore besides maybe an ITX machine 

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, goodtofufriday said:

My comment is for the desktop consumer enthusiast market. For other markets its a different story. 

enthusiast consumer with 1/2 a brain would buy a GTX1080 instead of a dual Polaris10 card

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, zMeul said:

enthusiast consumer with 1/2 a brain would buy a GTX1080 instead of a dual Polaris10 card

i dont disagree. My first comment on this thread was basically this. Single gpu options are always better.

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

Mini-ITX master race represent

Anyone calling this 'a waste' are ATX users stuck in the past.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, goodtofufriday said:

i dont disagree. My first comment on this thread was basically this. Single gpu options are always better.

yes, and this exactly why a dual RX480 video card doesn't make much sense

Link to comment
Share on other sites

Link to post
Share on other sites

I'm gonna wait and see whether AMD is going to start doing their multi-GPU single-die stuff now, instead of judge them based on the assumption that these will be two separate dies joined by Crossfire.

 

I'm also going to wait for reviewers to get a look at what frametimes are like, both in DX11 and DX12.

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hilarious. Dual 480s don't even match a 1080 in a number of titles, and even where they do, 45fps is not quite 4K-ready.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Energycore said:

I'm gonna wait and see whether AMD is going to start doing their multi-GPU single-die stuff now, instead of judge them based on the assumption that these will be two separate dies joined by Crossfire.

 

I'm also going to wait for reviewers to get a look at what frametimes are like, both in DX11 and DX12.

Multi-Logic interposers won't be ready for prime time for at least another year. It will be XFire. Of that you can be certain.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, zMeul said:

enthusiast consumer with 1/2 a brain would buy a GTX1080 instead of a dual Polaris10 card

 

49 minutes ago, goodtofufriday said:

i dont disagree. My first comment on this thread was basically this. Single gpu options are always better.

I Think maybe if it was out Right now it wouldn't be the worst option seeing 8gb Cards going for 240 so if it was 500 and had similar pref to a 1080 which realistically is 700 at this moment in the US that 200 dollar less might be worth the headache of a dual GPU. But as always 1 GPU is better then 2 but other reasons can be taken into account price and performance is a big one along with power draw 

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, goodtofufriday said:

My comment is for the desktop consumer enthusiast market. For other markets its a different story. I just get upset when people compare power draw as through its a deciding factor. In desptops for games it just doesn't matter anymore besides maybe an ITX machine 

Lets see this tru. 480 is 970lv performance with 165w. and 1070 is 980ti lv performance with 150w. For AMD to have a card with 1070 lv performance I would guess a 200ish Watt power draw. sure 40W is no biggie.

However, with the same architecture,  what will the power draw be for a 1080 lv performance? We now know 2xRX480 wont outperform a 1080 thats 165X2 = 330W vs a 180W for 1080. Now tats a huge deal (150W difference). You need a waaay better PSU and a way better cooling system. Also, the overclock headroom is way thinner with tat heat output under same cooling conditions.

Now, how much power will AMD need to give a Titan P performance? God knows. 500W? tats even more heat output and even thinner overclock headroom. And THIS is why I am very disappointed with polaris architecture.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Trixanity said:

Wasn't it an unofficial card made by like Powercolor or Club3D?

Nope. Official card. Although Powercolor might have made their own version.

 

 

1 hour ago, That Norwegian Guy said:

Mini-ITX master race represent

Anyone calling this 'a waste' are ATX users stuck in the past.

This card probably won't be smaller than the GTX 1080. Most likely won't outperform it either (overall), and in games where scaling isn't that great you might go all the way down to 1/2 of the performance. Not to mention microstuttering and all the other headaches multi-GPU setups might have.

Oh and let's not forget that having two 480 GPUs will probably produce a lot more heat than a single 1080 GPU.

 

If this is a dual 480 card, then it will not be better than a GTX 1080 for a mini-ITX build.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

Very bad news if they turn out to be true. Dual GPU cards are simply not as good as single GPU ones.

What they need in order to compete with Nvidia is Vega. They should not be making a dual GPU card to compete with them. A dual GPU card using mid-range GPUs doesn't even make any sense.

 

 

AMD has historically not used x2 for their dual GPU cards.

Examples:

5870 was single GPU. 5970 was dual GPU.

6970 was single GPU. 6990 was dual GPU.

7970 was single GPU. 7990 was dual GPU.

You forgot the r9 295x2 so it would make sense imo...

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, LAwLz said:

-snip-

I think he meant the HD 7990.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Misanthrope said:

No chance in hell it's just a dual polaris card unless they've run into catastrophic issues with Vega. 

Or Unless it has 4608 Stream Processors on a single GPU core! Polaris 1? :D.

 

But seriously. It either has to be 2x RX 480s on a single PCB or it has to be a R9 Fury /Fury X but with a 14nm die shrink.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×