Jump to content

AMD releases the Radeon Pro Duo

DocSwag

Source: http://www.anandtech.com/show/10279/amd-releases-radeon-pro-duo-fiji-350w-vr

 
Quote

The AMD Radeon Pro Duo was first announced back in March, with the card is being marketed for VR content creation first and foremost. With this card, AMD is promoting the ability to allocate one GPU per eye while powering VR experiences. This way the case is opened up for performance beyond what any other single card can offer today. Another use case for developers is offloading compute work to the second GPU while the first is used for graphical work which can make for a much smoother experience during demanding a workflow.

 
AMD GPU Specification Comparison
  AMD Radeon Pro Duo AMD Radeon R9 Fury X AMD Radeon R9 Fury AMD Radeon R9 295X2
Stream Processors 2 x 4096 4096 3584 2 x 2816
Texture Units 2 x 256 256 224 2 x 176
ROPs 2 x 64 64 64 2 x 64
Boost Clock 1000MHz 1050MHz 1000MHz 1018MHz
Memory Clock 1Gbps HBM 1Gbps HBM 1Gbps HBM 5Gbps GDDR5
Memory Bus Width 2 x 4096-bit 4096-bit 4096-bit 2 x 512-bit
VRAM 2 x 4GB 4GB 4GB 2 x 4GB
FP64 1/16 1/16 1/16 1/8
TrueAudio Y Y Y Y
Transistor Count 2 x 8.9B 8.9B 8.9B 2 x 6.2B
Typical Board Power 350W 275W 275W 500W
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 28nm
Architecture GCN 1.2 GCN 1.2 GCN 1.2 GCN 1.1
GPU Fiji Fiji Fiji Hawaii
Launch Date Q2 2016 06/24/2015 07/14/2015 04/21/2014
Launch Price $1499 $649 $549 $1499
 

 

Quote

The Radeon Pro Duo is essentially and effectively two Radeon R9 Nanos together on a single PCB. At a high level, the Pro Duo should give us up to twice the performance at twice the power consumption (plus a bit extra for PCIe switches). To remove heat, the card comes with a closed loop cooler similar to that found on AMD’s Radeon R9 Fury X. This cooler, unlike the one found on the R9 295X2, provides a complete liquid cooling solution covering the VRMs on both GPUs along with the GPUs themselves. For reference, the pipes on this one are 540 mm long, and the double-thick radiator with fan comes in at 63 mm

Quote

Moving past the cooling solution we get three full sized DisplayPort connectors and one full-size HDMI port. On the side of the card there are three 8-pin PCIe power connectors which will do more than an adequate job of supplying the rated 350W power draw. Note that 350W is the equivalent of dual R9 Nano cards (rated at 175W a piece), and will be clocked similarly. The reactive frequency adjustments to heavily loading, by inference, are likely to be similar but we expect AMD to be using low-power binned parts for their new high-end card.

So AMD has finally released the pro duo, and it looks like are making it two nanos that are watercooled. My guess is that the power limit is 525 watts. All we've learned today is the power draw is 350 watts. Tbh, I don't think this card is worth it at all at 1500. If it decreases in price, however, like the 295x2, ti could become a worth buying card.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, DocSwag said:

So AMD has finally released the pro duo, and it looks like are making it two nanos that are watercooled. My guess is that the power limit is 525 watts. All we've learned today is the power draw is 350 watts. Tbh, I don't think this card is worth it at all at 1500. If it decreases in price, however, like the 295x2, ti could become a worth buying card.

It's not a worth buying card right now, and it definitely will not be either once pascal and polaris get released in a few months...

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Enderman said:

It's not a worth buying card right now, and it definitely will not be either once pascal and polaris get released in a few months...

Unless you're a rich bastard.

 

Shot through the heart and you're to blame, 30fps and i'll pirate your game - Bon Jovi

Take me down to the console city where the games are blurry and the frames are thirty - Guns N' Roses

Arguing with religious people is like explaining to your mother that online games can't be paused...

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Fulgrim said:

Unless you're a rich bastard.

 

in which case you would buy two duo pros for a few months and then upgrade :P

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, Enderman said:

It's not a worth buying card right now, and it definitely will not be either once pascal and polaris get released in a few months...

For gamers, yea probably not a good idea.

But it is marketed as a card that can do content creation and gaming.

The only thing that i can think of that's similar is the Titan series.

And it has certified drivers that are designed for content creation like the FirePro cards have.

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Fulgrim said:

Unless you're a rich bastard.

 

I still WANT IT.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, kcaBsIsuxeN said:

I mean seriously.  Look at it..  Have to wonder about the single 120mm(?) though.  Even if the rad is twice as thick..

 

 

Radeon Pro Duo briefing deck-page-001_575px.jpg

If it's a HardwareLabs Black Ice GTX, or an Alphacool Monsta, it can handle two Nanos pretty well.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, patrickjp93 said:

If it's a HardwareLabs Black Ice GTX, or an Alphacool Monsta, it can handle two Nanos pretty well.

Fair enough.  Wonder about noise too though regarding the pressure the fan has to deal with pushing/pulling air through that very wide rad?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, kcaBsIsuxeN said:

Fair enough.  Wonder about noise too though regarding the pressure the fan has to deal with pushing/pulling air through that very wide rad?

Given the fan is a Gentle Typhoon, it's the best that you can get, at least until Noctua figures out its active noise dampening tech for its fans.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

It's basically 2 Nanos, the only case where it would make some kind of sense is in an mITX build. Still a cool card, just not worth the asking price, so if that goes down eventually, it will be a lot more sensible.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Enderman said:

It's not a worth buying card right now, and it definitely will not be either once pascal and polaris get released in a few months...

Well, if it drops to like 600-700 dollars after they release....

Then again, that seems a bit unlikely. 

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mikat said:

as linus said on the wan show, it doesn't have a place in the current market since developers aren't always optimizing for SLI, it's 300 bucks extra for an extra PCI-E slot, and developers want to use what their customers (gamers) are using

Number one: It's crossfire not SLI :D

Number 2: AMD has stated that this card is highly targeted at the VR Developer/workstation market, and not as much at gamers. If you ask me, this makes sense (except for the high price tag). In a workstation the two GPUs will definitely help a lot at accelerating work. For VR developers, it has been talked about for a long time about having each GPU render one eye. This GPU would therefore allow developers to optimize for this technique. 

 

Still though, if you ask me I don't see why people can't just purchase two Fury X's. It costs less too.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, ninninon said:

AMD do dual gpu cards properly...

With a closed loop using a (likely) noisy, slow pump and a single radiator that likely will allow 0 overclocking? Thanks but no thanks. Give me a cheap air solution, shave off $200, and let me do it myself.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, DocSwag said:

Number one: It's crossfire not SLI :D

Number 2: AMD has stated that this card is highly targeted at the VR Developer/workstation market, and not as much at gamers. If you ask me, this makes sense (except for the high price tag). In a workstation the two GPUs will definitely help a lot at accelerating work. For VR developers, it has been talked about for a long time about having each GPU render one eye. This GPU would therefore allow developers to optimize for this technique. 

 

Still though, if you ask me I don't see why people can't just purchase two Fury X's. It costs less too.

a developer in twitch chat said that he wants to use the hardware that the gamers use because he wants to know how it runs on the gamer's pc :)

and SLI stands for Scalable Link Interface, and Crossfire is a Scalable Link Interface :) 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mikat said:

a developer in twitch chat said that he wants to use the hardware that the gamers use because he wants to know how it runs on the gamer's pc :)

and SLI stands for Scalable Link Interface, and Crossfire is a Scalable Link Interface :) 

ah I see.

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, DocSwag said:

Number 2: AMD has stated that this card is highly targeted at the VR Developer/workstation market, and not as much at gamers. If you ask me, this makes sense (except for the high price tag). In a workstation the two GPUs will definitely help a lot at accelerating work. For VR developers, it has been talked about for a long time about having each GPU render one eye. This GPU would therefore allow developers to optimize for this technique. 

 

Still though, if you ask me I don't see why people can't just purchase two Fury X's. It costs less too.

This seems to go over peoples head. They have clearly stated this will be a prosumer card targeted towards VR developers. They even delayed its launch to wait for VR gear to be ready. The issue might be the message doesn't arrive to all intact.

 

9 minutes ago, mikat said:

and SLI stands for Scalable Link Interface, and Crossfire is a Scalable Link Interface :) 

SLI is Nvidias technology for multi-GPU.

Crossfire is AMDs technology for multi-GPU.

 

You could essentially just switch the roles, and call nvidias crossfire if you would.

EDIT: To explain further, like how gtx is now considered a synonym for GPUs, so are SLI, atleast for the masses

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Tomsen said:

 

SLI is Nvidias technology for multi-GPU.

Crossfire is AMDs technology for multi-GPU.

 

You could essentially just switch the roles, and call nvidias crossfire if you would.

I know, but i forgot to put crossfire, but you know what i meant :)

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, ninninon said:

True, but they do put two fully fledged gpu's on a single board. 

So was the Titan Z.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, mikat said:

a developer in twitch chat said that he wants to use the hardware that the gamers use because he wants to know how it runs on the gamer's pc :)

and SLI stands for Scalable Link Interface, and Crossfire is a Scalable Link Interface :) 

then the developer should use a 970 or other mid-low end hardware which is what 99% of gamer use

not a $1500 GPU

plus if a developer optimized for a $1500 GPu then nobody would buy the game because it would run like crap on everyones hardware

 

btw SLI is for nvidia and CFX is for AMD

CFX is not Scalable Link Interface

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Enderman said:

then the developer should use a 970 or other mid-low end hardware which is what 99% of gamer use

not a $1500 GPU

plus if a developer optimized for a $1500 GPu then nobody would buy the game because it would run like crap on everyones hardware

 

btw SLI is for nvidia and CFX is for AMD

that's what i meant (pro duo is useless) and i know the diffrence between SLI and CFX but i just forgot to put CFX :)

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Tomsen said:

This seems to go over peoples head. They have clearly stated this will be a prosumer card targeted towards VR developers. They even delayed its launch to wait for VR gear to be ready. The issue might be the message doesn't arrive to all intact.

 

SLI is Nvidias technology for multi-GPU.

Crossfire is AMDs technology for multi-GPU.

 

You could essentially just switch the roles, and call nvidias crossfire if you would.

EDIT: To explain further, like how gtx is now considered a synonym for GPUs, so are SLI, atleast for the masses

the technology is called SLI like how cuda cores are stream processors 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×