Jump to content

AMD RX 5700 Navi New arch

Firewrath9
10 minutes ago, TheDankKoosh said:

ddr means DOUBLE data rate, that doesn't change just because it's gddr. Just because amd calculates it in that way does not mean that it's the true way of calculating it.  

Ps: gddr5x is quad data rate

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, cj09beira said:

Ps: gddr5x is quad data rate

Well, that's a small exception, most memory is double data rate. 

8086k Winner BABY!!

 

Main rig

CPU: R7 5800x3d (-25 all core CO 102 bclk)

Board: Gigabyte B550 AD UC

Cooler: Corsair H150i AIO

Ram: 32gb HP V10 RGB 3200 C14 (3733 C14) tuned subs

GPU: EVGA XC3 RTX 3080 (+120 core +950 mem 90% PL)

Case: Thermaltake H570 TG Snow Edition

PSU: Fractal ION Plus 760w Platinum  

SSD: 1tb Teamgroup MP34  2tb Mushkin Pilot-E

Monitors: 32" Samsung Odyssey G7 (1440p 240hz), Some FHD Acer 24" VA

 

GFs System

CPU: E5 1660v3 (4.3ghz 1.2v)

Mobo: Gigabyte x99 UD3P

Cooler: Corsair H100i AIO

Ram: 32gb Crucial Ballistix 3600 C16 (3000 C14)

GPU: EVGA RTX 2060 Super 

Case: Phanteks P400A Mesh

PSU: Seasonic Focus Plus Gold 650w

SSD: Kingston NV1 2tb

Monitors: 27" Viotek GFT27DB (1440p 144hz), Some 24" BENQ 1080p IPS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, Firewrath9 said:

sigh

 

+500mhz on base memory clock is 2500mhz, which is multiplied by 4 to get the effective clock, 10ghz.

Can you do some research on things before saying them?

12 minutes ago, cj09beira said:

Ps: gddr5x is quad data rate

No, GDDR is still a double data rate design. @TheDankKoosh is correct.

 

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Drak3 said:

No, GDDR is still a double data rate design. @TheDankKoosh is correct.

 

GDDR5X is QDR. GDDR5 isn't. Though some OC tools measure/display it as such.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, TheDankKoosh said:

ddr means DOUBLE data rate, that doesn't change just because it's gddr. Just because amd calculates it in that way does not mean that it's the true way of calculating it.  

GDDR5X/GDDR6 is 2x GDDR5 is 2x of GDDR3, which is 2x data rate

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Drak3 said:

No, GDDR is still a double data rate design. @TheDankKoosh is correct.

 

its calculated as 4x though in afterburner IIRC

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Firewrath9 said:

its calculated as 4x though in afterburner IIRC

And? Software engineers can make mistakes with labeling things.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Drak3 said:

And? Software engineers can make mistakes with labeling things.

so if he had +500mhz on mem he has 10ghz actual clock?

Isn't that the arguement?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Firewrath9 said:

so if he had +500mhz on mem he has 10ghz actual clock?

Isn't that the arguement?

Don't know, don't really care. I'm not one of the software engineers behind products like Afterburner.

 

But I know this, GDDR5X has what is called quad data rate, but actually changes the prefetch size. To the end consumer, it's the same effect, you're doubling the data transferred relative to clocks.

It's a very misleading naming scheme, and if we apply it to every DDR revision that doubled prefetch sizes, we'd be on hexdecirate/secdecirate (16x rate).

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Firewrath9 said:

well, first of all, it uses 16Gb (2GB GDDR6) chips, (just 2 of them for every slot). It's basically a RTX Titan with an extra set of 2GB chips and Quadro Drivers.

RTX Titan is 2500$, so at MOST the price of doubling the VRAM is 2500$ (ripping off the GDDR6 on a different titan and putting it on a Titan + Driver = Titan RTX)

Most likely its ~200$, as 12 GB of 8Gb ddr6 is ~50-80$

 

HBM is fine for datacenter usage, but it does not belong on mainstream cards.

Also, engineering problems due to differing heights, which is why several Vegas had TJunction issues

 

 

Interesting, i didn't think GDDR could be setup as 2 chips per "channel", (not sure how else to describe it). I assumed given the 384 bit bus it had to be 12 chips. Explains why i've never heard of 4GB chips. Would love a shot of the PCB to see how they handled the traces on that.

 

Thats said the traces means a 24 chip PCB is going to be really, really expensive. Thats why Consumer cards are limited to 12 chips, the PCB cost of going to more is just unfeasible.To be fair as i implied in my prior post i'm betting a lot of the cost is just the Quadro branding. But i'd bet it's also really freakishly expensive compared to a consumer card, so there's no way they could do that on a consumer level card without majorly inflating the price just like HBM does.

Link to comment
Share on other sites

Link to post
Share on other sites

Let me guess, is NVIDIA gonna counter RX 5700 with an RTX 2070 Ti? They did that with 1070 Ti vs V56?
 

I'm greatly above potato, but I'm getting there...

Midrange Potato LVL 60:

CPU: Ryzen 5 3600 with Snowman MT-6 Dual Fans (CPU @ 3.8 GHz - 4.375 GHz to 4.5 GHz @ 1.1V - 1.35V),

MOBO: MSI B550-A Pro
GPU: Asrock RX 5600 XT Phantom Gaming D3 (1820MHz core @930mV)

RAM: TeamGroup T-Force Delta DDR4 Gaming 16GB (2x8GB) 3000MHz 16-17-17-37-58 @ 1.35V,

HARD DRIVE: WD 1TB Blue 
SSD: Toshiba XG5 Series NVMe 512GB (KXG50GVN512G) & Crucial MX500 1TB

CASE: DeepCool Kendomen Titanium case
PSU: Corsair RM-750 (2019) 80+ Gold

Display: Asus VP249QGR via HDMI (144Hz)

Keyboard: Generic PS/2 Keyboard

Mouse: Generic Honeycomb 250Hz Mouse
Speakers: Generic Headset

And yes, there are now fans. 5 Arctic P12 PST's

Userbenchmark Run: 
https://www.userbenchmark.com/UserRun/25234338 I don't trust that site anymore

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/31/2019 at 5:15 PM, RejZoR said:

HBM is great actually, but AMD was just ahead of time so much it just wasn't feasible from a supply and price point. You can be assured it'll be a standard sometime in the future like GDDR5/5X/6 is now. Because it's really the only way to go forward as PCB simply cannot carry enough traces to all the memory modules when it comes to large capacity beyond 16GB.

Everybody expected HBM prices to come down, and supply to go up over time.

 

But it didn't happen (so far) and hence AMD was screwed, because they had two successive generations of high end GPUs (Fiji and Vega)  which were designed around HBM and became supply constrained and more expensive to produce than planned.

 

Not sure if that was bad luck or bad planning by AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Humbug said:

Everybody expected HBM prices to come down, and supply to go up over time.

 

But it didn't happen (so far) and hence AMD was screwed, because they had two successive generations of high end GPUs (Fiji and Vega)  which were designed around HBM and became supply constrained and more expensive to produce than planned.

 

Not sure if that was bad luck or bad planning by AMD.

mostly bad luck, problem was those fpgas that launched around the same time were using quite a bit of hbm (and they were expensive so they can afford to pay a premium for it), add the fact that the memory guys were price fixing all memory at the time, and its really a number of unpredictable things happening at the same time

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Humbug said:

Everybody expected HBM prices to come down, and supply to go up over time.

 

But it didn't happen (so far) and hence AMD was screwed, because they had two successive generations of high end GPUs (Fiji and Vega)  which were designed around HBM and became supply constrained and more expensive to produce than planned.

 

Not sure if that was bad luck or bad planning by AMD.

Bad planning. It would have been lucky if it actually panned out.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Humbug said:

Not sure if that was bad luck or bad planning by AMD.

 

A bit of both. They really should have had a backup plan ready to go in case GDDR improved enough, but at the same time when they started to move over there was a lot of concern about hitting the limits of GDDR, (NVIDIA used HBM on some of it's last generation professional cards for this reason, they couldn't get the capability they needed out of then available GDDR), However GDDR5X and GDDR6 have extended things further than anyone expected which reduced the need for it which kept prices high. What AMD was gambling on was that NVIDIA would have to switch to HBM for all it's cards as well and that didn't happen, (though as NVIDIA using HBM in a limited fashion last gen shows it was a close thing, and whats going to happen in future is anyone's guess). 

Link to comment
Share on other sites

Link to post
Share on other sites

For those that absolutely cannot wait until tomorrow’s official announcement of Radeon RX 5700 series:

Quote

 

AMD-Radeon-RX-5700XT-Navi-Specifications.jpg.58ae46da3a655f1f6ce79f4a0d0264ab.jpg

 

AMD Radeon RX 5700 XT features 40 Compute Units (2560 stream processors). The Navi GPU is clocked at 1605 MHz in base mode, 1755 MHz in Game Mode and 1905 MHz in boost mode. Of course, the new addition here is the Game Clock.

 

The card is confirmed to feature 8GB of GDDR6 memory. At least two variants will be launching, a 180W TDP (225W TBP) priced around $499 US and a 150W TDP (180W TBP) SKU priced around $399 US.

 

Source:  https://videocardz.com/80966/amd-radeon-rx-5700-xt-picture-and-specs-leaked

Link to comment
Share on other sites

Link to post
Share on other sites

X - doubt.

 

Lets just wait until tomorrow and see what happens...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder what they mean with the "game clock". And what does the boost clock serves for then. I mean, in which applications? Or it's guaranteed base clock under heaviest strain, game clock is guaranteed minimum clock for games and boost is the max clock in general that can achieve under good conditions.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Humbug said:

Look at that hot and loud blower cooler! 

Just what we have all been waiting for.

It's not like the usual set don't have production models ready to go. Or did you actually think ASRock's 'concept' coolers were anything but 1-2 slightly tweaked parts away from whatever they ended up putting into production?

Link to comment
Share on other sites

Link to post
Share on other sites

Blower coolers blow. I know they improved efficiency, but it won't be quiet no matter what. RX580 and RX590 were cool coz they arrived aftermarket by default.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, RejZoR said:

Blower coolers blow. I know they improved efficiency, but it won't be quiet no matter what. RX580 and RX590 were cool coz they arrived aftermarket by default.

only because rx480 launched first, blowers, would probably be the main thing i would bring up if i ever met with lisa

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×