Jump to content

Micron confirms NVIDIA GeForce RTX 3090 gets 21Gbps GDDR6X memory

illegalwater

Summary

GDDR6X is real and runs at 19-21GB/s, developed by Micron. Also first official mention of the 3090 naming scheme.

NVIDIA-GeForce-RTX-3090-Memory-Specifica

Quote

In a technology brief, Micron has confirmed that GeForce RTX 3090 will sport 21 Gbps memory based on GDDR6X technology. Micron has supplied memory modules for GeForce graphics cards for years now, the GDDR6X partnership takes it to the next step.

NVIDIA GeForce RTX 3090 graphics card is listed on the Micron website with 12GB GDDR6X memory. Micron estimates that GeForce RTX 3090 will be able to break the 1 TB/s barrier with a bandwidth expected between 912 to 1008 GB/s.

It has been long speculated that the new high-end NVIDIA graphics cards might feature GDDR6X memory, however, not a single memory manufacturer confirmed it until now. This was not the case with GDDR6 memory, which was announced much sooner.

NVIDIA GeForce RTX 3090 is expected to feature GA102-300 GPU with 5248 CUDA cores, however, this part of the specifications has not yet been confirmed. While Micron claims that the RTX 3090 would launch with 12 GB GDDR6X memory, we have also seen numerous leaks featuring 22, 20, and even 16GB configurations.

Micron:

Micron GDDR6X is the world’s fastest graphics memory, delivering legendary performance to help drive ray tracing, shadow mapping and silky-smooth animation for an immersive PC gaming experience.

Micron:

In Summer of 2020, Micron announced the next evolution of Ultra-Bandwidth Solutions in GDDR6X. Working closely with NVIDIA on their Ampere generation of graphics cards, Micron’s 8Gb GDDR6X will deliver up to 21Gb/s (data rate per pin) in 2020. At 21Gb/s, a graphic card with 12pcs of GDDR6X will be able to break the 1TB/s of system bandwidth barrier! Micron’s roadmap also highlights the potential for a 16Gb GDDR6X in 2021 with the ability to reach up to 24Gb/s. GDDR6X is powered by a revolutionary new PAM4 modulation technology for Ultra-Bandwidth Solutions. PAM4 has the potential to drive even more improvements in data rate.

I'm surprised we're getting GDDR6X this soon, it feels like we just got GDDR6 yesterday.

 

Sources

https://videocardz.com/newz/micron-confirms-nvidia-geforce-rtx-3090-gets-21gbps-gddr6x-memory

Official document PDFs from Micron (taken down):

https://media-www.micron.com/-/media/client/global/documents/products/technical-marketing-brief/ultra_bandwidth_solutions_tech_brief.pdf?rev=16ecd1bb494f4a958810146858c02bab

https://media-www.micron.com/-/media/client/global/documents/products/technical-marketing-brief/gddr6x_pam4_2x_speed_tech_brief

Micron document mirror (thank to NuLuumo on reddit):

https://drive.google.com/file/d/17cSkg9RzLne74FGN5jvcLGopS79p4b_m/view

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

I swear with every new article about the next generation of Nvidia graphics cards, I get more and more confused as to what we are actually getting. I feel like there has been more conflicting leaks this generation than any generation prior.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, MageTank said:

I swear with every new article about the next generation of Nvidia graphics cards, I get more and more confused as to what we are actually getting. I feel like there has been more conflicting leaks this generation than any generation prior.

Well one thing is for sure, Micron did not confirm model naming nor an RTX 3090.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, MageTank said:

I swear with every new article about the next generation of Nvidia graphics cards, I get more and more confused as to what we are actually getting. I feel like there has been more conflicting leaks this generation than any generation prior.

It may well be many are correct, but for different models and placements. Anyway, hopefully in a little over 2 weeks we'll get something more solid to go on.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Well that’s a lot of memory 

Hi

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

hi

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

It may well be many are correct, but for different models and placements. Anyway, hopefully in a little over 2 weeks we'll get something more solid to go on.

At this point, I've given up on caring about the performance numbers, I just want a card as fast as my current RTX 2080 Ti with HDMI 2.1 support. Oddly enough, that is the one specification that has yet to be confirmed in any of these leaks, and it's the only one I care about.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Unpopular opinion, I hope this is called the RTX 2190, 3090 sounds way too stupid and makes no sense

Main PC [The Rig of Theseus]:

CPU: i5-8600K @ 5.0 GHz | GPU: GTX 1660 | RAM: 16 GB DDR4 3000 MHz | Case: Lian Li PC-O11 Dynamic | PSU: Corsair RM 650i | SSD: Corsair MP510 480 GB |  HDD: 2x 6 TB WD Red| Motherboard: Gigabyte Z390 Aorus Pro | OS: Windows 11 Pro for Workstations

 

Secondary PC [Why did I bother]:

CPU: AMD Athlon 3000G | GPU: Vega 3 iGPU | RAM: 8 GB DDR4 3000 MHz | Case: Corsair 88R | PSU: Corsair VS 650 | SSD: WD Green M.2 SATA 120 GB | Motherboard: MSI A320M-A PRO MAX | OS: Windows 11 Pro for Workstations

 

Server [Solution in search of a problem]:

Model: HP DL360e Gen8 | CPU: 1x Xeon E5-2430L v1 | RAM: 12 GB DDR3 1066 MHz | SSD: Kingston A400 120 GB | OS: VMware ESXi 7

 

Server 2 electric boogaloo [A waste of electricity]:

Model: intel NUC NUC5CPYH | CPU: Celeron N3050 | RAM: 2GB DDR3L 1600 MHz | SSD: Kingston UV400 120 GB | OS: Debian Bullseye

 

Laptop:

Model: ThinkBook 14 Gen 2 AMD | CPU: Ryzen 7 4700U | RAM: 16 GB DDR4 3200 MHz | OS: Windows 11 Pro

 

Photography:

 

Cameras:

Full Frame digital: Sony α7

APS-C digital: Sony α100

Medium Format Film: Kodak Junior SIX-20

35mm Film:

 

Lenses:

Sony SAL-1870 18-70mm ƒ/3.5-5.6 

Sony SAL-75300 75-300mm ƒ/4.5-5.6

Meike MK-50mm ƒ/1.7

 

PSA: No, I didn't waste all that money on computers, (except the main one) my server cost $40, the intel NUC was my old PC (although then it had 8GB of ram, I gave the bigger stick of ram to a person who really needed it), my laptop is used and the second PC is really cheap.

I like tinkering with computers and have a personal hatred towards phones and everything they represent (I daily drive an iPhone 7, or a 6, depends on which one works that day)

Link to comment
Share on other sites

Link to post
Share on other sites

like I would be able to afford it

Main PC [The Rig of Theseus]:

CPU: i5-8600K @ 5.0 GHz | GPU: GTX 1660 | RAM: 16 GB DDR4 3000 MHz | Case: Lian Li PC-O11 Dynamic | PSU: Corsair RM 650i | SSD: Corsair MP510 480 GB |  HDD: 2x 6 TB WD Red| Motherboard: Gigabyte Z390 Aorus Pro | OS: Windows 11 Pro for Workstations

 

Secondary PC [Why did I bother]:

CPU: AMD Athlon 3000G | GPU: Vega 3 iGPU | RAM: 8 GB DDR4 3000 MHz | Case: Corsair 88R | PSU: Corsair VS 650 | SSD: WD Green M.2 SATA 120 GB | Motherboard: MSI A320M-A PRO MAX | OS: Windows 11 Pro for Workstations

 

Server [Solution in search of a problem]:

Model: HP DL360e Gen8 | CPU: 1x Xeon E5-2430L v1 | RAM: 12 GB DDR3 1066 MHz | SSD: Kingston A400 120 GB | OS: VMware ESXi 7

 

Server 2 electric boogaloo [A waste of electricity]:

Model: intel NUC NUC5CPYH | CPU: Celeron N3050 | RAM: 2GB DDR3L 1600 MHz | SSD: Kingston UV400 120 GB | OS: Debian Bullseye

 

Laptop:

Model: ThinkBook 14 Gen 2 AMD | CPU: Ryzen 7 4700U | RAM: 16 GB DDR4 3200 MHz | OS: Windows 11 Pro

 

Photography:

 

Cameras:

Full Frame digital: Sony α7

APS-C digital: Sony α100

Medium Format Film: Kodak Junior SIX-20

35mm Film:

 

Lenses:

Sony SAL-1870 18-70mm ƒ/3.5-5.6 

Sony SAL-75300 75-300mm ƒ/4.5-5.6

Meike MK-50mm ƒ/1.7

 

PSA: No, I didn't waste all that money on computers, (except the main one) my server cost $40, the intel NUC was my old PC (although then it had 8GB of ram, I gave the bigger stick of ram to a person who really needed it), my laptop is used and the second PC is really cheap.

I like tinkering with computers and have a personal hatred towards phones and everything they represent (I daily drive an iPhone 7, or a 6, depends on which one works that day)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, VegetableStu said:

i mean that would be ideal, but unfortunately there are people out there in places of actionable change who feel if the large end of the important digit doesn't go up every generation it'll feel like it's not as big of a generational leap ._.
 

hopefully i'll be dead before i get to see nvidia release the 10080ti, then the 11080ti...

now SIKE! we decided to call the new Ampere cards 110xx because Jensen's grandma is 110

Main PC [The Rig of Theseus]:

CPU: i5-8600K @ 5.0 GHz | GPU: GTX 1660 | RAM: 16 GB DDR4 3000 MHz | Case: Lian Li PC-O11 Dynamic | PSU: Corsair RM 650i | SSD: Corsair MP510 480 GB |  HDD: 2x 6 TB WD Red| Motherboard: Gigabyte Z390 Aorus Pro | OS: Windows 11 Pro for Workstations

 

Secondary PC [Why did I bother]:

CPU: AMD Athlon 3000G | GPU: Vega 3 iGPU | RAM: 8 GB DDR4 3000 MHz | Case: Corsair 88R | PSU: Corsair VS 650 | SSD: WD Green M.2 SATA 120 GB | Motherboard: MSI A320M-A PRO MAX | OS: Windows 11 Pro for Workstations

 

Server [Solution in search of a problem]:

Model: HP DL360e Gen8 | CPU: 1x Xeon E5-2430L v1 | RAM: 12 GB DDR3 1066 MHz | SSD: Kingston A400 120 GB | OS: VMware ESXi 7

 

Server 2 electric boogaloo [A waste of electricity]:

Model: intel NUC NUC5CPYH | CPU: Celeron N3050 | RAM: 2GB DDR3L 1600 MHz | SSD: Kingston UV400 120 GB | OS: Debian Bullseye

 

Laptop:

Model: ThinkBook 14 Gen 2 AMD | CPU: Ryzen 7 4700U | RAM: 16 GB DDR4 3200 MHz | OS: Windows 11 Pro

 

Photography:

 

Cameras:

Full Frame digital: Sony α7

APS-C digital: Sony α100

Medium Format Film: Kodak Junior SIX-20

35mm Film:

 

Lenses:

Sony SAL-1870 18-70mm ƒ/3.5-5.6 

Sony SAL-75300 75-300mm ƒ/4.5-5.6

Meike MK-50mm ƒ/1.7

 

PSA: No, I didn't waste all that money on computers, (except the main one) my server cost $40, the intel NUC was my old PC (although then it had 8GB of ram, I gave the bigger stick of ram to a person who really needed it), my laptop is used and the second PC is really cheap.

I like tinkering with computers and have a personal hatred towards phones and everything they represent (I daily drive an iPhone 7, or a 6, depends on which one works that day)

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia launched the 2000 series with GDDR6 at speeds well below the factory spec. Then they released the "super" cards at higher frequencies. Something is telling me they will do the same with this gen. Got to find some way to sell more super cards.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, VegetableStu said:

hopefully i'll be dead before i get to see nvidia release the 10080ti, then the 11080ti...

I've still got a 2011 Macbook with a Radeon 6700M series graphics card. Which will be once again up to date in 2021, if everything goes as planned.

We will probably see some recycling of old numbering schemes before we got to the shameful 5-digits. And I'm pretty sure Intel is just shaming themself with their numbering scheme because they haven't got to 10 nm yet.

Link to comment
Share on other sites

Link to post
Share on other sites

This whole thing has been a mess rumor wise.  There are stats based on 7nm architecture which it seems consumer cards won’t be getting, there are name confusions, there are “what kind of memory is what card getting” confusions. All kinds of stuff.  The only thing I can figure to do is not believe any of it and wait for cards to be in the hands of reviewers for actual testing.  Even that may not solve the 7nm/8nm thing because a reviewer cannot measure what is underneath the heat sink and if they prerelease 7nm card to reviewers but only sell 8nm cards to consumers numbers may still be wrong.  Only thing I can do atm is not buy until everything shakes out which will be after the cards actually release. So October I guess.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

I'll wait 4 years before I upgrade my 1080Ti. The prices these days are simply out of proportions.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

Will have to wait until launch day and reviews to really get a proper picture.

 

All this has been quite a mess, really.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, CTR640 said:

I'll wait 4 years before I upgrade my 1080Ti. The prices these days are simply out of proportions.

To be fair we don’t know what prices are going to be for either the new Nvidia line or big Navi yet. If they want $500 or more for a will-play-everything gaming card again I may be in the same boat as you.  I’m not trusting rumors much these days.  Too much is conflicting and this 8nm/7nm shell game is making things worse.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Bombastinator said:

To be fair we don’t know what prices are going to be for either the new Nvidia line or big Navi yet. If they want $500 or more for a will-play-everything gaming card again I may be in the same boat as you.  I’m not trusting rumors much these days.  Too much is conflicting and this 8nm/7nm shell game is making things worse.

Exactly. I also have the impression we are being played and both nVidia and AMD will wait untill how they'll cope with the VRAM and we as customers are just nothing but morons waiting to rip our wallet. AMD has been pricing their GPU's to nVidia pricing aswell and if that's really the case with the so called "Big Navi", well, then I'll be gladly keeping my money in the pocket and use it for something else than on a GPU upgrade.

 

I gamed 4 years and 7 months on my GTX780 and I have no problem to the same to my current 1080Ti.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, CTR640 said:

Exactly. I also have the impression we are being played and both nVidia and AMD will wait untill how they'll cope with the VRAM and we as customers are just nothing but morons waiting to rip our wallet. AMD has been pricing their GPU's to nVidia pricing aswell and if that's really the case with the so called "Big Navi", well, then I'll be gladly keeping my money in the pocket and use it for something else than on a GPU upgrade.

 

I gamed 4 years and 7 months on my GTX780 and I have no problem to the same to my current 1080Ti.

AMD specifically doesn’t seem to have been doing much of that. OEMs may be.  Resellers do seem to be.  I don’t know exactly where the price rises are coming from.  I did notice that back when the 5700xt came out it was expected for the non reference cards to be cheaper than the reference ones but the opposite happened.  There was a brief dip during Black Friday when the very cheapest non reference cards actually brushed the upper end of originally projected numbers. It disappeared fast though.   Back when both amphere and big Navi we’re both far from released it was hoped that AMD and Nvidia would actually compete in a meaningful way, but feared that AMD and NVIDIA would simply avoid each other and do the standard dualopoly thing.  The August/September quasi-but-not-actual-release that seems to be coming for amphere still leaves whether that is going to happen up in the air.  

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

All I know (which isn't much) is that all these 'leaks' seem to be tech specs that hold no real bearing on what the performance will be IRL. 

 

I'm hearing terms that make no sense to me or can't even be quantified through comparison.

 

What I do know is..... 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bombastinator said:

AMD specifically doesn’t seem to have been doing much of that. OEMs may be.  Resellers do seem to be.  I don’t know exactly where the price rises are coming from.  I did notice that back when the 5700xt came out it was expected for the non reference cards to be cheaper than the reference ones but the opposite happened.  There was a brief dip during Black Friday when the very cheapest non reference cards actually brushed the upper end of originally projected numbers. It disappeared fast though. 

I see. Maybe it depends on country. I never buy any reference GPU even if cheapest as I value temps and no regret in having the AIB 1080Ti.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Kinder said:

All I know (which isn't much) is that all these 'leaks' seem to be tech specs that hold no real bearing on what the performance will be IRL. 

 

I'm hearing terms that make no sense to me or can't even be quantified through comparison.

 

What I do know is..... 

 

The one that is killing me the actual internals of the cards seem to be shifting as well.  There are some very limited performance numbers for 7nm amphere cards but I’ve seen nothing for 8nm cards except an expectation that they will be “slower” though how much is unspecified. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, MageTank said:

At this point, I've given up on caring about the performance numbers, I just want a card as fast as my current RTX 2080 Ti with HDMI 2.1 support. Oddly enough, that is the one specification that has yet to be confirmed in any of these leaks, and it's the only one I care about.

 

Not from Nvidia or a supply-chain partner, but according to Tom from MLID (who claims to have half a dozen insiders up his sleeve), it's going to be supported right out of the box:

 

Screenshot_20200719_235645.thumb.jpg.4e7c9f4a6f6406d40d5d6268ad36d19c.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Updated the sources in OP with the technical brief on GDDR6X.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

We have a visual:

Quote

Today, an alleged PCB of the NVIDIA's upcoming GeForce RTX 3090 has been pictured and posted on social media.

 

The PCB appears to be a 3rd party design coming from one of NVIDIA's add-in board (AIB) partners - Colorful.

 

The picture is blurred out on the most of the PCB and has Intel CPU covering the GPU die area to hide the information.

 

There are 11 GDDR6X memory modules covering the surrounding of the GPU and being very near it.

 

Another notable difference is the NVLink finger change, as there seems to be the new design present.

 

Check out the screenshot of the Reddit thread and PCB pictures below:

https://www.techpowerup.com/270986/nvidia-geforce-rtx-3090-ampere-alleged-pcb-picture-surfaces?amp=

 

 

NVIDIA GeForce RTX 3090 PCB NVIDIA GeForce RTX 3090 PCB NVIDIA GeForce RTX 3090 PCB  6QfsleO61pMpPCKN_thm.jpg

NVIDIA GeForce RTX 3090 PCB

 

.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×