Jump to content

RTX 3090 TI to use GDDR6X memory, approaches HBM2 bandwidth

da na
2 minutes ago, J-from-Nucleon said:

So, seeing as the 3090 HAS GDDR6X, it would've been concerning if the 3090ti (if it even exists) does NOT come with GDDR6X or HBM2(/3)

The clock speeds is the interesting part, clocks and bandwidth. The fact that it approaches HBM2 speeds... 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, leadeater said:

You really can't, downside to HBM (at least right now) is they pretty much only clock up to their rated speed and that's their hard limit.

1875815242_download(13).jpeg.81a50849fd3a395c8f43f4da83c8e3fa.jpeg

 

 

Though i wonder if oced gddr6x will be able to beat radeon vii with hbm2 vram

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, CTR640 said:

Oh, oops. I went a bit overboard then lol. I still wanna get my 301st confirmed kill!

well, here's my 301st confirmed message

1 hour ago, Mel0nMan said:

Summary

Nvidia's RTX 3090 TI is rumored to feature 21 GPBS GDDR6X memory when it launches. According to TechPowerUp this "maxes out the 384-bit bus width of GA102" at a 7.7% memory transfer rate increase over the 3090, making it the fastest memory on any team green card with 1,008 GB/s of bandwidth. Award for fastest gaming GPU memory still goes to the Radeon VII's HBM2, which achieves 1,024 GB/s.

Additionally, all 84 stream processors will be enabled in the 3090 TI.

 

Quote

 

My thoughts

It's incredible that normal GDDR memory is starting to near the bandwidth of HBM2. However, this will make it quite expensive, posing again the question of who this card is for. It'll likely be too expensive to appeal to 99% of gamers but most creative professionals will still want to go with an RTX A6000 for their workflows.

 

Sources

https://www.techpowerup.com/289430/nvidia-geforce-rtx-3090-ti-to-feature-21-gbps-gddr6x-memory

as for my thoughts on it:

 

GPUs are already so powerful that it doesn't even make sense to upgrade

I do think that the speeds are quite impressive, but I also think they're quite... for lack of a better word, overkill

 

who knows, Intel and AMD are about to play the most intense game of tag we've ever seen, so maybe we'll see GPU bottlenecks again when playing at 4k resolution

imagine that.

 

12 minutes ago, Mel0nMan said:

I've used both Nvidia and AMD cards (granted, furthest I've gotten with each is 900 series and Polaris) but honestly I prefer Nvidia cards (especially Quadro) when it comes to drivers, support, and features.

I personally have only used NVIDIA card(s), but judging by reviews/benchmarks/all of my knowledge about the past, NVIDIA is known for making GPUs, AMD is known for making cheap alternative CPUs to Intel, Intel is known for making the best CPUs, and I don't know too much about the storage/ram/peripheral world

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Spotty said:

Great, just what we need. More overly expensive graphics cards that nobody will be able to buy. I really have to ask why this even exists. Was the 3090 not fast enough? Does this really offer significant improvements over the existing 3090 to justify a 3090Ti card?

The release of a 3090 Ti makes sense,

NVIDIA are capitalizing on the current situation of the market,

They can launch the 3090 Ti at higher price for better profit margins - and with this economy people will buy each and everyone of them.

And of course the production of the 3090 Ti comes at the expense of the 3090 and lower tiered cards that have the lower profits margins (because NVIDIA has limited capacity in fabs).

 

NVIDIA doesn't have to make 3060 cards when it can make just 3090 Ti cards and each and everyone of them will sell.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, yesyes said:

I personally have only used NVIDIA card(s), but judging by reviews/benchmarks/all of my knowledge about the past, NVIDIA is known for making GPUs, AMD is known for making cheap alternative CPUs to Intel, Intel is known for making the best CPUs, and I don't know too much about the storage/ram/peripheral world

Nvidia being in such a technological and market dominant position is more of a recent/medium term thing. If you look back past GTX 700 series then there were many times Nvidia did not have the fastest card or not the fastest and also the least efficient cards.

 

ATI hard fumbled the HD 2000 and HD 3000 generations, got well back on track with the HD 4000 and took the lead again with the HD 5000 series, HD 6000 series back to a meh increase and then HD 7000 back to the best again and then... well things from there didn't go as well. Still really liked my 290X's and very competitive but that was the last time AMD/ATI had anything overall "good" in the market at the high end until now with the RX 6800/6900 series.

 

Similar story on the CPU side re AMD v Intel although you have to go back slightly further so the gap in the market between actually good AMD CPUs and the good stuff they have now is even wider than on the GPU front. AMD CPUs up until 2005 were king, helped by the fact Intel's P4 archs were all turds but hey you can't release winners every single time.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

Similar story on the CPU side re AMD v Intel although you have to go back slightly further so the gap in the market between actually good AMD CPUs and the good stuff they have now is even wider than on the GPU front. AMD CPUs up until 2005 were king, helped by the fact Intel's P4 archs were all turds but hey you can't release winners every single time.

AMD was more of a copy cat for the first 20 years, then they offered cheaper alternatives to Intel, and eventually beat them to 64 bit, then lost for 10 years

 

to get an idea of how exceptional AMD's current performance is, their profit from 2019-2020 spiked up by 650%...

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, leadeater said:

Nvidia being in such a technological and market dominant position is more of a recent/medium term thing. If you look back past GTX 700 series then there were many times Nvidia did not have the fastest card or not the fastest and also the least efficient cards.

 

ATI hard fumbled the HD 2000 and HD 3000 generations, got well back on track with the HD 4000 and took the lead again with the HD 5000 series, HD 6000 series back to a meh increase and then HD 7000 back to the best again and then... well things from there didn't go as well. Still really liked my 290X's and very competitive but that was the last time AMD/ATI had anything overall "good" in the market at the high end until now with the RX 6800/6900 series.

 

Similar story on the CPU side re AMD v Intel although you have to go back slightly further so the gap in the market between actually good AMD CPUs and the good stuff they have now is even wider than on the GPU front. AMD CPUs up until 2005 were king, helped by the fact Intel's P4 archs were all turds but hey you can't release winners every single time.

The R9 was AMD's most innovative, competitive card IMO

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Vishera said:

The release of a 3090 Ti makes sense,

NVIDIA are capitalizing on the current situation of the market,

 

NVidia is doing what NVidia typically does with each generation: capitalizing on the available silicon.  The market really has very little to do with their decision.  With each generation, they usually pop out a "better than the best" card, at the very tippy-top of the GeForce line, during the last year of that generation.  And whaddya know: 2022 is going to be a new generation.

 

Anyone (I'm not counting you) who is surprised, annoyed, or otherwise put off by this is ignoring what NVidia have done forever.  They're just following their usual pattern.

 

 

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, jasonvp said:

 

NVidia is doing what NVidia typically does with each generation: capitalizing on the available silicon.  The market really has very little to do with their decision.  With each generation, they usually pop out a "better than the best" card, at the very tippy-top of the GeForce line, during the last year of that generation.  And whaddya know: 2022 is going to be a new generation.

 

Anyone (I'm not counting you) who is surprised, annoyed, or otherwise put off by this is ignoring what NVidia have done forever.  They're just following their usual pattern.

 

 

But in the past there have been reasons to buy the best card. In this case, there are significantly fewer IMO. For example look at the 980 vs 980ti. Slightly higher clocks, 6gb instead of 4gb VRAM, better power delivery. Same with 1080/1080ti and 2080/2080ti. More shaders, overall more powerful, 11gb instead of 8gb VRAM. However the 3090 is already so expensive, powerful, and overkill a more powerful card doesn't have a place. 

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Vishera said:

They can launch the 3090 Ti at higher price for better profit margins - and with this economy people will buy each and everyone of them.

Just on this point, it isn't that clear cut. As I said in my previous post, I do feel the 3090 has been the most consistently available since Ampere launched. It's high price probably doesn't help, and means only the deepest pockets pick them up. I feel the price is balancing with availability for now. Still, they do sell. 3090 Ti will be above that.

 

How many have a 3090 now and will upgrade to the Ti? Probably few, unless they're the "must have the best available" crowd. It certainly exists, even if not in massive numbers.

 

How many did not choose a 3090 already, but the perf increase of the Ti will make them get it? Again, I don't feel it is going to be a great number. If you needed a 3090 class card, you could have got a 3090 already.

 

So who is left? There will be ongoing sales of course, since not everyone buys at the same time. I'm going to put a prediction out here, this may be the limit. They will sell, but in the longer term not push pricing much further than 3090 is already. In other words, pricing will adapt to market. They will sell, but pricing may have to adjust to keep them moving.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, jasonvp said:

With each generation, they usually pop out a "better than the best" card, at the very tippy-top of the GeForce line

NVIDIA are using the Tick-tock strategy that Intel used in the past.

2014: 980 - Maxwell

2015: 980 Ti - Maxwell refresh

2016: 1080 - Pascal

2017: 1080 Ti - Pascal refresh

2018: 2080 - Turing

2019: 2080 Super - Turing refresh

2020: 3080 - Ampere

2021: 3080 Ti - Ampere refresh

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Mel0nMan said:

But in the past there have been reasons to buy the best card. In this case, there are significantly fewer IMO.

 

And you ended that statement correctly: it's your opinion.  It may or may not be shared with other prospective buyers.  Remember there are folks out there who do the whole "FPS at any cost" instead of "$/FPS".  Those are the folks NVidia would be aiming this card at (assuming it actually materializes).

 

See: previous generation Titans.  It's the same thing.

 

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, yesyes said:

AMD was more of a copy cat for the first 20 years

Interestingly, that was exactly what they were founded for. 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

How many have a 3090 now and will upgrade to the Ti?

Miners and scalpers buy loads of them.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mark Kaine said:

Interestingly, that was exactly what they were founded for. 

“outsource the manufacturing.”

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, yesyes said:

“outsource the manufacturing.”

AMD reversed engineered Intel chips,improved them a bit and then sold them on the market.

That's how they started.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Vishera said:

AMD reversed engineered Intel chips,improved them a bit and then sold them on the market.

That's how they started.

but by that point Intel would have better CPUs, so AMD ended up being more of a cheaper alternative to Intel

░█▀▀█ ▒█░░░ ▒█▀▀▄ ▒█▀▀▀ ▒█▀▀█   ▒█░░░ ░█▀▀█ ▒█░▄▀ ▒█▀▀▀ 
▒█▄▄█ ▒█░░░ ▒█░▒█ ▒█▀▀▀ ▒█▄▄▀   ▒█░░░ ▒█▄▄█ ▒█▀▄░ ▒█▀▀▀ 
▒█░▒█ ▒█▄▄█ ▒█▄▄▀ ▒█▄▄▄ ▒█░▒█   ▒█▄▄█ ▒█░▒█ ▒█░▒█ ▒█▄▄▄

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Vishera said:

NVIDIA are using the Tick-tock strategy that Intel used in the past.

Intel's plan was to alternate between process focused generations and architecture focused generations. Nvidia I see as architecture every generation, often with a process change. Unlike Intel, I don't believe there is any notable process or architecture change in those refreshes. It is just a new configuration offering of the same generational tech.

 

3 minutes ago, Vishera said:

Miners and scalpers buy loads of them.

Miners, maybe although I'm not sure if ROI makes much sense at 3090 level. Scalpers still have to have someone to sell them onto, and this is higher risk money they're looking at. Lower cards will shift more easily.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

Unlike Intel, I don't believe there is any notable process or architecture change in those refreshes. It is just a new configuration offering of the same generational tech.

 

Correct.  There's no "tick-tock" going on here.  It's just pressing higher and higher performance silicon out of the same generation at lower prices than their data center/compute cards.

 

Lather.  Rinse.  Repeat.  They'll do this again at the tail end of '23 leading into the '24 generation change.  Betcherass on it.

 

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Vishera said:

AMD reversed engineered Intel chips,improved them a bit and then sold them on the market.

That's how they started.

I need to look at history to check, but my understanding was AMD was initially contracted by Intel to make Intel CPUs as one of Intel's major customers at the time wanted a 2nd source. This was fine until AMD starting making enhancements to the CPUs, which Intel saw as AMD exceeding what they were there for. Lawsuits happened, AMD essentially won, and we have what we have today.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, porina said:

I need to look at history to check, but my understanding was AMD was initially contracted by Intel to make Intel CPUs as one of Intel's major customers at the time wanted a 2nd source. This was fine until AMD starting making enhancements to the CPUs, which Intel saw as AMD exceeding what they were there for. Lawsuits happened, AMD essentially won, and we have what we have today.

I just looked at it: Both of us are wrong.

 

They started as a a second source not to Intel,but to Fairchild and National Semiconductor.

Quote

In September 1969, AMD moved from its temporary location in Santa Clara to Sunnyvale, California.[7] To immediately secure a customer base, AMD initially became a second source supplier of microchips designed by Fairchild and National Semiconductor.

 

And at some point later they were a second source to Intel like you said:

Quote

The company was a second source for Intel MOS/LSI circuits by 1973,

 

And they did reverse engineer Intel processors and sold them like i said:

Quote

By 1975, AMD entered the microprocessor market with the Am9080, a reverse-engineered clone of the Intel 8080

Source: https://en.wikipedia.org/wiki/Advanced_Micro_Devices

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Vishera said:

I just looked at it: Both of us are wrong.

I was thinking x86 era, which seems to be not far off: (same source)

Quote

In February 1982, AMD signed a contract with Intel, becoming a licensed second-source manufacturer of 8086 and 8088 processors.

I was not aware of their history going much further back than that.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia is milking the miners hard now and good for them.  One day us poor gamers and creators will be able to pick up the leftovers. 

 

Maybe it'll run cyberpunk? 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Mel0nMan said:

Summary

Nvidia's RTX 3090 TI is rumored to feature 21 GPBS GDDR6X memory when it launches. According to TechPowerUp this "maxes out the 384-bit bus width of GA102" at a 7.7% memory transfer rate increase over the 3090, making it the fastest memory on any team green card with 1,008 GB/s of bandwidth. Award for fastest gaming GPU memory still goes to the Radeon VII's HBM2, which achieves 1,024 GB/s.

Additionally, all 84 stream processors will be enabled in the 3090 TI.

 

Quote

 

My thoughts

It's incredible that normal GDDR memory is starting to near the bandwidth of HBM2. However, this will make it quite expensive, posing again the question of who this card is for. It'll likely be too expensive to appeal to 99% of gamers but most creative professionals will still want to go with an RTX A6000 for their workflows.

 

Sources

https://www.techpowerup.com/289430/nvidia-geforce-rtx-3090-ti-to-feature-21-gbps-gddr6x-memory

People are so short-sighted when they ask "who is this card for?"

 

There's basically three categories of users:

1. Gamers

2. Machine Learning

3. GPU-Accelerated Workload (eg Blender, Photoshop, Premiere, Davinci Resolve, etc)

 

Gamers do not need a GPU more powerful than their setup requires. So for most users with a 1080p display an 8GB 3070 GPU is likely the most they will be able to effectively use. However if someone legitimately has a 4K or 8K setup, then a 3090 probably will be insufficient under specific conditions, and not future-proof enough. Many people on Windows aren't even aware their "4K" setups are really 1080p upscales, because unless a game lets you select 4K, it's likely running at 1080p and the desktop compositor is the one upscaling it to 4k. Then you have DLSS and similar tech which upscales with the express purpose of putting a lesser load on the GPU.

 

Machine Learning, is very memory-bandwidth sensitive, and also performance sensitive. Usually a desktop GPU falls well short of what is needed depending on the workload, and in some cases, things, that need real-time performance, require a much higher performance level than one that can operate in batch mode to maximize the GPU use. 

 

GPU-accelerated workloads, such as video work, can usually impose an extreme burden on GPU's, but it will be a mixed bag of "stuff that takes advantage of CUDA" and "stuff that takes advantage of NVENC/NVDEC" , where a weaker GPU lessens your capability to work efficiently. With CAD projects, this could mean the difference between being able to load the entire project or only being able to load it one floor at a time, or one room at a time.

 

In general, the desktop GPU's and Workstation (Quadro) GPU's are the exact same part, just different driver optimizations (like unlimited video streams on quadro's) differentiate them. If you don't need CUDA, you don't need an nVidia GPU, and for most people, can, and probably should pick the AMD option. If you need CUDA (eg ML) then you don't have a choice. 

 

So it's likely the 3090Ti is probably the same as the A6000 part, which presently comes with 48GB GDDR6 and costs over $7000.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×