Jump to content

NVIDIA Project Beyond GTC Keynote with CEO Jensen Huang: RTX 4090 + RTX 4080 Revealed

2 minutes ago, MageTank said:

GK104 (Kepler) with the GTX 770. Kepler as a generation is still within that 8 year timeframe (they technically released Kepler cards in 2016). What do I win?

 

Man, gotta love that we already had Kepler on the brain from earlier in this thread, lol.

 

yea but the 770 was released in 2013. 
But yes, that was the correct answer. 9 years soon to be 10. God damn those are old cards
And outside of driver support, they still run okish. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, starsmine said:

yea but the 770 was released in 2013. 
But yes, that was the correct answer. 9 years soon to be 10. God damn those are old cards
And outside of driver support, they still run okish. 

Hey, you said and I quote:

3 hours ago, starsmine said:

find me any generation in the last 8 years where the x70/xx70 was the FULL 104/204 chip

Generationally speaking, Kepler is totally within the last 8 years, even if the x70 cards were not, lol.

 

I still have a handful of Kepler cards that work (GTX 760x2 Mars, 1x GTX 770 MSI (N770-2GD5/OC),  2x GTX 780 Ti reference, 1x GTX Titan Black. Still one of my favorite generations of GPUs.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1080ti beats 4060.

jk. but some of the recent cards seems slow compared to old generations.

More so for both the price and all the 2x speed increase talk to options and nearly 10x the amount of cores in a newer generation.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, MageTank said:

Man, gotta love that we already had Kepler on the brain from earlier in this thread, lol.

I didn't think GTX 700 series was worth mentioning lol, but oh well you went there 🙃

Link to comment
Share on other sites

Link to post
Share on other sites

I still have my GTX780 stored in case my 1080Ti will no longer works lol.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

You guys see the new pics if the 4090 FE?

 

Has a fan in each side.

 

Just wish I knew if the FE can go to 600 watts. 


 

image.thumb.jpeg.81884c1f8cc65a0f66e106a8d1dfba78.jpeg

 

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Kinda Bottlenecked said:

https://opendata.blender.org/ 

 

It looks like someone ran the blender benchmarks on the 4090.

 

image.png.33418b6c0c21fcd8b609f6f58b6145f5.png

 

90% increase in performance compared to a 3090 Ti.

 

99.5% increase in performance compared to a 3090.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Shzzit said:

Just wish I knew if the FE can go to 600 watts. 

Probably 520W. +15% power is fairly standard and the TDP is 450W. If you want higher you'll need to go with an AIB card.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Probably 520W. +15% power is fairly standard and the TDP is 450W. If you want higher you'll need to go with an AIB card.

Iv read you can slide power target over to get 600 but I’m not sure.

 

If fe can’t go to 600 they are really screwing the selfs vs AIB cards. 
 

Guessing AIB cards will be dual bios 450/600?

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Shzzit said:

Iv read you can slide power target over to get 600 but I’m not sure.

 

If fe can’t go to 600 they are really screwing the selfs vs AIB cards. 
 

Guessing AIB cards will be dual bios 450/600?

Getting up to 600 will certainly be AIB only.

 

One thing you have to be careful of is some AIB cards might only allow +12% OC power target but the starting TDP is 480W while another might allow +15% but it's starting RDP is 450W etc.

 

There will likely be AIB cards that will allow 600W but you'll have to do a bit of research to find the ones that can. However that said just because you can increase the power target to 600W doesn't mean the GPU will actually use 600W, temperature and voltage has to support that and I suspect it'll be rather hard to sustain 600W on the stock air coolers. Not that they can't cool that amount of heat but that they can't keep the Tj low enough.

 

RTX 3090 Ti got up to mid 500W, RTX 4090 has same TDP but is much bigger/complex chip to I have no doubts it'll do more than below

image.thumb.png.4a48c1e26127ffeaa302df6cf1df7f81.png

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, leadeater said:

Getting up to 600 will certainly be AIB only.

 

One thing you have to be careful of is some AIB cards might only allow +12% OC power target but the starting TDP is 480W while another might allow +15% but it's starting RDP is 450W etc.

 

There will likely be AIB cards that will allow 600W but you'll have to do a bit of research to find the ones that can. However that said just because you can increase the power target to 600W doesn't mean the GPU will actually use 600W, temperature and voltage has to support that and I suspect it'll be rather hard to sustain 600W on the stock air coolers. Not that they can't cool that amount of heat but that they can't keep the Tj low enough.

 

RTX 3090 Ti got up to mid 500W, RTX 4090 has same TDP but is much bigger/complex chip to I have no doubts it'll do more than below

image.thumb.png.4a48c1e26127ffeaa302df6cf1df7f81.png

That makes sense thanks, uggg this waiting is killing me, I have a million questions lol.

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, BiG StroOnZ said:

90% increase in performance compared to a 3090 Ti.

 

99.5% increase in performance compared to a 3090.

So at least one place where we get (almost) 2x

Link to comment
Share on other sites

Link to post
Share on other sites

I'm probably going to be replacing my GTX-980 with an RTX-4090, still waiting for independent reviewers but I'm cautiously optimistic. 

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia saying that their dual AV1 encoding, 40% more efficient than NVEC H.264

comparing 60fps 6mbps, cards being 35% overhead reduction with AI tools used (visuals, audio etc) in partner with OBS (posted before), Discord with AV1 end to end support.

From previous -8-10% to -5-6%

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/6/2022 at 12:19 PM, Shzzit said:

uggg this waiting is killing me, I have a million questions lol.

The NDA should lift tomorrow or the day after. You won't have to wait for too long.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/5/2022 at 9:24 AM, starsmine said:

find me any generation in the last 8 years where the x70/xx70 was the FULL 104/204 chip.

I know the answer to this question, so its rhetorical, but I want you to find it yourself. 
so no, its not simply a 4070, and it likely never was considered to be. 

there are full 104 chips used for the 70ti class cards, but never 70. so sure argue it could have been called the 4070ti. but again, it was likely never going to be a 4070

70 class card has always been a cut-down of the 80 class card. In the before days when things were good, the 80-class card was the top dog flagship for $500-$700 depending on the generation and as such usually used a x00 or x10 chip if you go back far enough. Before that, GeForce had different code names like G80, G92, NV30, etc., but I digress.

 

Then with Kepler, the 80 card was moved down the stack to the x04 chip which prior had always been used for the mid-range 60-class products. As such, the 70-class card became a cutdown version of the x04 chip.

 

Turing was even worse. The original 2070 was pushed down to the TU106 chip, a chip that since Kepler had been used for 50 and 60 class products before being rectified in the Super refresh and the 2070 Super being a cut-down TU104 based card.

 

Ampere was the anomaly since the 3080 was put back to a x02 class chip. But let's be real, the only reason this happened was because Samsung 8nm (its really 10nm) was an ancient node by this point (hence why it was cheap) and they really had to get everything they could out of it. GA104 based 3080 was not going to fly after Turing. So top end x04 was the 70-class cards.

 

Now with Lovelace, we see somewhat a return to form for Nvidia except that now we have this new x03 die. So I did fall into the initial camp of thinking the 4080 12GB was a 60-class card because of the 192-bit bus, but I think that was wrong. It should have been the 4070 or 4070 Ti. I get the architectural changes with the much larger L2 cache. But I will wait to see how that turns out. RDNA2 with Infinity Cache still suffered at 4k due to the 256-bit bus holding it back. That said, if I recall, Infinity Cache is more like L3 vs L2. So we will see.

Zen 3 Daily Rig (2022 - Present): AMD Ryzen 9 5900X + Optimus Foundations AM4 | Nvidia RTX 3080 Ti FE + Alphacool Eisblock 3080 FE | G.Skill Trident Z Neo 32GB DDR4-3600 (@3733 c14) | ASUS Crosshair VIII Dark Hero | 2x Samsung 970 Evo Plus 2TB | Crucial MX500 1TB | Corsair RM1000x | Lian Li O11 Dynamic | LG 48" C1 | EK Quantum Kinetic TBE 200 w/ D5 | HWLabs GTX360 and GTS360 | Bitspower True Brass 14mm | Corsair 14mm White PMMA | ModMyMods Mod Water Clear | 9x BeQuiet Silent Wings 3 120mm PWM High Speed | Aquacomputer Highflow NEXT | Aquacomputer Octo

 

Test Bench: 

CPUs: Intel Core 2 Duo E8400, Core i5-2400, Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS

Motherboards: ASUS Z97-Deluxe, EVGA Z490 Dark, EVGA Z790 Dark Kingpin

GPUs: GTX 275 (RIP), 2x GTX 560, GTX 570, 2x GTX 650 Ti Boost, GTX 980, Titan X (Maxwell), x2 HD 6850

Bench: Cooler Master Masterframe 700 (bench mode)

Cooling: Heatkiller IV Pro Pure Copper | Koolance GPU-210 | HWLabs L-Series 360 | XSPC EX360 | Aquacomputer D5 | Bitspower Water Tank Z-Multi 250 | Monsoon Free Center Compressions | Mayhems UltraClear | 9x Arctic P12 120mm PWM PST

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×