Jump to content

NVIDIA Project Beyond GTC Keynote with CEO Jensen Huang: RTX 4090 + RTX 4080 Revealed

6 minutes ago, Blademaster91 said:

It depends if you know you're not getting a true Mustang or not, an average driver wouldn't care, but anyone that pays attention to what they're buying will know its just not the same. I think the 4080 12GB is deceptive marketing, model names have meant something with the product stack, now Nvidia moves up their cards a tier again, and a real x80 card is gonna be around $1000 if the FE cards are going to undercut the AIB's.

See, nah man, 2.4 eco boost is still a true mustang. Those ecoboosts got a lot of go fast in them.

Link to comment
Share on other sites

Link to post
Share on other sites

  • Price increased
  • Uses more power
  • EVGA sailed away
  • Ugly AIB cards
  • 4080 12g and 16g. Sketchy move.

This seems like everything we didn't need.

A recession is hitting and energy prices are way up. This just makes no sense.

Would have liked to see the prices remain the same, performance bump and efficiency bump.

 

I wonder if they are going to let AIBs buy the chips. Keep prices high to sell off 3000 stock, then drop the prices and screw over AIBs with the 4000 chips they bought at the launch rate. Not sure if that's how that works.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, StarsMars said:
  • Price increased
  • Uses more power
  • EVGA sailed away
  • Ugly AIB cards
  • 4080 12g and 16g. Sketchy move.

This seems like everything we didn't need.

A recession is hitting and energy prices are way up. This just makes no sense.

Would have liked to see the prices remain the same, performance bump and efficiency bump.

 

I wonder if they are going to let AIBs buy the chips. Keep prices high to sell off 3000 stock, then drop the prices and screw over AIBs with the 4000 chips they bought at the launch rate. Not sure if that's how that works.

usually there are a rebates, but that hinges on nvidia approving the rebates, the aggressive rate at which MSI are undercutting the other AIBs are also insane

 

The 16gb AD103 and 12gb AD104 "4080" are a slap in the face of whoever buys them for 1200 and 900usd

 

Whoever's holding the most gpus are getting screwed the hardest, that'd probably be nvidia, they probably ordered a buncha AD102 for mining from TSMC, even the reveal itself today with how crappy the "4080"s were is just practically begging people to buy the 4090 (only if i dont have to get a new psu for it)

 

Nvidia's probably shoving gpus down AIB's throats to offset some of the losses, no1's making money atm, just a matter of where the losses go.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

So if im just gonna be getting an EK waterblock soon as they come out,  WHICH card would be best?   

 

FE or maybe highest watt AIB card like the Asus Strix?  

 

Or does it not matter because they will all do the same Mhz at the same low temp?

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Shzzit said:

So if im just gonna be getting an EK waterblock soon as they come out,  WHICH card would be best?   

 

FE or maybe highest watt AIB card like the Asus Strix?  

 

Or does it not matter because they will all do the same Mhz at the same low temp?

Wait for the EK blocks to come out. Typically they do a block for reference PCB cards and then a few others like Asus Strix. Anyway it would be a really bad idea to buy a GPU with a custom PCB now with the intention of putting a water block it without even knowing if EK will make a block for it.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, leadeater said:

Wait for the EK blocks to come out. Typically they do a block for reference PCB cards and then a few others like Asus Strix. Anyway it would be a really bad idea to buy a GPU with a custom PCB now with the intention of putting a water block it without even knowing if EK will make a block for it.

I'm just going buy what EK said, looks like they already have one ready for FE and are making more for the most popular AIB cards. 

 

EK-Quantum Vector² FE RTX 4090 water blocks, backplates, and active backplates are compatible with NVIDIA GeForce RTX 4090 Founders Edition GPU. The EK Cooling Configurator will be updated regularly with AIB partner PCBs and models as new info comes in. EK plans to provide all popular AIB models with their own water blocks to ensure customers have a wide range of choices depending on their preferred brand or requirements in the graphics card size.

 

EK-Quantum-Vector2-FE-RTX-4090-D-RGB-PR1

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Shzzit said:

EK plans to provide all popular AIB models with their own water blocks to ensure customers have a wide range of choices depending on their preferred brand or requirements in the graphics card size.

Was ages ago and I got burned when I got a card that I was going to water cool, didn't jump on it soon enough and EK stopped making the blocks for the card, only stocked the refence PCB ones. So yea, get it while it's in stock, don't be me and wait until it too late haha.

 

My current one came pre-installed with an EK block, Powercolor Liquid Devil. Have to say even though it's not hard to put a water block on it sure was nice to not have to do it.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Shzzit said:

So if im just gonna be getting an EK waterblock soon as they come out,  WHICH card would be best?   

 

FE or maybe highest watt AIB card like the Asus Strix?  

 

Or does it not matter because they will all do the same Mhz at the same low temp?

It's only ever really mattered if you're trying to get every last MHz out of the card. Which is getting harder and harder to do considering that nvidia is cracking down on things like BIOS editing and external voltage control.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

Was ages ago and I got burned when I got a card that I was going to water cool, didn't jump on it soon enough and EK stopped making the blocks for the card, only stocked the refence PCB ones. So yea, get it while it's in stock, don't be me and wait until it too late haha.

 

My current one came pre-installed with an EK block, Powercolor Liquid Devil. Have to say even though it's not hard to put a water block on it sure was nice to not have to do it.

Haha nice, cant wait to upgrade.  Going for the new Asus 48" oled thats 138hz.  By the looks of the leaks the 4090 should do 4k 138hz pretty easy on most games. 

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Shzzit said:

So if im just gonna be getting an EK waterblock soon as they come out,  WHICH card would be best?   

 

FE or maybe highest watt AIB card like the Asus Strix?  

 

Or does it not matter because they will all do the same Mhz at the same low temp?

FE is usually more limited.  450W is going to be the same as the TUF, the Strix is going to be higher according to their launch info.

 

FE you also won't likely be able to flash without messing up video outputs or something.

 

FE also sucks more for power modding like shunts.

 

TLDR: buy a Strix if you want the best one.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AnonymousGuy said:

FE is usually more limited.  450W is going to be the same as the TUF, the Strix is going to be higher according to their launch info.

 

FE you also won't likely be able to flash without messing up video outputs or something.

 

FE also sucks more for power modding like shunts.

 

TLDR: buy a Strix if you want the best one.

Aww that makes sense, thanks a lot.  Ya i totally forgot about dual bios too, def should get that, i love OC and tinkering.   Man i miss EVGA already, was gonna get a ftw card but 8(.  

 

I have been eye balling the Strix.  And ya im guessing it will be 500 or so watts vs stock 450,

 

Im so excited. 

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

there's some seriously questionable numbers in the Cyberpunk 2077 RTX on/DLSS demo, 3090ti runs 4k ultra with RT at 90+fps with DLSS and around 40ish with RT off, so the 22 to 90+fps I have to ask what card and settings are they even running to do that when a 3090ti is already doing the shown FPS. Hopefully it's not the 4090 because a 3090ti at 8k ultra with RT is doing around 30fps so if either of the 4080 or 4090 can only do 22 fps without DLSS3.0 how can they possibly be making the claims they are about performance uplifts?

 

their flight sim demo has what appears to be the same fps as a 3080 at 4k DLSS off (based on their own website's numbers https://www.nvidia.com/en-us/geforce/news/microsoft) .... not a great look when it comes to "2-4x the performance" of 30-series. 

 

it's looking more and more like a 2-4x increase (based on the numbers shown) is comparing 30series without DLSS to 40series with DLSS3.0, which is absolutely misrepresenting the performance of the cards.

Guess we'll see during the upcoming independent reviews but I'm actively in the market for a GPU and the current showing isn't pushing me to want to stand in line for a 40series if these were the claims without the substance of numbers behind them. If Nvidia was so confident about their performance uplift in games they would put up the fps numbers instead of playing the Apple charts game with zero qualifiers. This looks just like their 30series launch where it was claimed 3090 was 50% faster than the TitanRTX but that amounted to 23fps on the titan to 29fps on the 3090 in Death Stranding, or 1fps delta in RDR2.

 

that's not even getting into Fortnite 1440p at over 600fps on a 4090 with "e-sports high" settings (which isn't even a thing!) it's more likely they were using the performance mod which, more realistically, goes from 470-500fps on a 3090ti to 600fps on the 4090.

 

honestly, at the end of the day 2-4x isn't going to be a realistic performance bump within the game engines just by throwing more cores at the engine, without DLSS vs non-DLSS or scummy trickery with settings changes to misrepresent the improvement it's just not possible to make that big of an improvement simply by changing nodes or chip structures between generations. This is far too common and frankly should be classified as false advertising if they aren't willing to put real world numbers to their claims within their own launch slides as if they can't benchmark their products against the previous ones. (or are the 30series all still stuck in warehouses being held to manipulate the market like their investor's meeting mentioned?) 

The best gaming PC is the PC you like to game on, how you like to game on it

Link to comment
Share on other sites

Link to post
Share on other sites

That US $900 12GB "4080" card is a joke. You can get a used 3090 card from eBay for $100 less than that, which has twice as much VRAM and will probably outperform it in most games if you don't have the RTX settings maxed out.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, GhostRoadieBL said:

 

Guess we'll see during the upcoming independent reviews but I'm actively in the market for a GPU and the current showing isn't pushing me to want to stand in line for a 40series if these were the claims without the substance of numbers behind them. If Nvidia was so confident about their performance uplift in games they would put up the fps numbers instead of playing the Apple charts game with zero qualifiers. This looks just like their 30series launch where it was claimed 3090 was 50% faster than the TitanRTX but that amounted to 23fps on the titan to 29fps on the 3090 in Death Stranding, or 1fps delta in RDR2.

 

They did the same thing also 3000 vs. 2000 where they were like "here's DLSS on with the 3090, and DLSS off with the Titan RTX" and got to some bullshit conclusion like "it's 8K ready graphics with RTX on".

 

Realistically then it was "fuck off nvidia, 30 fps at 8K with DLSS set to speed over quality doesn't mean shit"

 

To this day I don't play anything with RTX on for the 3090 cause it craters FPS.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, AnonymousGuy said:

They did the same thing also 3000 vs. 2000 where they were like "here's DLSS on with the 3090, and DLSS off with the Titan RTX" and got to some bullshit conclusion like "it's 8K ready graphics with RTX on".

 

Realistically then it was "fuck off nvidia, 30 fps at 8K with DLSS set to speed over quality doesn't mean shit"

 

To this day I don't play anything with RTX on for the 3090 cause it craters FPS.

Yep, ray tracing is a cool tech demo but rasterized performance is still king. Even enabling ray traced shadows in WoW tanks my framerate. The most played games do not support ray tracing at all.

 

This announcement did exactly what it intended on doing; make the 30 series seem like a good deal.

5800X3D / ASUS X570 Dark Hero / 32GB 3600mhz / EVGA RTX 3090ti FTW3 Ultra / Dell S3422DWG / Logitech G815 / Logitech G502 / Sennheiser HD 599

2021 Razer Blade 14 3070 / S23 Ultra

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, vetali said:

Yep, ray tracing is a cool tech demo but rasterized performance is still king. Even enabling ray traced shadows in WoW tanks my framerate. The most played games do not support ray tracing at all.

 

This announcement did exactly what it intended on doing; make the 30 series seem like a good deal.

Yeah right now I'd say that if someone is thinking "maybe I should get a 4070"....no....go buy a higher tier 3080ti or 3090  right now.  Prices of the 3000 series cards are only going to go up/flat from here as the new inventory burns out and the mining-dump comes to an end.   

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AnonymousGuy said:

Yeah right now I'd say that if someone is thinking "maybe I should get a 4070"....no....go buy a higher tier 3080ti or 3090  right now.  Prices of the 3000 series cards are only going to go up/flat from here as the new inventory burns out and the mining-dump comes to an end.   

They have a long way to go down still before going up. And we don't have a clue how aggressive AMD will be with rdna3.

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, ZetZet said:

They have a long way to go down still before going up. And we don't have a clue how aggressive AMD will be with rdna3.

The price is already going up from where it was a couple weeks ago.  Everyone thinks an event like the ethereum merge killing mining or a new release is the exact date that the price is best.  Really it's more priced-in at that point and the lowest point is a couple weeks before.  Something to keep in mind is that every gpu that was sold to miners would have been sold to a gamer otherwise.  So there's not going to be a huge volume of GPU's on the market with no buyers, driving the price down.  And judging by the high prices on the 4000 series, that's going to vacuum up even more of the used 3000 inventory out there right now when it's like "hey you can spend $700 on a 3080Ti right now and probably do better than a $900 4080 12GB"

 

AMD is no-factor really.  15% of steam survey is AMD.  That tells me most people straight don't care what AMD releases.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

I think people are focusing too much on the specs without knowing how the performance really stacks up. We don't know if they're using the same die just cut down, or whether they're all using different dies.

 

From the last gen, performance vs CUDA cores varied depending on what die was being used, even with the same architecture

 

Just from a few quick calculations at 1080p the GA102 (3080/3080ti/3090/3090ti) averages about 70-75 CUDA cores per frame, whereas the GA104 (3060ti/3070/3070ti) averaged  around 50-55 CUDA cores per frame.

 

So, if the 4080 12GB is really just a '4070' in disguise, then it would use the next die down the stack.

For arguments sake, let's just say they have exactly the same performance per CUDA core as the 30 series. That could actually put the 12GB 7% ahead of the 16GB in performance despite having less CUDA cores!

 

Obviously that won't be the case, but it's exactly why you should never just look at the specs and think more=better!

 

And just to add to the argument, IMO, the 4080 16GB isn't the 'true' 4080 with the 4080 12GB being the 4070. To me, the 4080 16GB is the 4080ti and the 4080 12GB is the 'true' 4080

Laptop:

Spoiler

HP OMEN 15 - Intel Core i7 9750H, 16GB DDR4, 512GB NVMe SSD, Nvidia RTX 2060, 15.6" 1080p 144Hz IPS display

PC:

Spoiler

Vacancy - Looking for applicants, please send CV

Mac:

Spoiler

2009 Mac Pro 8 Core - 2 x Xeon E5520, 16GB DDR3 1333 ECC, 120GB SATA SSD, AMD Radeon 7850. Soon to be upgraded to 2 x 6 Core Xeons

Phones:

Spoiler

LG G6 - Platinum (The best colour of any phone, period)

LG G7 - Moroccan Blue

 

Link to comment
Share on other sites

Link to post
Share on other sites

Would love to see what kind of review guide they sending to reviewer. 

Nvidia: We've included some edibles with the review cards, feels free to chew it while doing long hours of benchmarks, we love you. =)

 

AMD will price RDNA3 accordingly to 4000 series performance imo, they stop caring for market share long time ago because you guys will buy Nvidia doesn't matter how high is the price anyway.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, yolosnail said:

We don't know if they're using the same die just cut down, or whether they're all using different dies.

AD104 spec: 7680 CUDA cores, 12G GDDR6X. 192 bit memory interface

4080 12G spec: 7680 CUDA cores, 12G GDDR6X. 192 bit memory interface

 

AD103 spec: 10752 CUDA core, 16G GDDR6X, 256 bit memory interface

4080 16G spec: 9728 CUDA core, 16G GDDR6X, 256 bit memory interface

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, HenrySalayne said:

AD104 spec: 7680 CUDA cores, 12G GDDR6X. 192 bit memory interface

4080 12G spec: 7680 CUDA cores, 12G GDDR6X. 192 bit memory interface

 

AD103 spec: 10752 CUDA core, 16G GDDR6X, 256 bit memory interface

4080 16G spec: 9728 CUDA core, 16G GDDR6X, 256 bit memory interface

 

So, they are using different dies, which could very well mean the performance gap between them is a lot smaller than the numbers would suggest

Laptop:

Spoiler

HP OMEN 15 - Intel Core i7 9750H, 16GB DDR4, 512GB NVMe SSD, Nvidia RTX 2060, 15.6" 1080p 144Hz IPS display

PC:

Spoiler

Vacancy - Looking for applicants, please send CV

Mac:

Spoiler

2009 Mac Pro 8 Core - 2 x Xeon E5520, 16GB DDR3 1333 ECC, 120GB SATA SSD, AMD Radeon 7850. Soon to be upgraded to 2 x 6 Core Xeons

Phones:

Spoiler

LG G6 - Platinum (The best colour of any phone, period)

LG G7 - Moroccan Blue

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, yolosnail said:

 

So, they are using different dies, which could very well mean the performance gap between them is a lot smaller than the numbers would suggest

Why would that be?  How is something with 30% less compute resource on a narrower bus supposed to make up the difference?

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, AnonymousGuy said:

Why would that be?  How is something with 30% less compute resource on a narrower bus supposed to make up the difference?

 

Because, as I demonstrated in my previous post, the smaller dies in the past have typically provided more FPS per CUDA core than the higher end dies, so performance does not scale linearly when you're changing dies. If all were on the same die, then the performance scale is pretty linear. 

 

Of course, with the new architecture, maybe things will scale more linearly across dies, but we just don't know until reviews are in!

Laptop:

Spoiler

HP OMEN 15 - Intel Core i7 9750H, 16GB DDR4, 512GB NVMe SSD, Nvidia RTX 2060, 15.6" 1080p 144Hz IPS display

PC:

Spoiler

Vacancy - Looking for applicants, please send CV

Mac:

Spoiler

2009 Mac Pro 8 Core - 2 x Xeon E5520, 16GB DDR3 1333 ECC, 120GB SATA SSD, AMD Radeon 7850. Soon to be upgraded to 2 x 6 Core Xeons

Phones:

Spoiler

LG G6 - Platinum (The best colour of any phone, period)

LG G7 - Moroccan Blue

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AnonymousGuy said:

AMD is no-factor really.  15% of steam survey is AMD.  That tells me most people straight don't care what AMD releases.

I'd wager it's the same as it was in CPU space; AMD GPUs used to be so bad nobody cared about them and there was no point in buying them in any sense either. Now they are back in game again and could very well become relevant in GPU market especially now when Nvidia is screwing up with customers with their confusing naming schemes and bloated prices. Of course market share is not going to change overnight but right now AMD could compete with pricing quite easily.

 

I'd say DLSS 3 is a gimmick and with the information we got in the keynote I feel like it's only going to work in cases where the frame rate was already good enough. You can't fix input lag by generating completely new frames out of thin air. So yes it might trick some customers shopping in the lower end of GPUs into choosing Nvidia (and maybe even paying a bit more) because of DLSS 3 marketing but overall it's not going to be that much of an advantage for Nvidia like DLSS 1 and 2 were against AMD.

 

The bright side here is the fact that now at least I'm waiting for RDNA3 release date with much greater interest than I was just yesterday 😄 I have an RTX3090 but am in need of another high-end PC and it looks like I'm going to buy AMD this time around and see how they work nowadays. Last Radeon I had was an HD6950 so it's been a while...

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×