Jump to content

NVIDIA Project Beyond GTC Keynote with CEO Jensen Huang: RTX 4090 + RTX 4080 Revealed

2 hours ago, leadeater said:

Pre 900 series there was no x100 die, 102 was the largest and 104 (or refresh of it) were second step down. I wouldn't compare back past 900 series though, too long ago and too different GPU die development process, but to nit pick that means it's only been 8 years not 10 (decade) 🙃

https://www.techpowerup.com/gpu-specs/nvidia-gf100.g85 
https://www.techpowerup.com/gpu-specs/nvidia-gk110.g136
https://www.techpowerup.com/gpu-specs/nvidia-gt200.g61

But point taken that the die development process was different back then.

  

2 hours ago, xg32 said:

the 80 cards have always used the 102 die with the 90 cards except this time


My dude. I just listed them all. the only 80 card using 102 was the 3080

Like the 2080 was 104, and 700 msrp, and it was not even the full die you had to wait for the 2080 super,
so a 900 dollar full 104 is not really weird price point. 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, xg32 said:

The 12pin adapters that comes with the cards are rushed, i would not use one in my house as it is a fire hazard, and the new atx 3.0 psus required to run these cards are not ready yet. This is the main reason i'm not getting a 4090, and while i wait i can see reviews and what amd has to offer.

I'm guessing you watched Jayz2cents video?

The issue reported with the 12VHPWR connector is in the mating between the 12VHPWR connectors on the cable and the graphics card. It makes no difference if it's an adapter cable or an original cable to the PSU. The same issue with cables overheating can still occur on ATX 3.0 power supplies if the cable is bent at the connector or if the connector is damaged from excessive mating cycles. 

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, starsmine said:

But point taken that the die development process was different back then.

Yes it was but that doesn't make your point any better regarding the x80 cards of late. We've got 8 years of precedent, precedent being broken. Not that I personally care much about that, my only issue is a x80 product with x03 die and a x80 product x04 die existing at the same time in the same product generation with the same naming scheme with very clearly different product specifications in multiple aspects which should and traditionally denotes a different product with a different product name.

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, Spotty said:

I'm guessing you watched Jayz2cents video?

The issue reported with the 12VHPWR connector is in the mating between the 12VHPWR connectors on the cable and the graphics card. It makes no difference if it's an adapter cable or an original cable to the PSU. The same issue with cables overheating can still occur on ATX 3.0 power supplies if the cable is bent at the connector or if the connector is damaged from excessive mating cycles. 

the original GN video, just that i don't trust the adapters for extended gaming session, let alone years, acting as an additional point of failure

 

Maybe the psus will come out before the cards and GN can do a comprehensive video with the cards and new psus. Not going in blind. I'm skeptical it's limited to post 30-cycles and bent cables once consumers get their hands on the cards.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, leadeater said:

Then simply don't. When something is simply so obviously anti consumer then it doesn't matter if certain people do not fully understand the situation, it actually changes nothing at all. Shitty business practices at the compromise of consumer transparency deserve all the criticisms they get, well founded or not. 

Yep the whole problem is misrepresentation to increase profits.  It has nothing to do with naming, etc. It doesn't matter the company consumers shouldn't put up with it and just let it go.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, xg32 said:

the original GN video, just that i don't trust the adapters for extended gaming session, let alone years, acting as an additional point of failure

 

Maybe the psus will come out before the cards and GN can do a comprehensive video with the cards and new psus. Not going in blind. I'm skeptical it's limited to post 30-cycles and bent cables once consumers get their hands on the cards.

Apparently existing PCIE graphics power cables already have a 30-cycle spec. 

 

https://www.molex.com/pdm_docs/ps/PS-43045-001.pdf

 

I for sure have probably surpassed that from just cleaning+case swap or testing

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

This is pretty cool ~

 

RTX 4090 performance caused Blizzard to increase Overwatch 2's framerate cap. RTX 4090 hits over 500fps at native 1440p:

 

Quote

 

Nvidia recently told PC Gamer that Blizzard increased Overwatch 2's framerate cap to 600fps after seeing how well the upcoming title ran on the GeForce RTX 4090. Nvidia also confirmed Overwatch 2 would support its latency-reducing feature, Nvidia Reflex.

 

In a new demo, the RTX 4090 ran Overwatch 2 with framerates consistently over 360fps at 1440p, often topping 400fps and sometimes reaching 500. More impressive is that the demo likely ran at native 1440p without any resolution upscaling, instead displaying the 4090's potential in raw performance.

 

https://www.techspot.com/news/96108-rtx-4090-performance-caused-blizzard-increase-overwatch-2.html

Link to comment
Share on other sites

Link to post
Share on other sites

Anyone seen any PCB pics besides FE? 
 

Does the founders edition pcb being a compact design effect the card vs a full size pcb besides for cooling? 

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

The thing about Moore's Law is that it hasn't been taken as biblical fact about transistor count for many years now.  In fact, that pretty much stopped being the idea around the time that dual-core CPU's first became a thing.

 

Instead, it's been taken as performance will double about every 18 months--and this has largely held true.  So when NVidia says "Moore's Law is dead"--what I take it to mean is a tacit admission that performance cannot be doubled every 18 months, because they are bringing the suck.  But also that cost will not go down--likewise because they're bringing the suck.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, IPD said:

Clearly we need a new power plug form factor, something much more durable and better rated to handle the power.

I agree. Stop expecting companies to make their garbage more efficient. Just release new standards that can draw 99999999999999999 times more electricity. I mean that's how the world works anyway, no? Looking forward to my 9999999999999999999999999 watt mobile phone 🙂 That is called progress! That is the future! Of course in the future everything has to use more electricity! It's common sense... 🙂 

 

How to spot the American.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Gamer Schnitzel said:

How to spot the American.

Ironically European Schuko plugs can already deliver way over 3kW (230V * 16A more or less, even 10A circuits can deliver 2.3kW).

Link to comment
Share on other sites

Link to post
Share on other sites

Are there any reasons why the US has 110V and EU 220-240V?

This pretty much sucks because I can't order some Philips HUE lamps from the US for example.

 

1 hour ago, Dracarris said:

Ironically European Schuko plugs can already deliver way over 3kW (230V * 16A more or less, even 10A circuits can deliver 2.3kW).

Funnily enough when I was searching for a power strip to reorganize my desk, it says about 3.6kW. So plenty of room.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, CTR640 said:

Are there any reasons why the US has 110V and EU 220-240V?

This pretty much sucks because I can't order some Philips HUE lamps from the US for example.

 

Funnily enough when I was searching for a power strip to reorganize my desk, it says about 3.6kW. So plenty of room.

Because the USA is in the dark ages and believes that 240 will somehow electrocute half the population.  Nevermind that the plug form-factor and many other things are far safer for Eurospec (not british spec).

 

Nor do I think it's a silly idea to create connectors and cabling less susceptible to bending/cracking/arcing/fires--than the current shite.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, IPD said:

Because the USA is in the dark ages and believes that 240 will somehow electrocute half the population.  Nevermind that the plug form-factor and many other things are far safer for Eurospec (not british spec).

 

Nor do I think it's a silly idea to create connectors and cabling less susceptible to bending/cracking/arcing/fires--than the current shite.

Is it too late for the US to snap out of that bullshit? Only a human being that's stupid enough to get electrocuted is when they stick a metal fork in the wall outlet.

 

I wonder how the US will deal with the ever-increasing power consumption like when there will be AIB GPU's that will require atleast 600W. There will be RTX professional GPU's that will even require 700W.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, CTR640 said:

Is it too late for the US to snap out of that bullshit? Only a human being that's stupid enough to get electrocuted is when they stick a metal fork in the wall outlet.

Just put your fingers between a US plug and outlet when pulling/inserting the plug. Plenty of room to get electrocuted there. But since it's only 110V, it's safe and they can keep the utter abomination of a plug/socket system that they currently have.

Link to comment
Share on other sites

Link to post
Share on other sites

I was a consumer ho too many times in my life. I love me some Big die chips, even got the Titan V prior to the 3090.
Not a hope I'm touching the 4090 this time round. Caseking prices already over MSRP; and no more options of getting FE cards in Ireland. Nevermind the per kWh price for electricity is going up another 49% here next month.

Hoping AMD does well, and likely going to grab an Intel arc to much about with.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, Dracarris said:

Just put your fingers between a US plug and outlet when pulling/inserting the plug. Plenty of room to get electrocuted there. But since it's only 110V, it's safe and they can keep the utter abomination of a plug/socket system that they currently have.

Seems like a silly thing to do, but our plugs are insulated so that wouldn't be a concern. If it's plugged in far enough to be live the part of the pins exposed will be insulated.

https://en.m.wikipedia.org/wiki/AS/NZS_3112

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Spotty said:

Seems like a silly thing to do, but our plugs are insulated so that wouldn't be a concern. If it's plugged in far enough to be live the part of the pins exposed will be insulated.

pretty much how it is almost everywhere else. with systems like Schuko having the socket additionally recessed so the described problem can't occur even with older, full-metal plugs. Additionally, it provides a much better mechanical stability against lose contacts and breaking the prongs off.

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Spotty said:

Seems like a silly thing to do, but our plugs are insulated so that wouldn't be a concern. If it's plugged in far enough to be live the part of the pins exposed will be insulated.

https://en.m.wikipedia.org/wiki/AS/NZS_3112

https://en.m.wikipedia.org/wiki/CEE_7_standard_AC_plugs_and_sockets

 

Far simpler.  Recess means you can't accidentally contact anything.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/26/2022 at 2:14 PM, IPD said:

Clearly we need a new power plug form factor, something much more durable and better rated to handle the power.

There's two angles to address:

 

a) How do deliver increasing power to GPU's. We already hit the wall for this. I don't know how many people have a dedicated 15A circuit for just their computer, but my apartment only has 2 circuits (the other two are the bathroom and kitchen.) There is no plugging "everything else" in to another outlet on the other circuit.

b) How to ensure the PSU can deliver increasing power. The GPU alone, gets 75w from the PCIe slot, 150w per 8-pin connector. So the maximum right now is 375w with two connectors or 525w with three. You can't just keep adding more connectors.

 

What should have happened, instead of making a smaller connector that ends up being at the top of the card, they should have made a thicker connector that uses a screw-type contact, think NEMA twist-locking connectors with several wires together to form a  1200w-capable single flexible rail. That way if a wire fails, it doesn't individually degrade the connector ends. Even breaking this into two 600w-capable and two grounds is still better than "if 1 pin gets torqued the wrong way, it melts the entire connector."

 

I believe we're going to quickly see an ATX 3.0A or 3.1 standard pretty quickly where this connector is abandoned for one with a different style of contact requiring 12AWG instead of 18-16AWG. 

 

image.thumb.png.f7943952659b38cff570cc135a67c4a8.png

image.thumb.png.57560e426b3097dd99f31664fda9baf9.png

 

image.thumb.png.7954d7dd84014ef463aa74b3da9e796c.png

 

image.thumb.png.50dee2b394b3756309c46d31cbdfc38f.png

Link to comment
Share on other sites

Link to post
Share on other sites

Well granted, if the 110v power supply countries would migrate out of the goddamn dark ages, this wouldn't be as much of an issue.  As it is, I'm irritated that 15 amp breakers are even a thing; 20 should be the minimum standard for everything.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Kisai said:

I don't know how many people have a dedicated 15A circuit

This is pretty much my limiting factor. Our home office is 15a circuit, has two computer systems on it, along with some other LED home lighting (not much power, but still). My understanding is you don't want to push more than 80% sustained on the circuit's rated load, which for 15a would be 1800w? So ~1400w, let's say 1200w when you take into account all the other lighting on the circuit.

 

That means both systems should not pull more than 600w continuous without possibly damaging the wiring. 

 

Now, most gaming sessions my PC's power strip completely on a killowatt meter is around 600w........that's a 10900KF OC + 3080 and 3 displays + a fan and speakers. Wife' system is less powerful, 9900k stock, 3070, 3 displays and speakers. 

 

Now, double that, and I'm already at 1200w continuous. If games start using more CPU load, then that's 1400w already.

 

I can't upgrade to a more powerful GPU on either PC unless I upgrade to a 20a circuit. I even moved our wifi Brother laser printer to another circuit, because it can pull over 400w when it fires up.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Shimmy Gummi said:

This is pretty much my limiting factor. Our home office is 15a circuit, has two computer systems on it, along with some other LED home lighting (not much power, but still). My understanding is you don't want to push more than 80% sustained on the circuit's rated load, which for 15a would be 1800w? So ~1400w, let's say 1200w when you take into account all the other lighting on the circuit.

 

That means both systems should not pull more than 600w continuous without possibly damaging the wiring. 

 

Now, most gaming sessions my PC's power strip completely on a killowatt meter is around 600w........that's a 10900KF OC + 3080 and 3 displays + a fan and speakers. Wife' system is less powerful, 9900k stock, 3070, 3 displays and speakers. 

 

Now, double that, and I'm already at 1200w continuous. If games start using more CPU load, then that's 1400w already.

 

I can't upgrade to a more powerful GPU on either PC unless I upgrade to a 20a circuit. I even moved our wifi Brother laser printer to another circuit, because it can pull over 400w when it fires up.

Yep. The danger signs should have been flagged when the combined power draw of two computers exceeds 1200w.

 

If people are going to have top-end computers, at least in North America, unless they dedicate the entire room for it, there's going to be problems. Apartment renters won't have the option to rewire the unit, and it will pretty much be the "rapid boiling kettle" problem again where the high end parts end up having to be imported from a country that has a higher power standard, and you have to make concessions to your own home to operate it.

 

At least with a typical house, ripping out wiring  and adding a circuit is an option. But most apartments and condos in North America are the cheapest, laziest, "luxury" shoebox designs that put style before function. Granite countertops and stainless steel appliances, yay, but where the hell do I plug in my computer that pulls as much power as tablesaw (they pull down the entire 15A circuit)? You gonna plug it into the bedroom, and blow the circuit every time you turn the lights on?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Kisai said:

Yep. The danger signs should have been flagged when the combined power draw of two computers exceeds 1200w.

 

If people are going to have top-end computers, at least in North America, unless they dedicate the entire room for it, there's going to be problems. Apartment renters won't have the option to rewire the unit, and it will pretty much be the "rapid boiling kettle" problem again where the high end parts end up having to be imported from a country that has a higher power standard, and you have to make concessions to your own home to operate it.

 

At least with a typical house, ripping out wiring  and adding a circuit is an option. But most apartments and condos in North America are the cheapest, laziest, "luxury" shoebox designs that put style before function. Granite countertops and stainless steel appliances, yay, but where the hell do I plug in my computer that pulls as much power as tablesaw (they pull down the entire 15A circuit)? You gonna plug it into the bedroom, and blow the circuit every time you turn the lights on?

 

 

My cousin actually has a similar setup. His girlfriend has a 3080 and he has a 3090 in their office. He's reported his breakers popping sometimes.

 

I was like...well at least you know the breakers work....better than a fire lol...but consider an upgrade to wiring or downgrade in case they don't pop

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×