Jump to content

Tongue-tied Ti(e) - Nvidia Allegedly Orders Partners to Halt RTX 3090 Ti Production

Lightwreather

Summary

Nvidia has allegedly requested that its add-in-board (AIB) partners temporarily halt the production of its upcoming flagship GeForce RTX 3090 Ti graphics cards. There are reportedly issues with hardware and firmware, two websites reported on Friday.

image_psd.png.2cd53f81c5f65b1d2cd34216efc0e59c.png

 

Quotes

Quote

TweakTown reported the news without disclosing reasons for the temporary halt, the time when Nvidia's request was made or whether Nvidia reached out to all of its partners. This report is supported by VideoCardz, which claims that Nvidia wanted to pause production of its next flagship consumer product due to issues with the BIOS and hardware.  

We have no idea what kind of 'BIOS and hardware' issues could lead to a temporary production halt after volume manufacturing was initiated. Typically, overheating due to insufficient cooling and/or overvoltaging, choice of wrong/weak components, problems with select applications, and incompatibility with certain hardware are among issues that plague newly released parts. Developers try to avoid such situations, which sometimes can cause delays.

 

My thoughts

So uhh, for all the 5000 of you that were going to buy the RTX3090 Tie, We have no date for you. But For the 6 that were going to buy the RTX3090 Ti, well, it's apparently been delayed. Tho, the reason given is vague, everything about this card is vague and so, technically, it hasn't even been delayed. There isn't much we can speculate on as to when it'll actually arrive but hey, at least we'll still have Ties.

 

Sources

Tom's Hardware

Tweaktown

Videocardz

"A high ideal missed by a little, is far better than low ideal that is achievable, yet far less effective"

 

If you think I'm wrong, correct me. If I've offended you in some way tell me what it is and how I can correct it. I want to learn, and along the way one can make mistakes; Being wrong helps you learn what's right.

Link to comment
Share on other sites

Link to post
Share on other sites

Probably rethinking to rename the card as 3090 Titan Black Titanium Ti 69 GTZ.

 

Or another caps/power delivery related issue.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

I wander if the binning AQ was not good enough and they are finding out that some of the dies they have sent out are glitching under extended load, the power draw on these cards is insane!

Link to comment
Share on other sites

Link to post
Share on other sites

How am I going to play magic arena now

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, J-from-Nucleon said:

Summary

Nvidia has allegedly requested that its add-in-board (AIB) partners temporarily halt the production of its upcoming flagship GeForce RTX 3090 Ti graphics cards. There are reportedly issues with hardware and firmware, two websites reported on Friday.

image_psd.png.2cd53f81c5f65b1d2cd34216efc0e59c.png

 

Quotes

 

My thoughts

So uhh, for all the 5000 of you that were going to buy the RTX3090 Tie, We have no date for you. But For the 6 that were going to buy the RTX3090 Ti, well, it's apparently been delayed. Tho, the reason given is vague, everything about this card is vague and so, technically, it hasn't even been delayed. There isn't much we can speculate on as to when it'll actually arrive but hey, at least we'll still have Ties.

 

Sources

Tom's Hardware

Tweaktown

Videocardz

IMO "oh crap, we forgot the LHR code"

 

Here's the thing, when all GPU's are unobtanium, you buy whatever is available if you can use it.

 

For gamers, anything higher than a 3060 is usually overkill, and for those with the right monitors can get away with a 3080Ti, at most. If you have a 3090, you likely are doing something that requires the extra overhead (eg streaming, 4k monitors, HDR, etc) rather than the more typical 1080p60 monitor user. Hell at this point I'd suggest buying the AMD GPU's if you're not a streamer/ML user if they're available, because they seem to be moderately more available.

 

nVidia parts have not been available for over an entire year at this point, and the vast majority of stuff that is available are things that are incapable of being used for Bitcoin or Ethereum.

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Kisai said:

For gamers, anything higher than a 3060 is usually overkill,

Wut? 

 

You have to break the chains of 1080p.  

 

Anything higher than a 3080 is maybe overkill.  

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This thing for $1999 (or 1999€) would be a steal. 🙃

In Germany retail pricing for the 3090 (the non-formal edition without the tie) has reached somewhere between 2500€ and 3000€ in the last few month (3080 are around 1400€, 3080 TIs are in the neighbourhood of 1900€). It's getting really stupid.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Heliian said:

Wut? 

 

You have to break the chains of 1080p.  

 

Anything higher than a 3080 is maybe overkill.  

 

 

image.png.d7a34a58a1272cc28187050ecbd2fd51.png

There are no 3080's, 3070Ti's or 3090's there. The most common are all parts that are just under the 3060 spec.

image.png.55979d69098cb79658de0b25ce98dd3c.png

Go down one more page, and you see the 3080, right under Intel UHD graphics. There may be more 3080's out there, but not that many when it's being put in the same ranking as GT 1030 and Intel iGPU's

 

image.png.f692116a46c4f6a253dcb9bdb2eb9ea1.png

 

67% of players are using 1080p

image.png.1ddfd8fce76abcda0a229d379dd8a980.png

No other resolution is even in the double digits.

 

So go down to the video memory:

image.png.78a17ef32430cb9b67bd255dbbe6c284.png

6GB and 8GB GPU's are nearly half the installed base, with the 4GB ones (eg xx50 parts) coming in at a 3.

 

So most PC gamers have something slightly more capable than 1080p60, but not much better.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Kisai said:

-snip-

That isn't really proof or much of a supporting argument that anything above a RTX 3060 is overkill. x60 class cards are and have always been extremely popular but this is a better indicator of what the majority are able to afford or are willing to not be able to run the highest settings and/or resolutions in the latest games.

 

Nobody is going to ever have a "bad" gaming experience with a x60 series card however that doesn't mean cards above it are overkill.

 

GPU performance classes are quite a lot like a bell curve, x50 through to x80 all sit around the various different positions of the upper middle point of the bell.  Below x50 they are to the left side and below the middle point, above x80 are to the right and below the middle point.

 

Buy the time buying the highest end GPU gives you any meaningful performance advantage over the models below it is the same time the higher mid range cards of the current generation are performing the same and two of those costs less than one of the highest. Logic applied here would be that buy more often than less often, problem there is many buy the top end every generation anyway so it's really not worth worrying about those people, let them do what they want.

 

If Crossfire was still a thing I'd have two 6800 XT's instead of one, I'd also be the first to say it's unnecessary.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, leadeater said:

 

Nobody is going to ever have a "bad" gaming experience with a x60 series card however that doesn't mean cards above it are overkill.

 

GPU performance classes are quite a lot like a bell curve, x50 through to x80 all sit around the various different positions of the upper middle point of the bell.  Below x50 they are to the left side and below the middle point, above x80 are to the right and below the middle point.

 

Historically, the only benefit to buying the x80 equivalent part was to run higher refresh rates, not higher resolutions. Because 4k "gaming monitors" have only been a thing for about 5 years, around the same time the GTX 1080 came out.

 

So my stance here, is again, if someone buys a 3060 or equivalent, that's likely all you need. for gaming. Most people are not playing 1080p120, nor are they playing 4kp120. Most games just don't work at 120fps to begin with, and those that do, require additional tweaking beyond stock settings. 

 

Then you have people who have larger monitors who actually use a resolution between 1080p and 4k, where the 3060 tier is not good enough. 

 

Like it's pretty obvious to see the target user of each tier:

 

x50 = 1080p60 Medium

x60 = 1080p60 Ultra

x70 = 1080p90, 2560x1440p60

x80 = 1080p120, 2560x1440p90, 4kp60

 

The 10xx, 20xx, and 30xx parts are all aim at the exact same tiers. 4Kp120 is not achievable by any tier of existing card.

Ea6tWRuWAz4ED3nu7eYrCn-970-80.png.webp

Previous tiers of 1080 cards were advertised as 4K cards, but "ultra" settings ? not achievable unless you want to play at 30fps. Which given 4K monitors of the same vintage were either 4Kp24 (HDMI) or 4kp60 (DP) max.

 

The prices for GPU's do not reflect the gaming performance at present. If they were still the MSRP, the x80 parts are viable. but at present prices, the x60 parts are a bad value, and everything above it not worth the cost for the performance increase.

 

So sucks to be a 4K gamer right now, paying $2500 for a GPU that should be less than $800.

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, Kisai said:

Historically, the only benefit to buying the x80 equivalent part was to run higher refresh rates, not higher resolutions. Because 4k "gaming monitors" have only been a thing for about 5 years, around the same time the GTX 1080 came out.

 

So my stance here, is again, if someone buys a 3060 or equivalent, that's likely all you need. for gaming. Most people are not playing 1080p120, nor are they playing 4kp120. Most games just don't work at 120fps to begin with, and those that do, require additional tweaking beyond stock settings. 

 

Then you have people who have larger monitors who actually use a resolution between 1080p and 4k, where the 3060 tier is not good enough. 

 

Like it's pretty obvious to see the target user of each tier:

 

x50 = 1080p60 Medium

x60 = 1080p60 Ultra

x70 = 1080p90, 2560x1440p60

x80 = 1080p120, 2560x1440p90, 4kp60

 

The 10xx, 20xx, and 30xx parts are all aim at the exact same tiers. 4Kp120 is not achievable by any tier of existing card.

Ea6tWRuWAz4ED3nu7eYrCn-970-80.png.webp

Previous tiers of 1080 cards were advertised as 4K cards, but "ultra" settings ? not achievable unless you want to play at 30fps. Which given 4K monitors of the same vintage were either 4Kp24 (HDMI) or 4kp60 (DP) max.

 

The prices for GPU's do not reflect the gaming performance at present. If they were still the MSRP, the x80 parts are viable. but at present prices, the x60 parts are a bad value, and everything above it not worth the cost for the performance increase.

 

So sucks to be a 4K gamer right now, paying $2500 for a GPU that should be less than $800.

so from your own examples and conclusion, none of the current cards are overkill ... not even a 3090ti

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Exty said:

so from your own examples and conclusion, none of the current cards are overkill ... not even a 3090ti

Anything higher than a 3060 is overkill for 1080p60 gaming, which is 67% of the steam survey base, and what 100% of console users use. Most people simply do not have monitors or televisions to make it worth buying hardware they can't use.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Kisai said:

Anything higher than a 3060 is overkill for 1080p60 gaming, which is 67% of the steam survey base, and what 100% of console users use. Most people simply do not have monitors or televisions to make it worth buying hardware they can't use.

 

100% of console users? the new generation of consoles are targeting 4k60. Even ps4 pro was targeting 4k30. i dont know what cave you live in that 1080p60 is still the norm but can you even buy 1080p tvs now days? also wouldn't steam survey also include bots? so it would just default the res to 1080p.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Kisai said:

Most people are not playing 1080p120,

Doesn't mean cards that are capable of doing such a thing are overkill as a whole, as you seem to be arguing.

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

^-^

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Elisis said:

Doesn't mean cards that are capable of doing such a thing are overkill as a whole, as you seem to be arguing.

It's just the commonality trap again, something being common doesn't mean the things less common are overkill. It just so happens there are things that are more common than others.

 

Also similarly something being expensive, or "too expensive", doesn't make it overkill.

 

Overkill is using a flame thrower to kill an ant, or a 16lb Sledge Hammer to hammer a nail etc

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/16/2022 at 4:47 AM, Kisai said:

-SNIP-

Thank you for confirming that i'm part of the 1% actually owning a 3080. I feel validated now. /s

 

12 hours ago, Kisai said:

Anything higher than a 3060 is overkill for 1080p60 gaming, which is 67% of the steam survey base, and what 100% of console users use. Most people simply do not have monitors or televisions to make it worth buying hardware they can't use.

4K actually surpassed 1080p as the majority of "market share" in 2019 when it comes to TVs. Seeing as all current consoles support 4K i'd argue that 1080p60 is pretty much dead or dying on consoles, not what 100% use.

 

14 hours ago, Kisai said:

x50 = 1080p60 Medium

x60 = 1080p60 Ultra

x70 = 1080p90, 2560x1440p60

x80 = 1080p120, 2560x1440p90, 4kp60

This "tiering" doesn't make sense at all. There is so much variation in what games are actually played. A 3050 could be anywhere from a 1080p 390Hz high GPU to a 1080p 60Hz low GPU depending if you play Valorant or Cyberpunk. Looking at an average out of dozens of games doesn't even begin to tell the whole story.

 

But even then these numbers you talked about don't end up being applicable. These days 60/70/80 tier GPUs are much more capable than what you think their target is. It's not as simple as "get a 3060 for 1080p60".

image.thumb.png.74b469369e97e899d93b58f89ed9dcd3.png

image.thumb.png.d84518282e138d37ca2b69a0e376b0ad.png

image.thumb.png.a6ce871f8bfdb38626e73642c9eda9a8.png

 

In the end these numbers or comparisons don't really matter either way atm. People will buy whatever GPU they can get their hands on. The GPU market sucks. Not just for 4K gamers. For everyone.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Some interesting discussion. I'd throw in, just because you're not sitting at the max refresh doesn't mean you can't benefit from it. VRR helps a lot.

 

I currently run a 3070 for 4k60+ gaming, as all I could reasonably get. Would I love a 3080? Hell yes. Do I want to pay for one at current market? Pass. Also the 3070 is near enough equivalent to a 2080 Ti unless you hunt for extremely specific edge cases. Removing the 60 Hz limit over HDMI helps a lot. At the end of the day, people will adjust graphical settings to get the balance of performance and quality they want out of the hardware. I know I'm not running the latest titles at 4k Ultra native at >>60fps. I might need to back off quality to very high, or use DLSS if available.

 

Also on the Steam stats, there is another way that might be more insightful, but would take more work. What is the trend? "new" tech will always lag as it is diluted by the existing older install base. So for example, is the 4k % going up faster than 1080 %? The 3080 example went up from 0.84% to 1.1% in 5 months. They are, slowly, getting out there.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

I'm so disappointed in NVIDIA. I had 5000$ burning a whole in my pocket and I couldn't wait to pay a scalper for one of the 10 units released to market. Jokes aside I wonder if anyone is actually truly affected by this?

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/16/2022 at 1:57 PM, leadeater said:

That isn't really proof or much of a supporting argument that anything above a RTX 3060 is overkill. x60 class cards are and have always been extremely popular but this is a better indicator of what the majority are able to afford or are willing to not be able to run the highest settings and/or resolutions in the latest games.

 

Nobody is going to ever have a "bad" gaming experience with a x60 series card however that doesn't mean cards above it are overkill.

 

GPU performance classes are quite a lot like a bell curve, x50 through to x80 all sit around the various different positions of the upper middle point of the bell.  Below x50 they are to the left side and below the middle point, above x80 are to the right and below the middle point.

 

Buy the time buying the highest end GPU gives you any meaningful performance advantage over the models below it is the same time the higher mid range cards of the current generation are performing the same and two of those costs less than one of the highest. Logic applied here would be that buy more often than less often, problem there is many buy the top end every generation anyway so it's really not worth worrying about those people, let them do what they want.

 

If Crossfire was still a thing I'd have two 6800 XT's instead of one, I'd also be the first to say it's unnecessary.

What’s wrong with your spelling and grammar?

Jesus Christ.

 

I’m waiting for the 3090 Ti kingpin or the strix white.

 

Hopefully there’s no delay whatsoever.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Stahlmann said:

 

 

In the end these numbers or comparisons don't really matter either way atm. People will buy whatever GPU they can get their hands on. The GPU market sucks. Not just for 4K gamers. For everyone.

Those graphs still indicate a 3060 is 1080p60 performance, so I don't know what you're trying to prove, and doesn't disprove what I've said. 

 

Unless you replaced your computer and television screens in the last year with the top of the line HDMI 2.1 HDR10 models, you also aren't getting a 4Kp60, let alone 4kp120 experience either, as all those early 4K televisions were 4kp24 on HDMI 1.4b, rarely 4kp60 on HDMI 2.0. Also I don't know anyone silly enough to spend more on a monitor than their GPU or console. You buy a monitor or television and you keep it until it dies, which can be as long as 15 years. The fact is when I decided to buy a 4K monitor to replace the existing Samsung 1080p monitor, there was only like two monitors out there that checked all the boxes (4kp60, displayport and HDMI 2.0 that is capable of 4kp60), yes there were more expensive options, but doubling the price for gsync was not worth the cost.

 

And yes, 100% of consoles are still doing 1080p unless you've somehow got the magic unobtanium wand and created a PS5 out of thin air. Yes they can do 4kp120 if you have a monitor or television that was released in the last two years that has HDMI 2.1 on it. But guess what, all those existing 4K TV and Monitors? 4kp60 if they support HDMI 2.0 only. 4kp120 on 4:2:0 content , but good luck, as no films or tv shows are released like that. 

 

And the PS4 Pro? most games are 1080p60, and even less are HDR, and those that claim to be 4K are actually upsampled to 4K or only running at 4kp30, they aren't 4k60.

https://www.eurogamer.net/articles/digitalfoundry-2016-4k-gaming-on-ps4-pro-tech-analysis

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, porina said:

Some interesting discussion. I'd throw in, just because you're not sitting at the max refresh doesn't mean you can't benefit from it. VRR helps a lot.

VRR is a HDMI 2.1 spec, and again, unless you have a recent monitor or television, you likely don't even have it.

9 hours ago, porina said:

 

Also on the Steam stats, there is another way that might be more insightful, but would take more work. What is the trend? "new" tech will always lag as it is diluted by the existing older install base. So for example, is the 4k % going up faster than 1080 %? The 3080 example went up from 0.84% to 1.1% in 5 months. They are, slowly, getting out there.

Unfortunately, the steam stats never track screen resolution, only GPU models and CPU models. You know, the stuff people like to fight about.

image.thumb.png.320ccdf9aafc0fb7a6bd6ee5c512ba9b.png

 

Though it may be interesting to point out how quickly people are adopting Windows 11, since that acts as a proxy for new computer sales.

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Kisai said:

Unfortunately, the steam stats never track screen resolution

Guess we can only do it indirectly, by looking at how the reported values change over time.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×