Jump to content

Nvidia: Unlaunching the 12gb 4080

Athan Immortal
8 minutes ago, StDragon said:

So, the box art, plastics on the card, and VBIOS will all report a 4080. But the relaunch will have a 40xx sticker slapped on the box at sale??

 

I hope the affix it at an angle with a font that similar to a rejection notice 🤣

AMD changed card specifics after last launch on one card they just updated the BIOS.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Brooksie359 said:

And they are going to buy the 4080 16gb at 1200? I am not sure why you would assume that they wouldn't sell the 4070 for 900 when they are selling the 4080 for 1200. 

Selling a 4070 at $900 would be as bad as a fake 4080, but I wouldn't be surprised if they're going to sell the 4080 relabeled as a 4070 at the same price.

25 minutes ago, Shimejii said:

I dont think the AIBs got screwed over as much as everyone thinks with the coolers and such, if they didnt label them 4080 12gb and just 4080, they can just slap the coolers on the cheaper model 4080 16s and call it a day. The ones they did buy yeah they got screwed on which sucks, but they are use to that by now.

 

Nvidia still cant be trusted when it comes to this stuff, so i expect either a dump of GPUS in china or a completely diff product in a year when its out of the minds of people

The AIB's are getting screwed pretty badly if they already paid Nvidia for the 4080 12GB dies, and the AIB's would be forced to sell the cards at a lower price, also have to relabel all the cards and boxes.

But I don't trust Nvidia on this at all, they're either going to rebrand the 4080 into a 4070 and sell it at the same price, or the 4080 12GB might end up as some OEM card.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Blademaster91 said:

Selling a 4070 at $900 would be as bad as a fake 4080, but I wouldn't be surprised if they're going to sell the 4080 relabeled as a 4070 at the same price.

The AIB's are getting screwed pretty badly if they already paid Nvidia for the 4080 12GB dies, and the AIB's would be forced to sell the cards at a lower price, also have to relabel all the cards and boxes.

But I don't trust Nvidia on this at all, they're either going to rebrand the 4080 into a 4070 and sell it at the same price, or the 4080 12GB might end up as some OEM card.

I think we you look at their post they most emphasize that having confusing naming conventions is bad not that the price or performance of the 12gb version wasn't good enough. I think they will very likely just rebrand it was a 4070 or something and then keep the same pricing as then it wouldn't disrupt any plans while addressing the bad naming convention issue. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Blademaster91 said:

Nvidia probably realized they screwed up, either they thought they might get sued for having 2 80 tier cards with completely different specs, or AMD might have a competitive RDNA3 card.

Nah, because Nvidia has been doing this for ages with their lower tier models.

 

The only screw up they realized, is that the 4080 12GB would be bashed by the media for the life of the 4000 series and probably continue on after, much like Fermi with the GTX 480. They probably expected the backlash, but probably expected to die down quickly, but it didn't happen.

 

Also, probably the board partners and retail stores all complained due to the fear, and most likely true, of the high numbers of returns and customer service calls (which is actually very costly for companies), especially that, unlike the cheap cards, people buying these 80 classes, will seek for that high performance and not play Facebook Games or whatever. People will buy it, realize the performance is not as expected, and return it, and probably blame the manufacturer for making a crap card that doesn't deliver the performance expected. Yes, they'll cash out more for the 16GB, but in their mind it won't be Nvidia's fault, but ASUS, or Gigabyte or whatever.

 

I won't be surprised if AIBs would have made the strict minimum cards of the 12GB variant already required by Nvidia to comply to their agreement (if any), and retailers would just refuse to buy them. OEMs would probably love the 12GB variant. They are selling a PC with "GeForce RTX 4080" sticker on it, but it a 12GB model.

 

Also it opens the door for AMD to go "Our new RDNA3 is more powerful than the 4080!", even though it is the 12GB model that they target.

 

4 hours ago, Blademaster91 said:

Although the product stack makes no sense now, with the 4080 16GB at $1200, there isn't any room for a 4070 at $800-900. And the AIB's are getting screwed so hard if there were already 4080 12GB cards.

Prices can be dropped. It will depend on sales figures. If the 4090 continues to sale strongly, then expect the 4070 at $800-900. If right now the only people buying are: buisnesses (for their employees), desperate people, the rich that need to get the best of the best to show off their friends, and sales go down afterwards. Then great! Expect price drop.

 

But so far, it looks like Nvidia could sale a $200 per month 10% boost pack subscription for the 4090, and it will be a huge success subscription.

 

 

Don't have your hopes up for AMD to come in and sale their RDNA3 GPUs for 500$ for performance beating the 4080 16GB. AMD is a public traded company, their promise to shareholders and investors are: make money. They'll just price to performance their GPUs, and that is all. Maybe be 50$ cheaper, which won't make much of a difference on a $850-$900 GPU. They don't need to be price aggressive, unless they realllly want to gain market share over money.

AMD did this crazy move with Zen 1, because AMD was forgotten with their unimpressive line of CPU for 14+ years.

Link to comment
Share on other sites

Link to post
Share on other sites

They could go with the 4070 ti super 12gb 

My Folding Stats - Join the fight against COVID-19 with FOLDING! - If someone has helped you out on the forum don't forget to give them a reaction to say thank you!

 

The only true wisdom is in knowing you know nothing. - Socrates
 

Please put as much effort into your question as you expect me to put into answering it. 

 

  • CPU
    Ryzen 9 5950X
  • Motherboard
    Gigabyte Aorus GA-AX370-GAMING 5
  • RAM
    32GB DDR4 3200
  • GPU
    Inno3D 4070 Ti
  • Case
    Cooler Master - MasterCase H500P
  • Storage
    Western Digital Black 250GB, Seagate BarraCuda 1TB x2
  • PSU
    EVGA Supernova 1000w 
  • Display(s)
    Lenovo L29w-30 29 Inch UltraWide Full HD, BenQ - XL2430(portrait), Dell P2311Hb(portrait)
  • Cooling
    MasterLiquid Lite 240
Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, GOTSpectrum said:

They could go with the 4070 ti super 12gb 

Yeah that's my bet

I wonder how this decisions played out in the Nvidia offices. Did Jensen just "ok" this, or did he orchestrate it?
Maybe we will never know.


Nvidia: whoopsie.gif
Community: 😑
Investors: 😡

CORSAIR RIPPER: AMD 3970X - 3080TI & 2080TI - 64GB Ram - 2.5TB NVME SSD's - 35" G-Sync 120hz 1440P
MFB (Mining/Folding/Boinc): AMD 1600 - 3080 & 1080Ti - 16GB Ram - 240GB SSD
Dell OPTIPLEX:  Intel i5 6500 - 8GB Ram - 256GB SSD

PC & CONSOLE GAMER
Link to comment
Share on other sites

Link to post
Share on other sites

The amount of conspiracy theories and spins people will put on any Nvidia story is astonishing. 

 

 

Good on Nvidia for renaming the card I guess. People seemed to care a lot about the sticker on the box so now they are changing it. All well that ends well? 

 

 

In before accusations of me being a braindead fanboy and shill for Nvidia because I care more about things like price, performance and features rather than which number it says on the box. 

Link to comment
Share on other sites

Link to post
Share on other sites

how many "Houston, abort abort abort" jokes are coming?

claiming that having 2 cards with the same name is confusing ignores the 3 version of the 2060, the multiple Titan cards, 'superclocked/super superclockes/super duper mega clocked' cards from third parties (which Nvidia would be approving the naming schemes based on the rest of the control they have over AIBs) Laptop vs Desktop GPU naming schemes (probably the most egregious scam naming of chips) or the sheer volume of GT1030 cards from different series chips being rebadged each generation.

 

This is definitely following the very public lashing Nvidia took for 30% less performance and trying to pass it off as an XX80 tier card when it clearly was an XX70 card from the start at a ludicrous price hike from 3070 at $499 to '4070' at $899.

The best gaming PC is the PC you like to game on, how you like to game on it

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, LAwLz said:

Good on Nvidia for renaming the card I guess. People seemed to care a lot about the sticker on the box so now they are changing it. All well that ends well?

The existence of consumer protection laws and agencies and the decades upon decades of lawsuits across all industry and products that relate to product naming would indicated many things and here's a just few; Yes people do care about product names, Yes product names matter, It's more important that you think it is.

 

If Nvidia changed it, especially this late then yes it matters, a lot. This move is in no way, at all, cheap and also most likely quite the logistical and strategic problem.

 

34 minutes ago, LAwLz said:

In before accusations of me being a braindead fanboy and shill for Nvidia because I care more about things like price, performance and features rather than which number it says on the box. 

They all matter. You can care about all of these AND also care about misleading product naming. You simply do not have to choose between. You can also care more about other things and still care about misleading product naming.

 

Need I go on?

 

I think it's rather unneeded and broadline poor form to try and downplay an actual legitimate issue. If you don't care then one great way of showing you don't care is not vocalizing (via text) that you think the naming doesn't matter to you. Like once is enough.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, porina said:

I'd be very curious what the real reasons are behind this. If you take their statement at face value, does that mean the 4080 12GB just disappears? They must have made them to build up stock for launch. Some must already physically exist. So the question then is, are they going to re-launch it (later) as a 4070-something?

For real everyone gonna be so hyped and Nvidia gonna wait like a month and then magically be like surprise we are launching the 4070 early for only $899 $850 and it's got 12gb of RAM!!!

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Athan Immortal said:

Summary

Nvidia are pulling the 4080 12gb card.

 

Quotes

 

My thoughts

For a company as head strong as Nvidia, this is welcome news. Everyone and their granny could see through the Fauxty80.

This does mean however they're left with the only XX80 card being unapologetically $1200, which is ridiculous.

 

Sources

https://www.nvidia.com/en-us/geforce/news/12gb-4080-unlaunch/

Well, it was the right thing to do.

 

What I expect is that they will tell the AIB's to rename it to RTX 4070 or 4070Ti or wherever it would have correctly slotted in based on the memory bandwidth. Which 192-bit would actually make it a "60" card if it was a 30 series, where as the 256-bit 16gb card is a 70Ti. However we need to actually look at the bandwidth number, not the chip configuration. 557 (12GB), below the RTX 3070 (608),  vs 742 (16GB), below the RTX 3080 (760).

 

So the correct label would be 4070Ti, assuming you accept the slight nerfs to bandwidth. If you go by memory chip configuration however, none of the cards are labeled correctly and should be bumped down. Which makes me wonder what compromise was made with the memory chips.

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Kisai said:

What I expect is that they will tell the AIB's to rename it to RTX 4070 or 4070Ti or wherever it would have correctly slotted in based on the memory bandwidth. Which 192-bit would actually make it a "60" card if it was a 30 series, where as the 256-bit 16gb card is a 70Ti. However we need to actually look at the bandwidth number, not the chip configuration. 557 (12GB), below the RTX 3070 (608),  vs 742 (16GB), below the RTX 3080 (760).

But looking at just the memory bandwidth isn't correct either. The whole point of adding larger caches to GPUs is to make it so large memory buses and bandwidths are not necessary. If we are going to act this way for Nvidia then we should have done so for AMD, but we didn't did we...

 

The number of SMs, the size of the die (cache uses tons of space), TDP/TGP etc all support it being a high end product. The fact that it has a 192bit GDDR6 bus in a void is not informative at all.

 

Edit:

A x60 series GPU die would be around 42 SMs not the 60 SMs of the now defunct RTX 4080 12GB.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, leadeater said:

But looking at just the memory bandwidth isn't correct either. The whole point of adding larger caches to GPUs is to make it so large memory buses and bandwidths are not necessary. If we are going to act this way for Nvidia then we should have done so for AMD, but we didn't did we...

 

The number of SMs, the size of the die (cache uses tons of space), TDP/TGP etc all support it being a high end product. The fact that it has a 192bit GDDR6 bus in a void is not informative at all.

 

Edit:

A x60 series GPU die would be around 42 SMs not the 60 SMs of the now defunct RTX 4080 12GB.

If we do it that way, they we're accepting that the entire 40 series traded memory bandwidth away and thus the difference between a 12GB and a 16GB card should be even less. It isn't.

 

The SM's between two different generations of cards are not equal. 

small_nvidia-editors-day-53.jpg

The fact that the 12 and 16GB 3080's were not even the same chip, justifies not calling them with the same part number.

The 3070 and 3060's are 104 parts. The 103 part is 3060Ti (or mobile 3080Ti.) 

 

So there is some heavy dishonesty here by Nvidia possibly trying to sell lower-end parts intended for lower end lines with higher-end part names. We just don't know which. Either the "4080 12GB" was really intended as a 4060 12GB, or the "4080 16GB" was intended as a 4070, and somewhere along the line they looked at the power requirements and realized that they would get massacred in the media if they were selling "lower end parts with higher end power requirements."

 

I'll reiterate again that, usually a die shrink results in power savings. So going from 8nm to 4nm, for the exact same "part" should cut the power in half at a minimum for the same performance, or should double the performance for the same power input.

 

So if we applied that to the 30-series if it were a die-shrink only

3090Ti = 450w @ 8nm = 225w 

3090 = 350w @ 8nm = 175w

3080Ti = 350w @ 8nm = 175w

3080 = 320w @ 8nm = 160w

3070Ti = 290w @ 8nm = 145w

3070 = 220w @ 8nm = 110w

3060Ti = 200w @ 8nm = 100w

3060 = 170w @ 8nm = 85w

 

So from a power point of view there should be 3090Ti performance should be on a 4070 225w. That seems to align with the "4080 12GB" as shown in the image above, or maybe the 70Ti. 

 

But at any rate everyone called out Nvidia here for trying to market a weaker part as something it's not.

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Kisai said:

If we do it that way, they we're accepting that the entire 40 series traded memory bandwidth away and thus the difference between a 12GB and a 16GB card should be even less. It isn't.

No it doesn't mean that at all. It means the difference between Ampere and Ada is less for memory bandwidth. Like for like in the same architecture is still the same, less is still less.

 

Ada simply does not necessitate the same memory bandwidth as Ampere does, it's as simple as that. That's why looking at bus widths and bandwidths cannot and will not tell you the whole story.

 

17 minutes ago, Kisai said:

The SM's between two different generations of cards are not equal. 

I was referring to the ratio or percentage of difference between the number of SMs from the different tiers of product. An RTX 40 series x60 class product is likely to come in around that 42 SM count in the Ada architecture, assuming the ratio of difference are roughly the same.

 

17 minutes ago, Kisai said:

The fact that the 12 and 16GB 3080's were not even the same chip, justifies not calling them with the same part number.

That is not part of this at all.

 

The GPU die and it's configuration in the defunct RTX 4080 12GB simply is not equivalent to a x60 series product. Your reasoning didn't encompass enough factors and that's what I was pointing out. No one metric makes a product.

 

Edit:

Like we both agree on it being called a RTX 4070 Ti but in general I do not care what it's called so long as it's not RTX 4080. I just wanted to caution against pinpoint assessments like that. Given such a large change in architecture it just exacerbates the problem.

 

"These are just names" point is valid itself, they are just names and it applies to the GPU die names, the graphics cards names etc. What matters is they aren't misleading. Just something else I'd advise not getting too hung up on.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, CHICKSLAYA said:

I just don't know how they position the stack now. No one is going to buy a 4070 at $899, or even $799. The most expensive they could get away with would be like $649. Then they look like they were just trying to scam us 

Exactly this. This screws the rest of their stack now. It appears they believed the cards were too powerful to offer within their typical tier's and that they would provide too much value, hindering them from squeezing more out of the consumer. They would have been better using a new naming tier or naming the 4090 an official titan and pushing every card up a tier. 

Link to comment
Share on other sites

Link to post
Share on other sites

As others have said this will likely just be the same card with a different name. The one that most makes sense is 4070Ti but charging $900 is too steep, even for Nvidia because it  makes their 4080 16GB look bad at $1200 so I would guess that they'll wait until they're forced to cut the price of it before introducing this card as a 4070Ti for $699 (presumably if/when the 4080 16GB comes down to $899 or $999)

 

Thus leaving the 4070 enough room to exist between $499 and $599 and enough room for all the other cards down the stack to follow (4060Ti for $399, 4060 for $329, 4050 for $249).

 

Though those numbers are speculation onln my part.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

  Tier        
Series 60 70 80 80 Ti 90
900 GM206 GM204 GM204 GM200  
1000 GP106 GP104 GP104 GP102  
2000 TU106 TU106 TU104 TU102  
3000 GA106 GA104 GA102 GA102 GA102
4000     AD103   AD102
           
           
Series 60 70 80 80 Ti 90
900 128 256 256 384  
1000 192 256 256 352  
2000 192 256 256 352  
3000 192 256 320 384 384
4000     256   384

 

Since comparisons have been made to past series, I thought I'd throw up the tables above. Before you ask about the lower Ti and Super models, I wanted to keep it somewhat simple so leaning towards earlier main GPUs. 3080 is 10GB model. 1060 is 6GB model.

 

I'm only listing the die and memory width. As a longer term trend, the _02 dies have been used for 80Ti and higher products. Ampere was the exception where it also made it to 80 tier. Historically 70-80 tier are _04 die. With Ada we see a _03 slotting in. The now defunct 4080 12GB with AD104 is historically consistent with 70 to 80 tier. It only feels like a drop compared to Ampere because Ampere punched way above expectations.

 

We see similar for memory width. 80 Ti and higher tiers tended to be 352-bit and higher. 70-80 tier was 256-bit, except Ampere's 80's again punching above that. This is the only  area where the defunct 4080 12GB may appear below par at 192-bit, historically 60 tier. But we should make some allowance for the bigger cache included so effectively the memory system may work closer to expected levels based on historic trends.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

It would be hilarious if they simply rename it to 4070 Ti and sell it for the exact same price.

I want to see everyone that said it should've been a 4070 get mad again, cause this is what they said they wanted. 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, leadeater said:

The existence of consumer protection laws and agencies and the decades upon decades of lawsuits across all industry and products that relate to product naming would indicated many things and here's a just few; Yes people do care about product names, Yes product names matter, It's more important that you think it is.

 

If Nvidia changed it, especially this late then yes it matters, a lot. This move is in no way, at all, cheap and also most likely quite the logistical and strategic problem.

 

They all matter. You can care about all of these AND also care about misleading product naming. You simply do not have to choose between. You can also care more about other things and still care about misleading product naming.

 

Need I go on?

 

I think it's rather unneeded and broadline poor form to try and downplay an actual legitimate issue. If you don't care then one great way of showing you don't care is not vocalizing (via text) that you think the naming doesn't matter to you. Like once is enough.

But if we are being realistic, how many people would actually have been mislead?

Most people don't know what graphics card their PC even has. So that group is safe regardless of what the card is called.

 

The people who do know a little bit are safe because even if they see 4080 12GB and 4080 16GB, they will understand that the one that says 16GB and is a lot more expensive will be faster. Even if they aren't technical they should be able to understand that a card with "16" in its name that costs 300 dollars more than the card with "12" in the name will be faster.

I mean, is it really that far fetched to assume people understand that a card with 16 in its name is faster than a card with 12 in its name? Even if we ignore the big price difference? 16 > 12. 

 

The people who are more educated should, and probably will, look up benchmarks and they will see the difference as well.

 

 

I think the amount of people who would actually get ripped off by the 4080 12GB 4080 16GB naming is pretty close to 0. The issue was way overblown.

 

 

I understand that names do matter to some degree, but I think in this case it doesn't really matter. I think things like rebadging (something AMD has been very guilty of if we go back a couple of generations) is a much bigger issue where people actually gets ripped off. Renaming something and pretending like it's a new product actually result in people not getting what they thought they paid for. Yet we saw very little backlash when that happened... I remember some of the same people defending that practice are in this very thread, name calling other forum users and throwing shit at Nvidia for something that in practice didn't really impact anyone.

 

 

If you can give me examples of how this naming scheme would actually result in a significant amount of people getting ripped off then I might change my mind. But until then, the three groups of potential buyers I can think of would have been safe and therefore I don't see this as an issue.

 

 

Also, I don't appreciate the attempt at silencing me for having a different opinion. You're basically telling me to fuck off if I don't jump on the bandwagon.

I don't care about the name because I see it as a non-issue. I commented in this thread because I care about the way people reacted to the name and I wanted to comment on that.

Link to comment
Share on other sites

Link to post
Share on other sites

Since there were leaks and hints of 4090 being in production since July to build up stock, My bet is that both 4080 versions are already in production for at least a month now. That means there's quite some stock of 4080 12GB cards in various warehouses. While you could make use of cheap underpaid and exploited workers in China to repackage everything, switch out all the promo gear inside the boxes and put stickers over the silk screens, the cheapest bet would be to sell these (as someone else mentioned) as Asia only (maybe even specifically for gaming/internet cafes - didn't they do that at least once before?) limited run or sell them to system integrators. The amount of work and money going into repackaging everything just to be forced to sell the cards for cheaper (because, let's face it: the chips are the expensive part and I don't think NVIDIA will reimburse AIBs if they suddenly decide to marked the chips as 4070). 

 

Question is: what happens with the rest of the stock of the 4080 12GB chips? That probably depens on whether the actual 4070 has the same chip or not. If it has: congratulations: more 4070 chips. But now NVIDIA would probably need to lower the price of those chips and I bet AIBs already paid several supply runs already - so NVIDIA needs to partially reimburse AIBs for those deliveries.

 

If it's not the same chip then it probably will be a 4070 Ti oder 4070 Super down the line (which leads to the question: aren't those boards already in design?) - or they just cancel the whole chip production and find a solution (as suggested before) for a special sale. 

 

It leaves the question: was the 4080 12GB always intended as a 4080 or was it renamed? Many folks speculated that it once started as a 4070. How does the chip lineup look like? How much needs to be changed? How many chips have already been made?

Use the quote function when answering! Mark people directly if you want an answer from them!

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

But if we are being realistic, how many people would actually have been mislead?

Most people don't know what graphics card their PC even has. So that group is safe regardless of what the card is called.

 

The people who do know a little bit are safe because even if they see 4080 12GB and 4080 16GB, they will understand that the one that says 16GB and is a lot more expensive will be faster. Even if they aren't technical they should be able to understand that a card with "16" in its name that costs 300 dollars more than the card with "12" in the name will be faster.

I mean, is it really that far fetched to assume people understand that a card with 16 in its name is faster than a card with 12 in its name? Even if we ignore the big price difference? 16 > 12. 

 

Anyone that doesn't follow computers (your view is biased).  There absolutely nothing that will tell the average consumer that there is a performance difference just a VRAM difference 

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, ewitte said:

Anyone that doesn't follow computers (your view is biased).  There absolutely nothing that will tell the average consumer that there is a performance difference just a VRAM difference 

I'll pose the same argument again as I did in the previous thread: Anyone who doesn't follow computers also would conclude that a 40-series card would be better than any 30-series card, because a higher number clearly means better performance. I have yet to see anyone actually refute this notion. If two 4080s differentiated by the VRAM difference is opaque to non-techies, so is the typical clinical numbering, because nothing about those names give any indications of their respective performance differences.

And now a word from our sponsor: 💩

-.-. --- --- .-.. --..-- / -.-- --- ..- / -.- -. --- .-- / -- --- .-. ... . / -.-. --- -.. .

ᑐᑌᑐᑢ

Spoiler

    ▄██████                                                      ▄██▀

  ▄█▀   ███                                                      ██

▄██     ███                                                      ██

███   ▄████  ▄█▀  ▀██▄    ▄████▄     ▄████▄     ▄████▄     ▄████▄██   ▄████▄

███████████ ███     ███ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀████ ▄██▀ ▀███▄

████▀   ███ ▀██▄   ▄██▀ ███    ███ ███        ███    ███ ███    ███ ███    ███

 ██▄    ███ ▄ ▀██▄██▀    ███▄ ▄██   ███▄ ▄██   ███▄ ▄███  ███▄ ▄███▄ ███▄ ▄██

  ▀█▄    ▀█ ██▄ ▀█▀     ▄ ▀████▀     ▀████▀     ▀████▀▀██▄ ▀████▀▀██▄ ▀████▀

       ▄█ ▄▄      ▄█▄  █▀            █▄                   ▄██  ▄▀

       ▀  ██      ███                ██                    ▄█

          ██      ███   ▄   ▄████▄   ██▄████▄     ▄████▄   ██   ▄

          ██      ███ ▄██ ▄██▀ ▀███▄ ███▀ ▀███▄ ▄██▀ ▀███▄ ██ ▄██

          ██     ███▀  ▄█ ███    ███ ███    ███ ███    ███ ██  ▄█

        █▄██  ▄▄██▀    ██  ███▄ ▄███▄ ███▄ ▄██   ███▄ ▄██  ██  ██

        ▀███████▀    ▄████▄ ▀████▀▀██▄ ▀████▀     ▀████▀ ▄█████████▄

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, ewitte said:

Anyone that doesn't follow computers (your view is biased).  There absolutely nothing that will tell the average consumer that there is a performance difference just a VRAM difference 

And you don't think that consumer will assume a difference in VRAM will result in a difference in performance?

 

If you ask the average consumer that doesn't follow hardware "which card is faster, the one with 12GB of RAM or the one with 16GB of RAM and costs 300 dollars more" chances are they will say the latter.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×