Jump to content

Nvidia: Unlaunching the 12gb 4080

Athan Immortal

I'm very much getting Star Wars micro transaction vibes from this situation.

 

NV really is the EA of the hardware world.

Just trying their luck with some really shitty business practices/products and seeing what they can get away with on the consumer.

If there's backlash, they backtrack and act like they did something great and they're now the good guys.

 

Business are min/maxing consumers now, its great for the shareholders, but terrible for the consumers.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/16/2022 at 6:13 AM, leadeater said:

That's assuming people actually look that far. Your average gamer right now and in the past talks about things like the RTX 3080 being a good graphics card and I simply do not think it's a safe assumption that these people are going to read past or take a lot of notice about the 12GB or 16GB being there.

i think people who buy product by name would probably go for the cheaper one, so "wow i can get 4080 for this price!" and then they'll grab the cheaper one and still game happily

if nvidia were to name it 4070 or something lower, it would make some people who shops for "i7 and xx80 nvidia gpu" to actually spend more
it also opens the gate for AMD to compare their x700 card against nvidia's "4080", which might make nvidia look bad on paper because some people think xx80 and x800 product should be on par -shrug-

 

does the consumer lose out on anything? no not really, they paid less, they got less

the only thing i can see that would be confusing is if before purchasing, they google "4080 benchmark" and got the 16gb version while they're looking to buy 12gb version, that could be misleading, but it's a PEBCAK thing, imo

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, Moonzy said:

i think people who buy product by name would probably go for the cheaper one, so "wow i can get 4080 for this price!" and then they'll grab the cheaper one and still game happily

I don't think that's a good assumption at all, since there are many people who quite openly say they brought the top end system off X company etc.

 

And it doesn't matter if they will game happily, buyers are still being mislead as to what they are getting in those types of situations. Had it said something else then it would put in the pause of contemplation "Do I want to go down to a lower model/class product". Then you have to add in whether or not they would have purchased at all based on this, wait longer until they could afford what they wanted. Or when more of the product line comes out, why buy a 4070 for X when a 4080 is only Y more, a "4080" is so much better.... (which 4080?).

 

Paying less and expecting less is simply a bad argument when you're being mislead as to what the difference actually is. If you cannot make a fair assessment because of misleading product names then you can't fairly evaluate those expectations.

 

59 minutes ago, Moonzy said:

the only thing i can see that would be confusing is if before purchasing, they google "4080 benchmark" and got the 16gb version while they're looking to buy 12gb version, that could be misleading, but it's a PEBCAK thing, imo

That is still a problem and it's a very worrying mindset to have there compared to expecting and company to not create an anti-consumer situation in the first place.

 

Tell me how hard it would have been for Nvidia to avoid this in the first place?

 

Nvidia named the two products the way they did specifically to benefit themselves, to think anything else is naïve. Does that mean they were doing it out of malice, not really, but lets not kid ourselves that the reason it was named an RTX 4080 rather than something else is because Nvidia actively thought about how it would effect buying decisions and determined the potential sales prospects would be better this way than going with a different major model name.

 

Something doesn't have to be a bad product, or bad value, or a lot of other things for consumers to be mislead. The only requirement for being mislead is being mislead. Normalizing misleading practices because a situation isn't "omg harmful" simply isn't a good idea, big or small it should not be happening. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

buyers are still being mislead as to what they are getting in those types of situations.

would you care to explain what do you mean by "misled"? they still got a 4080, it is (was) called a 4080

i think it would help to see what do they lose out on, which i cant really see any, pay less get less

 

don't get me wrong, i would prefer they call it something else myself due to the "4080 benchmark" reason alone, but i don't see what is there to make as huge of a fuss

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Moonzy said:

would you care to explain what do you mean by "misled"? they still got a 4080, it is (was) called a 4080

i think it would help to see what do they lose out on, which i cant really see any, pay less get less

There are two 4080's, which one can you say they were expecting?

 

Pay less expect less is simply a bad argument when you are being put in a situation where there is confusion in to the product. Pay less and getting a lower product model is much more obvious, agree?

 

Do you think a consumer would expect more from a $900 RTX 4080 than a $900 RTX 4070? I think yes they would since model reputation is a thing and can be observed, even among the lesser informed.

 

When boiling down an entire situation and argument in to single factor like price be aware that it could be a narrow or flimsy argument. It might not, but single factor arguments in complication situations are rarely that good.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Moonzy said:

don't get me wrong, i would prefer they call it something else myself due to the "4080 benchmark" reason alone, but i don't see what is there to make as huge of a fuss

There is literally not huge fuss at all. It's those, now like yourself, that want to argue the point that are actually making it out to be worse than it is.

 

The naming was bad, it was anti-consumer in nature, Nvidia has now agreed it was not a good situation and change it. The issue is literally over...

 

I honestly do not get the point in discussing it further.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, leadeater said:

There are two 4080's, which one can you say they were expecting?

a 4080, a layman wouldnt care and just game happily, if i have to guess, ignorance is bliss after all

again, i agree that it's unnecessarily confusing, but the consumer wouldnt lose out much.
 

13 minutes ago, leadeater said:

Pay less het less is simply a bad argument when you are being put in a situation where there is confusion in to the product. Pay less and getting a lower product model is much more obvious, agree?

having more indication it's a lower tier product would be more helpful indeed
but they do have 12gb vs 16gb, so idk

 

13 minutes ago, leadeater said:

Do you think a consumer would expect more from a $900 RTX 4080 than a $900 RTX 4070? I think yes they would since model reputation is a thing and can be observed, even among the lesser informed.

hypothetically, if this was the case, i would first question "why" and look into it further

just like back when 3060ti was the same price as 3070 for a period of time. (spoiler alert: they had the same hash rate n efficiency)
 

11 minutes ago, leadeater said:

There is literally not huge fuss at all. It's those, now like yourself, that want to argue the point that are actually making it out to be worse than it is.

it isnt? literally every techtuber and place i go was talking about how the 4080 12gb was supposed to be called a 4070

 

11 minutes ago, leadeater said:

Nvidia has now agreed

due to backlash
 

11 minutes ago, leadeater said:

I honestly do not get the point in discussing it further.

regarding this particular case, i agree, it's pretty case closed
 

but now im wondering where (or how) do we draw the line?
is having multiple i5 confusing? i5 10400 vs i5 10600, for example, they're both i5
or laptop gpu vs desktop gpu? i think this should also be contested (more so, if you ask me)

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Incredible how some still defends nGreedia their anti-consumer practices.

 

facepalm-really.gif

 

Why still discussing? There won't be 4080/12 anymore, it's a mistake product. nGreedia got the backlash they deserves.

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, LAwLz said:

The bus width should not determine what the name of the product is. Especially not when they have reworked the cache the way they have.

The bus width is pretty irrelevant. Just like how horse power does not necessarily determine how fast a car is around a track. 

 

 

The 3060 Ti and 3080 Super has the same memory bus width. Does that mean Nvidia should have called the 3060 Ti a 3080 class card? Of course not, because bus width is only a tiny part of the equation that determines performance. On its own it is fairly useless and not at all an indication of performance.

3060Ti are 256bit but their non Ti is at 192bit. Their XX70s minimum bus width was always 256bit, What do you mean it's irrelevant? Even if today's cards, bus width doesn't determine how fast a card is, it's still better to have a wider bus width. If the 12GB was at 256bit, then total memory bandwidth would be a lot better.

  • RTX 4080 16GB - 735.7GB/s (256bit)
  • RTX 4080 12GB - 503.8GB/s (192bit)
  • RTX 4080 12GB - 671.7GB/s (if it was 256bit)

 

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry for DPIng, so I've heard all of their cards are getting a cut in bus width and they're might be a RTX 4050, 4060/Ti, 4070

4050 and 4060 gets 128bit

4060Ti 160bit

4070 192bit

 

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, NumLock21 said:

Even if today's cards, bus width doesn't determine how fast a card is, it's still better to have a wider bus width. If the 12GB was at 256bit, then total memory bandwidth would be a lot better.

We've had a product generation demonstrating getting more out of a narrower bus. Just that it's from team Red. Ada does have a LOT more cache than previous gen nv cards, and similar probably applies here. Effective bandwidth is much higher than raw VRAM numbers alone show. Over focusing on one detail isn't giving the complete picture.

 

 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, porina said:

We've had a product generation demonstrating getting more out of a narrower bus. Just that it's from team Red. Ada does have a LOT more cache than previous gen nv cards, and similar probably applies here. Effective bandwidth is much higher than raw VRAM numbers alone show. Over focusing on one detail isn't giving the complete picture.

Even though it can get more out of a narrower bus with whatever tricks they have up their sleeves. 4070 deserves a 256bit bus. lol

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, NumLock21 said:

Even though it can get more out of a narrower bus with whatever tricks they have up their sleeves. 4070 deserves a 256bit bus. lol

why?

isnt it how it performs? lol

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, pas008 said:

why?

isnt it how it performs? lol

But it's 30% slower than the 16GB. If memory bus don't mean a thing now for Nvidia, then they can reduce their 4080 16GB from 256bit to 69bits, for all I care.

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, NumLock21 said:

But it's 30% slower than the 16GB. If memory bus don't mean a thing now for Nvidia, then they can reduce their 4080 16GB from 256bit to 69bits, for all I care.

12gb had 2000 less cuda cores than 16gb you dont think that made huge difference hence why all the uproar?

along with tensor and rt cores

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, pas008 said:

12gb had 2000 less cuda cores than 16gb you dont think that made huge difference hence why all the uproar?

along with tensor and rt cores

I know that, but does that also give them the right to slash the memory bus from 256 to 192. Stop defending nGreedia and leather jacket man for releasing crap products with ridiculous prices.

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, NumLock21 said:

I know that, but does that also give them the right to slash the memory bus from 256 to 192. Stop defending nGreedia and leather jacket man for releasing crap products with ridiculous prices.

where am i defending that card except for its performance you are carried away on bus which isnt really that big of a deal

 

Link to comment
Share on other sites

Link to post
Share on other sites

Reminder that Nvidia has commited far worse warcrimes before and gotten away with it, because low tier gpus are for some reason invisible.

 

For example the GT1030 GDDR5 vs DDR4, where the latter model is over 2 times slower, and COSTS THE SAME, or like 5€ less.

 

gt730 was even worse, where you may have gotten 10 years older gpu with the same name and over 3X less performance, because they used different architectures but same name. So you may have gotten a gt730 that wasn't supported by a newer gt730 driver, because it was actually a 630 series gpu with ended driver support. 

 

And with the low tier cards it is almost impossible to figure out from the box which card it is, the gt730 has 3 versions with wildly different specs. https://www.tomshardware.com/news/nvidia-gt-730-announcement,27087.html

 

So I don't think Nvidia did this due to the backlash, as reviewers have expressed outrage over the gt730 and gt1030 too. 

I think more it was due to AMD upcoming gpus, and how a really slow but expensive "rtx 4080" would be really easy cannon fodder for AMD's marketing team.

 

 

I only see your reply if you @ me.

This reply/comment was generated by AI.

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, NumLock21 said:

But it's 30% slower than the 16GB. If memory bus don't mean a thing now for Nvidia, then they can reduce their 4080 16GB from 256bit to 69bits, for all I care.

Most if not all of that performance difference is because of the fewer cores, not the memory bus width. 

 

The memory bus width does not really matter. Its just part of the equation. Staring blindly at the memory bus width and ignoring everything else is like looking at horse power to determine how fast something will go around a track. 

The horse power is just one measurement out of many that determines how something performs. 

 

If car A goes around a track 10% faster than car B then it doesn't matter if car A has 500 horse power vs 600 in car B. 

Car A is still faster. The reasons for that might be weight, or handling, or a long list of reasons that are unrelated to horse power. If what matters is how fast something goes around a track then its silly to say car A isn't allowed to be called a race car because "race cars has at least 600 horse power", which is a completely made up criteria. 

 

Not all 80 class cards have had a 256 bit bus. Bus width has varied from generation to generation. What matters is performance. 

The GTX 280 had a 512 bit bus. Does that mean the 3080 wasn't a true 80 class card because all 80 class cards should have a 512 bit bus? 

 

The 1070 had a 256 bit bus. Does that mean it should have been called the 1080? In fact, the 1070 had the same memory bus width as the 1080. Maybe they should both have been called the 1080 by this "memory bus determines tier" logic. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/17/2022 at 5:22 AM, Arika S said:

welp

 

Can't really defend Intel's boxes here, but at least on the AMD retail boxes, they have a tamper-proof sticker that lists the CPU model: 

652681_436527_01_front_zoom.jpg

 

Still hard to see it from this angle, but its definitely there. 

5 minutes ago, LAwLz said:

Most if not all of that performance difference is because of the fewer cores, not the memory bus width. 

 

The memory bus width does not really matter. Its just part of the equation. Staring blindly at the memory bus width and ignoring everything else is like looking at horse power to determine how fast something will go around a track. 

The horse power is just one measurement out of many that determines how something performs. 

 

If car A goes around a track 10% faster than car B then it doesn't matter if car A has 500 horse power vs 600 in car B. 

Car A is still faster. The reasons for that might be weight, or handling, or a long list of reasons that are unrelated to horse power. If what matters is how fast something goes around a track then its silly to say car A isn't allowed to be called a race car because "race cars has at least 600 horse power", which is a completely made up criteria. 

 

Not all 80 class cards have had a 256 bit bus. Bus width has varied from generation to generation. What matters is performance. 

The GTX 280 had a 512 bit bus. Does that mean the 3080 wasn't a true 80 class card because all 80 class cards should have a 512 bit bus? 

I can help settle this. Memory bandwidth/capacity only matters if you do not have enough of it. Having more capacity and bandwidth than what you need has zero impact on performance. Works exactly the same way as typical system memory in that regard. Adding another 32GB of memory to your system doesn't automatically make it faster. It only matters if you were running out and swapping to a slower solution.

 

Also, bus width is but a small fraction of the equation to factor in. GPU memory hierarchy has evolved a ton over the past few years, and not all of it for the better. Pascal had lower memory latency with GDDR5X compared to Ampere, Volta and Turing. However, Pascal could not utilize L1 cache in OpenCL while the newer architectures can. That L1 cache is significantly faster and of much lower latency, which allowed Nvidia to mask their latency penalty of GDDR6/X leaving people none the wiser to this performance deficit. This was completely agnostic of bus widths and memory frequencies, talking strictly design here.

 

The 4080 12GB and 4080 16GB being different VRAM capacities was inconsequential to the overall performance of the cards. It's no different from when Nvidia did this in the past going back to Kepler with the 780 Ti and Titan Black, all the way to the RTX 2060 and its VRAM "rebrands". What does make a significant difference is the SM structure and difference in cores/streaming processors. The decision to change the fundamental SM count (each SM having 64 FP32 and INT32 cores totaling 128 cores per SM), you end up with a massive 2048 core count deficiency as well as an additional L1 cache penalty of 2048KB. This isn't even counting the differences in TMU/ROP counts or tensor/RT cores for those that use them. 

 

All of that is going to result in that 30% performance difference we saw from Nvidia's first party benchmarks (and that was likely best case scenario). It has been a very long time since we've seen VRAM bandwidth/capacity result in a 30% performance improvement. I am talking GTX 970 Witcher 3 "3.5GB" meme days, lol. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/18/2022 at 4:03 AM, Moonzy said:

hypothetically, if this was the case, i would first question "why" and look into it further

Just to make that clear, it's one or other. You can't look in to it if you mean compare these two different naming options. Plus that's already a problem, you would look in to it, you might understand the difference better. Someone else might assume that an RTX 4080 can't be that much different from the other RTX 4080 because they have similar name. Where as it's already well established over multiple generations that an RTX 3070/4070 etc is different and slower than a RTX 3080/RTX 4080. Slapping on 12GB or 16GB to the end and then assuming people will actually take notice is a much bigger, worse and more problematic assumption than assuming people can differentiate between a RTX 4080 and an RTX 4070.

 

And even if we go with the assumption people will note the 12GB vs 16GB and the price difference and expect less because it's cheaper as there is some extra numbers and stuff we're going to have to debate what expectations they might actually have versus it actually being an established different product model name.

 

So would an average person expect more from a $900 RTX 4070 or more from a $900 RTX 4080. Neither exist together, also include in this that a $1200 RTX 4080 exists at the same time.

 

Situation 1:

  • RTX 4070 - $900
  • RTX 4080 - $1200

Situation 2:

  • RTX 4080 12GB - $900
  • RTX 4080 16GB - $1200

One of these has a lesser risk to mislead, one of these has a lesser chance of confusion, one of these more clearly signifies the significance of difference. It is neither unfair nor a "big fuss" to point this out.

 

Neither do I really personally think an impact assessment is necessary, the name is a problem, we all know it. It makes zero difference how much you, I or anyone thinks about whether or not people will be dissatisfied with their purchase because that doesn't negate the potential and none of us can actually be sure, see the future or know how everyone thinks. What we can do is ask that the product name is changed to something less confusing with less risk and follow an existing and established product naming scheme that signifies such differences. We do not need to have a new problem introduced even if factor in people not understanding already.

 

As for the CPUs, those have model numbers, those are list on OEM/SI product specs and it's not entirely foreign concept about these model numbers either. Of course you can't save everyone and I've certainly heard people talking about solely the i5 or i7 part without any idea there are different models, yet I would put forward these are also the same people more likely to not understand the RTX 4080 situation or buy the wrong one even without doing so because it's cheaper.

 

Like I said just because you are or someone is happy with the product and how it's performing doesn't mean they have not been mislead. Satisfaction with the product strictly speaking is not a requirement at all for being mislead, typically however it's the end result of being mislead.

 

This is why I'm not really entertaining these discussions and addressing them, I certainly do not think it matters at all and I overall do not think it matters. Ignorance is bliss is a bad argument which is actually what you and others are saying is fine and that protections for the ignorant aren't necessary because "they are happy".

 

On 10/18/2022 at 4:03 AM, Moonzy said:

it isnt? literally every techtuber and place i go was talking about how the 4080 12gb was supposed to be called a 4070

But does that mean it's a "big fuss". How else do they or anyone else point out that a company has released a product that has the potential to mislead or is otherwise named in a way it should not be.

 

Protest in silence aka say nothing at all? How do you think that would have turned out?

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, leadeater said:

assuming

I guess this is the crux of the problem, people assume

 

I guess we can agree that a name change to something more obvious is good.

What we don't agree on is whether the impact matters or not, and it's ok to agree to disagree

 

10 hours ago, leadeater said:

Ignorance is bliss is a bad argument which is actually what you and others are saying is fine and that protections for the ignorant aren't necessary because "they are happy".

This whole thing reminds me of this video

 

10 hours ago, leadeater said:

But does that mean it's a "big fuss". How else do they or anyone else point out that a company has released a product that has the potential to mislead or is otherwise named in a way it should not be.

The way to get companies(or politicians) to change using public opinion is by making a big fuss about it.

 

 

From a business standpoint, I still think it's dumb for Nvidia to name it a 4080 anyways, due to the reasons I listed in my first post

Literally don't see how naming it would make sense from a business standpoint.

Trying to raise prices of "tiers", perhaps? But as long as price to performance is improving, then I have no complaints. But it may affect those who buys "i7 and xx80" people, which I believe is most of the consumers (Based on how many friends I've helped look for a PC)

 

I hope that more companies go under this scrutiny

For example, Acer XF240Q monitor, there's like a bajillion types of specs for this model number, from 240hz to 144hz, or the XV272U, same thing

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×