Jump to content

NVIDIA Project Beyond GTC Keynote with CEO Jensen Huang: RTX 4090 + RTX 4080 Revealed

8 minutes ago, Avocado Diaboli said:

Except your point flies out of the window when including the iPhone 13, because it's the same thing under a different name but offers no performance improvement.

Yeah because a phone has a bit more features than the CPU, and Apple was crystal clear about what changed and what did not. Maybe the phone was a bad analogy to start with although everyone clearly sees how it serves to make my point. Which you stubbornly refuse to accept although it's pretty clear to everyone else.

8 minutes ago, Avocado Diaboli said:

Also, nice of you to ignore my MacBook Air example where it's the same name but different performance on actually different storage sizes, which maps directly onto your example of iPhone 14 storage sizes.

Yeah cool. Look, the performance you are referring to here makes little to no difference in the overall typical usage of the MBA and changes the average user experience very little, if at all. While the number of CUs, SMs and whatnot is the literal core feature of a freakin graphics card and the very thing why you buy a higher tier GPU.

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Shimmy Gummi said:

Look my man, you're putting a lot of effort into "Ackchyually" -ing your replies. You get a gold star.


Bottom line - it's confusing and deceptive to name two things with vastly different configurations the same thing. 

 

If Nvidia felt the memory capacity difference was enough of a change to justify a name change to specifically identify that difference...why not specifically identify the more important change, the CUDA core configuration? Oh, right, to be deceptive.

 

I wish this forum had downvotes sometimes.

Is it wrong that both the 2.4 and 5.0 mustang are called mustangs?
I mean one has a 2.4L 4 banger, the other has a 5L v8?
 

  

29 minutes ago, Avocado Diaboli said:

The iPhone 14 is basically the same as the iPhone 13, they just called it something new because they have to release a new series every year. For all intents and purposes it's the same phone. You didn't make the point you intended to make. You are being misled everywhere. That's the point of marketing. And naming is part of that. Apple also released that new Mac Book Air with a much slower SSD on the bottom tier machine after switching to 256 GB flash storage chips. On paper, all of the machines are called the same, but one undeniably has worse performance than the others. As I keep repeating, names don't matter.

 



Also saying the iphone 13 is the same as the 14 is just a weird take. they are built completly differently even if they share a cpu.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, starsmine said:

Is it wrong that both the 2.4 and 5.0 mustang are called mustangs?
I mean one has a 2.4L 4 banger, the other has a 5L v8?

Also saying the iphone 13 is the same as the 14 is just a weird take. they are built completly differently even if they share a cpu.

It would be wrong if you called them both GT lol, which is what is happening here.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Dracarris said:

Yeah because a phone has a bit more features than the CPU, and Apple was crystal clear about what changed and what did not. Maybe the phone was a bad analogy to start with although everyone clearly sees how it serves to make my point. Which you stubbornly refuse to accept although it's pretty clear to everyone else.

 

Yeah cool. Look, the performance you are referring to here makes little to no difference in the overall usage of the MBA. While the number of CUs, SMs and whatnot is the literal core feature of a freakin graphics card and the very thing why you buy a higher tier GPU.

So you made a bad comparison with the iPhone 14 and are now upset when I took up the bait and gave you an example of exactly what you were trying to use against my argument with the MacBook Air and now it suddenly doesn't matter anymore, because Apple has set a precedent? You're just trying to downplay actual differences in performance on the Macbook Air to feebly create a distinction to not admit that your example was bad.

 

Nvidia are also clear what the difference between the 4080 12GB and the 4080 16GB are. It's listed in the specs front and center, even before they mention the difference in memory capacity (other than the name itself):

image.png.7506aeabe0af3827b215249c19731dbd.png

And now a word from our sponsor: 💩

-.-. --- --- .-.. --..-- / -.-- --- ..- / -.- -. --- .-- / -- --- .-. ... . / -.-. --- -.. .

ᑐᑌᑐᑢ

Spoiler

    ▄██████                                                      ▄██▀

  ▄█▀   ███                                                      ██

▄██     ███                                                      ██

███   ▄████  ▄█▀  ▀██▄    ▄████▄     ▄████▄     ▄████▄     ▄████▄██   ▄████▄

███████████ ███     ███ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀████ ▄██▀ ▀███▄

████▀   ███ ▀██▄   ▄██▀ ███    ███ ███        ███    ███ ███    ███ ███    ███

 ██▄    ███ ▄ ▀██▄██▀    ███▄ ▄██   ███▄ ▄██   ███▄ ▄███  ███▄ ▄███▄ ███▄ ▄██

  ▀█▄    ▀█ ██▄ ▀█▀     ▄ ▀████▀     ▀████▀     ▀████▀▀██▄ ▀████▀▀██▄ ▀████▀

       ▄█ ▄▄      ▄█▄  █▀            █▄                   ▄██  ▄▀

       ▀  ██      ███                ██                    ▄█

          ██      ███   ▄   ▄████▄   ██▄████▄     ▄████▄   ██   ▄

          ██      ███ ▄██ ▄██▀ ▀███▄ ███▀ ▀███▄ ▄██▀ ▀███▄ ██ ▄██

          ██     ███▀  ▄█ ███    ███ ███    ███ ███    ███ ██  ▄█

        █▄██  ▄▄██▀    ██  ███▄ ▄███▄ ███▄ ▄██   ███▄ ▄██  ██  ██

        ▀███████▀    ▄████▄ ▀████▀▀██▄ ▀████▀     ▀████▀ ▄█████████▄

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Avocado Diaboli said:

So you made a bad comparison with the iPhone 14 and are now upset when I took up the bait and gave you an example of exactly what you were trying to use against my argument with the MacBook Air and now it suddenly doesn't matter anymore, because Apple has set a precedent? You're just trying to downplay actual differences in performance on the Macbook Air to feebly create a distinction to not admit that your example was bad.

 

Nvidia are also clear what the difference between the 4080 12GB and the 4080 16GB are. It's listed in the specs front and center, even before they mention the difference in memory capacity (other than the name itself):

image.png.7506aeabe0af3827b215249c19731dbd.png

This is buried on their website. In fact, looking at my physical boxes for my ASUS RTX 3070 KO and ASUS RTX 3080 TUF, they don't even list the specifications. 

 

A name means a lot in marketing, as its what you see in stores, in the SI system spec list, and on the shelves. And in the case of the two cards I own for the 3000 series, on the box. There's nothing to differentiate my 3080 10gb from the 308012gb on the physical box.

 

The fact they carry the same name with only one identifiable difference being memory capacity, but omitting the much more significant core difference, is clearly deceptive marketing.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

Let's also take a look at what a typical consumer might look at when comparing these two cards, when actually considering a point of sale:

 

https://www.bestbuy.com/site/asus-nvidia-geforce-rtx-3080-v2-10gb-gddr6x-pci-express-4-0-strix-graphics-card-black/6475238.p?skuId=6475238

 

vs 

 

https://www.bestbuy.com/site/asus-nvidia-geforce-rtx-3080-12gb-gddr6x-pci-express-4-0-strix-graphics-card-black/6501110.p?skuId=6501110

 

They do NOT list core count. A typical user would look at these and come to the conclusion that other than the memory difference, they are the same. Lucky for them, the difference is marginal, albeit present.

 

I wonder how likely the 4080 12gb and 16gb will have similar marketing material?

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

It is kinda weird they're changing core counts and vram in the "4080" banner.  You'd think they would have called it a 4080 Super or something since that's more appropriate.  It's not quite big enough though to call it a Ti since they're leaving a  (huge) gap for a 4080Ti still between the 4080 16 and the 4090.

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, AnonymousGuy said:

It is kinda weird they're changing core counts and vram in the "4080" banner.  You'd think they would have called it a 4080 Super or something since that's more appropriate.  It's not quite big enough though to call it a Ti since they're leaving a  (huge) gap for a 4080Ti still between the 4080 16 and the 4090.

I mean the 3080 ti only has 17% more CUDA cores than the 3080 10gb, and 20% wider memory bus and capacity.


The 4080 16gb has 27% more CUDA cores, a 33% wider memory bus and capacity than the 4080 12gb model. 


They really should not be called the same thing, at all.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Shimmy Gummi said:

This is buried on their website. In fact, looking at my physical boxes for my ASUS RTX 3070 KO and ASUS RTX 3080 TUF, they don't even list the specifications. 

 

A name means a lot in marketing, as its what you see in stores, in the SI system spec list, and on the shelves. And in the case of the two cards I own for the 3000 series, on the box. There's nothing to differentiate my 3080 10gb from the 308012gb on the physical box.

 

The fact they carry the same name with only one identifiable difference being memory capacity, but omitting the much more significant core difference, is clearly deceptive marketing.

 

Most tech products don't list extensive specifications on their boxes, so this is hardly a point against Nvidia in particular and more against all tech manufacuters in general.

 

4 minutes ago, Shimmy Gummi said:

Let's also take a look at what a typical consumer might look at when comparing these two cards, when actually considering a point of sale:

 

https://www.bestbuy.com/site/asus-nvidia-geforce-rtx-3080-v2-10gb-gddr6x-pci-express-4-0-strix-graphics-card-black/6475238.p?skuId=6475238

 

vs 

 

https://www.bestbuy.com/site/asus-nvidia-geforce-rtx-3080-12gb-gddr6x-pci-express-4-0-strix-graphics-card-black/6501110.p?skuId=6501110

 

They do NOT list core count. A typical user would look at these and come to the conclusion that other than the memory difference, they are the same. Lucky for them, the difference is marginal, albeit present.

 

I wonder how likely the 4080 12gb and 16gb will have similar marketing material?

 

Newegg lists the core count under the specs tab:

https://www.newegg.com/evga-geforce-rtx-3080-10g-p5-3897-kl/p/N82E16814487541?Description=rtx 3080&cm_re=rtx_3080-_-14-487-541-_-Product

https://www.newegg.com/evga-geforce-rtx-3080-12g-p5-4877-kl/p/N82E16814487553?Description=rtx 3080&cm_re=rtx_3080-_-14-487-553-_-Product

 

As does any reputable tech merchant in my neck of the woods.

And now a word from our sponsor: 💩

-.-. --- --- .-.. --..-- / -.-- --- ..- / -.- -. --- .-- / -- --- .-. ... . / -.-. --- -.. .

ᑐᑌᑐᑢ

Spoiler

    ▄██████                                                      ▄██▀

  ▄█▀   ███                                                      ██

▄██     ███                                                      ██

███   ▄████  ▄█▀  ▀██▄    ▄████▄     ▄████▄     ▄████▄     ▄████▄██   ▄████▄

███████████ ███     ███ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀████ ▄██▀ ▀███▄

████▀   ███ ▀██▄   ▄██▀ ███    ███ ███        ███    ███ ███    ███ ███    ███

 ██▄    ███ ▄ ▀██▄██▀    ███▄ ▄██   ███▄ ▄██   ███▄ ▄███  ███▄ ▄███▄ ███▄ ▄██

  ▀█▄    ▀█ ██▄ ▀█▀     ▄ ▀████▀     ▀████▀     ▀████▀▀██▄ ▀████▀▀██▄ ▀████▀

       ▄█ ▄▄      ▄█▄  █▀            █▄                   ▄██  ▄▀

       ▀  ██      ███                ██                    ▄█

          ██      ███   ▄   ▄████▄   ██▄████▄     ▄████▄   ██   ▄

          ██      ███ ▄██ ▄██▀ ▀███▄ ███▀ ▀███▄ ▄██▀ ▀███▄ ██ ▄██

          ██     ███▀  ▄█ ███    ███ ███    ███ ███    ███ ██  ▄█

        █▄██  ▄▄██▀    ██  ███▄ ▄███▄ ███▄ ▄██   ███▄ ▄██  ██  ██

        ▀███████▀    ▄████▄ ▀████▀▀██▄ ▀████▀     ▀████▀ ▄█████████▄

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, Avocado Diaboli said:

 

Most tech products don't list extensive specifications on their boxes, so this is hardly a point against Nvidia in particular and more against all tech manufacuters in general.

 

 

Newegg lists the core count under the specs tab:

https://www.newegg.com/evga-geforce-rtx-3080-10g-p5-3897-kl/p/N82E16814487541?Description=rtx 3080&cm_re=rtx_3080-_-14-487-541-_-Product

https://www.newegg.com/evga-geforce-rtx-3080-12g-p5-4877-kl/p/N82E16814487553?Description=rtx 3080&cm_re=rtx_3080-_-14-487-553-_-Product

 

As does any reputable tech merchant in my neck of the woods.

Comment removed by me

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Shimmy Gummi said:

How much is nvidia paying you? I hope they are.

Ah, nice, when you run out of arguments, accuse your opponent of being a paid shill. Totally makes your arguments more credible.

And now a word from our sponsor: 💩

-.-. --- --- .-.. --..-- / -.-- --- ..- / -.- -. --- .-- / -- --- .-. ... . / -.-. --- -.. .

ᑐᑌᑐᑢ

Spoiler

    ▄██████                                                      ▄██▀

  ▄█▀   ███                                                      ██

▄██     ███                                                      ██

███   ▄████  ▄█▀  ▀██▄    ▄████▄     ▄████▄     ▄████▄     ▄████▄██   ▄████▄

███████████ ███     ███ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀████ ▄██▀ ▀███▄

████▀   ███ ▀██▄   ▄██▀ ███    ███ ███        ███    ███ ███    ███ ███    ███

 ██▄    ███ ▄ ▀██▄██▀    ███▄ ▄██   ███▄ ▄██   ███▄ ▄███  ███▄ ▄███▄ ███▄ ▄██

  ▀█▄    ▀█ ██▄ ▀█▀     ▄ ▀████▀     ▀████▀     ▀████▀▀██▄ ▀████▀▀██▄ ▀████▀

       ▄█ ▄▄      ▄█▄  █▀            █▄                   ▄██  ▄▀

       ▀  ██      ███                ██                    ▄█

          ██      ███   ▄   ▄████▄   ██▄████▄     ▄████▄   ██   ▄

          ██      ███ ▄██ ▄██▀ ▀███▄ ███▀ ▀███▄ ▄██▀ ▀███▄ ██ ▄██

          ██     ███▀  ▄█ ███    ███ ███    ███ ███    ███ ██  ▄█

        █▄██  ▄▄██▀    ██  ███▄ ▄███▄ ███▄ ▄██   ███▄ ▄██  ██  ██

        ▀███████▀    ▄████▄ ▀████▀▀██▄ ▀████▀     ▀████▀ ▄█████████▄

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Avocado Diaboli said:

Ah, nice, when you run out of arguments, accuse your opponent of being a paid shill. Totally makes your arguments more credible.

What are you actually arguing? All I get out of this is:

You are listing examples of existing bad business practices in an attempt to explain why it is okay.

 

Just because bad business practices exist, does not validate them.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Shimmy Gummi said:

I mean the 3080 ti only has 17% more CUDA cores than the 3080 10gb, and 20% wider memory bus and capacity.


The 4080 16gb has 27% more CUDA cores, a 33% wider memory bus and capacity than the 4080 12gb model. 


They really should not be called the same thing, at all.

Yeah, in reconsidering it's pretty much the 4080 12 is a 4070Ti or something.  

 

They're *really* trying to make the 4090 look like a good value.  3090 vs. 3080 was 8.7 cores vs 10.7 (23% more).  4090 vs 4080 16 is 16.3 vs.  9.7 (68% more)

 

But it's obviously going to be a 4080Ti in there that is like 12 something cores to fit between 16.3 and 9.7

Workstation:  14700nonk || Asus Z790 ProArt Creator || MSI Gaming Trio 4090 Shunt || Crucial Pro Overclocking 32GB @ 5600 || Corsair AX1600i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Sir Beregond said:

They also said 3080 was 2X 2080 2 years ago and that wasn't true. Wait for 3rd party benchmarks.

If course we should wait for third party benchmarks, but I am wondering where the person I replied to got the information that it's only 30% faster. 

If Nvidia says it's about 100% faster in their testing, and some random person says they are lying and it's only 30%, or course I'll ask where they got that info from. 

 

Even if we assume that Nvidia's numbers are misleading and only accurate in a handful of games (like the 2080 to 3080 numbers were), the last time they claimed double rasterization performance we got somewhere around 70-80% higher performance. I am just wondering where the 30% number comes from. 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Avocado Diaboli said:

Most tech products don't list extensive specifications on their boxes, so this is hardly a point against Nvidia in particular and more against all tech manufacuters in general.

This is exactly why the model name means so much. Also just because everyone else does a shitty thing does not make it okay.

Link to comment
Share on other sites

Link to post
Share on other sites

i do wonder if this will end up screwing up the 4080 name down the line, cause these "4080s" are weak af🤣

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Shimmy Gummi said:

What are you actually arguing? All I get out of this is:

You are listing examples of existing bad business practices in an attempt to explain why it is okay.

No, I've been arguing this entire time that you should never take a name of a tech product in isolation and always look at the specs. All you've been doing is confirming that argument but you've been under the assumption that I defend the practice, for some, to me, wholly obscure reason. All I've done is put it into proper context. @Dracarristried to pretend that this is something unique to Nvidia and put forward the Apple example, which I pointed out did the exact same thing with identically named products with different specs and performance. You tried to pretend that because BestBuy doesn't list specs, they're not available to customers anywhere else, and when that failed you called me a shill. And since the posts by @AnonymousGuy are still going on about how the 4080 12 GB is a 4070 Ti in disguise, or that the 4080 16 GB should've been a 4080 Super, my point still stands: Names are meaningless. It doesn't matter what either of the cards are called. I don't defend the practice of intransparent naming schemes. But at some point as a customer, you have to also just face the music and do a bit of research about what you're actually buying.

 

Think of it this way: To a clueless dipshit customer, is it really clear that an 3050 is worse than a 2080? After all, 3050 is a higher number than 2080. You and I know how to decode that number and make sense of it. But if you're really trying to argue from the perspective of some random non-techie, you'd have to conclude that the way these products are named doesn't make a lick of sense by any metric and therefore it is always imperative to do your research.

And now a word from our sponsor: 💩

-.-. --- --- .-.. --..-- / -.-- --- ..- / -.- -. --- .-- / -- --- .-. ... . / -.-. --- -.. .

ᑐᑌᑐᑢ

Spoiler

    ▄██████                                                      ▄██▀

  ▄█▀   ███                                                      ██

▄██     ███                                                      ██

███   ▄████  ▄█▀  ▀██▄    ▄████▄     ▄████▄     ▄████▄     ▄████▄██   ▄████▄

███████████ ███     ███ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀████ ▄██▀ ▀███▄

████▀   ███ ▀██▄   ▄██▀ ███    ███ ███        ███    ███ ███    ███ ███    ███

 ██▄    ███ ▄ ▀██▄██▀    ███▄ ▄██   ███▄ ▄██   ███▄ ▄███  ███▄ ▄███▄ ███▄ ▄██

  ▀█▄    ▀█ ██▄ ▀█▀     ▄ ▀████▀     ▀████▀     ▀████▀▀██▄ ▀████▀▀██▄ ▀████▀

       ▄█ ▄▄      ▄█▄  █▀            █▄                   ▄██  ▄▀

       ▀  ██      ███                ██                    ▄█

          ██      ███   ▄   ▄████▄   ██▄████▄     ▄████▄   ██   ▄

          ██      ███ ▄██ ▄██▀ ▀███▄ ███▀ ▀███▄ ▄██▀ ▀███▄ ██ ▄██

          ██     ███▀  ▄█ ███    ███ ███    ███ ███    ███ ██  ▄█

        █▄██  ▄▄██▀    ██  ███▄ ▄███▄ ███▄ ▄██   ███▄ ▄██  ██  ██

        ▀███████▀    ▄████▄ ▀████▀▀██▄ ▀████▀     ▀████▀ ▄█████████▄

 

Link to comment
Share on other sites

Link to post
Share on other sites

So far Nvidia is just Nvidia, nothing new under the sun. Just using same name to different things to create confusion. Just remember how many versions of GTX 1060 there were with all same damn name to the consumer except some of them had different amount of VRAM, but you may have gotten GTX 1060 with GP106 or with GP104, maybe it had GDDR5 or with X in the end, maybe it had effective 9Gbps memory, maybe just 8 Gbps.

 

Nvidia could have made that simpler and more consumer friendly, but did they make, of course not. After all how the hell are you supposed to sell shit if consumer actually could see from the package that they are being fooled?

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, AnonymousGuy said:

Yeah, in reconsidering it's pretty much the 4080 12 is a 4070Ti or something.  

 

They're *really* trying to make the 4090 look like a good value.  3090 vs. 3080 was 8.7 cores vs 10.7 (23% more).  4090 vs 4080 16 is 16.3 vs.  9.7 (68% more)

 

But it's obviously going to be a 4080Ti in there that is like 12 something cores to fit between 16.3 and 9.7

and they didnt even price it as high as they "could", safe to assume they made more than they can sell due to mining forecasts, and it HAS to sell at 1600usd, i smell a bit of desperation.

 

Everything screams, please buy the 4090

 

Where's the old Jensen, price it at 2k 🤣

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, BabaGanuche said:

This is exactly why the model name means so much. Also just because everyone else does a shitty thing does not make it okay.

This, so much this. Even outside of tech, we're constantly held back by this way of thinking. Things have always been bad, therefore, it is okay to keep doing bad or just turn a blind eye and accept it as "the way things are."


It's sickening.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

2 different GPU with different specs and performance named 4080 in addition to outrages pricing model...

NVIDIA can stick those GPUs...(you know where)

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Avocado Diaboli said:

tried to pretend that this is something unique to Nvidia and put forward the Apple example, which I pointed out did the exact same thing with identically named products with different specs and performance.

Just to be crystal clear here: I never did the "unique to Nvidia" part, at all. This is all your "special" interpretation.

 

And yes, since you seem to be twisting my words and arguments, please note the following:

- If the iphone 13 and 14 are a bad example, take any other phone with storage tiers in the name and a clearer distinction to the predecessor. I already said this, you chose to ignore it, for the sake of apparently being an Nvidia apologist.

- The differences between iphone 13 and 14 were made crystal clear in the keynote and marketing material, not so with Nvidia. => Nvidia did objectively worse.

- The storage speed in a Macbook Air with 128 and 256GB of SSD matters much, much less than a 30% difference in execution units on a GPU. => Nvidia did objectively worse.

 

Edit: Besides making it very clear from the beginning, the difference between A15 and A16 in the iphone 14 similarly has a much smaller impact on end user experience compared to the effect of the significant difference in # of execution units in the 4080 tiers.

 

The naming of the 4080 tiers is clearly deceptive with none of the examples you brought up being anywhere close in severity.

And if you ignore all these points: Even if others do as bad, it is still absolutely no excuse for Nvidia to follow suit.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, starsmine said:

Is it wrong that both the 2.4 and 5.0 mustang are called mustangs?
I mean one has a 2.4L 4 banger, the other has a 5L v8?

It depends if you know you're not getting a true Mustang or not, an average driver wouldn't care, but anyone that pays attention to what they're buying will know its just not the same. I think the 4080 12GB is deceptive marketing, model names have meant something with the product stack, now Nvidia moves up their cards a tier again, and a real x80 card is gonna be around $1000 if the FE cards are going to undercut the AIB's.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×