Jump to content

OMG Nvidia GPU Models/Names Ridiculous!

bigjoncoop
49 minutes ago, DreamCat04 said:

Ah a 1660(Ti) supports raytracing? Well in a raytraced game where I get around 30-40FPS on my 2060, I get like 7-9 on my 1660Ti (although it's the 115W laptop version and it's drawing like 90W)

Pascel supports real-time Raytracing. 
Its a DX12 protocol. not a NVidia protocol.
So you can run all your "RTX" games on a GTX 1080 if you wanted to. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, starsmine said:

Pascel supports real-time Raytracing. 
Its a DX12 protocol. not a NVidia protocol.
So you can run all your "RTX" games on a GTX 1080 if you wanted to. 

sweet, how do I turn that on?

                          Ryzen 5800X3D(Because who doesn't like a phat stack of cache?) GPU - 7700Xt

                                                           X470 Strix f gaming, 32GB Corsair vengeance, WD Blue 500GB NVME-WD Blue2TB HDD, 700watts EVGA Br

 ~Extra L3 cache is exciting, every time you load up a new game or program you never know what your going to get, will it perform like a 5700x or are we beating the 14900k today? 😅~

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Fasauceome said:

I find it funny that you are completely right about your point of bad naming, but you choose the worst example....

It makes perfect sense that the 1080 is way faster than the 1650 because the 1650 is the bottom of its generation, while the 1080 is the top of its own.

 

Now, you want to really complain? The GTX 1060 is a stunning example. The 3GB and 6GB have a difference in not just memory, but cuda cores! The 1060 6GB should be called the 1060 ti. And the GT 1030... Don't even get me started. Let alone the MX130 for laptops!

 

More recently we're seeing  major complaints about the RTX 4080/4070 ti. They wanted to release two 4080 models, with different ram and cuda cores configurations! Would have been a nightmare, community backlash made them create the 4070 ti. But they should have honestly just called it the 4070 and made it cheaper.

What about the 3060 8GB vs 12GB?  Fuckin' totally different cards!

8GB is 3060 Light, or something....

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, lotus10101 said:

sweet, how do I turn that on?

if the game supports software RT?  It's not hard.

 

But you'll nerf the ever living hell outta your framerate.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, starsmine said:

Pascel supports real-time Raytracing. 
Its a DX12 protocol. not a NVidia protocol.
So you can run all your "RTX" games on a GTX 1080 if you wanted to. 

Sure, the API calls are there for any card to use, but only RTX cards (on the nvidia side) have the RT cores that allow those operations to be done at a reasonable speed. Otherwise the GPU's ALUs are the ones doing the work, slowing things down considerably.

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, bigjoncoop said:

For instance a GTX 1080 scores at least 120% higher in all major categories versus a GTX 1650....  can someone explain how that's supposed to make sense

The 16 series is a low end refresh.

it’s a 1000 generation card, the 6 just indicates a refresh, of a 50 level card. It’s a bit faster than a 1050ti because it’s a refresh line. 
 

 

it’s not terrible, nvidia actually has the most straight forward names.

first digit is generation: ie 3

the second number is almost never used now, sticks to 0 unless it’s a 1650 or 1660 (so refreshes use it)

the third number is the class of card: ie 8

the fourth number is always 0

theres also ti or super at the end which means it’s a better version of the non ti one (1080 vs 1080ti)

 

 

so you might have a 3080 ti, a 3000 generation card, an 80 class card, and a bit faster than a normal 3080. It’s not always a higher number wins. A 2080 is faster than a 3060, because it’s an 80 class card, even though the 3060 uses a faster chip architecture

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dean0919 said:

Microsoft also skipped Windows 9 after Windows 7 and went straight to 10. Also, another weird egg they did was with Windows 8.1... Why name it 8.1, where every their other OS had normal numbers?...

There's actually good reasons for this, though the first one is a little dumb.

1. When Windows programs do version checks, if the first number is 9 the program assumes it's on Windows 9x.

2. Windows 8.1 wasn't a whole new OS. It was a Service Pack to Windows 8 that made enough changes to bump the version number.

elephants

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Helpful Tech Witch said:

The 16 series is a low end refresh.

it’s a 1000 generation card, the 6 just indicates a refresh, of a 50 level card. It’s a bit faster than a 1050ti because it’s a refresh line. 

No, its not

they are not pascal cards, they are TU117 or TU116. Its just Turing without RT cores, 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dean0919 said:

Also, another weird egg they did was with Windows 8.1... Why name it 8.1, where every their other OS had normal numbers?...

Because it was still windows 8, just not crap. So it’s windows 8.1

1 hour ago, Dean0919 said:

 

I also find it stupid to rename GTX into RTX just because they invented ray tracing, which is just another feature in games and nothing much.

It’s a hardware feature, not a feature in games. The rt cores to do it are present in only rtx cards, though a faster gtx can do it through software imitation of rt cores.

from a marketing sense is an amazing idea

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, starsmine said:

No, its not

they are not pascal cards, they are TU107. Its just Turing without RT cores, 

Huh. I didn’t know that. I just knew they were bad deals most the time and assumed nvidia refreshed 10.

though I never said it was a pascal card. I said it was a 1000 series card… which it is.

 

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Helpful Tech Witch said:

Huh. I didn’t know that. I just knew they were bad deals most the time and assumed nvidia refreshed 10.

though I never said it was a pascal card. I said it was a 1000 series card… which it is.

 

No, they are NOT 1000 series cards is what Im pointing out. They are NOT a refresh. 1660 has zero relation to the 1060, and the 1650 has zero relation to the 1050. They are of the same architecture of the 2000 cards. You DID say they were pascal cards, thats what the 1060 and 1050 are.
Nvidia did not want people confusing them with RTX cards in terms of RTX performance. 

Nor was the 1660, 1660ti, or 1660 super ever considered a Bad deal.

TU117 used for the first batches of 1650 is weird, not Nvenc (new) support, just normal Nvenc, but if you dont care then you dont care.
TU116 supported Nvenc (new)

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Helpful Tech Witch said:

It’s a hardware feature, not a feature in games. The rt cores to do it are present in only rtx cards, though a faster gtx can do it through software imitation of rt cores.

from a marketing sense is an amazing idea

Its a DX12 protocol. RTX just indicates there are essentially RT asics to handle those calls, not that Ray tracing is a hardware feature.
It is very much a feature in games. 
image.png.8ca4c1cc2de75efee9274be28650c70f.png

https://www.techpowerup.com/gpu-specs/docs/nvidia-turing-architecture.pdf

46 minutes ago, lotus10101 said:

sweet, how do I turn that on?

by... turning it on? Im not sure I understand what you are asking, Game goes, hey I can use DX12 calls of this subset of instructions(DXR) (DX12 is weird like this, which is why almost all DX11 cards supports a subset of DX12 instructions so they get slapped with the label of supporting DX12), does the driver say I can do that? 
Driver goes ya
and the toggle option appears in game.

 

Quote

Turing ray tracing performance with RT Cores is significantly faster than ray tracing in Pascal GPUs. Turing can deliver far more Giga Rays/Sec than Pascal on different workloads, as shown in Figure 21. Pascal is spending approximately 1.1 Giga Rays/Sec, or 10 TFLOPS / Giga Ray to do ray tracing in software, whereas Turing can do 10+ Giga Rays/Sec using RT Cores, and run ray tracing 10 times faster. Note: This paper does not cover developer details for implementing ray tracing in games or applications with RTX, DXR, or other APIs, but many resources exist with such information. Good initial information sources include Introduction to NVIDIA RTX and DirectX Ray Tracing blog post, the NVIDIA RTX Technology developer site, and a publicly accessible GDC 2018 course on RTX presented by NVIDIA called Ray Tracing in Games with NVIDIA RTX. Also refer to Microsoft’s blog on DXR.

image.png.d651d9d4de82940295f953b6b1011720.png

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, PDifolco said:

Haha you're right ! What you should do is invert the 2 first and last figures

so 1650 => 5016

1080 => 8010 

Then the second is obviously better 😄

 

 

so what are you going to do with the 3070?
rename it to 7030?
is that obviously better then the 8010?
cause it is better.

Nvidias names are pretty solid as is. being confused that the flagship of the previous generation beats a mid tier of the next generation is not really justifiable. 
a 1990s NSX is still faster then a 2010 civic.
 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, bigjoncoop said:

For instance a GTX 1080 scores at least 120% higher in all major categories versus a GTX 1650....  can someone explain how that's supposed to make sense?

The 3rd number indicates the class of GPU.  You are talking an 80 clas vs a 50 class.  What really irritates me is using the same name for completely different things.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, ewitte said:

The 3rd number indicates the class of GPU.  You are talking an 80 clas vs a 50 class.  What really irritates me is using the same name for completely different things.

What like 5700x and 5700xt  or  Rx590 and Z590? 🤣

                          Ryzen 5800X3D(Because who doesn't like a phat stack of cache?) GPU - 7700Xt

                                                           X470 Strix f gaming, 32GB Corsair vengeance, WD Blue 500GB NVME-WD Blue2TB HDD, 700watts EVGA Br

 ~Extra L3 cache is exciting, every time you load up a new game or program you never know what your going to get, will it perform like a 5700x or are we beating the 14900k today? 😅~

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, ewitte said:

The 3rd number indicates the class of GPU.  You are talking an 80 clas vs a 50 class.  What really irritates me is using the same name for completely different things.

I especially enjoy The B650 and B660  🤔is this amd or intel? I know amd has pin's of course(orders B650 and an intel 12th gen) DAMINT ☠️ better head to the ltt forum to sort this matter out dur dur dur 

                          Ryzen 5800X3D(Because who doesn't like a phat stack of cache?) GPU - 7700Xt

                                                           X470 Strix f gaming, 32GB Corsair vengeance, WD Blue 500GB NVME-WD Blue2TB HDD, 700watts EVGA Br

 ~Extra L3 cache is exciting, every time you load up a new game or program you never know what your going to get, will it perform like a 5700x or are we beating the 14900k today? 😅~

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dean0919 said:

They could just release it as a service pack for Windows 8 and fix it's crappiness instead of renaming it to 8.1 which was dumb.

It was a service pack for windows 8. 
From a marketing perspective it makes sense. They needed people to know it was majorly overhauled.

 

2 hours ago, starsmine said:

Its a DX12 protocol. RTX just indicates there are essentially RT asics to handle those calls, not that Ray tracing is a hardware feature.
It is very much a feature in games. 
image.png.8ca4c1cc2de75efee9274be28650c70f.png

https://www.techpowerup.com/gpu-specs/docs/nvidia-turing-architecture.pdf

by... turning it on? Im not sure I understand what you are asking, Game goes, hey I can use DX12 calls of this subset of instructions(DXR) (DX12 is weird like this, which is why almost all DX11 cards supports a subset of DX12 instructions so they get slapped with the label of supporting DX12), does the driver say I can do that? 
Driver goes ya
and the toggle option appears in game.

 

image.png.d651d9d4de82940295f953b6b1011720.png

Ray tracing is a dx12 feature. 
 

RTX is the hardware feature (rt cores) on nvidia cards that allows for high speed raytracing. Which is what I said.

 

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, lotus10101 said:

What like 5700x and 5700xt  or  Rx590 and Z590? 🤣

Things like releasing a DDR3 "version" of a card when it drops performance 40+%.  Or laptop GPU naming.  Any time they change dies or lose more than 10ish percent it isn't the same product, its basically a lie.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, ewitte said:

Things like releasing a DDR3 "version" of a card when it drops performance 40+%.  Or laptop GPU naming.  Any time they change dies or lose more than 10ish percent it isn't the same product, its basically a lie.

oh, well everyone knows that, if it wasn't clear before just look at the 4080ti, wait I meant 4070ti...4060 super? idfk lol

                          Ryzen 5800X3D(Because who doesn't like a phat stack of cache?) GPU - 7700Xt

                                                           X470 Strix f gaming, 32GB Corsair vengeance, WD Blue 500GB NVME-WD Blue2TB HDD, 700watts EVGA Br

 ~Extra L3 cache is exciting, every time you load up a new game or program you never know what your going to get, will it perform like a 5700x or are we beating the 14900k today? 😅~

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, lotus10101 said:

oh, well everyone knows that, if it wasn't clear before just look at the 4080ti, wait I meant 4070ti...4060 super? idfk lol

It isn't an excuse they deserve to be called out each and every time.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ewitte said:

It isn't an excuse they deserve to be called out each and every time.

Agreed, I wish all of them would just set down some guidelines for classification and stick to it 

 

~ If you going to be a rock, Be a rock, not a bug - Hendricksson - Screamers1995

                          Ryzen 5800X3D(Because who doesn't like a phat stack of cache?) GPU - 7700Xt

                                                           X470 Strix f gaming, 32GB Corsair vengeance, WD Blue 500GB NVME-WD Blue2TB HDD, 700watts EVGA Br

 ~Extra L3 cache is exciting, every time you load up a new game or program you never know what your going to get, will it perform like a 5700x or are we beating the 14900k today? 😅~

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/13/2023 at 12:46 AM, bigjoncoop said:

 

For that reason alone I am an AMD fanboy!

That's a pretty weak and pathetic reason if I'm honest.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

I like to buy XFX for the extra X's

                          Ryzen 5800X3D(Because who doesn't like a phat stack of cache?) GPU - 7700Xt

                                                           X470 Strix f gaming, 32GB Corsair vengeance, WD Blue 500GB NVME-WD Blue2TB HDD, 700watts EVGA Br

 ~Extra L3 cache is exciting, every time you load up a new game or program you never know what your going to get, will it perform like a 5700x or are we beating the 14900k today? 😅~

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×