Jump to content

Could the RTX 2000 series names be misleading?

1 minute ago, Dunzaus said:

Hmm, I guess you were right about that, I must have had mostly CPU demanding games in my mind. I still doubt that you will see those cards that you fear, next year. I hope we will though. I want there to be such leaps, even though my wallet don't like it. But I don't see what would make Nvidia do that, unless there's competition.

honestly i'm more hoping that AMD will make a come back, the prices are too insane right now and i have a feeling they will keep inflating them if AMD does nothing :/ 

and i want Nvidia to release cards every 2 years :D i think that's more reasonable, yearly upgrades are crap, makes you feel like shit if they release something so much better so soon, atleast let me get a good use out of my card before the whole industry advances :/ they can still make big leaps every 2 years though

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

.

Edited by Dunzaus
replied to wrong comment
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, syn2112 said:

that's what i thought at first but then i wondered why they didn't show ANY benchmark results? all they kept babbling about is Ray Tracing and how the new cards perform 6x better at Ray Tracing than older cards bla bla bla, which is completely unfair.

maybe the reason they didn't talk about actual performance is because the new cards are lacking, and it's not worth it to show benchmarks because they will be similar.

so IF the performance is lacking then i guess they will release 7nm next year, not many people are gonna buy an RTX 2080 TI or 2080 if the 1080 Ti is so much cheaper, and offers similar results. minus ray tracing.

I really really hope the real reason is why they didn't show benchmarks is because they have a bunch of the 10series cards left over that they want to sell(which we kind of know) and fear that people wouldn't buy them(for the current price) if people knew that the new generation beats them in price to performance(could be a reason for the price increase too) 

 

But this is really just hoping.

 

Your theory is more likely but a man can hope.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, i_build_nanosuits said:

The RTX 2070 perform better than a Titan Xp

The Titan xp is about as good in games, as the 1080ti, so that's not really surprising, we saw that with the last 2 generations as well. 1070 was slightly better than 980ti. It doesn't have to be that much better, in order for that statement to be true.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, i_build_nanosuits said:

 

I asked for benchmarks, not Nvidia's marketing. They did claim that Turing was 6 times faster than Pascal, in ray tracing, which is no surprise.
Yes, 2070 will be better than Titan XP(ascal) in RTX loads. That we knew all along, but RTX loads are pretty limited at this point.

On a side not, how does Nvidia's ray tracing compare against DX 12's?

Ex-EX build: Liquidfy C+... R.I.P.

Ex-build:

Meshify C – sold

Ryzen 5 1600x @4.0 GHz/1.4V – sold

Gigabyte X370 Aorus Gaming K7 – sold

Corsair Vengeance LPX 2x8 GB @3200 Mhz – sold

Alpenfoehn Brocken 3 Black Edition – it's somewhere

Sapphire Vega 56 Pulse – ded

Intel SSD 660p 1TB – sold

be Quiet! Straight Power 11 750w – sold

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Dunzaus said:

The Titan xp is about as good in games, as the 1080ti, so that's not really surprising, we saw that with the last 2 generations as well. 1070 was slightly better than 980ti. It doesn't have to be that much better, in order for that statement to be true.

yeah, and that's very impressive imo...

 

970 > 780ti

1070 > or = 980ti

2070 > or = 1080ti

 

im fine with that metric, absolutely.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Quadriplegic said:

On a side not, how does Nvidia's ray tracing compare against DX 12's?

DX12?!

DX12 is trash, that's why no games were ever really succesful using it...and new upcoming games supporting the new technologies are mostly DX11 games if i'm not mistaken...maybe they will have that faulty semi-DX12 support bolted on to them, i don't know, i hope not.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, i_build_nanosuits said:

yeah, and that's very impressive imo...

 

970 > 780ti

1070 > or = 980ti

2070 > or = 1080ti

 

im fine with that metric, absolutely.

Except it has been 28 months since the last cards came out and we even skipped Volta, so 2070 should really not be = 1080ti. It should be way higher.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Dunzaus said:

Except it has been 28 months since the last cards came out and we even skipped Volta, so 2070 should really not be = 1080ti

Notice how he put > = 1080ti. That means greater than or equal to, not that it will definitely be just equal to.

There are 10 types of people in this world. Those that understand binary and those that don't.

Current Rig (Dominator II): 8GB Corsair Vengeance LPX DDR4 3133 C15, AMD Ryzen 3 1200 at 4GHz, Coolermaster MasterLiquid Lite 120, ASRock B450M Pro4, AMD R9 280X, 120GB TCSunBow SSD, 3TB Seagate ST3000DM001-9YN166 HSD, Corsair CX750M Grey Label, Windows 10 Pro, 2x CoolerMaster MasterFan Pro 120, Thermaltake Versa H18 Tempered Glass.

 

Previous Rig (Black Magic): 8GB DDR3 1600, AMD FX6300 OC'd to 4.5GHz, Zalman CNPS5X Performa, Asus M5A78L-M PLUS /USB3, GTX 950 SC (former, it blew my PCIe lane so now on mobo graphics which is Radeon HD 3000 Series), 1TB Samsung Spinpoint F3 7200RPM HDD, 3TB Seagate ST3000DM001-9YN166 HDD (secondary), Corsair CX750M, Windows 8.1 Pro, 2x 120mm Red LED fans, Deepcool SMARTER case

 

My secondary rig (The Oldie): 4GB DDR2 800, Intel Core 2 Duo E8400 @ 3GHz, Stock Dell Cooler, Foxconn 0RY007, AMD Radeon HD 5450, 250GB Samsung Spinpoint 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management. UPDATE: SPECS UPGRADED DUE TO CASEMOD, 8GB DDR2 800, AMD Phenom X4 9650, Zalman CNPS5X Performa, Biostar GF8200C M2+, AMD Radeon HD 7450 GDDR5 edition, Samsung Spinpoint 250GB 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management and support for non Dell boards.

 

Retired/Dead Rigs: The OG (retired) (First ever PC I used at 3 years old back in 2005) Current Specs: 2GB DDR2, Pentium M 770 @ 2.13GHz, 60GB IDE laptop HDD, ZorinOS 12 Ultimate x86. Originally 512mb DDR2, Pentium M 740 @ 1.73GHzm 60GB IDE laptop HDD and single boot XP Pro. The Craptop (dead), 2gb DDR3, Celeron n2840 @ 2.1GHz, 50GB eMMC chip, Windows 10 Pro. Nightrider (dead and cannibalized for Dominator II): Ryzen 3 1200, Gigabyte A320M HD2, 8GB DDR4, XFX Ghost Core Radeon HD 7770, 1TB Samsung Spinpoint F3 (2010), 3TB Seagate Barracuda, Corsair CX750M Green, Deepcool SMARTER, Windows 10 Home.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Dunzaus said:

Except it has been 28 months since the last cards came out and we even skipped Volta, so 2070 should really not be = 1080ti

and maybe it will be even better, who knows...it's a brand new architecture, it's greatly possible that those cuda cores are much more powerful...and also you guys seem to discredit the ray tracing stuff but it will become very relevant very soon, nvidia showed 4 major titles already that will suport it (asseto corsa competizione, tomb raider, Metro exodus and battlefield 5) and they also showed this:
https://www.guru3d.com/news-story/nvidia-is-listing-21-games-with-rtx-support.html

Which add final fantasy, PUBG, Hitman 2 and many others...

So this is to be taken into consideration as well...Pascal will do shit all of these RTX stuff in those amazing games.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, xriqn said:

Notice how he put > = 1080ti. That means greater than or equal to, not that it will definitely be just equal to.

I know, just how it's been the past few generations. And again, Volta's version of 2070 should have been > or = 1070 , so that's not really saying much.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Dunzaus said:

I know, just how it's been the past few generations. And again, Volta's version of 2070 should have been > or = 1070 , so that's not really saying much.

Also my prediction is, seing how different this new architecture is compared to maxwell/pascal...i think the 980ti and 1080ti will now start the ''aging like milk'' part of the process...in that as newer titles will roll out and driver updates for the Turing cards will roll out, the performance difference between them will grow and people will start to complain that the older cards are now falling appart...just like the 780/780ti did a few years ago.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, i_build_nanosuits said:

and maybe it will be even better, who knows...it's a brand new architecture, it's greatly possible that those cuda cores are much more powerful...and also you guys seem to discredit the ray tracing stuff but it will become very relevant very soon, nvidia showed 4 major titles already that will suport it (asseto corsa competizione, tomb raider, Metro exodus and battlefield 5) and they also showed this:
https://www.guru3d.com/index.php?ct=news&action=file&id=29081

Which add final fantasy, PUBG, Hitman 2 and many others...

So this is to be taken into consideration as well...Pascal will do shit all of these RTX stuff in those amazing games.

Everything is possible, we haven't seen the benchmarks yet. But all the ray tracing in the world, won't make a card run 60fps in 4K, or heck even 144fps, for a 4K monitor with 144hz.

 

Ray tracing looks impressive, but in the end we need more powerful cards for 4K monitors and VR

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, i_build_nanosuits said:

and maybe it will be even better, who knows...it's a brand new architecture, it's greatly possible that those cuda cores are much more powerful...and also you guys seem to discredit the ray tracing stuff but it will become very relevant very soon, nvidia showed 4 major titles already that will suport it (asseto corsa competizione, tomb raider, Metro exodus and battlefield 5) and they also showed this:
https://www.guru3d.com/news-story/nvidia-is-listing-21-games-with-rtx-support.html

Which add final fantasy, PUBG, Hitman 2 and many others...

So this is to be taken into consideration as well...Pascal will do shit all of these RTX stuff in those amazing games.

Yea pascal will do shit all of the RTX stuff but who honestly cares about RTX unless they're a graphics w***e. I feel like people who are pre ordering are buying into technology that we cannot be certain about real world performance with yet and to me, that's a bad move because what if the real world performance jump outside of RTX stuff isn't that great? To be honest, I think the only reason nvidia is even releasing Turing right now is because they found out AMD was bringing out a competitor to Pascal (Navi).

There are 10 types of people in this world. Those that understand binary and those that don't.

Current Rig (Dominator II): 8GB Corsair Vengeance LPX DDR4 3133 C15, AMD Ryzen 3 1200 at 4GHz, Coolermaster MasterLiquid Lite 120, ASRock B450M Pro4, AMD R9 280X, 120GB TCSunBow SSD, 3TB Seagate ST3000DM001-9YN166 HSD, Corsair CX750M Grey Label, Windows 10 Pro, 2x CoolerMaster MasterFan Pro 120, Thermaltake Versa H18 Tempered Glass.

 

Previous Rig (Black Magic): 8GB DDR3 1600, AMD FX6300 OC'd to 4.5GHz, Zalman CNPS5X Performa, Asus M5A78L-M PLUS /USB3, GTX 950 SC (former, it blew my PCIe lane so now on mobo graphics which is Radeon HD 3000 Series), 1TB Samsung Spinpoint F3 7200RPM HDD, 3TB Seagate ST3000DM001-9YN166 HDD (secondary), Corsair CX750M, Windows 8.1 Pro, 2x 120mm Red LED fans, Deepcool SMARTER case

 

My secondary rig (The Oldie): 4GB DDR2 800, Intel Core 2 Duo E8400 @ 3GHz, Stock Dell Cooler, Foxconn 0RY007, AMD Radeon HD 5450, 250GB Samsung Spinpoint 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management. UPDATE: SPECS UPGRADED DUE TO CASEMOD, 8GB DDR2 800, AMD Phenom X4 9650, Zalman CNPS5X Performa, Biostar GF8200C M2+, AMD Radeon HD 7450 GDDR5 edition, Samsung Spinpoint 250GB 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management and support for non Dell boards.

 

Retired/Dead Rigs: The OG (retired) (First ever PC I used at 3 years old back in 2005) Current Specs: 2GB DDR2, Pentium M 770 @ 2.13GHz, 60GB IDE laptop HDD, ZorinOS 12 Ultimate x86. Originally 512mb DDR2, Pentium M 740 @ 1.73GHzm 60GB IDE laptop HDD and single boot XP Pro. The Craptop (dead), 2gb DDR3, Celeron n2840 @ 2.1GHz, 50GB eMMC chip, Windows 10 Pro. Nightrider (dead and cannibalized for Dominator II): Ryzen 3 1200, Gigabyte A320M HD2, 8GB DDR4, XFX Ghost Core Radeon HD 7770, 1TB Samsung Spinpoint F3 (2010), 3TB Seagate Barracuda, Corsair CX750M Green, Deepcool SMARTER, Windows 10 Home.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, xriqn said:

Yea pascal will do shit all of the RTX stuff but who honestly cares about RTX unless they're a graphics w***e. I feel like people who are pre ordering are buying into technology that we cannot be certain about real world performance with yet and to me, that's a bad move because what if the real world performance jump outside of RTX stuff isn't that great? To be honest, I think the only reason nvidia is even releasing Turing right now is because they found out AMD was bringing out a competitor to Pascal (Navi).

Or they just want to milk some money before next year, when the manufacturing should move to 7nm. 

 

Ex-EX build: Liquidfy C+... R.I.P.

Ex-build:

Meshify C – sold

Ryzen 5 1600x @4.0 GHz/1.4V – sold

Gigabyte X370 Aorus Gaming K7 – sold

Corsair Vengeance LPX 2x8 GB @3200 Mhz – sold

Alpenfoehn Brocken 3 Black Edition – it's somewhere

Sapphire Vega 56 Pulse – ded

Intel SSD 660p 1TB – sold

be Quiet! Straight Power 11 750w – sold

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, syn2112 said:

that's what im afraid of... 7nm is a big jump, so i'm afraid these cards will be obselete when 7nm comes around, because 12nm is basically 16nm, it's not an actual "12" nm.

I thought that the 2080 series was built on 7nm.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, ryao said:

I thought that the 2080 series was built on 7nm.

it's not, it's actually 12nm which is basically 16nm++

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, xriqn said:

 what if the real world performance jump outside of RTX stuff isn't that great?

That's a 2 generation jump...Titan Volta is already a shit load faster than titan Xp in games...so how could one more generation leap deliver such aweful performance?! also, do you think they would charge 1299$ if that card would't deliver? i dont.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, i_build_nanosuits said:

DX12?!

DX12 is trash, that's why no games were ever really succesful using it...and new upcoming games supporting the new technologies are mostly DX11 games if i'm not mistaken...maybe they will have that faulty semi-DX12 support bolted on to them, i don't know, i hope not.

i honestly have not seen DX12 do anything better than DX11 so far... maybe when theres DX12 based engines it will perform as promised, but i have no idea.

Vulkan didn't need "vulkan based engines" to seriously impress us, i really wish everyone would start using Vulkan, Doom is so fucking amazing because of it.

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, syn2112 said:

Vulkan didn't need "vulkan based engines" to seriously impress us, i really wish everyone would start using Vulkan, Doom is so fucking amazing because of it.

Doom run fine on everything...Doom run fine on a nintendo switch...and with an nvidia card it runs fine in open GL, and in Vulcan...doom simply is not a very demanding game and it's well optimized...the fact that it can run at 15% more FPS when using vulcan which adds absolutely nothing to the experience except a few more frames here and there doesn't make it impressive in the least...at least in my eyes that's how i see it...also, no new games use vulcan.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, i_build_nanosuits said:

Doom run fine on everything...Doom run fine on a nintendo switch...and with an nvidia card it runs fine in open GL, and in Vulcan...doom simply is not a very demanding game and it's well optimized...the fact that it can run at 15% more FPS when using vulcan which adds absolutely nothing to the experience except a few more frames here and there doesn't make it impressive in the least...at least in my eyes that's how i see it...also, no new games use vulcan.

actually the advantage with Vulkan is about 20-40% performance bump, but that only ties with AMD.
Nvidia didn't get any advantage with Vulkan but there's a reason for that, it's because AMD uses Asynchronous Compute and Nvidia doesn't that's where the advantage comes in.

but it's reported than RTX uses Async Compute, so i guess we will see performance improvements there as well :) 

and who says that the frames are wasted? and either way with a performance bump like that, you can basically run the game at higher settings because of it, what's not to like?


or for example Devs can use the hardware more efficiently and do more complex scenes and such etc etc etc

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, syn2112 said:

it's not, it's actually 12nm which is basically 16nm++

Has Nvidia or TSMC confirmed that?

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, syn2112 said:

actually the advantage with Vulkan is about 20-40% performance bump, but that only ties with AMD.
Nvidia didn't get any advantage with Vulkan but there's a reason for that, it's because AMD uses Asynchronous Compute and Nvidia doesn't that's where the advantage comes in.

but it's reported than RTX uses Async Compute, so i guess we will see performance improvements there as well :) 

and who says that the frames are wasted? and either way with a performance bump like that, you can basically run the game at higher settings because of it, what's not to like?


or for example Devs can use the hardware more efficiently and do more complex scenes and such etc etc etc

It depends on what is being rendered. Vulkan excels in situations where you have amadyl’s law style slowdowns by reducing the time spend per draw call. It is not always going to make a difference.

 

By the way, I am not a graphics developer (I do storage), but assuming that there is no serialization mechanism like a mutex in place, it is possible to (ab)use threading to emulate asynchronous operations. QEMU does it when it is not using AIO for example. Having an asynchronous interface to use is definitely nicer though because it reduces boilerplate.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, i_build_nanosuits said:

That's a 2 generation jump...Titan Volta is already a shit load faster than titan Xp in games...so how could one more generation leap deliver such aweful performance?! also, do you think they would charge 1299$ if that card would't deliver? i dont.

We just have to wait for the benchmarks that comes next month. But, if Nvidia has shown anything, then it's that they are willing skyrocket prices, as soon as they got a monopoly. Which they now have again the high end and have had for some years now, at the enthusiast market. Just like any company would. It's basic economics. Corner the market and raise prices.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×