Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
FaxedForward

Chinese outlet Teclab leaks RTX 3090 benchmarks

Recommended Posts

Just now, Suika said:

The actual CUDA cores aren't even 20% higher on the 3090 so I think those individuals need a bit of a reality check like this topic to get them back in line.

The biggest change to seemed to me to be a bus width reduction for the 3080.  I was expecting more out of that then seems to have appeared in this thread.  Could be the increase in memory amount reduces or negates it or something.  I don’t know enough about modern chip design to say.


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
35 minutes ago, Bombastinator said:

 I don’t know enough about modern chip design to say.

Well, if it's any comfort to you, I know nothing about chip design. I wouldn't think the memory bus would play a significant role when the reported hardware difference between the GPUs is already pretty slight, but I don't know a whole lot anyway so this is ignorant speculation at best haha.


if you have to insist you think for yourself, i'm not going to believe you.

Link to post
Share on other sites
2 minutes ago, Suika said:

Well, if it's any comfort to you, I know nothing about chip design. I wouldn't think the memory bus would play a significant role when the reported hardware difference between the GPUs is pretty slight, I figured the reduced bus width wouldn't matter so much on the 3080 based solely on the significantly reduced amount of VRAM, but I don't know a whole lot anyway so this is ignorant speculation at best haha.

I was listening to MLID for the reports on bus width increasing gpu speed.  I understand the 3090 is also clocked slightly lower than the 3080.  Seeing independent 3090 results may be interesting.  The results of independent testing will be the results.  


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
1 minute ago, Bombastinator said:

I was listening to MLID for the reports on bus width increasing gpu speed.  I understand the 3090 is also clocked slightly lower than the 3080.  Seeing independent 3090 results may be interesting.  The results of independent testing will be the results.  

I figured the clocks being lowered had to do with running near the limit of what PCIe power and dual 8-pin connectors should offer, so I'm excited to see more testing or even just clarification on how right or wrong I am on that. It'd be nice to see if there's additional headroom on cards that offer 3x 8-pin connectors for beginner and amateur overclockers.


if you have to insist you think for yourself, i'm not going to believe you.

Link to post
Share on other sites

Buy RTX 3080 10 GB now in 2020 for $699, sell for $X when RTX 3070 Ti/Super 16 GB comes out = Loss of $699 - $X

Buy a RTX 3080 T/Super 20 GBi in 2021 for $A

Sell RTX 3080 T/Super 20 GBi in 2022 for $Y to buy a RTX 4080 = Loss of $A - $Y

Total Loss = $699 - $X + $A - $Y

 

Buy a RTX 3090 now in 2020 for $1,499

Sell RTX 3090 in 2022 when RTX $4090 is announced for $Z = Loss $1,499 - $Z

 

Will this be true???

$699 - $X + $A - $Y > $1,499 - $Z

Link to post
Share on other sites
3 hours ago, Suika said:

I feel like most people considering a card like the RTX 3090 aren't terribly concerned about reviews, only that it's the top dog. It's just fun to speculate about performance before the embargo lifts or reviewers get their hands on the card. I've heard that reviewers aren't being seeded 3090 cards, though, so iunno.

I would argue it is relevant for people looking for the "top dog" performance if the second card down the stack can get to that level with an overclock. That's what this suggests, but we don't know yet


We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: PowerColor PCS+ R9 290

Attachcorethingy: GA-H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone Fara R1-based case for half the price

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Logitech G203 Prodigy

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: 2x Samsung SMB2030N (1600x900 VGA)

 

Link to post
Share on other sites
Posted · Original PosterOP
30 minutes ago, Suika said:

I figured the clocks being lowered had to do with running near the limit of what PCIe power and dual 8-pin connectors should offer, so I'm excited to see more testing or even just clarification on how right or wrong I am on that. It'd be nice to see if there's additional headroom on cards that offer 3x 8-pin connectors for beginner and amateur overclockers.

Maybe 3090 will be different but the Gamers Nexus overclocking marathon did not give me much hope. Using an EVGA FTW3 3080, 3x8-pins, 100% fan speed, throwing everything he could at it the performance was still basically on par with every other card.


Current build: AMD Ryzen 3600, ASUS PRIME X570-Pro, EVGA GTX 1080 Ti SC2, G.Skill 2x16GB 3600C16 DDR4, Samsung 850 Evo 250GB SATA (boot), Sabrent Rocket 1TB NVMe (games/work), Corsair RM750x, Scythe Mugen 5 Rev. B, Phanteks Enthoo Pro M, LG 27GL83A-B

Link to post
Share on other sites

I suspect that Nvidias claims of double raw performance over 2xxx series are close to the truth. It's frequently the case that raw compute performance (for example, FLOPS ratings) does not correlate to real world performance gains.

Hence why:

On 9/19/2020 at 9:36 AM, FaxedForward said:

With the cards only delivering half of the performance promised by NVIDIA I am fairly certain we are going to get massive incremental performance improvements pushed via software.

Is probably correct.

These cards actually push out support for a few new features that games/engines have yet to take advantage of (for example true DirectStorage support). They likely also have support for things we don't even know about yet. Microsoft has really been hinting at some serious changes coming to the way that GPUs interact with the rest of the system. It's likely that these coming changes will bring noticeable performance increases in real world GPU loads.


I will never succumb to the New Cult and I reject the leadership of @Aelar_Nailo and his wicked parrot armies led by @FakeCIA and @DildorTheDecent. I will keep my eyes pure and remain dedicated to the path of the One True; IlLinusNati

Link to post
Share on other sites
2 hours ago, straight_stewie said:

I suspect that Nvidias claims of double raw performance over 2xxx series are close to the truth. It's frequently the case that raw compute performance (for example, FLOPS ratings) does not correlate to real world performance gains.

Hence why:

Is probably correct.

These cards actually push out support for a few new features that games/engines have yet to take advantage of (for example true DirectStorage support). They likely also have support for things we don't even know about yet. Microsoft has really been hinting at some serious changes coming to the way that GPUs interact with the rest of the system. It's likely that these coming changes will bring noticeable performance increases in real world GPU loads.

It already are. Look at the benchmarks of RTX 3080 in newer games. Even at 1080p, they are doubling the framerate of something like GTX 1080Ti and even RTX 2080. Where in mostly older games, uplift is much smaller. This tells me new games are designed better and far less CPU bound, resulting in insane performance increase despite using same CPU as before. I gathered this info from TechPowerUp RTX 3080 review looking only at 1080p scores which are also most relevant to me as I have 1080p display but want to run games at high framerate.

Link to post
Share on other sites

For me it's a difficult decision.  You just know they're going to come out with a 3080 Super in the next year too that'll be the same performance as a 3090 for $400 less.  15-20% batter than a 3080 isn't really that great of a value proposition...the 1080Ti was 30% better than a 1080.

 

Maybe should just say "fuck it" and SLI 3090 and not have to upgrade for the next 5 years.  I'm satisfied if I can lay down 4k144.


Workstation: 9900KF @ 4.9Ghz || Gigabyte Z390 Aorus Master || Gigabyte Gaming G1 1080Ti || G.Skill DDR4-3800 @ 3600 4x8GB || Corsair AX1500i || 11 gallon whole-house loop.

LANRig/GuestGamingBox: 9900 nonK || ASRock Z390 Taichi Ultimate || EVGA Titan X (Maxwell) || Corsair SF600 || CPU+GPU watercooled 280 rad push only.

Server Router (Untangle): 8350K @ 4.7Ghz || ASRock Z370 ITX || 2x8GB || PicoPSU 250W, running on AX1200i from Server Storage || 11 gallon whole-house loop.

Server Compute: E5-2696v4 || Asus X99 mATX WS || LSI 9280i + Adaptec + Intel Expander || 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup, Corsair AX1200i (drives)  || 11 gallon whole-house loop.

Laptop: HP Elitebook 840 G3 (Intel 8350U), Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to post
Share on other sites
2 minutes ago, AnonymousGuy said:

Maybe should just say "fuck it" and SLI 3090 and not have to upgrade for the next 5 years.  I'm satisfied if I can lay down 4k144.

 

Starting next year NVIDIA won't be making SLI profiles anymore. Unless the gave developer uses DX12 or Vulkan , and adds support natively, you won't get it.

 

I'll just do what I did previously. Get the top card now, have it 2-3 years, and sell just before the announcement. Repeat.

Sold my Titan V, covered my 3090 with it.


5820K 4.0GHz | NH D15S | 32 GB RAM | GTX 580 | ASUS PG348Q+MG278Q

 

Link to post
Share on other sites
Just now, Valentyn said:

 

Starting next year NVIDIA won't be making SLI profiles anymore. Unless the gave developer uses DX12 or Vulkan , and adds support natively, you won't get it.

Is there a TLDR of how much effort it is for developers to natively support it?  We talking it's plausible all AAA games are going to do it?


Workstation: 9900KF @ 4.9Ghz || Gigabyte Z390 Aorus Master || Gigabyte Gaming G1 1080Ti || G.Skill DDR4-3800 @ 3600 4x8GB || Corsair AX1500i || 11 gallon whole-house loop.

LANRig/GuestGamingBox: 9900 nonK || ASRock Z390 Taichi Ultimate || EVGA Titan X (Maxwell) || Corsair SF600 || CPU+GPU watercooled 280 rad push only.

Server Router (Untangle): 8350K @ 4.7Ghz || ASRock Z370 ITX || 2x8GB || PicoPSU 250W, running on AX1200i from Server Storage || 11 gallon whole-house loop.

Server Compute: E5-2696v4 || Asus X99 mATX WS || LSI 9280i + Adaptec + Intel Expander || 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup, Corsair AX1200i (drives)  || 11 gallon whole-house loop.

Laptop: HP Elitebook 840 G3 (Intel 8350U), Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to post
Share on other sites
20 minutes ago, AnonymousGuy said:

Is there a TLDR of how much effort it is for developers to natively support it?  We talking it's plausible all AAA games are going to do it?

Quite a bit sadly.

Some decent ones support it. Looks like Rockstar are going it, if those can catch your fancy; although since NVIDIA are no longer bothering. I think developers might just no bother since they won't be getting much future Dev support.

Not like NVIDIA provides for their Gameworks integrations.

DirectX 12 titles include Shadow of the Tomb Raider, Civilization VI, Sniper Elite 4, Gears of War 4, Ashes of the Singularity: Escalation, Strange Brigade, Rise of the Tomb Raider, Zombie Army 4: Dead War, Hitman, Deus Ex: Mankind Divided, Battlefield 1, and Halo Wars 2.

Vulkan titles include Red Dead Redemption 2, Quake 2 RTX, Ashes of the Singularity: Escalation, Strange Brigade, and Zombie Army 4: Dead War

 

Souce:
https://www.guru3d.com/news-story/nvidia-ends-sli-support-and-is-transitioning-to-native-game-integrations-(read-terminated).html


5820K 4.0GHz | NH D15S | 32 GB RAM | GTX 580 | ASUS PG348Q+MG278Q

 

Link to post
Share on other sites
1 hour ago, Valentyn said:

 Review for 3090 is out!
 

 

For some reason or other the opening pic for this video on my phone is the GN 3080 OC world record.  I recently Watched this so it may just be YouTube.  They apparently got 14tf out of a 3080 on LN.  is this relevant to your statement at all or is YouTube just messing with me?


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
1 hour ago, AnonymousGuy said:

For me it's a difficult decision.  You just know they're going to come out with a 3080 Super in the next year too that'll be the same performance as a 3090 for $400 less.  15-20% batter than a 3080 isn't really that great of a value proposition...the 1080Ti was 30% better than a 1080.

 

Maybe should just say "fuck it" and SLI 3090 and not have to upgrade for the next 5 years.  I'm satisfied if I can lay down 4k144.

Historically the ti/super/whatever stuff has only ever come out to smack AMD with. If AMD can’t make a card that competes with a 3090 there may not be a 3090ti/super/whatever  and if they can’t compete with the 3080 I suspect there likely won’t be a 3080ti/super/whatever


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
47 minutes ago, Bombastinator said:

For some reason or other the opening pic for this video on my phone is the GN 3080 OC world record.  I recently Watched this so it may just be YouTube.  They apparently got 14tf out of a 3080 on LN.  is this relevant to your statement at all or is YouTube just messing with me?

That's just youtube messing with you sadly.


5820K 4.0GHz | NH D15S | 32 GB RAM | GTX 580 | ASUS PG348Q+MG278Q

 

Link to post
Share on other sites

Cost of a 3080 10 GB in 2020 + cost of a 3080 20 GB in 2021 > RTX 3090 in 2020
Under the condition that the RTX 3080 20 GB is > $800 (increase of only $100), which I think it will be, probably much more like in the $1000 range.

 

Enough said. Would you rather take the hit buying and selling + shipping one graphics card or two? If the 3080 is north of $800, it would have been cheaper to just get a 3090.

Link to post
Share on other sites
1 minute ago, Yoshi Moshi said:

Cost of a 3080 10 GB in 2020 + cost of a 3080 20 GB in 2021 > RTX 3090 in 2020
Under the condition that the RTX 3080 20 GB is > $800, which I think it will be, probably in the $1000 range.

 

Enough said. Would you rather take the hit buying and selling + shipping one graphics card or two? If the 3080 is north of $800, it would have been cheaper to just get a 3090.

Would is a hard one.  Too many unknown variables.  A big one is how much memory things will actually need and that remains unknown.  The consoles do weird things with memory.  Need to see how their OS works to see how much of what kind of memory is even available to run games on on them.


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
8 minutes ago, Bombastinator said:

Would is a hard one.  Too many unknown variables.  A big one is how much memory things will actually need and that remains unknown.  The consoles do weird things with memory.  Need to see how their OS works to see how much of what kind of memory is even available to run games on on them.

Even under the condition that the 3080 20 GB model equals the cost of the 3080 10 GB (which would be insane, but for the sake of argument). It would still be cheaper to get a 3090. By the time you finish buying and selling two cards vs one, it would have been cheaper just to get a 3090, even if a 3080 10 GB + a 3080 20 GB is a $100 cheaper than a 3090.

 

I really don't think that the 3080 20 GB is going to be cheaper than the 3080 10 GB, but I guess there's a small chance.

 

I reckon Nvidia knows what they are doing, and clearly see the need for cards with more memory, and think they can make money off it. If 10 GB was enough, and Nvidia couldn't make money of cards with more VRAM, then the cards wouldn't exist. Even the 3070 16 GB version (the value card, the ##70 class) the most popular card, will have 16 GB of VRAM which is more than the 3080 10 GB. The 3070 16 GB is not for enthusiast but the value option. The value option will have more VRAM than the 3080 10 GB. They clearly see the need for more memory than 10 GB when the value option will have more than 10 GB.

Link to post
Share on other sites
2 minutes ago, Yoshi Moshi said:

Even under the condition that the 3080 20 GB model equals the cost of the 3080 10 GB (which would be insane, but for the sake of argument). It would still be cheaper to get a 3090. By the time you finish buying and selling two cards vs one, it would have been cheaper just to get a 3090, even if a 3080 10 GB + a 3080 20 GB is a $100 cheaper than a 3090.

 

I reckon Nvidia knows what they are doing, and clearly see the need for cards with more memory, and think they can make money off it. If 10 GB was enough, and Nvidia couldn't make money of cards with more VRAM, then the cards wouldn't exist. Even the 3070 16 GB version (the value card, the ##70 class) the most popular card, will have 16 GB of VRAM which is more than the 3080 10 GB. The 3070 16 GB is not for enthusiast but the value option. The value option will have more VRAM than the 3080 10 GB.

But as of yet they don’t exist.  As for the value option thing, apparently the consoles don’t have any regular memory.  They use vram for everything.  So it’s kind of unknown how much actual game memory will be available.  One guy claimed as little as 8gb.  I personally find this one doubtful, but I don’t know what the number is going to be.  Just that it has to be less than 16gb.  The whole 20gb thing implies to me they may be covering against a 16gb AMD card.  You’ve got 16gb?  We’ve got 20gb! So there!” kinda thing.  I’m not saying you’re wrong necessarily.  I just don’t know that you’re right.


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
1 minute ago, Bombastinator said:

But as of yet they don’t exist.  As for the value option thing, apparently the consoles don’t have any regular memory.  They use vram for everything.  So it’s kind of unknown how much actual game memory will be available.  One guy claimed as little as 8gb.  I personally find this one doubtful, but I don’t know what the number is going to be.  Just that it has to be less than 16gb.  The whole 20gb thing implies to me they may be covering against a 16gb AMD card.  You’ve got 16gb?  We’ve got 20gb! So there!” kinda thing.  I’m not saying you’re wrong necessarily.  I just don’t know that you’re right.

Exactly, even the consoles, the baseline, have more VRAM than the 3080 10 GB. We know that's not all for rendering images on the display, but how much is necessary for the OS and other tasks, probably not 6 GB. Even a more complicated OS like windows 10 doesn't use that much memory at ideal. The consoles are expected to last a few years, not one generation of graphics card. So they are built with that in mind, probably a 4 year life span. They have 16 GB as a result to get through the next four years of 4k gaming.

Link to post
Share on other sites
42 minutes ago, Yoshi Moshi said:

Exactly, even the consoles, the baseline, have more VRAM than the 3080 10 GB. We know that's not all for rendering images on the display, but how much is necessary for the OS and other tasks, probably not 6 GB. Even a more complicated OS like windows 10 doesn't use that much memory at ideal. The consoles are expected to last a few years, not one generation of graphics card. So they are built with that in mind, probably a 4 year life span. They have 16 GB as a result to get through the next four years of 4k gaming.

It’s been 5 years for a console generation so far, more or less.  The problem is we don’t know what devs are going to DO with that memory. We don’t even know how much is going to be available.  I can’t see making a decision on a graphics card until I know how the consoles are going to work.   Too much chanc of winding up with a white elephant.


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
On 9/19/2020 at 12:03 PM, CTR640 said:

I think it's the driver. They don't have the latest and newest driver yet for the 3090. nVidia is not stupid for that 10%, no one with a healthy common sense would pay up $1500+ for a 10% performance.

They said this is supposed to be a titan replacement and in the past the titan has been about 10% faster than the ti for about double the price but with more vram so this would be in line with that. Honestly I would just buy a 3080 and call it a day because the 3090 is looking like it is not worth the extra money unless you are a prosumer who needs the extra vram and or nvlink capabilities. 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×