Jump to content

Chinese outlet Teclab leaks RTX 3090 benchmarks

FaxedForward
1 minute ago, Bombastinator said:

I was listening to MLID for the reports on bus width increasing gpu speed.  I understand the 3090 is also clocked slightly lower than the 3080.  Seeing independent 3090 results may be interesting.  The results of independent testing will be the results.  

I figured the clocks being lowered had to do with running near the limit of what PCIe power and dual 8-pin connectors should offer, so I'm excited to see more testing or even just clarification on how right or wrong I am on that. It'd be nice to see if there's additional headroom on cards that offer 3x 8-pin connectors for beginner and amateur overclockers.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

Buy RTX 3080 10 GB now in 2020 for $699, sell for $X when RTX 3070 Ti/Super 16 GB comes out = Loss of $699 - $X

Buy a RTX 3080 T/Super 20 GBi in 2021 for $A

Sell RTX 3080 T/Super 20 GBi in 2022 for $Y to buy a RTX 4080 = Loss of $A - $Y

Total Loss = $699 - $X + $A - $Y

 

Buy a RTX 3090 now in 2020 for $1,499

Sell RTX 3090 in 2022 when RTX $4090 is announced for $Z = Loss $1,499 - $Z

 

Will this be true???

$699 - $X + $A - $Y > $1,499 - $Z

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Suika said:

I feel like most people considering a card like the RTX 3090 aren't terribly concerned about reviews, only that it's the top dog. It's just fun to speculate about performance before the embargo lifts or reviewers get their hands on the card. I've heard that reviewers aren't being seeded 3090 cards, though, so iunno.

I would argue it is relevant for people looking for the "top dog" performance if the second card down the stack can get to that level with an overclock. That's what this suggests, but we don't know yet

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Suika said:

I figured the clocks being lowered had to do with running near the limit of what PCIe power and dual 8-pin connectors should offer, so I'm excited to see more testing or even just clarification on how right or wrong I am on that. It'd be nice to see if there's additional headroom on cards that offer 3x 8-pin connectors for beginner and amateur overclockers.

Maybe 3090 will be different but the Gamers Nexus overclocking marathon did not give me much hope. Using an EVGA FTW3 3080, 3x8-pins, 100% fan speed, throwing everything he could at it the performance was still basically on par with every other card.

Current build: AMD Ryzen 7 5800X, ASUS PRIME X570-Pro, EVGA RTX 3080 XC3 Ultra, G.Skill 2x16GB 3600C16 DDR4, Samsung 980 Pro 1TB, Sabrent Rocket 1TB, Corsair RM750x, Scythe Mugen 5 Rev. B, Phanteks Enthoo Pro M, LG 27GL83A-B

Link to comment
Share on other sites

Link to post
Share on other sites

I suspect that Nvidias claims of double raw performance over 2xxx series are close to the truth. It's frequently the case that raw compute performance (for example, FLOPS ratings) does not correlate to real world performance gains.

Hence why:

On 9/19/2020 at 9:36 AM, FaxedForward said:

With the cards only delivering half of the performance promised by NVIDIA I am fairly certain we are going to get massive incremental performance improvements pushed via software.

Is probably correct.

These cards actually push out support for a few new features that games/engines have yet to take advantage of (for example true DirectStorage support). They likely also have support for things we don't even know about yet. Microsoft has really been hinting at some serious changes coming to the way that GPUs interact with the rest of the system. It's likely that these coming changes will bring noticeable performance increases in real world GPU loads.

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, straight_stewie said:

I suspect that Nvidias claims of double raw performance over 2xxx series are close to the truth. It's frequently the case that raw compute performance (for example, FLOPS ratings) does not correlate to real world performance gains.

Hence why:

Is probably correct.

These cards actually push out support for a few new features that games/engines have yet to take advantage of (for example true DirectStorage support). They likely also have support for things we don't even know about yet. Microsoft has really been hinting at some serious changes coming to the way that GPUs interact with the rest of the system. It's likely that these coming changes will bring noticeable performance increases in real world GPU loads.

It already are. Look at the benchmarks of RTX 3080 in newer games. Even at 1080p, they are doubling the framerate of something like GTX 1080Ti and even RTX 2080. Where in mostly older games, uplift is much smaller. This tells me new games are designed better and far less CPU bound, resulting in insane performance increase despite using same CPU as before. I gathered this info from TechPowerUp RTX 3080 review looking only at 1080p scores which are also most relevant to me as I have 1080p display but want to run games at high framerate.

Link to comment
Share on other sites

Link to post
Share on other sites

 Review for 3090 is out!
 

 

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

For me it's a difficult decision.  You just know they're going to come out with a 3080 Super in the next year too that'll be the same performance as a 3090 for $400 less.  15-20% batter than a 3080 isn't really that great of a value proposition...the 1080Ti was 30% better than a 1080.

 

Maybe should just say "fuck it" and SLI 3090 and not have to upgrade for the next 5 years.  I'm satisfied if I can lay down 4k144.

Workstation:  13700k @ 5.5Ghz || Gigabyte Z790 Ultra || MSI Gaming Trio 4090 Shunt || TeamGroup DDR5-7800 @ 7000 || Corsair AX1500i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, AnonymousGuy said:

Maybe should just say "fuck it" and SLI 3090 and not have to upgrade for the next 5 years.  I'm satisfied if I can lay down 4k144.

 

Starting next year NVIDIA won't be making SLI profiles anymore. Unless the gave developer uses DX12 or Vulkan , and adds support natively, you won't get it.

 

I'll just do what I did previously. Get the top card now, have it 2-3 years, and sell just before the announcement. Repeat.

Sold my Titan V, covered my 3090 with it.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Valentyn said:

 

Starting next year NVIDIA won't be making SLI profiles anymore. Unless the gave developer uses DX12 or Vulkan , and adds support natively, you won't get it.

Is there a TLDR of how much effort it is for developers to natively support it?  We talking it's plausible all AAA games are going to do it?

Workstation:  13700k @ 5.5Ghz || Gigabyte Z790 Ultra || MSI Gaming Trio 4090 Shunt || TeamGroup DDR5-7800 @ 7000 || Corsair AX1500i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, AnonymousGuy said:

Is there a TLDR of how much effort it is for developers to natively support it?  We talking it's plausible all AAA games are going to do it?

Quite a bit sadly.

Some decent ones support it. Looks like Rockstar are going it, if those can catch your fancy; although since NVIDIA are no longer bothering. I think developers might just no bother since they won't be getting much future Dev support.

Not like NVIDIA provides for their Gameworks integrations.

DirectX 12 titles include Shadow of the Tomb Raider, Civilization VI, Sniper Elite 4, Gears of War 4, Ashes of the Singularity: Escalation, Strange Brigade, Rise of the Tomb Raider, Zombie Army 4: Dead War, Hitman, Deus Ex: Mankind Divided, Battlefield 1, and Halo Wars 2.

Vulkan titles include Red Dead Redemption 2, Quake 2 RTX, Ashes of the Singularity: Escalation, Strange Brigade, and Zombie Army 4: Dead War

 

Souce:
https://www.guru3d.com/news-story/nvidia-ends-sli-support-and-is-transitioning-to-native-game-integrations-(read-terminated).html

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Valentyn said:

 Review for 3090 is out!
 

 

For some reason or other the opening pic for this video on my phone is the GN 3080 OC world record.  I recently Watched this so it may just be YouTube.  They apparently got 14tf out of a 3080 on LN.  is this relevant to your statement at all or is YouTube just messing with me?

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AnonymousGuy said:

For me it's a difficult decision.  You just know they're going to come out with a 3080 Super in the next year too that'll be the same performance as a 3090 for $400 less.  15-20% batter than a 3080 isn't really that great of a value proposition...the 1080Ti was 30% better than a 1080.

 

Maybe should just say "fuck it" and SLI 3090 and not have to upgrade for the next 5 years.  I'm satisfied if I can lay down 4k144.

Historically the ti/super/whatever stuff has only ever come out to smack AMD with. If AMD can’t make a card that competes with a 3090 there may not be a 3090ti/super/whatever  and if they can’t compete with the 3080 I suspect there likely won’t be a 3080ti/super/whatever

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, Bombastinator said:

For some reason or other the opening pic for this video on my phone is the GN 3080 OC world record.  I recently Watched this so it may just be YouTube.  They apparently got 14tf out of a 3080 on LN.  is this relevant to your statement at all or is YouTube just messing with me?

That's just youtube messing with you sadly.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Cost of a 3080 10 GB in 2020 + cost of a 3080 20 GB in 2021 > RTX 3090 in 2020
Under the condition that the RTX 3080 20 GB is > $800 (increase of only $100), which I think it will be, probably much more like in the $1000 range.

 

Enough said. Would you rather take the hit buying and selling + shipping one graphics card or two? If the 3080 is north of $800, it would have been cheaper to just get a 3090.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Yoshi Moshi said:

Cost of a 3080 10 GB in 2020 + cost of a 3080 20 GB in 2021 > RTX 3090 in 2020
Under the condition that the RTX 3080 20 GB is > $800, which I think it will be, probably in the $1000 range.

 

Enough said. Would you rather take the hit buying and selling + shipping one graphics card or two? If the 3080 is north of $800, it would have been cheaper to just get a 3090.

Would is a hard one.  Too many unknown variables.  A big one is how much memory things will actually need and that remains unknown.  The consoles do weird things with memory.  Need to see how their OS works to see how much of what kind of memory is even available to run games on on them.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Bombastinator said:

Would is a hard one.  Too many unknown variables.  A big one is how much memory things will actually need and that remains unknown.  The consoles do weird things with memory.  Need to see how their OS works to see how much of what kind of memory is even available to run games on on them.

Even under the condition that the 3080 20 GB model equals the cost of the 3080 10 GB (which would be insane, but for the sake of argument). It would still be cheaper to get a 3090. By the time you finish buying and selling two cards vs one, it would have been cheaper just to get a 3090, even if a 3080 10 GB + a 3080 20 GB is a $100 cheaper than a 3090.

 

I really don't think that the 3080 20 GB is going to be cheaper than the 3080 10 GB, but I guess there's a small chance.

 

I reckon Nvidia knows what they are doing, and clearly see the need for cards with more memory, and think they can make money off it. If 10 GB was enough, and Nvidia couldn't make money of cards with more VRAM, then the cards wouldn't exist. Even the 3070 16 GB version (the value card, the ##70 class) the most popular card, will have 16 GB of VRAM which is more than the 3080 10 GB. The 3070 16 GB is not for enthusiast but the value option. The value option will have more VRAM than the 3080 10 GB. They clearly see the need for more memory than 10 GB when the value option will have more than 10 GB.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Yoshi Moshi said:

Even under the condition that the 3080 20 GB model equals the cost of the 3080 10 GB (which would be insane, but for the sake of argument). It would still be cheaper to get a 3090. By the time you finish buying and selling two cards vs one, it would have been cheaper just to get a 3090, even if a 3080 10 GB + a 3080 20 GB is a $100 cheaper than a 3090.

 

I reckon Nvidia knows what they are doing, and clearly see the need for cards with more memory, and think they can make money off it. If 10 GB was enough, and Nvidia couldn't make money of cards with more VRAM, then the cards wouldn't exist. Even the 3070 16 GB version (the value card, the ##70 class) the most popular card, will have 16 GB of VRAM which is more than the 3080 10 GB. The 3070 16 GB is not for enthusiast but the value option. The value option will have more VRAM than the 3080 10 GB.

But as of yet they don’t exist.  As for the value option thing, apparently the consoles don’t have any regular memory.  They use vram for everything.  So it’s kind of unknown how much actual game memory will be available.  One guy claimed as little as 8gb.  I personally find this one doubtful, but I don’t know what the number is going to be.  Just that it has to be less than 16gb.  The whole 20gb thing implies to me they may be covering against a 16gb AMD card.  You’ve got 16gb?  We’ve got 20gb! So there!” kinda thing.  I’m not saying you’re wrong necessarily.  I just don’t know that you’re right.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bombastinator said:

But as of yet they don’t exist.  As for the value option thing, apparently the consoles don’t have any regular memory.  They use vram for everything.  So it’s kind of unknown how much actual game memory will be available.  One guy claimed as little as 8gb.  I personally find this one doubtful, but I don’t know what the number is going to be.  Just that it has to be less than 16gb.  The whole 20gb thing implies to me they may be covering against a 16gb AMD card.  You’ve got 16gb?  We’ve got 20gb! So there!” kinda thing.  I’m not saying you’re wrong necessarily.  I just don’t know that you’re right.

Exactly, even the consoles, the baseline, have more VRAM than the 3080 10 GB. We know that's not all for rendering images on the display, but how much is necessary for the OS and other tasks, probably not 6 GB. Even a more complicated OS like windows 10 doesn't use that much memory at ideal. The consoles are expected to last a few years, not one generation of graphics card. So they are built with that in mind, probably a 4 year life span. They have 16 GB as a result to get through the next four years of 4k gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

So I could conceivably have more VRAM than regular ram? 👀

Either @piratemonkey or quote me when responding to me. I won't see otherwise

Put a reaction on my post if I helped

My privacy guide | Why my name is piratemonkey PSU Tier List Motherboard VRM Tier List

What I say is from experience and the internet, and may not be 100% correct

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Yoshi Moshi said:

Exactly, even the consoles, the baseline, have more VRAM than the 3080 10 GB. We know that's not all for rendering images on the display, but how much is necessary for the OS and other tasks, probably not 6 GB. Even a more complicated OS like windows 10 doesn't use that much memory at ideal. The consoles are expected to last a few years, not one generation of graphics card. So they are built with that in mind, probably a 4 year life span. They have 16 GB as a result to get through the next four years of 4k gaming.

It’s been 5 years for a console generation so far, more or less.  The problem is we don’t know what devs are going to DO with that memory. We don’t even know how much is going to be available.  I can’t see making a decision on a graphics card until I know how the consoles are going to work.   Too much chanc of winding up with a white elephant.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/19/2020 at 12:03 PM, CTR640 said:

I think it's the driver. They don't have the latest and newest driver yet for the 3090. nVidia is not stupid for that 10%, no one with a healthy common sense would pay up $1500+ for a 10% performance.

They said this is supposed to be a titan replacement and in the past the titan has been about 10% faster than the ti for about double the price but with more vram so this would be in line with that. Honestly I would just buy a 3080 and call it a day because the 3090 is looking like it is not worth the extra money unless you are a prosumer who needs the extra vram and or nvlink capabilities. 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia low key confirms 10% numbers in news release: https://www.nvidia.com/en-gb/geforce/news/rtx-3090-out-september-24/

Quote

For 4K gaming, the GeForce RTX 3090 is about 10-15% faster on average than the GeForce RTX 3080, and up to 50% faster than the TITAN RTX.

Spoiler


image.png.65c54c70601fdf4055f342eb5dcf6f7d.png

 

Look forward to official reviewers searching for the elusive 15% gains.

Link to comment
Share on other sites

Link to post
Share on other sites

So if I buy this for a whooping 1500$, one year later will the 3080 ti 20gb version be 700$ and 20-30% faster than the 3090?

CPU: i9 19300k////GPU: RTX 4090////RAM: 64gb DDR5 5600mhz ////MOBO: Aorus z790 Elite////MONITORS: 3 LG 38" 3840x1600 WIDESCREEN MONITORS

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, kaylexmorgana said:

So if I buy this for a whooping 1500$, one year later will the 3080 ti 20gb version be 700$ and 20-30% faster than the 3090?

I don't think that RTX 3080 Ti 20GB will be faster than RTX 3090 because "3090" must be faster than any "3080".

 

For example, GTX 1070 Ti is slower than GTX 1080. Same for RTX 2070 Super, it's slower than RTX 2080.

PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 32GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 59.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2689 v4 | 2x Intel BXSTS200C | 32GB DDR4-2400 ECC Reg | MSI RTX 3080 Ti Suprim X | 2x 1TB SSD SATA Samsung 870 EVO | Corsair AX1600i | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T460p | i7-6700HQ | 16GB DDR4 2133 | GeForce 940MX | 240GB SSD PNY CS900 | 14" IPS 1920x1080 | Win11

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 6 others computers (Intel Compute Stick x5-Z8330, Giada Slim N10 WinXP, 2 Apple classic and 2 PC pocket WinCE)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×