Jump to content

Could the RTX 4090 be considered "worth it" simply due to it's extra VRAM?

LightningMachine
Go to solution Solved by GuiltySpark_,

If you’re not at 4K and don’t plan do be, 4080. If you are, the difference between the 4080 and 4090 is significant. Probably more than you think. It’s kind of insane the jump in performance from my 3090 to 4090. 
 

Price is subjective but if PC gaming is one of or your only hobby, a $1600 purchase is cheap in the grand scheme of what you could be spending on other hobbies over two years. 
 

All about perspective. 

I'm on the fence of buying a totally new rig, and my eyes have been set on either the 4080 (1249EUR) or 4090 (1699EUR). 

 

While the RTX 4090 costs 32% more over here, when I compare the frame rates to the RTX 4080, it only delivers 23% more performance. So at first one might feel like the RTX 4080 is the better, smarter choice. However, when I looked at the comparison between the RTX 3070 and 6800 XT, it really spoke volumes in terms of just how important VRAM is, even in the long run. 

 

What do you guys think? Should I buy an RTX 4080 to exchange it for a 50 or 60 series card later on or should I buy the 4090 and in that way "future proof" my rig for the next few years to come?

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, LightningMachine said:

I'm on the fence of buying a totally new rig, and my eyes have been set on either the 4080 (1249EUR) or 4090 (1699EUR). 

 

While the RTX 4090 costs 32% more over here, when I compare the frame rates to the RTX 4080, it only delivers 23% more performance. So at first one might feel like the RTX 4080 is the better, smarter choice. However, when I looked at the comparison between the RTX 3070 and 6800 XT, it really spoke volumes in terms of just how important VRAM is, even in the long run. 

 

What do you guys think? Should I buy an RTX 4080 to exchange it for a 50 or 60 series card later on or should I buy the 4090 and in that way "future proof" my rig for the next few years to come?

 

 

 

Do like I did, get a 7900XTX, as much VRAM as 4090 for less than a 4080 🙂 

System : AMD R9 5900X / Gigabyte X570 AORUS PRO/ 2x16GB Corsair Vengeance 3600CL18 ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Eisbaer 280mm AIO (with 2xArctic P14 fans) / 2TB Crucial T500  NVme + 2TB WD SN850 NVme + 4TB Toshiba X300 HDD drives/ Corsair RM850x PSU/  Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, PDifolco said:

Do like I did, get a 7900XTX, as much VRAM as 4090 for less than a 4080 🙂 

I understand, truly I do, but I have a GSYNC monitor, and DLSS has been proven to be far, far more effective than FSR. I'm cheering AMD on for sure, but they haven't been putting out the innovation that Nvidia brings to the table. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, LightningMachine said:

I understand, truly I do, but I have a GSYNC monitor, and DLSS has been proven to be far, far more effective than FSR. I'm cheering AMD on for sure, but they haven't been putting out the innovation that Nvidia brings to the table. 

I would get the 4080 and upgrade in a few years (when you feel like you need/want to). The extra Vram is useless when you don't need it and we dont know if you will actually need more than 16 in the next years.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, LightningMachine said:

I understand, truly I do, but I have a GSYNC monitor, and DLSS has been proven to be far, far more effective than FSR. I'm cheering AMD on for sure, but they haven't been putting out the innovation that Nvidia brings to the table. 

personally i cant tell the difference between fsr and dlss and dlss 3 is known to have its own set of issues.

Link to comment
Share on other sites

Link to post
Share on other sites

The bad price to performance ratio increases exponentially the higher end you go kinda like a race car. You pay more for less % increases. 

 

Id get a 4080. The 4090 isnt worth the cost unless you just want the clout of a 4090

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, LightningMachine said:

I understand, truly I do, but I have a GSYNC monitor, and DLSS has been proven to be far, far more effective than FSR. I'm cheering AMD on for sure, but they haven't been putting out the innovation that Nvidia brings to the table. 

GSync and Freesync are in essence the same thing, just adapted to different architectures. Besides, this technology will be standard in the near future. DLSS and FSR are pretty much the same thing: just a competitive answer to each other. What matters is value and longevity. Nvidia is going with Apple approach, giving you less for more and requiring you to upgrade more often. And did you miss the part where AMD introduced a software Ray Tracing... which gave RT capability to pretty much ANY card? Vulkan API is full of such things, all ready to be applied in game engines. If AMD had the same game developer support Nvidia has... we would many more features than just Nvidia "offerings"

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, LightningMachine said:

I understand, truly I do, but I have a GSYNC monitor, and DLSS has been proven to be far, far more effective than FSR. I'm cheering AMD on for sure, but they haven't been putting out the innovation that Nvidia brings to the table. 

RTX 4090 is sort of futile in most scenarios unless you're running 4K high refresh rate. So whatever display you're currently using that's Gsync only, I'd also look to upgrade if you're looking at a GPU upgrade.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, PriitM said:

GSync and Freesync are in essence the same thing, just adapted to different architectures. Besides, this technology will be standard in the near future. DLSS and FSR are pretty much the same thing: just a competitive answer to each other. What matters is value and longevity. Nvidia is going with Apple approach, giving you less for more and requiring you to upgrade more often. And did you miss the part where AMD introduced a software Ray Tracing... which gave RT capability to pretty much ANY card? Vulkan API is full of such things, all ready to be applied in game engines. If AMD had the same game developer support Nvidia has... we would many more features than just Nvidia "offerings"

G-sync =/= Freesync on many levels. Most newer displays will support both since especially the G-sync requirements have been loosened, a lot. 

 

Even if you're mostly over-driving your display to that 1.5-2x level for minimal latency, VRR is still worth it. I just refute the nature of upgrading your GPU without considering a monitor upgrade, especially if the display in question is old enough/budget enough to be G-sync only.

 

Generally, a Freesync monitor in 2023 will support VRR on Nvidia, but a G-sync monitor is generally Nvidia only for VRR.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, emosun said:

The bad price to performance ratio increases exponentially the higher end you go kinda like a race car. You pay more for less % increases. 

 

Id get a 4080. The 4090 isnt worth the cost unless you just want the clout of a 4090

 

Last gen there was a tiny difference between a 3080 and 3090.  There is a massive difference in Cuda cores between the 4080 and 4090.  If you see that in-game depends on what you are playing.  It is pretty easy to not realize that big of a difference due to how CPU-limited a lot of titles are today even at 4k.  Likely as new titles and UE5 roll out the gap will widen.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

If you’re not at 4K and don’t plan do be, 4080. If you are, the difference between the 4080 and 4090 is significant. Probably more than you think. It’s kind of insane the jump in performance from my 3090 to 4090. 
 

Price is subjective but if PC gaming is one of or your only hobby, a $1600 purchase is cheap in the grand scheme of what you could be spending on other hobbies over two years. 
 

All about perspective. 

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, ewitte said:

Last gen there was a tiny difference between a 3080 and 3090.  There is a massive difference in Cuda cores between the 4080 and 4090.  If you see that in-game depends on what you are playing.  It is pretty easy to not realize that big of a difference due to how CPU-limited a lot of titles are today even at 4k.  Likely as new titles and UE5 roll out the gap will widen.

Its because the RTX 3080 and 3090 used the same GA102 die where the RTX 4080 uses a much smaller die with an equivalent cuda core count to the RTX 3070ti when doing an intergenerational comparison. Performance deltas also track better between the 3070ti and 4080 when compared to the 3090 and 4090 that show a better intergenerational performance gain closer to 70%.

 

Practically speaking, the RTX 3080's Ada Lovelace counterpart doesn't exist, but if it did, it would look a lot like this speculative card. NVIDIA GeForce RTX 4080 Ti Specs | TechPowerUp GPU Database

 

Nvidia would have to have millions of AD102 dies that didn't make the cut as production cards or the RTX 4090 for the short list of GPUs using that die on the market. NVIDIA AD102 GPU Specs | TechPowerUp GPU Database

 

Amusingly when using this standard, the RTX 4090 is closer and below the RTX 3080ti in binning scheme/silicon quality.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, GuiltySpark_ said:

If you’re not at 4K and don’t plan do be, 4080. If you are, the difference between the 4080 and 4090 is significant. Probably more than you think. It’s kind of insane the jump in performance from my 3090 to 4090. 
 

Price is subjective but if PC gaming is one of or your only hobby, a $1600 purchase is cheap in the grand scheme of what you could be spending on other hobbies over two years. 
 

All about perspective. 

RTX 4090 is just an absolute 4K monster, its the only reason I bought one since I already had my 4K 240Hz display and it doubled the 6900 XT I had at the time. 16GB of VRAM was also limiting in some games, where Division 2 I would have to run DX11 or lower the texture qualities to prevent it from maxing out the VRAM, which turns out was about 19GB of actual use.

 

I've also seen speculation that the RTX 4090 will retain a lot more value than the 3090 simply due to the better tensor cores for AI. NVIDIA GeForce RTX 4090 vs RTX 3090 Deep Learning Benchmark (lambdalabs.com)

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Agall said:

RTX 4090 is sort of futile in most scenarios unless you're running 4K high refresh rate. So whatever display you're currently using that's Gsync only, I'd also look to upgrade if you're looking at a GPU upgrade.

The problem is often with modern AAA games that they implement an absolute shit-tier level of TAA. See also Red Dead Redemption 2 on PC. That game, on 1440p is unviewable for me. Setting the resolution scale to 150% relieves the problem so, so much more. 

 

Why would you say that I'd need to upgrade my monitor too? 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Agall said:

RTX 4090 is just an absolute 4K monster, its the only reason I bought one since I already had my 4K 240Hz display and it doubled the 6900 XT I had at the time. 16GB of VRAM was also limiting in some games, where Division 2 I would have to run DX11 or lower the texture qualities to prevent it from maxing out the VRAM, which turns out was about 19GB of actual use.

 

I've also seen speculation that the RTX 4090 will retain a lot more value than the 3090 simply due to the better tensor cores for AI. NVIDIA GeForce RTX 4090 vs RTX 3090 Deep Learning Benchmark (lambdalabs.com)

Division 2 is an odd one. On any card, even my 4090 it nearly maxes out VRAM but there is the argument of used vs. allocated. MW2 seems to do the same thing. I'll look at VRAM usage in HWiNFO and they'are always at 20-23GB "used". Its the same number for allocated so 🤷‍♂️

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LightningMachine said:

I'm on the fence of buying a totally new rig, and my eyes have been set on either the 4080 (1249EUR) or 4090 (1699EUR). 

 

While the RTX 4090 costs 32% more over here, when I compare the frame rates to the RTX 4080, it only delivers 23% more performance. So at first one might feel like the RTX 4080 is the better, smarter choice. However, when I looked at the comparison between the RTX 3070 and 6800 XT, it really spoke volumes in terms of just how important VRAM is, even in the long run. 

 

What do you guys think? Should I buy an RTX 4080 to exchange it for a 50 or 60 series card later on or should I buy the 4090 and in that way "future proof" my rig for the next few years to come?

The 4090 is a worlds better gpu than the 4080, they're just not even close to the same class of card. That 23% difference probably has a lot of cpu limits with current games, as the 4090 has a staggering 68% more CUDA cores and in Time Spy Extreme the 4090 benches 37% higher than the 4080 on graphics store. The 4080 is enormously cut down vs previous gen 80 series compared to the flagships. I wouldn't touch a 4080 with a ten foot pole when it's so close in price to the far superior 4090. Though you'd be nuts to not upgrade to a 4k144 monitor too.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, GuiltySpark_ said:

Division 2 is an odd one. On any card, even my 4090 it nearly maxes out VRAM but there is the argument of used vs. allocated. MW2 seems to do the same thing. I'll look at VRAM usage in HWiNFO and they'are always at 20-23GB "used". Its the same number for allocated so 🤷‍♂️

I know Division 2 would use more than 16GB since it would do the typical 'single digit framerate' issue when you're dipping into shared memory.

 

But I agree on used vs allocated, since MW2 will allocate 23GB as well but be unlikely to actually use that much.

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ewitte said:

Last gen there was a tiny difference between a 3080 and 3090.  There is a massive difference in Cuda cores between the 4080 and 4090.  If you see that in-game depends on what you are playing.  It is pretty easy to not realize that big of a difference due to how CPU-limited a lot of titles are today even at 4k.  Likely as new titles and UE5 roll out the gap will widen.

This is a topic of conversation that isn't mentioned nearly enough imo. Up until the 4090, we had CPU's fast enough where GPU's were the bottleneck ~90% of the time. Now? We as a community BADLY need Arrow Lake/Zen 5 to be wicked fast. Even the 7800x3d can't keep up with a 4090 a lot of the time.

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LightningMachine said:

I'm on the fence of buying a totally new rig, and my eyes have been set on either the 4080 (1249EUR) or 4090 (1699EUR). 

 

While the RTX 4090 costs 32% more over here, when I compare the frame rates to the RTX 4080, it only delivers 23% more performance. So at first one might feel like the RTX 4080 is the better, smarter choice. However, when I looked at the comparison between the RTX 3070 and 6800 XT, it really spoke volumes in terms of just how important VRAM is, even in the long run. 

 

What do you guys think? Should I buy an RTX 4080 to exchange it for a 50 or 60 series card later on or should I buy the 4090 and in that way "future proof" my rig for the next few years to come?

 

 

 

I'd get neither of these. I'd get a 6950xt or 7900xt

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Deadpool2onBlu-Ray said:

This is a topic of conversation that isn't mentioned nearly enough imo. Up until the 4090, we had CPU's fast enough where GPU's were the bottleneck ~90% of the time. Now? We as a community BADLY need Arrow Lake/Zen 5 to be wicked fast. Even the 7800x3d can't keep up with a 4090 a lot of the time.

I find its more engine limitations than CPU limitations. Example being Warframe I can get up to 1200 fps at 720p/1080p ultra which is CPU/engine limited. At 4K, its in the 500 fps range.

 

I made this thread to demonstrate how limiting it can be in the best case scenario for an older chip, being 4K ultra settings. At 4K in enclosed single player scenarios where my 7950x3D could reach 1200 fps (mind you this is testing every CCD configuration, including pseudo 7800x3D and 7700x not shown in this post) the 4790k got basically the same 500 some fps as the 7950x3D.

 

"Cassini (saturn) solo staring in corner 1080p- 500 (same at 4K) vs 500 4k and 1200 at 1080p"

Ryzen 7950x3D PBO +200MHz / -15mV curve CPPC in 'prefer cache'

RTX 4090 @133%/+230/+1000

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Agall said:

I find its more engine limitations than CPU limitations. Example being Warframe I can get up to 1200 fps at 720p/1080p ultra which is CPU/engine limited. At 4K, its in the 500 fps range.

 

I made this thread to demonstrate how limiting it can be in the best case scenario for an older chip, being 4K ultra settings. At 4K in enclosed single player scenarios where my 7950x3D could reach 1200 fps (mind you this is testing every CCD configuration, including pseudo 7800x3D and 7700x not shown in this post) the 4790k got basically the same 500 some fps as the 7950x3D.

 

"Cassini (saturn) solo staring in corner 1080p- 500 (same at 4K) vs 500 4k and 1200 at 1080p"

I think it's a little of both honestly. But you're right. Game development is the REAL issue, not the hardware, cpu/gpu or otherwise. Look at the dumpster fire that is LTOU Remake on PC, Jedi Survivor, Hogwarts etc. Although I think Nvidia needs to give us more VRAM, there is no reason why all of a sudden anything less than 16gb should struggle in new AAA Games

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, bezza... said:

personally i cant tell the difference between fsr and dlss and dlss 3 is known to have its own set of issues.

I also have a GSync monitor, but as the game FPS are close or better to its 120Hz I don't have any tearing or such

No real need for DLSS neither

System : AMD R9 5900X / Gigabyte X570 AORUS PRO/ 2x16GB Corsair Vengeance 3600CL18 ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Eisbaer 280mm AIO (with 2xArctic P14 fans) / 2TB Crucial T500  NVme + 2TB WD SN850 NVme + 4TB Toshiba X300 HDD drives/ Corsair RM850x PSU/  Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, LightningMachine said:

I understand, truly I do, but I have a GSYNC monitor, and DLSS has been proven to be far, far more effective than FSR. I'm cheering AMD on for sure, but they haven't been putting out the innovation that Nvidia brings to the table. 

With a high end card such as the 4080 or 4090 you're probably gonna wanna turn gsync off anyways unless you're playing 4k 144hz and you won't need dlss or fsr, those features are mainly for lower end cards. 

The 7900 XTX seems like the clear option for you to go for imo.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, SteveGrabowski0 said:

 I wouldn't touch a 4080 with a ten foot pole when it's so close in price to the far superior 4090. Though you'd be nuts to not upgrade to a 4k144 monitor too.

I was in the same boat once they released the 4080 I didn't even consider it.  I'm fine with the 120hz I have on my 48" OLED, although it is probably equal to a 144-200hz LCD.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Deadpool2onBlu-Ray said:

I think it's a little of both honestly. But you're right. Game development is the REAL issue, not the hardware, cpu/gpu or otherwise. Look at the dumpster fire that is LTOU Remake on PC, Jedi Survivor, Hogwarts etc. Although I think Nvidia needs to give us more VRAM, there is no reason why all of a sudden anything less than 16gb should struggle in new AAA Games

80% of the issues were VRAM related (after about a month of being out at least).  The real issue is UE4 has been pushed to its limits.  It doesn't properly make use of modern processors, there are some inefficiencies, etc.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×