Jump to content

Unreal Engine 5.1 - video game graphics have arrived at their destination

Delicieuxz

 

From lighting and reflection revolutions, to nanite foliage eliminating the need for LOD, to environments as large as more-than-half the distance from the Earth to the Sun, and lots more, Unreal Engine 5.1 resolves a bunch of the final barriers to presenting realistic environments and brings graphics to a level of realisation that probably fulfills a lot of fantasies from earlier days of PC gaming, of what the end goal for graphics could be. There's certainly more that can be done, but these developments bring the paradigm across the threshold from janky allusion and concessions, into accurate-to-reality depiction in real-time, on consumer hardware.

 

But witnessing graphics revolutions is more exciting than reading about them, so check out this video:

 

 

It's no wonder why Epic is able to claim that more than half of all announced next-gen games are being made on Unreal Engine.

 

The next Witcher game will surely look phenomenal with unlimited foliage detail at all view distances. And I wonder what the consumer GPU and hardware industry will be like 10 years from now, when this is accomplishable on today's hardware. If civilisation still exists at that point, fully-realistic graphics could be accessible to all, for a cheap price. And when unlimited graphical fidelity is ubiquitous and normalised, making higher-fidelity graphics no longer an impressive marketing point, there will likely be a focal shift in game development towards gameplay, creative concepts, and writing, resulting in better games.

 

 

Unreal Engine 5.1 is now available!

Quote

Unreal Engine 5.1 is now available!

 

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

Cool, but knowing how well developers have optimized and polished games recently UE5 makes me shudder a little bit. I think I would prefer last-gen looking games if they ran smoothly and without as much janky stuff. 

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

the biggest change was 5.0

5.1 is mostly a very improved upon what 5.0. Unity is also used for high quality.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ZetZet said:

Cool, but knowing how well developers have optimized and polished games recently UE5 makes me shudder a little bit. I think I would prefer last-gen looking games if they ran smoothly and without as much janky stuff. 

To the extent that I care, every time we're promised realism, it always a mile short of where were in 2001 with film.

 

Basically film has had 2001, Squaresoft's "The Spirits Within" (which was 100% CG) and then 2009's Avatar which was a hybrid of "realistic" and actual real characters. 

 

Unreal Engine has always hit a middle point between "this is playable" and "this is pretty" and unfortunately the gap between "playable" and "pretty" is massive. You can see this in existing UE games like Fortnite and Dead By Daylight, but also the Final Fantasy 7 Remake. The prettier something looks, doesn't necessarily make for a better playing game, and in many cases the developer will not actually go through the required effort to make something "pretty" when it's competitive multiplayer because nobody actually plays it at maximum quality because it costs them "frames" in the game.

 

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Kisai said:

The prettier something looks, doesn't necessarily make for a better playing game, and in many cases the developer will not actually go through the required effort to make something "pretty" when it's competitive multiplayer because nobody actually plays it at maximum quality because it costs them "frames" in the game.

I agree. It's so annoying when some developers push the graphics so far beyond what gamers can do and then hide behind "next-gen" cover. In my personal opinion if you are a developer you should scale your games to the most popular midrange card that is on the market. Like for example for all games that are going to come out in the next year or so 3060 should be able to get 60 fps comfortably in 1080p (maxed or near maxed). If you go beyond that you are punishing majority of the player base with a shit experience, just because they were too poor. 

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ZetZet said:

I agree. It's so annoying when some developers push the graphics so far beyond what gamers can do and then hide behind "next-gen" cover. In my personal opinion if you are a developer you should scale your games to the most popular midrange card that is on the market. Like for example for all games that are going to come out in the next year or so 3060 should be able to get 60 fps comfortably in 1080p (maxed or near maxed). If you go beyond that you are punishing majority of the player base with a shit experience, just because they were too poor. 

they....do....
Yall know you dont have to run a game on ultra right?

In fact that has historically been the norm, games would always have max settings be unusable on the hardware of the day on release, it gives a game a longer tail and be relevant longer rather then have this werid situation today where there isnt a single game you cant max out at 1440p and get 60fps with a 3090, so why even make new gpus.

Link to comment
Share on other sites

Link to post
Share on other sites

From my understanding, Nanite can still  make games look better at lower settings, especially when it comes to things in the distance and things popping in and out and between different quality models.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, starsmine said:

they....do....
Yall know you dont have to run a game on ultra right?

That misses the point.

 

You should not have to meddle with the settings to create a playable experience. If you are playing on a RTX 3090 or a PS5 or Series X, the output should be exactly the same. If you are playing competitively, you should not have to fiddle with the settings to find the optimal settings. That's why universally you'll see people playing competitive FPS games with the quality settings to the worst. If you see someone playing it on Ultra, they're certainly not playing it competitively (maybe for fun, or with Dixper on a stream.)

 

Hell, Fortnite, runs like utter trash if you turn on any RT feature. On a 3090 even.

 

There should only be two "standard" settings modes for PC games "I'm playing with others" and "I'm streaming", in which case "with others" profiles all the players systems and sets the quality/framerate settings to the lowest common denominator (so yes, the player with the potato cripples everyone) and "streaming" which turns off features that induce motion sickness and seizures, and caps the display framerate to 60 or 120 (if supported by capture tools.) If neither are true, the game should run with all the features turned on that maintain the intended frame rate, and the game should determine which features are less necessary (eg motion blur, anti-aliasing, bloom, shadows, fog, etc) on a room-by-room basis to keep that frame rate. The only quality features the players should be able to toggle themselves are the screen resolution, and accessibility feature groups (eg blur/flicker/colorblind)

 

To that end, "I'm playing with others" should benchmark the GPU until it finds a 30fps+ or 60fps+ setting for that "map" before dropping the player into a multiplayer game using that map. With a game console, that is a predictable known value. With a PC, you will have people with 6 year old CPU/GPU's trying to play with people with 11/12/13th gen Intel CPU's and 2080/3080/4080 + GPU's who will have an effective advantage. If you do a private lobby then the hardware requirement is waived if the host has opted to.

 

But again, competitively, people with the best hardware and live the closest to the data center have an advantage. You level the playing field by making sure that these settings are minimum identical. You want to keep performance-affecting features away from being tuned. Also keep in mind that reshade is a thing, and there will always be players who use these tools to "make the quality even worse" to gain performance.

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Kisai said:

That misses the point.

 

You should not have to meddle with the settings to create a playable experience. If you are playing on a RTX 3090 or a PS5 or Series X, the output should be exactly the same. If you are playing competitively, you should not have to fiddle with the settings to find the optimal settings. That's why universally you'll see people playing competitive FPS games with the quality settings to the worst. If you see someone playing it on Ultra, they're certainly not playing it competitively (maybe for fun, or with Dixper on a stream.)

You... dont? Games auto sense your hardware and pick a settings configuration, generally, they have big buckets of low/med/high/ultra now that you can just use as well. Geforce experience cloud sources the most common used settings for you and will auto apply. you literally dont have to meddle with settings anymore
I missed zero points here.

12 minutes ago, Kisai said:

There should only be two "standard" settings modes for PC games "I'm playing with others" and "I'm streaming", in which case "with others" profiles all the players systems and sets the quality/framerate settings to the lowest common denominator (so yes, the player with the potato cripples everyone) and "streaming" which turns off features that induce motion sickness and seizures, and caps the display framerate to 60 or 120 (if supported by capture tools.) If neither are true, the game should run with all the features turned on that maintain the intended frame rate, and the game should determine which features are less necessary (eg motion blur, anti-aliasing, bloom, shadows, fog, etc) on a room-by-room basis to keep that frame rate. The only quality features the players should be able to toggle themselves are the screen resolution, and accessibility feature groups (eg blur/flicker/colorblind)

What the actual fuck?

GO PLAY CONSOLE THEN.
Configurability is a massive draw for PC players.

 

12 minutes ago, Kisai said:

To that end, "I'm playing with others" should benchmark the GPU until it finds a 30fps+ or 60fps+ setting for that "map" before dropping the player into a multiplayer game using that map. With a game console, that is a predictable known value. With a PC, you will have people with 6 year old CPU/GPU's trying to play with people with 11/12/13th gen Intel CPU's and 2080/3080/4080 + GPU's who will have an effective advantage. If you do a private lobby then the hardware requirement is waived if the host has opted to.

 

But again, competitively, people with the best hardware and live the closest to the data center have an advantage. You level the playing field by making sure that these settings are minimum identical. You want to keep performance-affecting features away from being tuned. Also keep in mind that reshade is a thing, and there will always be players who use these tools to "make the quality even worse" to gain performance.

 

This has to be one of the worst hot takes I have seen in a while. It would halt all improvement to software and hardware.

Link to comment
Share on other sites

Link to post
Share on other sites

This is certainly cool but remember, for scenes to look as good as they possibly can a lot of careful setting and modeling is required and it's unlikely to be practical or even feasible in an actual interactive gameplay experience. We had this exact same discourse when UE4 came out but as far as I can tell we still haven't seen actual games with the level of fidelity seen in the UE4 demo. Most studios can't afford to have 20 people work for 6 months to model and animate a single face and even if they did it would likely not be worth it when they could work on other aspects of the game people care more about instead.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, starsmine said:

You... dont? Games auto sense your hardware and pick a settings configuration, generally, they have big buckets of low/med/high/ultra now that you can just use as well. Geforce experience cloud sources the most common used settings for you and will auto apply. you literally dont have to meddle with settings anymore
I missed zero points here.

You missed all of them. A game does not need 200 configurable variables. Nobody tweaks more than one or two of those variables, because your average "I just want to play the game" player wants it to look like the trailer, not like a PS2.

 

47 minutes ago, starsmine said:


GO PLAY CONSOLE THEN.
Configurability is a massive draw for PC players.

No it isn't. Being able to play at 4K with all the settings turned on is. Nobody gives a care about any values between "Ultra" and "Low", It's either "Ultra" or "off" for the four most GPU intensive values, and everything else is left alone.

 

47 minutes ago, starsmine said:

This has to be one of the worst hot takes I have seen in a while. It would halt all improvement to software and hardware.

No, you just don't understand it. People already tamper with their GPU gain advantages over other players in competitive games, this is just leveling the playing field from the beginning and removing the hacky crap "performance guides" you always end up doing anyway.

 

You have the choice between Pretty game with lower frame rate, or competitive game with high frame rate, you only get to choose one. When you play a single player game, those tunables can sometimes make the difference between "this room should have precompiled the shaders" and "this room is 5fps every time I face the door, while the rest of the game is 120fps"

 

When you play multiplayer with sweaty gitgud players, they're all playing on low settings, no matter what the game is, because they believe everyone else is.  So just take that advantage away and set everyone to the same settings that guarantees 60fps.

 

I can guarantee you anyone who plays fortnite, is playing with the sound accessibility features on so you can see what direction people are in. So just turn it on for everyone.

Link to comment
Share on other sites

Link to post
Share on other sites

Looking forward to playing at 60 fps again and now that moore's law is dead we won't play at fast hz anymore.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Sauron said:

This is certainly cool but remember, for scenes to look as good as they possibly can a lot of careful setting and modeling is required and it's unlikely to be practical or even feasible in an actual interactive gameplay experience. We had this exact same discourse when UE4 came out but as far as I can tell we still haven't seen actual games with the level of fidelity seen in the UE4 demo. Most studios can't afford to have 20 people work for 6 months to model and animate a single face and even if they did it would likely not be worth it when they could work on other aspects of the game people care more about instead.

Even if max quality on models doesn't improve, from my understanding Nanite still benefits how the game looks. Reason is that instead of every item having for example 4 models created by the creators that all are different level of detail for different distances from the player, the engine can automatically adjust level of detail so it's less of a popping between states and more of a gradual transition based on distance. The creators also don't have to make all the different models I think?

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Kisai said:

To the extent that I care, every time we're promised realism, it always a mile short of where were in 2001 with film.

 

Basically film has had 2001, Squaresoft's "The Spirits Within" (which was 100% CG) and then 2009's Avatar which was a hybrid of "realistic" and actual real characters. 

 

Unreal Engine has always hit a middle point between "this is playable" and "this is pretty" and unfortunately the gap between "playable" and "pretty" is massive. You can see this in existing UE games like Fortnite and Dead By Daylight, but also the Final Fantasy 7 Remake. The prettier something looks, doesn't necessarily make for a better playing game, and in many cases the developer will not actually go through the required effort to make something "pretty" when it's competitive multiplayer because nobody actually plays it at maximum quality because it costs them "frames" in the game.

 

Honestly that is why I like games that are stylized as they can make the game looks beautiful without being photo realistic as its not supposed to look realistic. It's why I can go back and play older borderlands games and they don't look nearly as bad as some old games that went for the realistic graphics. I mean even look at valheim which is a super popular game and it's not photo realistic at all but again because it's stylized it still looks good and the gameplay is really solid. I wish more game developers realized that gameplay is king at the end of the day. I mean my favorite games of all time just had solid gameplay and had little to do with graphics. 

Link to comment
Share on other sites

Link to post
Share on other sites

Great but how well does it run on mobile?

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

The engine looks great, but just remember that we're still relying on game devs not being lazy and cutting corners, actually using the engine to it's fullest and making games that don't suck

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

I hope improvement comes for shader compilation stutter issue.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Niiiice.

 

One of my biggest gripes about game graphics, since i was a kid even, has been 'pop in' .. i HATE it ..i mean REALLY dont like it. I do everything i possibly can to try stop. if the game doesnt have an ingame option for LOD settings ..i find a mod ..or option file to tweek, to increase LOD quality so that 'pop in' is less noticeable.

 

While im sure it will take many years for this 'nanite' tech to work its way into the majority of games in one form or another, i look forward to it.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Mihle said:

Even if max quality on models doesn't improve, from my understanding Nanite still benefits how the game looks. Reason is that instead of every item having for example 4 models created by the creators that all are different level of detail for different distances from the player, the engine can automatically adjust level of detail so it's less of a popping between states and more of a gradual transition based on distance. The creators also don't have to make all the different models I think?

I'm sure it will help but it seems like such a minor improvement that most people won't even notice. It's great for reducing the workload of developers who'd otherwise have to create lower poly models, that's for sure.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

photo realism isn't worth the spec requirement. I've come to point where i just prefer cel-shading games and pixel art.

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, Kisai said:

You missed all of them. A game does not need 200 configurable variables. Nobody tweaks more than one or two of those variables, because your average "I just want to play the game" player wants it to look like the trailer, not like a PS2.

Since when is the average player everyone? Da fuck? I tweaked a ton of the RDR2 settings including some hidden in "Advanced" to get a stable Ultra 4K experience with my 3080.

20 hours ago, Kisai said:

You should not have to meddle with the settings to create a playable experience. If you are playing on a RTX 3090 or a PS5 or Series X, the output should be exactly the same.

No, just no. Everyone should get an enjoyable experience but higher end hardware should most certainly provide a more premium one. And we are sure as hell not giving up anything on PC due to consoles.

19 hours ago, Kisai said:

Nobody gives a care about any values between "Ultra" and "Low", It's either "Ultra" or "off" for the four most GPU intensive values, and everything else is left alone.

Just where do you get this from.. come on, that's just BS. High should be a sensible preset for state-of-the-art hardware that works right out of the box at 60+ fps and AFAIK that's currently the way it is.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dracarris said:

Just where do you get this from.. come on, that's just BS. High should be a sensible preset for state-of-the-art hardware that works right out of the box at 60+ fps and AFAIK that's currently the way it is.

That's not how it works, and you know it.

 

Any benchmarkable game will show you that "everything turned on" is the setting the developer intended you to play the game on, and no games are actually intended to be run at any setting below what the current generation game console settings are, which are the default settings. The games do not profile the hardware to check if you are running a capable iGPU. they just see "Intel" and go "nope, this game will not work on your toaster, I don't believe it."

 

Both Final Fantasy 14's original version, and the entire CyberPunk fiasco also proves it's entirely possible to misjudge what hardware people will desperately try to play the game on. Despite meeting the BS "minimum" requirements, neither game is playable at any setting without having had the top-of-the-line hardware at release. If you try to "tune down" the game, what you got is a broken PS2-visual experience that nobody wants to play. I've seen videos of glitches from Cyberpunk that look like glitches I still see in GTA-V. When your computer is not as capable as a game console, that's when you stop using your PC to play games.

 

I've seen people tune down or mod "competitive" games to the point that all it is is geometry, no textures or shaders. At some point you have to say that the tradeoffs for running a game on a potato or toaster computer is not worth it and you'd have a better experience on a Nintendo Switch.

 

Anyway, you seem to be of the mind that people spend an hour turning their games. Nobody does that. The average person adjusts nothing and hopes the developer didn't design their game to run on hardware that doesn't yet exist. Short of adjusting against motion sickness, seizures, and color-blind accessibility, leave everything else alone if it's at 60fps. You just don't know if you're going to get 60fps consistently, or if every time you switch locations, or walk through a door as the lighting shaders tank the performance for a minute.

 

Like good grief, I'm still seeing games that are new and games that are 12 years old have the same problem where an effect that is used in some places in the game turns the otherwise 60fps game into 5fps. Let's see, The Stanley Parable Deluxe (released this year, which is made in Unity) does that in one place. Ghostbusters; Spirits Unleashed does this every time you fire. Fortnite does this every time you start a new match, there will be like a 10 second lag as the map pops in. The latter two are Unreal Engine 4. The games are running on NVMe SSD's. The GPU is a RTX 3090. The problem is not the hardware. 

 

Short of toggling the resolution and DLSS features you have no reason to meddle. Let the game engine figure out what features are most important to maintaining 60fps. It can track the performance of the game over the last 240 seconds and dial back features until 60fps is hit for that map. Or dial-forward features if it sees head room for it.

 

People whine and complain about telemetry tracking in software, but this is literately a case where we know (eg from the steam survey) what people have and can establish the base settings that work for that hardware (which is what Geforce Experience does) but those settings are not adjusted on a per-room or per-map setting, which they should be. Sometimes you want a really large map, and that will absolutely tank even the best GPU if it has infinite draw distance.

Link to comment
Share on other sites

Link to post
Share on other sites

people said this the generation before, and before that and before that, until we get to the days where crt lines made up a spaceship.

it's hard to get excited when real world video games graphics are basically the equivalent of apple saying 'the future is here' while charging a grand for a monitor stand thats more or less a metal stick.

when these features can be run on a 1070 at 4k call me, because THEN we've made strides in real world video game graphics.

*Insert Witty Signature here*

System Config: https://au.pcpartpicker.com/list/Tncs9N

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, SolarNova said:

One of my biggest gripes about game graphics, since i was a kid even, has been 'pop in' .. i HATE it ..i mean REALLY dont like it. I do everything i possibly can to try stop. if the game doesnt have an ingame option for LOD settings ..i find a mod ..or option file to tweek, to increase LOD quality so that 'pop in' is less noticeable.

 

While im sure it will take many years for this 'nanite' tech to work its way into the majority of games in one form or another, i look forward to it.

there is still "pop in", only that its more of visual noise that is generated. Which is something nice about DLSS 3 to FSR 3 if both are using "frame generation" to "smooth/stabilize" an moving image. Also custom LODs can be more performative in certain games, although cost a lot more time and resources, also balancing the LODs to their settings can be another issue as you mention.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×