Jump to content

The Point of G-SYNC?

TheRuke

G-sync is only really shines if you can't hold framerates above 60fps. Having v-sync 'on' with no framerate dips below 60fps gives you the same experience as G-sync (unless you are sensitive to the inherent 16ms input lag you get with v-sync enabled). As Linus said, G-sync will shine when it's in 4K monitors, when maintaining playable framerates is an issue. The problem is that at below 30fps, where would go a long way to smooth out the stop-start effect you get at these framerates, g-sync doesn't work.

 

It's an exciting, innovative concept but it's benefits are being a little overstarted. I'd like to see all monitors/displays use some kind of dynamic refresh rate in the future though, it's all pros and no cons. I guarantee scalers that output a dynamic refresh rate could be designed that work with any video input device (which will be needed for the widespread adoption of this tech). The only problem I see is that there isn't really any incentive for display manufactures to do this with all their products (especially expensive high res IPS panels that are aimed at professional or prosumer users where G-sync shows no benefit, unless a manufacturer goes out on a limb to differentiate themselves) as the benefits are only their when the video input has a variable frame rate (ie games). Only when the price reaches a point that the cost isn't much more than a typical scaler will they become common.

Link to comment
Share on other sites

Link to post
Share on other sites

Using normal v-sync for that game and it should fix the problem.

Seems kinda weird that would happen since Adaptive V-sync should be turning V-sync off since it's only running at 30FPS.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

Seems kinda weird that would happen since Adaptive V-sync should be turning V-sync off since it's only running at 30FPS.

The tearing tells me that vsync isn't enabled, this is because, as you know, adaptive vsync turns off vsync when framerates are under 60fps. With a game that is locked at 30fps you want to use normal vsync as it will lock vsync at 30FPS (with each frame being displayed for 2 refresh cycles) so you won't get any tearing.

 

 

 

Just to add because I feel like typing lol;

When normal vsync drops below 60fps you get stutter which also means added input lag (because a frame remains for >1 refresh cycles while the monitor waits for the GPU to completely render the next frame).

When adaptive vsync drops below 60fps you get tearing (as the monitor stops waiting for a complete frame to be rendered but displays what ever is ready at the next refresh cycle)

 

You'd want to use adaptive v-ysnc for most situations, if you are remain near 60fps or >60fps and only dip below at particularly GPU intensive moments.

Link to comment
Share on other sites

Link to post
Share on other sites

Wow lol its G Sync for starters and it is not a gimmick its awesome it might not be in phones or tvs and linus never said it would be he said it could be get your facts straight and if u can honestly play with V-sync with the terrible input lag u deserve a medal! 

 

Its really funny how people still don't completely understand what this is. lol

 

1. Stop white-knighting. It's sad.

2. Use punctuation or any kind of structure in your text so people can read it.

3. You base your assumption of "what this is" on stuff you heard from Linus, who clearly lost track of reality in this matter. Which invalidates your, not sure what to call it, let's say "argument".

 

It's obvious you're so pro, only V-Synch is holding you back from becoming a CoD billionaire. because THIS IS SPARTA G.Synch.

 

And as V-Synch is holding back your "true potential". The whole input lag thing is bullshit. Your monitor will adapt to 30fps. You still only see half to 1/4 of the frames someone with a better config sees. Your mouse will have the same input lag, as your monitor will refresh 1/factor its scaling.

 

Don't embarass yourself with such posts. How old are you, 9?!

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

And as V-Synch is holding back your "true potential". The whole input lag thing is bullshit. Your monitor will adapt to 30fps. You still only see half to 1/4 of the frames someone with a better config sees. Your mouse will have the same input lag, as your monitor will refresh 1/factor its scaling.

 

The input lag that you get with v-sync is because the monitor has to wait for an entire frame to be draw before it is displayed therefore you are seeing your input from at least 1 refresh cycle ago, when framerates dip below 60fps with vsync on the you are getting atleast 2 refresh cylces of input lag and depending how far along the GPU when rendering a frame the monitor may have to wait longer. With perfect v-sync on a 60Hz panel (ie framerate >60fps at all times) then you are getting at least 16ms of input lag, with it only getting worse in multiples of this. Monitors that have higher refresh rates go someway to alleviating this as the the monitor doesn't wait around as to refresh. Gsync updates the monitors image as soon as a frame is drawn by the GPU, so you see you don't get the input lag introduced by the monitor waiting for the next refresh cylce before that frame can be shown. 

 

Input lag will always exist to some extent, as a game engine receives input then the GPU renders a frame based on that input, as rendering has to happen after the input is received. What Gsync does is eliminate the added latency that occurs further down the pipeline in the monitor.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD: 'here nvidia if you'd like to use mantle, its open source'

NVIDIA: 'we are to proud to do that'

I will care about this if they EVER show any real game Mantle benchmarks. I would consider it worth it even if it was just Battlefield right now, but they have show nothing

Link to comment
Share on other sites

Link to post
Share on other sites

If it was a free feature, it would be game changing. $175 for less tearing in some scenes? No worth it imo (at least over a 144Hz)...

 

Also, input lag will always be limited eventually by the actual liquid crystal.

 

I don't get why people are comparing G-Sync to 60Hz V-Sync... It should be at least 120Hz V-Sync; same logic goes for comparing a 780 to a 7770.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

The input lag that you get with v-sync is because the monitor has to wait for an entire frame to be draw before it is displayed therefore you are seeing your input from at least 1 refresh cycle ago, when framerates dip below 60fps with vsync on the you are getting atleast 2 refresh cylces of input lag and depending how far along the GPU when rendering a frame the monitor may have to wait longer. With perfect v-sync on a 60Hz panel (ie framerate >60fps at all times) then you are getting at least 16ms of input lag, with it only getting worse in multiples of this. Monitors that have higher refresh rates go someway to alleviating this as the the monitor doesn't wait around as to refresh. Gsync updates the monitors image as soon as a frame is drawn by the GPU, so you see you don't get the input lag introduced by the monitor waiting for the next refresh cylce before that frame can be shown. 

 

Input lag will always exist to some extent, as a game engine receives input then the GPU renders a frame based on that input, as rendering has to happen after the input is received. What Gsync does is eliminate the added latency that occurs further down the pipeline in the monitor.

 

It will reduce the input lag on your monitor, but the game itself will receive the mouse input at the same speed it would without G.Synch and there is nothing but a visual change. That's what I meant to say.

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

I will care about this if they EVER show any real game Mantle benchmarks. I would consider it worth it even if it was just Battlefield right now, but they have show nothing

Mantle is supposed to come out this month.

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

It will reduce the input lag on your monitor, but the game itself will receive the mouse input at the same speed it would without G.Synch and there is nothing but a visual change. That's what I meant to say.

Not exactly, gsync will will mean input lag = framerates (eg, 40fps means 40Hz and therefore 25ms input lag). But the alterative is vsync off, so tearing means one frame is displayed on >1 refresh cycle and your input is spread over 32ms, or vsync to lock framerates at 30fps, so you still get at least 16ms input lag, but occasionally 32ms.

 

Again, all this just shows that gsync will be really good in some situations but pretty useless in others.

Link to comment
Share on other sites

Link to post
Share on other sites

Wow you guys are hating on gsync hard! Why do you do this?

You have not tried it. Anyone here who says it's not good/waste of money/etc please try it somewhere and THEN hate on it.

This will be very important for VR, because currently you ahve to have vsync for VR, because stuttering or tearing in VR will make vomit faster than putting a fist down your throat.

But the problem with vsync is lag, so since gsync has less lag than vsync it is an instant win for VR. Also you don't have to keep 120FPS constant(yes I believe that 120hz is essential for VR, I've tried the Oculus rift with 60hz and a higher framerate is VERY important, even more so than a higher res) but you can drop a little without stuttering your way down to 60FPS.

I have not tried gsync yet, but I am looking forward to doing so and I don't plan on buying a video card or monitor that doesn't support gsync.

EDIT: Please find one person that has tried gsync and hasn't seen a difference.

AMD FX8320 @3.5ghz |  Gigabyte 990FXA-UD3  |  Corsair Vengeance 8gb 1600mhz  |  Hyper 412s  |  Gigabyte windforceR9 290  |  BeQuiet! 630w  |  Asus Xonar DGX  |  CoolerMast HAF 912+  |  Samsung 840 120gb


2 WD red 1tb RAID0  |  WD green 2tb(external, backup)  |  Asus VG278He  |  LG Flatron E2240  |  CMstorm Quickfire TK MXbrown  |  Sharkoon Fireglider  |  Audio Technica ATH700X


#KILLEDMYWIFE

Link to comment
Share on other sites

Link to post
Share on other sites

Wow you guys are hating on gsync hard! Why do you do this?

You have not tried it. Anyone here who says it's not good/waste of money/etc please try it somewhere and THEN hate on it.

This will be very important for VR, because currently you ahve to have vsync for VR, because stuttering or tearing in VR will make vomit faster than putting a fist down your throat.

But the problem with vsync is lag, so since gsync has less lag than vsync it is an instant win for VR. Also you don't have to keep 120FPS constant(yes I believe that 120hz is essential for VR, I've tried the Oculus rift with 60hz and a higher framerate is VERY important, even more so than a higher res) but you can drop a little without stuttering your way down to 60FPS.

I have not tried gsync yet, but I am looking forward to doing so and I don't plan on buying a video card or monitor that doesn't support gsync.

EDIT: Please find one person that has tried gsync and hasn't seen a difference.

Please find one demo where they had a good card with over 60FPS and were locking it at 60FPS. There's probably no difference. G-SYNC just makes your experience smoother no matter how high your frames are while reducing lag (Which only applies to weaker cards, I would suggest getting a better card instead of buying a whole new monitor.)

| Case: NZXT Tempest 210 | CPU: Intel Core i5 3570K @ 3.9 Ghz | GPU: ASUS ROG STRIX GTX 1070 | RAM: Crucial Ballistix Tactical 8GB |

| Mouse: Zowie FK1 | Monitor: Acer 21.5' | Keyboard: CoolerMaster Stealth w/ Brown Switches |

#KilledMyWife - #LinusButtPlug - #1080penis

 

Link to comment
Share on other sites

Link to post
Share on other sites

Wow you guys are hating on gsync hard! Why do you do this?

You have not tried it. Anyone here who says it's not good/waste of money/etc please try it somewhere and THEN hate on it.

This will be very important for VR, because currently you ahve to have vsync for VR, because stuttering or tearing in VR will make vomit faster than putting a fist down your throat.

But the problem with vsync is lag, so since gsync has less lag than vsync it is an instant win for VR. Also you don't have to keep 120FPS constant(yes I believe that 120hz is essential for VR, I've tried the Oculus rift with 60hz and a higher framerate is VERY important, even more so than a higher res) but you can drop a little without stuttering your way down to 60FPS.

I have not tried gsync yet, but I am looking forward to doing so and I don't plan on buying a video card or monitor that doesn't support gsync.

EDIT: Please find one person that has tried gsync and hasn't seen a difference.

G-Sync isn't compatible with VR setups.

 

I was also wrong about the pricing. It's actually going to be $602.75, compared to $399 as previously rumored. That places it $335.76 more than a non-G-Sync enabled monitor, which costs $266.99. (Source)

 

If I wanted to get smooth motion, I'd just get the $266.99 144Hz monitor with a capable graphics card. And if that wasn't enough... I'd add another graphics card, which rewards me with raw power that I can use to accelerate more than just graphics.

 

Also, I can understand the comparison drawn between an SSD and HDD with G-Sync, but G-Sync isn't 10 folds better than a 144Hz monitor.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

G-Sync isn't compatible with VR setups.

 

I was also wrong about the pricing. It's actually going to be $602.75, compared to $399 as previously rumored. That places it $335.76 more than a non-G-Sync enabled monitor, which costs $266.99. (Source)

 

If I wanted to get smooth motion, I'd just get the $266.99 144Hz monitor with a capable graphics card. And if that wasn't enough... I'd add another graphics card, which rewards me with raw power that I can use to accelerate more than just graphics.

 

Also, I can understand the comparison drawn between an SSD and HDD with G-Sync, but G-Sync isn't 10 folds better than a 144Hz monitor.

 

LOL that price.

So Nvidia is basically making you buy a new GPU. Except you don't get a gpu but a 30c chip.

I still don't get how this should be as awesome as Linus is praising it to be. My guess is he really wants that Nvidia meet up, get a higher tier of cooperation and get moar moneyz.

If you've seen Slick's reactions it basically says "it's OK" while Linus is going apeshit for no reason. But well Anand is too fond of Nvidia to be 100% credible, since Linus is too fond of Anand this kinda goes hand in hand.

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

DISCLAIMER: When referring to gsync I include other technologies that might come up in the future that also sync the refresh rate of the panel to the framerate the GPU is outputting.

(I hope someone will make something that's open source so that AMD, nVidia and Intel can use it)

Please find one demo where they had a good card with over 60FPS and were locking it at 60FPS. There's probably no difference. G-SYNC just makes your experience smoother no matter how high your frames are while reducing lag (Which only applies to weaker cards, I would suggest getting a better card instead of buying a whole new monitor.)

 

With 4K coming up we won't have as good performacne as we do now. So it is very important to still have a fluid gaming experience. This is where gsync comes in: It makes any framerate look smooth.

And if you can't feels the difference in input lag between vsync on vs vsync off/gsync at 60hz you are about as sensitive to lag as a brick.

 

G-Sync isn't compatible with VR setups.

 

I was also wrong about the pricing. It's actually going to be $602.75, compared to $399 as previously rumored. That places it $335.76 more than a non-G-Sync enabled monitor, which costs $266.99. (Source)

 

If I wanted to get smooth motion, I'd just get the $266.99 144Hz monitor with a capable graphics card. And if that wasn't enough... I'd add another graphics card, which rewards me with raw power that I can use to accelerate more than just graphics.

 

Also, I can understand the comparison drawn between an SSD and HDD with G-Sync, but G-Sync isn't 10 folds better than a 144Hz monitor.

It is meant for VR setups, why do you think john carmack was at the nvida event? He even said that gsync is great for VR.

Having a capable graphics card with a 144hz monitor doesn't fix the issues.

Either you can run 144fps with vsync on without any stutter, but then you have input lag to worry about and you have to turn details down very far to keep the framrate so high.

Or you turn vsync off and get tearing and stuttering no matter what framrate you are running at, but less input lag.

Dat price though xD

It'll come down in price one they can put it on a dedicated microchip, the current implementation is very ghetto and thus pretty expensive. We'll have to see how prices change.

Adding another GPU might add raw performance, but it also introduces incompatiblity and stuttering beyond the stuttering that is there anyways due to the refresh rate not synced to the framerate.

For more details on why gsync is needed, here are some slow-mo videos of how monitos look actually:

 

No vsync(till 32:21)(this is running below the refreshrate of the monitor, you get tearing below the refresh rate as you can see ;)) :

http://youtu.be/KhLYYYvFp9A?t=31m1s

 

Tripple Buffering vsync:

http://youtu.be/KhLYYYvFp9A?t=35m14s

AMD FX8320 @3.5ghz |  Gigabyte 990FXA-UD3  |  Corsair Vengeance 8gb 1600mhz  |  Hyper 412s  |  Gigabyte windforceR9 290  |  BeQuiet! 630w  |  Asus Xonar DGX  |  CoolerMast HAF 912+  |  Samsung 840 120gb


2 WD red 1tb RAID0  |  WD green 2tb(external, backup)  |  Asus VG278He  |  LG Flatron E2240  |  CMstorm Quickfire TK MXbrown  |  Sharkoon Fireglider  |  Audio Technica ATH700X


#KILLEDMYWIFE

Link to comment
Share on other sites

Link to post
Share on other sites

<snip>

G-Sync is a great technology. I cannot wait for AMD to come out with something similar. The price Nvidia is asking for it is way to high for it to be a "killer" functionality.

 

Here are some (poorly) edited photos to show how 120Hz is better than 60Hz:

sXq85e5.jpg

zcMIouz.jpg

The point is, the torn and stuttered frames are only displayed for 8 ms instead of 16. That's much more acceptable, as seen in the Anandtech review. There is also an input lag reduction, which is great.

Obviously, the future is 0 torn or stuttered frames, but it isn't worth $335 (plus the MSRP premium that comes with an Nvidia card).

 

I hope this technology comes back as a standard.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

1. Stop white-knighting. It's sad.

2. Use punctuation or any kind of structure in your text so people can read it.

3. You base your assumption of "what this is" on stuff you heard from Linus, who clearly lost track of reality in this matter. Which invalidates your, not sure what to call it, let's say "argument".

 

It's obvious you're so pro, only V-Synch is holding you back from becoming a CoD billionaire. because THIS IS SPARTA G.Synch.

 

And as V-Synch is holding back your "true potential". The whole input lag thing is bullshit. Your monitor will adapt to 30fps. You still only see half to 1/4 of the frames someone with a better config sees. Your mouse will have the same input lag, as your monitor will refresh 1/factor its scaling.

 

Don't embarass yourself with such posts. How old are you, 9?!

Lmao!

 

1. Im not in school your not my teacher don't tell me what to do!!!

2. I don't base my assumption on what linus has said i base it on what ive seen and know! Nvidia has showed it off!

3. I don't even play Pc cod I play xbox for cod not enough people play pc competitively!

 

So once again wrong with your assumptions and im the one embarrassing myself lol! Im not forcing u to get believe in it or get it but im allowed to have my own opinion so quit crying like a little bitch shows how much of a pussy u really are in the real world lol. 

Link to comment
Share on other sites

Link to post
Share on other sites

1. Im not in school your not my teacher don't tell me what to do!!!

 

"I'm not in school and you're not my teacher..."

Perhaps you should be.

 

 

2. I don't base my assumption on what linus has said i base it on what ive seen and know! Nvidia has showed it off!

 

You have seen G.Synch on a regular monitor. Magic exists after all.

 

 

3. I don't even play Pc cod I play xbox for cod not enough people play pc competitively!

 

You play CoD. And on a XboX.

My assumption was not only right but also got enforced.

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

"I'm not in school and you're not my teacher..."

Perhaps you should be.

 

 

 

You have seen G.Synch on a regular monitor. Magic exists after all.

 

 

 

You play CoD. And on a XboX.

My assumption was not only right but also got enforced.

omfg wrong again lol u still show how immature u are by still fighting me over things completely off topic and all because i disagree with you haha get a life! 

Link to comment
Share on other sites

Link to post
Share on other sites

Torvalds in the new Linux Kernel -> GTFO Nvidia .D +50& AMD +0% Nvidia, yay SteamOS ^^

 

 

On Topic:

 

G.Synch is a gimmick.

 

Linus is fangirling out in his video. It's gross, unprofessional and sad. He's probably chasing his wife and his employee with a roll of tape and the G.Synch module as we speak.

 

G.Synch will go away as quickly as 3D did.

 

Linus' prediction that G.Synch will be everywhere from TVs to Smartphones to what ever is not only uneducated but simply wrong. The general computer user has no need for G.Synch. And therefore no company is going to mass produce G.Synch TVs, Phones and Monitors as neither the consumer nor they themselves are eager to pay Nvidia fees.

 

Might be a fun technology, but Linus really overexagerates. Like hard.

 

Edit:

 

Slick's reaction says it all: "Ya... less tearing in that one room and.... ... ... "

 

 

Do you have Skype, Mumble, or Ventrilo installed? I want to talk to people like you who are spreading misinformation and confusing people by not knowing a thing about how anything works or its usefulness. G-Sync is REVOLUTIONARY for games--particularly emulation.

 

Gamers don't LIKE tearing and input lag. Gamers WANT their games like the arcade version of Mortal Kombat II to run smoothly at its 54.7hz requirement. Only a psychopath like you would HOPE for tearing and input lag.

Link to comment
Share on other sites

Link to post
Share on other sites

G-sync is only really shines if you can't hold framerates above 60fps. Having v-sync 'on' with no framerate dips below 60fps gives you the same experience as G-sync (unless you are sensitive to the inherent 16ms input lag you get with v-sync enabled). As Linus said, G-sync will shine when it's in 4K monitors, when maintaining playable framerates is an issue. The problem is that at below 30fps, where would go a long way to smooth out the stop-start effect you get at these framerates, g-sync doesn't work.

 

It's an exciting, innovative concept but it's benefits are being a little overstarted. I'd like to see all monitors/displays use some kind of dynamic refresh rate in the future though, it's all pros and no cons. I guarantee scalers that output a dynamic refresh rate could be designed that work with any video input device (which will be needed for the widespread adoption of this tech). The only problem I see is that there isn't really any incentive for display manufactures to do this with all their products (especially expensive high res IPS panels that are aimed at professional or prosumer users where G-sync shows no benefit, unless a manufacturer goes out on a limb to differentiate themselves) as the benefits are only their when the video input has a variable frame rate (ie games). Only when the price reaches a point that the cost isn't much more than a typical scaler will they become common.

 

That does it. Pull over.

 

Give me your Skype information. I'm TIRED of people saying things like "G-sync is only good for x or x." WOW you need to get off of my planet. If Carl Sagan were alive he'd make an entire SERIES on how revolutionary G-Sync is because IT IS and if you deny it you might as well deny evolution or the CURVATURE OF THE EARTH. WOW I'm sure glad deadbolts were invented for doors.

Link to comment
Share on other sites

Link to post
Share on other sites

That does it. Pull over.

 

Give me your Skype information. I'm TIRED of people saying things like "G-sync is only good for x or x." WOW you need to get off of my planet. If Carl Sagan were alive he'd make an entire SERIES on how revolutionary G-Sync is because IT IS and if you deny it you might as well deny evolution or the CURVATURE OF THE EARTH. WOW I'm sure glad deadbolts were invented for doors.

 

I'm not saying it is 'only' good in certain situations, just that it's benefits are more pronounced in certain situations.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×