Jump to content

Three times the charm - New AMD CPU announcement + big Navi Teaser

williamcll
5 hours ago, LAwLz said:

Not sure what you mean. They lowered the resolution and image quality to reduce the GPU bottleneck. 

 

 

Of course price to performance matters. That's pretty much the only thing that should matter to 90% of all customers. 

That's why people loved Ryzen so much. It had good enough single core performance but way better multi-core performance for the price. 

If people start going "oh but single core performance is more important than price or multi-core performance" then I'll call them out as the hypocrites and fanboys they are if they have recommended any Ryzen CPU before this. 

like buildzoid said even if an application had perfect scaling there would be very little difference because of the per thread improvements and how many applications have that?

 

also with the fact that they beat intel in everything now and they probably wont have unlimited stock at launch they would be crazy not to raise prices imo

Link to comment
Share on other sites

Link to post
Share on other sites

The CPU very much matters for gaming in certain titles and/or if you are aiming for high FPS.  A weaker CPU will bottleneck a strong GPU at higher FPS.

 

If you are aiming for 1080P, sub-100 FPS then yes, you'll probably be happy with a 3600 for a long time.  If you're aiming for 1080P 144Hz or 1440P 144Hz, I think there will be times you wished you had a top of the line 8-core CPU.

 

Man, I wish AMD had released an 8-core 65W 5700X.  That's the part I was planning my build around, and I don't want to wait until next year to build.

 

 

 

 

 

 

Xeon E3-1241 @3.9GHz, 1.07V | Asus Z97-E/USB 3.1 | G.Skill Ripjaws X 8GB (2x4GB) DDR3-1600 | MSI RX 480 Gaming X 4GB @1350MHz/2150MHz, 1.09V/.975V | Crucial MX100 256GB | WD Blue 1TB 7200RPM | EVGA 750W G2 80+ Gold | CM Hyper 212+ w/ Noctua F12 | Phanteks Enthoo Pro M | Windows 10 Retail

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, gabrielcarvfer said:

And it does, that is why you can look for alternatives until the prices comes down.

 

And still does.

 

Uh, everyone still recommended Intel for games/CAD/AI applications because they are/were waaaay faster, even though they have higher prices and multi-core performance isn't that great. You seem to be way to mad for something that doesn't make any sense. If you were going to buy for work, then buy it. If you were not, than save your money and buy something later with a better deal. 2020 isn't the best year to build/upgrade a PC anyways.

It still does as long as they keep making it.  If they want to kill the 3700x they’ve got to drop the price of the 5800x if they want to keep price/performance, which I think they do if they’ve got a brain in their head.  The major structural difference between the 3700x and 5000 series is the cache is a single cache pool.  Big sheet of silicon.  Yield might not be great. With the cache as two separate chiplets they get much less defect waste.  What I might see them doing is making a version of 5000 with the split cache and selling it for a lower price before dropping the 3700x. It would be slower, but cheaper to make and they could still keep price/performance.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Bombastinator said:

It still does as long as they keep making it.  If they want to kill the 3700x they’ve got to drop the price of the 5800x if they want to keep price/performance, which I think they do if they’ve got a brain in their head.  The major structural difference between the 3700x and 5000 series is the cache is a single cache pool.  Big sheet of silicon.  Yield might not be great. With the cache as two separate chiplets they get much less defect waste.  What I might see them doing is making a version of 5000 with the split cache and selling it for a lower price before dropping the 3700x. It would be slower, but cheaper to make and they could still keep price/performance.

You can't attribute yield to CCX/cache structure.

3000 series, 8 cores per CCD, 32MB of L3 cache.

5000 series: 8 cores per CCD, 32MB of L3 cache.

A defect in the cache would affect both equally. Even the 6 core versions have the full cache. Dies with a core defect could be used to produce the 6 core and possible future fewer core versions. The unified CCX gives more flexibility there. With Zen 2 a die with good cores in 4+2 configuration could only be offered as either 4+0 or 2+2, but with Zen 3 they can do 6 cores regardless.

 

A hypothetical 5700X could mirror the 3700X in lower clocks and lower power compared to the _800X parts. A major point of Zen 3 is the removal of the cache partitioning. Adding it back in makes no sense outside a scenario they use two CCDs to do so. This would only make much sense if their CCD core yields are really bad they had a load of dies with only 4 good cores. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, porina said:

You can't attribute yield to CCX/cache structure.

3000 series, 8 cores per CCD, 32MB of L3 cache.

5000 series: 8 cores per CCD, 32MB of L3 cache.

A defect in the cache would affect both equally. Even the 6 core versions have the full cache. Dies with a core defect could be used to produce the 6 core and possible future fewer core versions. The unified CCX gives more flexibility there. With Zen 2 a die with good cores in 4+2 configuration could only be offered as either 4+0 or 2+2, but with Zen 3 they can do 6 cores regardless.

 

A hypothetical 5700X could mirror the 3700X in lower clocks and lower power compared to the _800X parts. A major point of Zen 3 is the removal of the cache partitioning. Adding it back in makes no sense outside a scenario they use two CCDs to do so. This would only make much sense if their CCD core yields are really bad they had a load of dies with only 4 good cores. 

I guess I’m assuming that the cache for the 3700x is two pieces whereas the cache for the 5000 is one piece. If you get a point fault on a 3700x you throw half the cache away whereas with a 5000 you’d have to throw all of it away.  This is the cheapness gain of chiplets. If the 5000 uses fewer or no chiplets it’s going to be more expensive to make because there’s more waste.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully the 5000 series get discounted as much and as quickly from MSRP as the previous generation Zen chips have been. Disregarding performance uplift and competition with Intel, the sticker shock people are going to see moving from previous x600 series to this one for AMDs cheapest current gen is gonna be rough.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Bombastinator said:

I guess I’m assuming that the cache for the 3700x is two pieces whereas the cache for the 5000 is one piece. If you get a point fault on a 3700x you throw half the cache away whereas with a 5000 you’d have to throw all of it away.  This is the cheapness gain of chiplets. If the 5000 uses fewer or no chiplets it’s going to be more expensive to make because there’s more waste.

Let's clear up the terminology first. A core chiplet would be each piece of silicon containing the cores, also known as a CCD. In Zen 2 you have two CCX each of 4 cores and 16MB of L3 cache. Each CCX can not directly talk to each other. They have to go off chiplet and back again to do so. As a result of that, each CCX has to be able to access its own share of local L3 cache to function effectively. In Zen 3 you have one CCX of 8 cores and 32MB of L3 cache. On the 3600 and up CPUs, they are all offered with full cache.

 

It gets more interesting in the lower parts. The China special 3500 is apparently a 6 core with 16MB of total L3 cache. From what we know of CCX, they have to be balanced or totally disabled. To get 6 cores, they would have to run two CCX with 3 cores and 8MB cache each, implying they disabled half the cache from each CCX, and not all the cache from one CCX, leaving the other untouched. Similar probably applies to the 3100 which runs 2 cores each CCX. The 3300X could be the scenario you describe, with 4 cores and full cache of a single CCX only.

 

In short, it looks like AMD have the ability to partially disable the cache regardless of the CCX structure. Not having that CCX partition in Zen 3 is not a barrier to that. If anything, it gives them more flexibility in how they disable parts to produce offerings with each CCD below maximum core count, as there are fewer limitations in place.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, porina said:

Let's clear up the terminology first. A core chiplet would be each piece of silicon containing the cores, also known as a CCD. In Zen 2 you have two CCX each of 4 cores and 16MB of L3 cache. Each CCX can not directly talk to each other. They have to go off chiplet and back again to do so. As a result of that, each CCX has to be able to access its own share of local L3 cache to function effectively. In Zen 3 you have one CCX of 8 cores and 32MB of L3 cache. On the 3600 and up CPUs, they are all offered with full cache.

 

It gets more interesting in the lower parts. The China special 3500 is apparently a 6 core with 16MB of total L3 cache. From what we know of CCX, they have to be balanced or totally disabled. To get 6 cores, they would have to run two CCX with 3 cores and 8MB cache each, implying they disabled half the cache from each CCX, and not all the cache from one CCX, leaving the other untouched. Similar probably applies to the 3100 which runs 2 cores each CCX. The 3300X could be the scenario you describe, with 4 cores and full cache of a single CCX only.

 

In short, it looks like AMD have the ability to partially disable the cache regardless of the CCX structure. Not having that CCX partition in Zen 3 is not a barrier to that. If anything, it gives them more flexibility in how they disable parts to produce offerings with each CCD below maximum core count, as there are fewer limitations in place.

But that does mean that a zen2 3700x is two chiplets and a zen3 5800 is one chiplet (or monolithic block?) so twice as big and more silicon would be ruined if there is a flaw.  They could disable parts of flawed zen3 chips to make smaller cache lower core count chips like intel does but they wouldn’t be the same chip any more.  There would be oddity chips.  It’s possible the 5600 is one I suppose.  The problem with the 5600 is its 6/12 and slower because of the way the AMD boost function works.  The reason higher core count chips have higher boost clocks is the boost function boosts more the more cores there are.  

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, LAwLz said:

That's a very clever way of typing "it happened once". The Nvidia 20 generation. But even that is at best a partial lie because some GPUs in the 20 generation did offer better price to performance than the 10 series.

True, also due to those actually having market competition too. Geforce 16 series was also introduced as well because of how badly value 20 series was in general. The problem is however the pricing you could actually buy these at, if you look at reviews and MSRP it looks a lot better than it actually was. For someone actually looking to buy the majority of the time the listed prices were driving down that performance per dollar, and for a generation where it was borderline increase of that it was more often below than above.

 

17 hours ago, LAwLz said:

stop limiting your comparisons to one specific thing (gaming) and you will see that this statement is untrue. We did get better price to performance. Gaming might have taken a small hit but nobody cared because the other performance metrics got a big boost. 

Why, it's very much what this forum cares about. There is zero reason for a customer looking to buy a CPU for gaming to care at all the performance per dollar of say the 9900K is better than the 8700K is for Blender, it is for that customer irrelevant. And lets not also ignore the primary target for those CPUs that the marketing is very much targeted towards, gamers.

 

It's not like Intel doesn't have an entire product segment for professional and larger multi-thread workloads. If I wanted to focus on that it wouldn't be using consumer desktop as my basis for discussion. However to be fair there Intel 10th gen and even 9th gen were for the most part competent at this in part due to the high core counts from market competition and software making use of the iGPU.

 

But to say nobody cared, heck no, a lot of people care about this workload they just didn't have a better choice. Buy the best or don't, doesn't leave much else for discussion there does it? So people that were willing to purchased the best.

 

17 hours ago, LAwLz said:

2) CPU is pretty much irrelevant for gaming so it's barely worth mentioning. Like I showed earlier, even a 200 dollar CPU is more than enough to keep up with a 1000 dollar GPU for gaming. If you are going to game on your pc then you do not want to buy an expensive CPU. The Ryzen 5000 series is way worse for gaming than the 3000 series because of the price being higher and the cheaper 3000 chips being more than enough for gaming. Your money is better spent on GPU than CPU if you only care about gaming. 

I never said anything counter to this, neither have I disagreed with your statements that 3600 is still a better buy for most people. Simply buying a better GPU and/or turning up the graphics settings so the CPU is not a performance factor is basically always the better choice.

 

However if you are not a current Ryzen system owner or 300 series owner you really only have two options that actually make sense. The first is do not buy anything and wait for more Ryzen 5000 products to be announced or if you need to purchase now for what ever reason then Ryzen 5000 purchase simply does make the most sense. It does not make sense to buy in to the end of life of a platform and while doing so buy a generation old product, you are never going to upgrade that CPU. The difference in total system cost doing so is minimal and unless you plan to upgrade as soon as Ryzen 6000 comes out that 5000 will be a better purchase later when it will matter.

 

17 hours ago, LAwLz said:

1) The 20 series got a lot of shit (rightfully so) for the prices. For some reason AMD isn't getting it for "doing the same thing" though. 

I will not pass judgement when I do not have the information on architecture improvements and I do not have independent reviews.

 

17 hours ago, LAwLz said:

With this Ryzen 5000 launch the price to performance has gone way down (if AMD's benchmarks are accurate). 

So far only significantly so for the 5600X and does it not seem odd to declare the entire generation a dud off of a single SKU?

 

5950X: ~7% increase for a claimed performance gain greater than this

5900X: ~10% increase for a claimed performance gain greater than this (3900 is OEM only)

5800X: ~13% increase (3800X) or 37% increase (3700X), nobody should be buying the 3800X so we'll go with the 3700X here. Most of AMD's comparisons for this one were with the 10700K but I'll go with value reduction on this one compared to what people were actually buying, the 3700X.

5600X: ~20% or 50% increase, both most likely a value reduction, one certainly.

 

Seems to me it's more accurate to say the 5600X is a worse value than the previous generation, not the entire 5000 series. But who really knows, I don't have reviews yet to look at.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Bombastinator said:

But that does mean that a zen2 3700x is two chiplets and a zen3 5800 is one chiplet (or monolithic block?)

The 3700X was a single CCD, so was the 3800X.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

The 3700X was a single CCD, so was the 3800X.

So it’s monolithic?  Why even mess with infinity fabric? 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Bombastinator said:

So it’s monolithic?  Why even mess with infinity fabric? 

No the CPU cores are in the CCD, memory controller and PCIe controller etc are in the IOD. You only need a second CCD if you go above 8 cores. You can do a two CCD 8 core product but AMD does not do this because it would more likely be worse in performance because of cross CCD communication, there would be some workloads where it would be better and some where it would be worse. Have a look at the 3100 and 3300X reviews to see this in action.

 

Edit:

You either have a 2 chip MCM product or a 3 chip MCM product. Only the APUs are monolithic.

 

The IOD is 12nm on a much cheaper and extremely high yield node and it's area size is much larger than the CCDs. The CCDs are 7nm on TSMC that is more expensive and from what I know lower yields than GloFo 12nm, you get more usable product this way. If you were to combine a single CCD with the IOD and create an actual monolithic design it would have lower yields and a higher cost to the consumer. And as you have just seen it also allows AMD to update the CCDs with a new architecture without making any changes to the IOD at all, Ryzen 5000 is using the same IOD as 3000 series with zero changes to it.

Link to comment
Share on other sites

Link to post
Share on other sites

Will be interesting to add AMD's PassMark single/multi-core numbers in mid-November. I will update this post then.

 

eb_intelamd-5.png.94f3017f0433c2d90695e645e4f4631b.png

 

The chart is in order of current (early October) UK price.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, kingmustard said:

Will be interesting to add AMD's PassMark single/multi-core numbers in mid-November. I will update this post then.

 

eb_intelamd-5.png.94f3017f0433c2d90695e645e4f4631b.png

 

The chart is in order of current (early October) UK price.

I thought Intel stopped providing coolers with their processors like several generations ago?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, thechinchinsong said:

I thought Intel stopped providing coolers with their processors like several generations ago?

I'm happy to be corrected if anyone knows more?

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, thechinchinsong said:

I thought Intel stopped providing coolers with their processors like several generations ago?

As a generalisation, it is my understanding that models with K or X in the name do not, but the ones lacking those letters do include a cooler.

 

Edit: I should have added, above applies to retail boxed product. OEM versions do not include cooler.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, porina said:

As a generalisation, it is my understanding that models with K or X in the name do not, but the ones lacking those letters do include a cooler.

For AMD it's 65W and below come with a cooler, at least for 5000 series anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/9/2020 at 4:56 PM, gabrielcarvfer said:

10900k is 650USD on Amazon. 

5900x is 550USD.

 

It beats the competition in basically everything, still 100USD cheaper and people are complaining... Not to mention the cheaper mobos.

 

I don't really get it.


 

thats listing a cpu that isnt even available and has been replaced (not officially but still) by the i9 10850k which can be gained here in NLD for 450-465 which is around a 100 bucks less then the 5900x no doubt it will even become less as time progresses. so them comparing it to the 10900k was pretty stupid. and even then. here in NLD i can buy a 10900k for 515 which is still cheaper than the 5900x. 

PC: 
MSI B450 gaming pro carbon ac              (motherboard)      |    (Gpu)             ASRock Radeon RX 6950 XT Phantom Gaming D 16G

ryzen 7 5800X3D                                          (cpu)                |    (Monitor)        2560x1440 144hz (lg 32gk650f)
Arctic Liquid Freezer II 240 A-RGB           (cpu cooler)         |     (Psu)             seasonic focus plus gold 850w
Cooler Master MasterBox MB511 RGB    (PCcase)              |    (Memory)       Kingston Fury Beast 32GB (16x2) DDR4 @ 3.600MHz

Corsair K95 RGB Platinum                       (keyboard)            |    (mouse)         Razer Viper Ultimate

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, kingmustard said:

I'm happy to be corrected if anyone knows more?

Check out what porina said here. Also, I was under an impression that models with F in the name also don't have coolers, but that might just me remembering things wrong.

8 hours ago, porina said:

As a generalisation, it is my understanding that models with K or X in the name do not, but the ones lacking those letters do include a cooler.

 

Edit: I should have added, above applies to retail boxed product. OEM versions do not include cooler.

If anyone else has something to add about box coolers, feel free to chime in.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, thechinchinsong said:

Check out what porina said here. Also, I was under an impression that models with F in the name also don't have coolers, but that might just me remembering things wrong.

F only means no integrated graphics. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, divito said:

Has there been any word regarding them releasing something like a 3300x for Zen 3?

There was something like a year between the announcement of the 3700x and the 3300x so I wouldn’t hold my breath.  

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Bombastinator said:

There was something like a year between the announcement of the 3700x and the 3300x so I wouldn’t hold my breath.  

Ya I kind of figured it might happen similarly as before as they flesh out everything; I just haven't been following all the news very closely so wasn't sure if they had anything planned in that regard or announced in any of their materials.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, divito said:

Ya I kind of figured it might happen similarly as before as they flesh out everything; I just haven't been following all the news very closely so wasn't sure if they had anything planned in that regard or announced in any of their materials.

I personally would have taken the announcement of a 4/8 chip to imply they think 4/8 will be sufficient for future gaming.  I take the announcement of the 6/12 chip to imply they think such might be sufficient.  I still don’t know though. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, divito said:

Has there been any word regarding them releasing something like a 3300x for Zen 3?

As the lower end chips don't really bring profits, they are always the last ones.

Also, typically, it is a collection of chips that they gather that can't deliver what is expected from the premium chips, cut down to be sold as a lower end chip (unless it is to meet demand, then it could be a fully functional one, purposefully cut down.. but usually if the demand is seen to be greater then temp, then they would just make a cheaper version).

 

So going with the Zen 3.

We can guess that the 6 core variants, is 1x 8 core CCX where 1 or 2 of its cores are broken (don't work, or can't operate at their expected speeds), these faulty cores are cut off, and sold as a 6 core variant. If we compare the 5800X with the 5600X, we can see that the 5800X is a 8 core CPU, and the 5600X is a 6 core

 

So unless they start making a normally, by design, a 6 core variant and a 4 core variant, which is possible.. I mean the 6 core version might meet demand as production gets better, but also the faulty ones sold as a different kind of quad core with different speed and cache, and the broken quad core, might be sold as a dual core branded as a Athlon series (say).

 

A lot of possibilities can be done. But these are based on what is the current situation now, and what they predict the market will want to buy. For example, if in a month, TSMC is outputting 0 or too few faulty chips to be sold as a quad core, but the 6 core is still popular enough, maybe they'll make a true quad core CCX to be used, especially if they know it will sell like hot cakes to be worth producing. 

Edited by GoodBytes
Correction to what I said.
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×