Jump to content

Leaked MacBook Air GB5 benchmark shows score higher than 16-inch MacBook Pro; SC higher than 5950X

Go to solution Solved by Spindel,

*DISCLAIMER* All pictures below are stolen from Affinity forum. 

 

Since Apparently Geekbench is bad let's look att Affinity benchmark

 

This is a i9-10900 with a RTX 2070 Super

image.png.2f5c0203504a50b8fa961dd8318a10ff.png

 

 

 

This is a 3900X with a GTX 1080

image.png.7695f37d1eb96d2bd2758a053ca0d179.png

 

 

This is the M1

image.thumb.png.0e7353cdcc881f86e582110920f779c5.png

 

 

38 minutes ago, saltycaramel said:

I’ve already said it but I think the battery life on these M1 laptops will blow people’s mind because it will be drained in a more “ipad-like” way...

Only heterogeneous cores and complete control over both the hw and the OS afford that..

iPad Air A14 5nm - 10hrs of web browsing

MBP 13” M1 5nm - 17hrs of web browsing

 

Now that they are no longer “dog years and human years”, the direct comparison of the above ratings provided by Apple could mean an all day battery life for the MBP...some people may even change how and where they use their laptop.

This, and the iPad's legendary battery saving mode. You can go weeks without using an iPad, open it one day and all your notifications are there with 20% battery still left. This type of power management is now coming to a laptop.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, NotTheFirstDaniel said:

This, and the iPad's legendary battery saving mode. You can go weeks without using an iPad, open it one day and all your notifications are there with 20% battery still left. This type of power management is now coming to a laptop.

Honestly that is far more useful than battery run times that realistically aren't required. The reason people like those longer run time figures is because it also extends the heavy usage run time which is the thing people actually care about, because you are never away from charging ability for that long except in extremely rare and infrequent circumstances.

 

Being able to put your laptop away fully charged and then open it after 24-48 hrs and it's still 99.9% charge, hell yes that is better and useful.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, saltycaramel said:

I’ve already said it but I think the battery life on these M1 laptops will blow people’s mind because it will be drained in a more “ipad-like” way...

Only heterogeneous cores and complete control over both the hw and the OS afford that..

iPad Air A14 5nm - 10hrs of web browsing

MBP 13” M1 5nm - 17hrs of web browsing

 

Now that they are no longer “dog years and human years”, the direct comparison of the above ratings provided by Apple could mean an all day battery life for the MBP...some people may even change how and where they use their laptop.

If they made this into a 2in1 i could see the point but tbh most of the time when I don't want to hassle with plugging something in I also don't want to have a laptop as its hard to use while standing. I am more likely to use an iPad and wanting better battery life on that then on a laptop simply because I can actually use an iPad on the move. Unfortunately using a laptop while standing and moving around alot is impractical even if it has the battery life to support it. I mean the majority of my coworkers used iPad pros when out in the field because you can use it while standing and moving.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Brooksie359 said:

If they made this into a 2in1 i could see the point but tbh most of the time when I don't want to hassle with plugging something in I also don't want to have a laptop as its hard to use while standing. I am more likely to use an iPad and wanting better battery life on that then on a laptop simply because I can actually use an iPad on the move. Unfortunately using a laptop while standing and moving around alot is impractical even if it has the battery life to support it. I mean the majority of my coworkers used iPad pros when out in the field because you can use it while standing and moving.

They did make it into a 2 in one.  iPad Pro with the new magnetic keyboard thing.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bombastinator said:

They did make it into a 2 in one.  iPad Pro with the new magnetic keyboard thing.

I guess that sorta works but tbh it wouldn't change much for me then because I would still opt for the iPad pro in that case. I guess the one plus side is that maybe more apps and programs will be compatible with the iPad in the future due to Apple switching their laptops over to arm based processors. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Brooksie359 said:

I guess that sorta works but tbh it wouldn't change much for me then because I would still opt for the iPad pro in that case. I guess the one plus side is that maybe more apps and programs will be compatible with the iPad in the future due to Apple switching their laptops over to arm based processors. 

Not sure what the one plus side is.  Might be a typo.  Might refer to a line of android phones.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bombastinator said:

Not sure what the one plus side is.  Might be a typo.  Might refer to a line of android phones.

I'm saying that the iPad might benefit indirectly with Apple's move to arm based chips in their laptops. More apps and programs will be made to run on arm based cpus natively in the future allowing them to run on the iPad as well. 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Brooksie359 said:

I'm saying that the iPad might benefit indirectly with Apple's move to arm based chips in their laptops. More apps and programs will be made to run on arm based cpus natively in the future allowing them to run on the iPad as well. 

I think they've already put Final Cut Pro and XCode on the roadmap for the iPad, or at least that's what the leaks say. Maybe it will come at the next WWDC.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, Brooksie359 said:

Like I have said before I fundamentally disagree with that statement. The majority of the people I know aren't complaining about their laptops battery life and wanting a different laptop as a result. The reason being that most people plug their laptops in or go without it being plugged in a couple or hours at a time not 20 hours. Battery life for a tablet or a phone makes sense to me as you typically would use those on the go but for a laptop you really need to be sitting down somewhere to use it and often times you can go and plug it in at that time.

Let's agree to disagree. You are in the minority - who doesn't care about battery life here. And you don't seem to be getting the point here. I've already told you (and many other have reiterated it after) that it's thae battery life of the device will extend across the board. For me I dont have to worry about every bringing a charger over the weekend. For other's it's a little more up time for the intensive tasks they do. Either way it's a laptop and people like to carry it around.

 

It is an important metric for a lot of people. I'd suggest someday you actually try using a laptop with good battery life. It will probaby change the way you think now

Quote

Obviously having more battery life is good but not when it comes at the expense of not running an x86 chip and having to hassle with compatibility workarounds like rosseta and potentially performance penalties in x86 oriented workloads. 

Not this again, sigh. Apple's entire pitch here is that their chips are going to be better than x86. They aren't splitting their Macs to ARM and Intel both hving their own strengths. Rather they're aiming to transition completely and beat Intel with their own chips - meaning they expect their new Mac to be better in all ways than if it was Intel put in them.

 

Sure, the transition period may be slightly rough, but with Apple's influence (something only Apple can do) most of maintained apps will switch to run on ARM and eventually this will become a non issue in about year or two. This is something like growing pains. And for x86 Apple has made a translation layer and you as user doesnt need to bother with any of this. If the app works fast - great. If rosetta performance doesn't live up to their claims and is a deal breaker for you, wait until the ARM version comes out before getting a new MacBook. Either way, all these issues you stated are things that we all know is going to get fixed. And hopefully it's all worth it in the end in terms of battery life and performance - which Apple seems very confident in

Link to comment
Share on other sites

Link to post
Share on other sites

Oh now on to the next thing.

 

In a thread I made some weeks ago I was lamenting the death of the dedicated sound card. In that thread I said the next thing to go was the dedicated GPU.

 

Of course I got more (angry) replies on my foot note about the coming death of the dGPU than on my lament to the dedicated sound card.

 

Looking at the performance numbers (that have come out) of the M1 iGPU and looking at the power envelope of it i guess I’ll stand by dGPU becomeing something that will disappear.

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Spindel said:

Oh now on to the next thing.

 

In a thread I made some weeks ago I was lamenting the death of the dedicated sound card. In that thread I said the next thing to go was the dedicated GPU.

 

Of course I got more (angry) replies on my foot note about the coming death of the dGPU than on my lament to the dedicated sound card.

 

Looking at the performance numbers (that have come out) of the M1 iGPU and looking at the power envelope of it i guess I’ll stand by dGPU becomeing something that will disappear.

It might disappear in laptops. I don't think it will on Desktops, especially PC side. My reason is stated below

 

As sound cards improved - it was noticed by less and less people. These days most people are completely fine with a cheap pair of earphones and the built in sound card in pretty much all devices - making a dedicted sound card superflous to most people

 

GPU is a different thing. Every year, we make advancements in power efficiencies in silicon. What happens is iGPUs get more and more powerful. But at the same time, you also have an option to clock the architecture to a 11 and throw power at it to get more performance. So as long as that is always the case, we will have dedicated GPUs, on desktops at least

 

At improvement in GPU speed is something of value for foreseeable future. In Gaming, I doubt we'll ever stop wanting better graphics until every game basically looks like a live action movie (and we're far from that). All GPU accelerated tasks, things that days hours now should ideally be in seconds in the very future. Even then we'll come up with workloads that even more demanding (like live path tracing) that just isn't possible with current technology today.

 

In laptops, yes I think it will disappear (like the way it already has in most 13" devices). But in desktops, where those extreme performance are more likely to be used, we'll always want to overclock whatever new tech we have to the limit, thereby requiring us to make massive cooleres like the RTX 3090

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Spindel said:

Oh now on to the next thing.

 

In a thread I made some weeks ago I was lamenting the death of the dedicated sound card. In that thread I said the next thing to go was the dedicated GPU.

 

Of course I got more (angry) replies on my foot note about the coming death of the dGPU than on my lament to the dedicated sound card.

 

Looking at the performance numbers (that have come out) of the M1 iGPU and looking at the power envelope of it i guess I’ll stand by dGPU becomeing something that will disappear.

This has happened before.  There was a period where there were math coprocessors to go along with CPUs.  They eventually got absorbed into the cpu.  If you consider a discrete card to be effectively a coprocessor graphics cards are in some ways math coprocessors as well.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bombastinator said:

This has happened before.  There was a period where there were math coprocessors to go along with CPUs.  They eventually got absorbed into the cpu.  If you consider a discrete card to be effectively a coprocessor graphics cards are in some ways math coprocessors as well.

Most people get away just fine with Intel iGPU and AMD APU as it is right now, my work PC with 10900 and 32GB ram has no dGPU. Thing is an APU or SoC is never going to replace high end dGPU for high end usage, power and cooling just make it impractical to go that route in devices that are not space constrained and losing the ability to upgrade it as a PCIe device is less desirable.

 

You either need a dGPU or you don't. I seriously do not think people realize this has already been a thing for a long time, most computers do not have dGPUs so that's not a thing that is coming or might happen, it happened like 8 years ago or more? Lots of software already make use of these integrated GPUs and more are doing so, if GPU acceleration is required but only for optimization reasons rather than raw performance utilizing these is the optimal choice as they are already present.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bombastinator said:

This has happened before.  There was a period where there were math coprocessors to go along with CPUs.  They eventually got absorbed into the cpu.  If you consider a discrete card to be effectively a coprocessor graphics cards are in some ways math coprocessors as well.

They can wrangle my dedicated math co-processor from my cold dead hands along with my dedicated blitter, dedicated 2d-card, dedicated 3d-accelerator, dedicated physics card, dedicated memory controllers :P

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Spindel said:

Looking at the performance numbers (that have come out) of the M1 iGPU and looking at the power envelope of it i guess I’ll stand by dGPU becomeing something that will disappear.

That can only ever happen if demand of high performance graphics and compute disappears which it won't. High end GPUs have 10x and greater performance than the M1 and this disparity in performance isn't going to close up at all, not in low power SoC implementations. Even in high power SoC implementations it's still going to be 5x and greater performance disparity, GPU performance and development is not static so M1X, M2 or w/e future product is going to be relative to future GPUs and future software demands.

 

AMD Ryzen 4700G and even other lower end and previous generation APUs have roughly the same GPU performance as the Apple M1 (M1 is a bit higher and also newer arch than the Vega mobile used currently), theoretical performance anyway. When AMD get their act together and get Navi based APUs on to the market they will be a fair bit faster than current products. Intel Xe Mobile are much faster than current AMD APUs and Apple M1 btw.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, leadeater said:

That can only ever happen if demand of high performance graphics and compute disappears which it won't. High end GPUs have 10x and greater performance than the M1 and this disparity in performance isn't going to close up at all, not in low power SoC implementations. Even in high power SoC implementations it's still going to be 5x and greater performance disparity, GPU performance and development is not static so M1X, M2 or w/e future product is going to be relative to future GPUs and future software demands.

 

AMD Ryzen 4700G and even other lower end and previous generation APUs have roughly the same GPU performance as the Apple M1 (M1 is a bit higher and also newer arch than the Vega mobile used currently), theoretical performance anyway. When AMD get their act together and get Navi based APUs on to the market they will be a fair bit faster than current products. Intel Xe Mobile are much faster than current AMD APUs and Apple M1 btw.

I agree that Apple silicon is going to need some sort of discrete coprocessor attachment system.  The last one they had was egpu via thunderbolt, which wasn’t great but it was something. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Bombastinator said:

I agree that Apple silicon is going to need some sort of discrete coprocessor attachment system.  The last one they had was egpu via thunderbolt, which wasn’t great but it was something. 

Well I think they only need that in Mac Pros, most software don't really require very high end GPUs to get the desired benefit. You can get very poor scaling in applications outside of pure compute tasks, even a lot of rendering software don't get massive gains from mid tier to high end GPUs. So not having dGPUs and a competent SoC all the way up to high spec iMac should be fine.

 

Edit:

A lot of the time the larger VRAM is what's actually required and why someone is opting for a higher end GPU. SoC with larger or expandable unified memory solves that need.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

Well I think they only need that in Mac Pros, most software don't really require very high end GPUs to get the desired benefit. You can get very poor scaling in applications outside of pure compute tasks, even a lot of rendering software don't get massive gains from mid tier to high end GPUs. So not having dGPUs and a competent SoC all the way up to high spec iMac should be fine.

The problem there is Mac pros are wildly expensive.  So expensive they’re out of range of a lot of pros.  I would say anything with a “pro” in the name excepting the iPad Pro.  May be what you meant.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Bombastinator said:

The problem there is Mac pros are wildly expensive.  So expensive they’re out of range of a lot of pros.  I would say anything with a “pro” in the name excepting the iPad Pro.  May be what you meant.

No because the M1 SoC already has a 2.6 TFLOP GPU in it and that's a very low power part, an Apple SoC with a much higher power target designed for higher end systems will have a much greater performant GPU in it and like I mentioned a lot of software doesn't scale well with upper end GPU performance, they just need some kind of GPU acceleration and then it's all about the VRAM. So for that their isn't really a reason for an iMac to have a dGPU option when you have say a 50W-80W SoC in it with 5-8 TFLOPs of GPU performance or equivalent performance scaled with time progression.

 

Edit:

There is a very big difference between delivering ~5 TFLOPs of GPU and the current ~30 TFLOPs on the high end. Not just power but overall design and requirements, things like memory design. SoC will lose their appeal if they end up being giant in size or monster power users when fully loaded up.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

No because the M1 SoC already has a 2.6 TFLOP GPU in it and that's a very low power part, an Apple SoC with a much higher power target designed for higher end systems will have a much greater performant GPU in it and like I mentioned a lot of software doesn't scale well with upper end GPU performance, they just need some kind of GPU acceleration and then it's all about the VRAM. So for that their isn't really a reason for an iMac to have a dGPU option when you have say a 50W-80W SoC in it with 5-8 TFLOPs of GPU performance or equivalent performance scaled with time progression.

If one needs more they need more.  There has never been a time I’ve seen where “this is all you will ever need” turned out to be true.  The problem is while sure, if you want to keep things low power consumption such is required.  Not everyone needs that though. People may need more performance more than they need low power. Will this work for the m1? Maybe.  It is enough to make me not want to buy an m1 even if I could use it for most things.  Will it work for enough people  and I am simply to be turned out on the street as unacceptable? Perhaps.  I don’t see it as a guarantee though. Apple has a wider audience than just people who can play on phones.  Maybe they don’t want that audience.  It’s kind of looking that way atm.  M2 if it doesn’t have it will need a variant that does. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Bombastinator said:

There has never been a time I’ve seen where “this is all you will ever need” turned out to be true.

Have a look at CAD applications and video production software, higher end performance scaling on these is not good at all.

 

Example:

pic_disp.php?id=64708&width=800

 

https://www.pugetsystems.com/labs/articles/Adobe-Premiere-Pro---NVIDIA-GeForce-RTX-3070-3080-3090-Performance-1951/

 

Yes the RTX 3090 is faster, but have a look at the RTX 2060 Super. VRAM is far more important for these workloads if you need it than GPU performance.

 

Edit:

Point being an Apple SoC at 50W-75W could very well score relatively similar to these GPU in this application making VRAM the only factor to consider. Apple isn't stopping at 15W SoC's. I'm not talking about M1 or 15W, this is a point about Apple being able to make SoC for higher end products then mean dGPU isn't required which is already a thing now, Intel QuickSync for example.

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, leadeater said:

Have a look at CAD applications and video production software, higher end performance scaling on these is not good at all.

 

Example:

pic_disp.php?id=64708&width=800

 

https://www.pugetsystems.com/labs/articles/Adobe-Premiere-Pro---NVIDIA-GeForce-RTX-3070-3080-3090-Performance-1951/

 

Yes the RTX 3090 is faster, but have a look at the RTX 2060 Super. VRAM is far more important for these workloads if you need it than GPU performance.

 

Edit:

Point being an Apple SoC at 50W-75W could very well score relatively similar to these GPU in this application making VRAM the only factor to consider. Apple isn't stopping at 15W SoC's. I'm not talking about M1 or 15W, this is a point about Apple being able to make SoC for higher end products then mean dGPU isn't required which is already a thing now, Intel QuickSync for example.

Can the power of a given coprocessor be incorporated into a non discrete system?  Sure. It has many times. The problem though is that capacity WILL need to be expanded eventually for something.  If capacity to extend is instituted it will not be used by a majority of people.  The problem is NO capacity. There remains the section of people who DO need that extension now as well as the section of people who may not actually need to activate that capacity but are unwilling to do without it.  The question is whether those two groups together represent a large enough section of users as to cause a problem for Apple.  I think it is. It may even be a majority. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Bombastinator said:

Can the power of a given coprocessor be incorporated into a non discrete system?  Sure. It has many times. The problem though is that capacity WILL need to be expanded eventually for something.  If capacity to extend is instituted it will not be used by a majority of people.  The problem is NO capacity. There remains the section of people who DO need that extension now as well as the section of people who may not actually need to activate that capacity but are unwilling to do without it.  The question is whether those two groups together represent a large enough section of users as to cause a problem for Apple.  I think it is. It may even be a majority. 

Have you forgotten we are talking about Apple and my chosen example for the end point (product tier) was the iMac. You aren't upgrading GPUs in iMacs ever, or any Apple product other than the Mac Pro. The only expandable Apple product is the Mac Pro.

 

Having an SoC doesn't stop support for external GPUs but those are questionable benefit and very limited usage in reality.

 

By the time your current hardware in an iMac starts to impede your workflow it's time to upgrade the entire thing, not a component of it which is impossible generally anyway.

 

So what capacity are you talking about? I've shown you that a GPU with 5 times the raw performance only results in 10% more application performance, this is not atypical of these kind of software. So if by capacity you mean VRAM an SoC doesn't stop expandable ram being implemented in to the design, either solely off package (unlike the M1) or a hybrid of on package RAM and off package RAM (that you can expand as required).

 

The largest group of Apple users are laptop users, expandability is already nonexistent. A below 100W SoC can entirely replace everything found in all Apple Mac products iMac/iMac Pro and down, with the single exception of anyone doing GPU compute and I posit that to be a very very small group of Apple users doing this on Apple Mac devices.

 

100W SoC will not replace high end dedicated GPUs, the need for them will not go away, but these and Apple products outside of the Mac Pro never meet each other.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

Have you forgotten we are talking about Apple and my chosen example for the end point was the iMac. You aren't upgrading GPUs in iMacs ever, or any Apple product other than the Mac Pro. The only expandable Apple product is the Mac Pro.

 

Having an SoC doesn't support external GPUs but those are questionable benefit and very limited usage in reality.

 

By the time your current hardware in an iMac starts to impede your workflow it's time to upgrade the entire thing, not a component of it which is impossible generally anyway.

 

So what capacity are you talking about? I've shown you that a GPU with 5 times the raw performance only results in 10% more application performance, this is not atypical of these kind of software. So if by capacity you mean VRAM and SoC doesn't stop expandable ram being implemented in to the design, either solely off package (unlike the M1) or a hybrid of on package RAM and off package ram (that you can expand as required).

 

The largest group of Apple users are laptop users, expandability is already nonexistent.

That iMacs don’t have upgradable GPUs is true.   They’re basically laptops pretending to be desktops. I think attempting to market iMacs to pros is a prime reason why the Mac computer (as opposed to phone) market has done so poorly.  Pros look at those things and view them as shit.  Apple as a whole has done well, because their phone thing worked out, but their computer business used to have a higher percentage market share than their phone business does now.  These days it’s still relatively very small.  “We’ve gone from 2% to 6%! Everyone throw confetti!” Is imho stupid. It used to be LOTS higher. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Bombastinator said:

Pros look at those things and view them as shit.

No they don't, how many have you talked to? We have an entire creative arts department and teach hundreds of students every year video production and 3D modeling, they love iMacs and iMac Pro. I know a few photographers, they love their iMac and MacBook Pro. I in fact know zero people currently than have the current generation Mac Pro. We used to have a single lab of Mac Pros, the trash can, but that was replaced with iMac Pro.

 

20 minutes ago, Bombastinator said:

“We’ve gone from 2% to 6%! Everyone throw confetti!” Is imho stupid. It used to be LOTS higher. 

 

20 minutes ago, Bombastinator said:

but their computer business used to have a higher percentage market share than their phone business does now

 

When?

itwbukqnxyw11.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×