Jump to content

AMD Reorganises GPU Divison - Forms 'Radeon Technologies Group'

HKZeroFive

From what I have been hearing this is bigger than just GPU, API implementation, collaboration, VR, software extension, expanded design and custom implementations, it feels like more of a move rallying the troops for the initial landings of a whole new market.

Link to comment
Share on other sites

Link to post
Share on other sites

They did let the mobile (as in phones) radeon go. It's known as Adreno (anagram of Radeon) lol.

And as soon as they let it go, it became a massive success (used in pretty much all smartphones).

It's kind of sad.

 

 

Adreno GPU's are in the Droid Turbo the S5 the Note 5 but I belive that the Exynos GPU's on the S6/S6 edge/S6 Edge+/Galaxy Note 5 are also from AMD

Samsung's latest phones use Mali GPUs. They are from ARM, not AMD.

 

 

The Galaxy S6's are all ARM Mali cores. (530 iirc)

T760

With 8 cores to be precise.

Link to comment
Share on other sites

Link to post
Share on other sites

-irrelevant snip-

 

 

T760

With 8 cores to be precise.

Yep. http://www.anandtech.com/show/9146/the-samsung-galaxy-s6-and-s6-edge-review/6

 

As previously mentioned, the Galaxy S6 uses a Mali T760MP8 clocked at 772 MHz

 

 

http://www.notebookcheck.net/ARM-Mali-T760-MP8.140006.0.html

 

The ARM Mali-T760 MP8 is a fast mobile graphics solution that can be found in ARM SoCs like the Samsung Exynos 7420 Octa (Galaxy S6)... Besides OpenGL ES 3.1, the GPU supports OpenCL 1.1 as well as DirectX 11. According to ARM, the Mali-T760 can be scaled from 1 - 16 cores/clusters.

 
The MP8 version offers 8 clusters clocked at up to 772 MHz (302 GFLOPS)

Ensure a job for life: https://github.com/Droogans/unmaintainable-code

Actual comment I found in legacy code: // WARNING! SQL injection here!

Link to comment
Share on other sites

Link to post
Share on other sites

If AMD dissolves either the graphics group will continue on its own or someone would 100% buy them, the market and graphics IP and tech RadeonTG has is very damn important for many, be it intel/valve or samsung someone will want them.

 

But the desktop x86_64 CPU business i doubt anyone would buy it, maybe someone will be interested in the amd's x64 instruction set IP rights, but i seriously doubt anyone would adventure in the declining markets of desktop CPU's. It would have been a great move years ago when dual-quad core madness began, but today there is not much demand for new cpu's with more cores really.

Think about it, as a casual user or gamer would you need anything more than an i5 or 8350? no(would be nice but you dont need it). What would a company gain from investing many millions in 8-16core 3-4ghz desktop cpu's that applications dont use properly due to bad paralel code? most of people wouldnt upgrade since they would gain no benefits in day to day tasks and gaming, plus DX12/Vulkan will furtherly reduce the requirement of a strong cpu in gaming so there is no growth there.If cpu's would be as efficient as paralel gpu's at rendering, using every drop of horsepower to render more frames, then more cores would be a huge improvement but sadly they are not and probably wont be for some time or ever.

 

We need a massive leap in cpu arhitectures for desktop until this market will see any significant growth ever again.The home user multitasking PC/gamer pc kind of reached its limits, it doesnt need that many more cores.

Link to comment
Share on other sites

Link to post
Share on other sites

If AMD dissolves either the graphics group will continue on its own or someone would 100% buy them, the market and graphics IP and tech RadeonTG has is very damn important for many, be it intel/valve or samsung someone will want them.

 

But the desktop x86_64 CPU business i doubt anyone would buy it, maybe someone will be interested in the amd's x64 instruction set IP rights, but i seriously doubt anyone would adventure in the declining markets of desktop CPU's. It would have been a great move years ago when dual-quad core madness began, but today there is not much demand for new cpu's with more cores really.

Think about it, as a casual user or gamer would you need anything more than an i5 or 8350? no(would be nice but you dont need it). What would a company gain from investing many millions in 8-16core 3-4ghz desktop cpu's that applications dont use properly due to bad paralel code? most of people wouldnt upgrade since they would gain no benefits in day to day tasks and gaming, plus DX12/Vulkan will furtherly reduce the requirement of a strong cpu in gaming so there is no growth there.If cpu's would be as efficient as paralel gpu's at rendering, using every drop of horsepower to render more frames, then more cores would be a huge improvement but sadly they are not and probably wont be for some time or ever.

I am by no means good at economics, but selling their GPU division seems like a bad move to me. Isn't that basically their only profitable division? Selling it would give them a boost in cash short term but in the long run they would probably lose money from it.

Link to comment
Share on other sites

Link to post
Share on other sites

If AMD dissolves either the graphics group will continue on its own or someone would 100% buy them, the market and graphics IP and tech RadeonTG has is very damn important for many, be it intel/valve or samsung someone will want them.

 

But the desktop x86_64 CPU business i doubt anyone would buy it, maybe someone will be interested in the amd's x64 instruction set IP rights, but i seriously doubt anyone would adventure in the declining markets of desktop CPU's. It would have been a great move years ago when dual-quad core madness began, but today there is not much demand for new cpu's with more cores really.

Think about it, as a casual user or gamer would you need anything more than an i5 or 8350? no(would be nice but you dont need it). What would a company gain from investing many millions in 8-16core 3-4ghz desktop cpu's that applications dont use properly due to bad paralel code? most of people wouldnt upgrade since they would gain no benefits in day to day tasks and gaming, plus DX12/Vulkan will furtherly reduce the requirement of a strong cpu in gaming so there is no growth there.If cpu's would be as efficient as paralel gpu's at rendering, using every drop of horsepower to render more frames, then more cores would be a huge improvement but sadly they are not and probably wont be for some time or ever.

Nvidia wants x86_64 bad. It needs a way to diversify and gain the ubiquity ARM just doesn't have.

 

I've done a fair bit of work on CPU-based physics engines (400 lines of Unreal 4's lighting engine are mine as of build 4.8, and another 600 in the reflection engine are mine, most of that dedicated to liquid reflections). Algorithmically it's quite possible to have AVX instructions implement a lot of a rendering or compute pipeline. It wouldn't be difficult, just meticulous.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly AMD is basically out of the CPU market at this point, if they're gonna get back in, zen is gonna have to be really fuckin impressive.

Hell man, if Zen is gonna be the next best thing I would have no problem recommending it to people, if its that good then I might get one. But obviously, we still really dont know much at all.

4690K // 212 EVO // Z97-PRO // Vengeance 16GB // GTX 770 GTX 970 // MX100 128GB // Toshiba 1TB // Air 540 // HX650

Logitech G502 RGB // Corsair K65 RGB (MX Red)

Link to comment
Share on other sites

Link to post
Share on other sites

This is probably the best news in a while for the graphics side of AMD. Ever since AMD acquired ATI, they have tied ATIs hands with red tape so AMD could merge the CPU and GPU together. Now that AMD have achieved their goal of HUMA and HSA, they can finally take the plastic bag off of ATI's head and let it breath again. They have Zen on the way, and now they need a graphics division working at full efficiency to push their next goals forward by 2020, both in embedded SOC solutions, and in High Wattage APUs. I'm amazed at how well AMD have done, and are still doing considering the decade-long game of twister they've been playing internally. Can't say it enough though, this is good news.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

But the desktop x86_64 CPU business i doubt anyone would buy it,

 

China or Russia I'd imagine could be interested.  Not that I think AMD/ATI is splitting up, or that it would be a good idea for them to do so.

Link to comment
Share on other sites

Link to post
Share on other sites

China or Russia I'd imagine could be interested.

 

AMD wouldn't be allowed to sell it to them.  But it wouldn't be an issue, as some other company here (likely NVidia) would buy it.

Link to comment
Share on other sites

Link to post
Share on other sites

Intel would outbid them.

Are you sure? I think Intel could easily make their own high-power graphics cards if they really really wanted to. Besides, Intel doesn't really have the... charisma that ATI and Nvidia do.

"LATEST NEXT GEN GRAPHICS FOR UNSTOPPABLE AWESOME GAMING, CRUSH YOUR FOES WITH THE LATEST (GT740) GRAPHICS.'

"UNLEASH YOUR INNER GAMER!!! DESTROY THE OPPOSITION!!!" "INCREDIBLE SPEED WITH ALL NEW FX PROCESSORS. 5GHz OF RAW POWARRRR"

 

Intel is not like that.

Link to comment
Share on other sites

Link to post
Share on other sites

Guys/Gals,

 

No... AMD didn't do this because they're preparing for Bankruptcy. AMD just received a large injection of liquidity. This was done because AMD is preparing for 2016/2017. For a major comeback.

 

Raja Koduri was the chief architect of the R300 at ATi. He is now in charge of the Radeon Graphics Division. On the AMD side there's Jim Keller, the architect behind the K7 and K8 Processors, working on Zen. (PS, Apple went to sh!t because these two left Apple, not because Steve Jobs died).

 

By splitting both companies, it gives an engineer, Raja, more control over the Graphics division. The only person he reports too is Dr. Lisa Su. Su, is very much centered around engineering rather than marketing and PR. This gives Raja far more flexibility in pushing the Radeon division towards engineering feats.

 

2016/2017 should prove to be rather interesting years.

"Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth." - Arthur Conan Doyle (Sherlock Holmes)

Link to comment
Share on other sites

Link to post
Share on other sites

Are you sure? I think Intel could easily make their own high-power graphics cards if they really really wanted to. Besides, Intel doesn't really have the... charisma that ATI and Nvidia do.

"LATEST NEXT GEN GRAPHICS FOR UNSTOPPABLE AWESOME GAMING, CRUSH YOUR FOES WITH THE LATEST (GT740) GRAPHICS.'

"UNLEASH YOUR INNER GAMER!!! DESTROY THE OPPOSITION!!!" "INCREDIBLE SPEED WITH ALL NEW FX PROCESSORS. 5GHz OF RAW POWARRRR"

 

Intel is not like that.

No, Intel's problem is in AMD's and Nvidia's troves of patents which it has to dance around or buy access to (they pay Nvidia a hefty sum each year to have access to its stuff, though as the Direct X 12 compatibility shows, Intel's building its own IP and isn't simply copying designs, because Skylake graphics have 1 or 2 features Maxwell doesn't). And it would present very easy profits for Intel if it could make high-powered dGPUs and give it a much firmer position from which to fight IBM in supercomputing and scale-up server markets.

 

No, Intel is a lot like me. I know how good I am, but I'm keenly aware of my weaknesses. While you're tearing a rival apart, I'm sneaking behind you both to deliver the killing blow to whoever remains in the end. Meanwhile, I'm practicing my combat skills and honing my weapon.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Guys/Gals,

 

No... AMD didn't do this because they're preparing for Bankruptcy. AMD just received a large injection of liquidity. This was done because AMD is preparing for 2016/2017. For a major comeback.

That is only a rumor, not confirmed.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Have you checked  how much Valve make - they stopped doing anything - steam is a cash cow like non other.

yes, and outside STEAM, they haven't done anything tangible since Portal 2

CS:Go? DOTA 2? - the 90's called!  -_-

Link to comment
Share on other sites

Link to post
Share on other sites

yes, and outside STEAM, they haven't done anything tangible since Portal 2

CS:Go? DOTA 2? - the 90's called!  -_-

If my market was making me 1 Million in sales monthly - I wouldn't bother touching anything work-related ever again :D

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

If my market was making me 1 Million in sales monthly - I wouldn't bother touching anything work-related ever again :D

Eventually GOG is going to become a serious problem for Valve. I wouldn't be so complacent if I were in Gabe Newell's position if he intends to keep raking it in.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

If my market was making me 1 Million in sales monthly - I wouldn't bother touching anything work-related ever again :D

if for a month, because some unexplained reason, STEAM would stop working, VALVe would shit itself

 

with that much cash, you would think they wouldn't need a company like HTC (company that's in deep shit) to make the Vive VR set

STEAM Machines? a total flop!

STEAM OS? what are they waiting for, Windows 20?!

Link to comment
Share on other sites

Link to post
Share on other sites

yes, and outside STEAM, they haven't done anything tangible since Portal 2

CS:Go? DOTA 2? - the 90's called!  -_-

 

yeah, it seems a lot of their IP has been left for dead.  ;)

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

I am by no means good at economics, but selling their GPU division seems like a bad move to me. Isn't that basically their only profitable division? Selling it would give them a boost in cash short term but in the long run they would probably lose money from it.

It is but if they keep throwing money from GPu division into dying cpu business then their gpu division will eventually go extinct, just look at the market share nvidia vs radeon its embarrasing really.

They need each division to sustain itself, the main two being x86 server/desktop and gpu.

 

Nvidia wants x86_64 bad. It needs a way to diversify and gain the ubiquity ARM just doesn't have.

 

I've done a fair bit of work on CPU-based physics engines (400 lines of Unreal 4's lighting engine are mine as of build 4.8, and another 600 in the reflection engine are mine, most of that dedicated to liquid reflections). Algorithmically it's quite possible to have AVX instructions implement a lot of a rendering or compute pipeline. It wouldn't be difficult, just meticulous.

 

Probably im no expert but i just dont see anyone providing great bang for the buck desktop cpu's if they buy AMD's x86 business, since it has little growth potential, i might be wrong, i hope im wrong.

That sounds like great work for UE4 really great engine, im trying to self learn C++ and prog in general but failing hard at it.

AVX2 wise i read about it and im really surprised its not in every game engine already, even OS, basically all the code in games that deals with floating point operation can benefit from this greatly since boohoo everything you see on the screen in 3D games are vectors and matrices calculations exactly what benefits the most from avx2. AVX2 + DX12 would seriously boost CPU performance in games, but apart from Haswell and up there is not much support on avx2, so i guess thats why no one uses those instruction for games so far. Its sad that new cpu tech(instructions) is the least adopted one, we hardly get adoption for new graphics API. 

Link to comment
Share on other sites

Link to post
Share on other sites

It is but if they keep throwing money from GPu division into dying cpu business then their gpu division will eventually go extinct, just look at the market share nvidia vs radeon its embarrasing really.

They need each division to sustain itself, the main two being x86 server/desktop and gpu.

 

 

Probably im no expert but i just dont see anyone providing great bang for the buck desktop cpu's if they buy AMD's x86 business, since it has little growth potential, i might be wrong, i hope im wrong.

That sounds like great work for UE4 really great engine, im trying to self learn C++ and prog in general but failing hard at it.

AVX2 wise i read about it and im really surprised its not in every game engine already, even OS, basically all the code in games that deals with floating point operation can benefit from this greatly since boohoo everything you see on the screen in 3D games are vectors and matrices calculations exactly what benefits the most from avx2. AVX2 + DX12 would seriously boost CPU performance in games, but apart from Haswell and up there is not much support on avx2, so i guess thats why no one uses those instruction for games so far. Its sad that new cpu tech(instructions) is the least adopted one, we hardly get adoption for new graphics API. 

If you want to learn C++, there are a few good books you need to go to. Self-teaching it when there's as much meticulous underpinning structure and complexity left from its inception (must have a header file with declarations or all functions called must be declared above the caller) is a bit of a fool's errand. BRB with my favorite titles (hooray for amazon).

 

Intro C++: http://www.amazon.com/gp/product/0321563840?ref_=wl_mb_recs_1_title

Intro C++: http://www.amazon.com/gp/product/0201704315?colid=1ATAWRFDY0G4T&coliid=I2XT6DCQVWSH9R&ref_=wl_it_dp_o_pd_nS_ttl

 

Those two books should be read in tandem to get both the technical and the practical as well as common (but moderately difficult) problems solved in C++

 

Engineering-Level Problems: http://www.amazon.com/gp/product/0201615622?colid=1ATAWRFDY0G4T&coliid=IBMAEIE2HM9UL&ref_=wl_it_dp_o_pd_nS_ttl

Technical, Code Patterns & Improvement: http://www.amazon.com/gp/product/1491903996?colid=1ATAWRFDY0G4T&coliid=I2QXP0WCDGCVPJ&ref_=wl_it_dp_o_pd_nS_ttl

 

Those two together once you're comfortable are a crash course in making the language work for you and training your brain to see algorithmic solutions.

 

Intro to Game Programming: http://www.amazon.com/gp/product/0990582906?colid=1ATAWRFDY0G4T&coliid=I3S1N0BK68WD0E&ref_=wl_it_dp_o_pC_nS_ttl

Mathematical Approaches to Programming: http://www.amazon.com/gp/product/0321942043?colid=1ATAWRFDY0G4T&coliid=I9HEYF68V603R&ref_=wl_it_dp_o_pC_nS_ttl

 

Those two give you a good taste of going toward a data or heuristics-intense path or towards a creative/"games" track.

 

Native C++ Parallelism: http://www.amazon.com/gp/product/1933988770?colid=1ATAWRFDY0G4T&coliid=I13LXYIZMNU2P3&ref_=wl_it_dp_o_pC_nS_ttl

Generic Parallel Algorithms/Programming Patterns: http://www.amazon.com/gp/product/0124159931?colid=1ATAWRFDY0G4T&coliid=I31Y9MWW1IECRL&ref_=wl_it_dp_o_pd_nS_ttl

OpenMP (no book exists, but you should know this exists and how to use it) http://bisqwit.iki.fi/story/howto/openmp/

Intel's Thread Building Blocks: http://www.amazon.com/gp/product/0596514808?colid=1ATAWRFDY0G4T&coliid=I3488PSBEHVU4G&ref_=wl_it_dp_o_pd_nS_ttl

 

Those four are tough but very useful reads and where I learned the bulk of what I know in parallel design. OpenMP specializes in synchronous design (but can do both) and is supported by all the major compilers (MSVC 2015 is still only at 2.0 even though 4.0 has been out since 2012 and 4.1 is now out). Intel's thread Building Blocks specializes in asynchronous design and is currently only supported by Intel's C/C++ compiler, but GCC and Clang plan to support it in coming builds. 

 

Multi and Many-Core Programming Patterns: http://www.amazon.com/dp/0128021187/ref=wl_it_dp_o_pd_nS_ttl?_encoding=UTF8&colid=1ATAWRFDY0G4T&coliid=I162N9RMPB7W2Z

Many-Core Parallel Programming (Xeon Phi focus, but applicable to much more, including Intel's integrated graphics architecture): http://www.amazon.com/gp/product/0124104142?colid=1ATAWRFDY0G4T&coliid=I230JFI6PH9RNW&ref_=wl_it_dp_o_pC_S_ttl

GPGPU in OpenCL 2.0 (2.1 edition should be out soon) http://www.amazon.com/gp/product/0128014148?colid=1ATAWRFDY0G4T&coliid=I2DMIH81D0LY5R&ref_=wl_it_dp_o_pC_nS_ttl

 

Those three are the most difficult reads and should only be undertaken when you are comfortable with parallel modes of thought.

 

And of course, no computer scientist is complete without a good algorithm book. http://www.amazon.com/Algorithm-Design-Jon-Kleinberg/dp/0321295358/ref=sr_1_1?s=books&ie=UTF8&qid=1441841431&sr=1-1&keywords=kleinberg+tardos

 

Sorry I don't have a good data structures book for you. I've never found one I liked.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

It is but if they keep throwing money from GPu division into dying cpu business then their gpu division will eventually go extinct, just look at the market share nvidia vs radeon its embarrasing really.

They need each division to sustain itself, the main two being x86 server/desktop and gpu.

 

 

Probably im no expert but i just dont see anyone providing great bang for the buck desktop cpu's if they buy AMD's x86 business, since it has little growth potential, i might be wrong, i hope im wrong.

That sounds like great work for UE4 really great engine, im trying to self learn C++ and prog in general but failing hard at it.

AVX2 wise i read about it and im really surprised its not in every game engine already, even OS, basically all the code in games that deals with floating point operation can benefit from this greatly since boohoo everything you see on the screen in 3D games are vectors and matrices calculations exactly what benefits the most from avx2. AVX2 + DX12 would seriously boost CPU performance in games, but apart from Haswell and up there is not much support on avx2, so i guess thats why no one uses those instruction for games so far. Its sad that new cpu tech(instructions) is the least adopted one, we hardly get adoption for new graphics API. 

 

You cannot factor in GPU marketshare without mentioning consoles anymore. The marketshare figures that you're quoting, are not for an installed user base but quarterly reports. You're correct to point out that over the past year, or so, nVIDIA has been growing its marketshare in the dGPU segment. You would think that this would translate into developers partnering up with nVIDIA for upcoming DX12 titles... but that's not the case.

 

Developers have been, in the majority, partnering up with AMD instead. Why? Because the money is in the consoles. Developer are looking to learn on how to code closer to metal on GCN hardware on the console side. This is translating into developers taping into GCN-centric optimizations. On the PC side, any DX12 compatible card will do. The GCN-centric optimizations are staying but separate Vendor ID paths are being written for nVIDIA hardware (based on certain features Kepler and Maxwell v1 lack).

 

It's not entirely embarrassing because even when AMD had the Radeon HD 5870 and nVIDIA were "struggling" in benchmarks... AMD wasn't making a killing. AMD wasn't winning any developer support either. The issue was centered around poor developer relations and poor AMD driver documentation (on the developer side). AMDs console design wins as well as DX12, closer to metal programming, have rendered this handicap irrelevant.

 

When I look at 2016/2017, I am seeing a completely different PC gaming environment on the horizon.

"Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth." - Arthur Conan Doyle (Sherlock Holmes)

Link to comment
Share on other sites

Link to post
Share on other sites

Valve literally shit 1 mil a day.

intel makes much more than valve 

Link to comment
Share on other sites

Link to post
Share on other sites

Man, if they dissolve it looks like your vision of the future for Intel/ATI and nVidia/x86_64 could become a thing.

intel redid the license so whoever buys amd wont get the x86 license and nvidia has no experience making desktop cpus

Link to comment
Share on other sites

Link to post
Share on other sites

Are you sure? I think Intel could easily make their own high-power graphics cards if they really really wanted to. Besides, Intel doesn't really have the... charisma that ATI and Nvidia do.

"LATEST NEXT GEN GRAPHICS FOR UNSTOPPABLE AWESOME GAMING, CRUSH YOUR FOES WITH THE LATEST (GT740) GRAPHICS.'

"UNLEASH YOUR INNER GAMER!!! DESTROY THE OPPOSITION!!!" "INCREDIBLE SPEED WITH ALL NEW FX PROCESSORS. 5GHz OF RAW POWARRRR"

 

Intel is not like that.

intel will probably be interested in amd's driver team 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×