Jump to content

Consoles bound to AMD products

Loscohones
1 hour ago, omarthegeek said:

They are not bound to AMD completely since nintendo has Nvidia making an apu for the switch since none of the big companies really makes SoC meant for consoles. Anyways, Sony and Mircosoft are stuck with amd since in x86 intel is expensive and does not have good value. AMD is cheaper and offers more power.

They aren't stuck with AMD, in the sense that there were certainly alternatives - they just chose that route because it seemed to make the most sense.

 

Even if they wanted to maintain x86 compatibility - which is a relatively new thing for consoles - they could have gone Intel/AMD or Intel/NVIDIA - or hell, even Intel/Intel, given that they've been really working on their graphics division lately.

 

They also could have chosen a non x86 design, though that'd break backwards compatibility. They could have used ARM - a risky move. They could have went PowerPC - though it's been since the PS3 I think since we last saw anything even remotely PPC based.

 

Realistically though, AMD was always still the best choice, since in a console, you need to balance performance, cost, and power consumption. AMD hit a decent sweet spot for that before, and with their upcoming next gen console SoC's for PS5 and Xbox Two (I have no idea if they've given it a proper name yet), they've potentially nailed the trifecta completely.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's easy to get lost in the "this brand performs better" talk. The reality is that at specific price points there isn't really much of a gap between any of them. Whether that's AMD vs Intel or AMD vs Nvidia. Even when AMD's CPUs were super bellow par, they were super bellow par at significantly lower price points. With consoles there's also the fact that they can probably get a better deal if they go with AMD for both CPU & GPU than they would if they were AMD/NVidia, Intel/NVidia or Intel/AMD. So it kinda makes sense that in the generations since AMD acquired Radeon (2006, post R&D phase for the Wii/360/PS3 gen) it's been pretty much entirely AMD/AMD with the exception of the Switch with its Nvidia SoC.

 

3 hours ago, dalekphalm said:

They also could have chosen a non x86 design, though that'd break backwards compatibility. They could have used ARM - a risky move. They could have went PowerPC - though it's been since the PS3 I think since we last saw anything even remotely PPC based.

The Wii U was PPC based. Easy to laugh at it now with 20:20 hindsight but when you take your head back to ~2010 when they made that choice you can see why. By going PPC they maintained full backwards compatibility with Wii. They also made it easier for companies to port from 360/PS3 at a time when the PS3/XBOne were still a few years away. One of the many things that bit them in the end but... still worth remembering

Fools think they know everything, experts know they know nothing

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, omarthegeek said:

Anyways, Sony and Mircosoft are stuck with amd since in x86 intel is expensive and does not have good value. AMD is cheaper and offers more power.

That is, uhm, a lot of speculation.

 

No the Reason is because Intel doesn't offer a good Graphics Solution, if at all. 

And nVidia doesn't offer a good CPU Solution (and are dicks).

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, skywake said:

The Wii U was PPC based. Easy to laugh at it now with 20:20 hindsight but when you take your head back to ~2010 when they made that choice you can see why. By going PPC they maintained full backwards compatibility with Wii. They also made it easier for companies to port from 360/PS3 at a time when the PS3/XBOne were still a few years away. One of the many things that bit them in the end but... still worth remembering

If anything that would've only made ports from the 360 easier on the WiiU, given superficial similarities (PPC based CPU, three-cores, ATI/AMD based GPU), but otherwise porting is a much more complicated beast than that.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, not really...

 

The PPC Version of both is different. It is believed that the PPC In the WiiU is the old one, that's also used in the Gamecube.


The PPC in the XBox 360 is a different version that also comes with SMT2, is clocked at 3,2GHz, though an In Order Design.

 

But the OS Itself is already a big hurdle that isn't to be underestimated...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

So let's discuss history from the 5th generation and on, when consoles were going into 3D territory and speculate why the hardware they have was chosen:

 

Sega Saturn

This was a mixture of a sunk cost due to developing a CPU chip with Hitachi (the SuperH chip) and finding out it wasn't powerful enough for where 3D games would likely go. So they added another. Then they found out the capabilities of the PS1 and added another Video Display Processor (VDP) on that. It was likely the VDP was based on the Sega Model 1's graphics system, since Sega wanted to leverage hardware from their arcade presence at the time.

 

Nintendo 64

The Nintendo 64 was based on hardware from SGI because of Nintendo's relationship with the company at the time.

 

PlayStation

Apparently this was designed in-house by Kutaragi, who had a hand in developing the SNES's SPC700 sound processor and the scrapped SNES CD add-on. (https://www.ign.com/articles/1998/08/28/history-of-the-playstation)

 

Dreamcast

One of Sega's aims was to build the Dreamcast with COTS hardware. Originally the Dreamcast was to use a 3dfx GPU and a PowerPC 603e CPU, but at some point due to Sega's relationship with NEC and 3dfx disclosing their deal with Sega, switched over to the SuperH SH-4 and VideoLogic (as they were known at the time) PowerVR GPU, both manufactured by NEC. (http://www.gamasutra.com/view/feature/4128/the_rise_and_fall_of_the_dreamcast.php?print=1)

 

The most likely reason people think Sega went with NEC is because NEC is a Japanese company. However, there are also practical considerations in that both the CPU and GPU are single sourced and the CPU is of a familiar architecture. The GPU was the only risk since tile-based rendering was new.

 

PlayStation 2

Can't seem to find any development history on the Emotion Engine or the Graphics Synthesizer, but both were likely designed in-house.

 

GameCube

Nintendo went with ArtX for the GPU and system bus. What heavily points to why they chose them is because ArtX was founded by the same people who created the N64's graphics system. (https://www.ign.com/articles/1999/03/13/its-alive-3)

 

IBM was likely chosen because the PowerPC at the time was cheap enough, had enough performance, and had good thermal characteristics. It's also likely that Nintendo could leverage a strategy used in previous consoles to get development started before the hardware was finalized: use readily available computer systems as devkits.

 

Xbox

The Xbox started off as a side project in Microsoft's DirectX team to put together a console due to game developers moving away from Windows. It's likely the final hardware is what it is because the initial development kits were literally PCs and the biggest reason to choose NVIDIA at the time was because they were the forerunner in PC GPUs.

 

Xbox 360

Microsoft chose ATI over NVIDIA likely due to one thing: NVIDIA soured their relationship. Because the Xbox was struggling to sell and Microsoft overestimated how much they would, it led to an issue where NVIDIA overproduced chips and they couldn't sell them at the price they wanted. So NVIDIA started legal action against Microsoft (https://www.cnet.com/news/nvidia-microsoft-settle-xbox-spat/) (Which makes me wonder why they didn't sell unsold NV2A's as GeForce 3s or 4s)

 

The CPU was chosen likely because Microsoft sent out a bid for the CPU and IBM showed them the specs of the PPE portion of the then upcoming Cell processor. (https://www.wsj.com/articles/SB123069467545545011)

 

PlayStation 3

The CPU was from a joint effort between Sony, Toshiba, and IBM. It was probably to create the next generation of supercomputer CPU. I'm going to guess Sony wanted to use the Cell in the PS3 to give it some purpose and also because at the time, the "bit-wars" shifted to the "GFLOP wars"

 

The reason to go with NVIDIA eludes me. Maybe they wanted another shot at the console business?

 

Wii

Nintendo didn't want to compete in the high end, so they appeared to simply give the GameCube an upgrade, however slight.

 

WiiU

Considering its backwards compatibility capabilities, Nintendo kept with the same base foundations.

 

PlayStation 4 and Xbox One

To reiterate, here's why AMD was in a really good position to provide Sony and Microsoft the hardware:

Quote
  • AMD had experience making APUs. This keeps costs down by not needing to manufacture two separate packages.
  • AMD developed HSA, which allows the CPU and GPU to use the same physical memory to share data rather than pass it back and forth between a system bus.
  • AMD didn't necessarily need to develop anything new. They were taking existing designs with slight modifications and marrying them together. There's a reason why AMD calls this a "semi-custom" design.

But there's a lingering question: was NVIDIA ever considered in the bid? It turns out... No. NVIDIA voluntarily bowed out of making the GPU for consoles. Why? They didn't see it as a profitable enough business. Now you might go and think this is snobbish on NVIDIA's part, but NVIDIA was chasing after a market that was growing at the time: mobile devices. On top of the HPC market they were establishing a foothold in, adding yet another thing to do R&D work, even if all they had to do was recycle a consumer GPU, was likely not a good idea from a business point of view for long term growth.

To vindicate NVIDIA somewhat on their decision, AMD's Q3 FY2018 earnings for the "Enterprise, embedded, and semi-custom" categories where revenue from console sales would be was in total, $750M (https://www.anandtech.com/show/13514/amd-announces-q3-fy-2018-earnings). NVIDIA would make $792 million alone in the "Datacenter" category (https://www.anandtech.com/show/13607/nvidia-announces-q3-fy-2019-results-lots-of-stock)*

 

*Note the article date, not the "Fiscal Year" thing

 

Switch

Since Nintendo was interested in making something more like a gaming tablet, it would make sense to look at the tablets that were already out on the market and who could provide the best hardware for their needs. So around 2015, if we assume this was when Nintendo was seriously looking for something, the "top-end" mobile GPUs at the time were:

  • Qualcomm's Adreno 530
  • Imagination Technologies' PowerVR GT7600
  • ARM's Mali T880
  • NVIDIA's Tegra X1

Out of all of these, the Tegra and PowerVR easily outperformed the Adreno and Mali. Except as far as SoC availbility goes, the PowerVR could only be found on Apple's SoC and the Tegra was on the Tegra X1. Since it's smart to pick something already available to minimize costs, NVIDIA seemed like the best candidate for a gaming centric tablet because there was no way in hell Nintendo would be able to mooch off of Apple.

 

EDIT: There's a recurring theme here among the console manufacturers: Use COTS products whenever possible. Instead of burning through R&D money to develop a new core chip, use someone else's. This is especially the case today when the cost of developing a high-performance chip from ground up can easily reach billions of dollars.

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/23/2019 at 12:37 AM, Mira Yurizaki said:

-snip-

Gonna add to the GameCube, Wii and Wii U saga: IBM and Nintendo developed a pretty decent partnership in development of the Dolphin project, from what I remember, which would explain why they kept PPC up until the Wii U. There's still some promotional renders floating around that have Mario and the IBM logo in them.

mario-IBM.jpg

 

I forget where but I've also heard rumors that Sony initially intended for the Cell to handle graphics rendering as well as general processing of shit and that they likely ended up with Nvidia later in development. Not sure how true those are, but that would make a bit of sense, because I don't believe the GPU in the PS3 differs all that much from the 7900 or something like that.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/24/2019 at 9:34 AM, Loscohones said:

Why do consoles (Xbox and Playstation mostly) use AMD cpu's and gpu's?

 

Intel and Nvidia are both superior  products when it comes to gaming. All these years both Sony and Microsoft use AMD only why is that? Is it the pricing or restrictions of manufacturers?

My guessing is AMD is the only one willing to make custom ish chips for the consoles and are able to make them at a price that MS and Sony are willing to pay. For the record Nvidia does have its hardware in the Nintendo Switch. The reason they dont use Nvidia GPU's is because the chips used in these consoles are effectively APU's, as they have iGPU built on to the same chip as the CPU. 

 

As far as why Intel hasnt gotten in to the ring. I would assume its because they wont or cant design a chip at the low price point that is needed. Historically Intel has been the more expensive solution. Even in times where they were not necessarily the best. 

I just want to sit back and watch the world burn. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Dan Castellaneta said:

I forget where but I've also heard rumors that Sony initially intended for the Cell to handle graphics rendering as well as general processing of shit

Yeah, I heard that as wel, that it was some kind of "Super Processor" for both. But they failed...

 

6 hours ago, Dan Castellaneta said:

and that they likely ended up with Nvidia later in development. Not sure how true those are, but that would make a bit of sense, because I don't believe the GPU in the PS3 differs all that much from the 7900 or something like that.

I heard that they might have asked ATi but they didn't have resources or capacity to deliver a chip to them, so they went with nVidia...

 

But to be honest: with a Radeon 1900 based Chip (preferably the 1950GT, aka RV570 as it already was manufactured in a smaller process than the r580) the PS3 would have been much better because CineFX had problems with pipeline stalls.


With the ATi Architecture, Texturing and shading were independant since R300.

With nVidia that is not the case and IIRC texturing stalls the shader pipeline...

They fixed that with the G80 generation though...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×