Jump to content

What if... there were higher end consoles than present?

porina
Just now, Prqnk3d said:

if there were high end consoles...

people would be mining on them /s

There were high end consoles but they were too expensive so nobody bought them...

 

7 minutes ago, skywake said:

and that with some games you'd run into issues with games that need more than two cores. So I don't know why you are saying that I didn't mention these things.

Because 'running into issues' and game doesn't work at all are two different things.

One can mean it runs like shit, but does imply that you can play - wich you can not with only a two core/2 Thread CPU!

 

You should have mentioned that some games don't run at all on 2 Core/2 Thread CPUs!

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Stefan Payne said:

Because 'running into issues' and game doesn't work at all are two different things.

One can mean it runs like shit, but does imply that you can play - wich you can not with only a two core/2 Thread CPU!

 

You should have mentioned that some games don't run at all on 2 Core/2 Thread CPUs!

Why? I wasn't trying to argue that people should go out and buy dual-core CPUs for gaming. The fact that some games fail to run at all with dual-core CPUs doesn't take anything away from the point I was trying to make. I was making a point about how far behind the curve the PS4/XBOne were even in 2013. Specifically I was trying to counter an argument from someone who was saying that all GPUs from that era are now garbage and CPUs were even further behind. Total BS.

The point I made about an overclocked Pentium was pushing the limits for a reason, I was proving a point. I said that even a machine like that, something that is pretty average compared to today's hardware, would be ok running most games at 1080p. And given that's the case someone who spent a bit more in 2013 and got an i5 and a 660Ti or better is still doing pretty well at 1080p. So saying 2013 hardware is garbage now isn't at all true. The XBOne/PS4 were much further behind the curve than previous consoles were and, to be frank, it didn't really matter because we're not pushing the hardware as hard as we used to.

Fools think they know everything, experts know they know nothing

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, skywake said:

Why? I wasn't trying to argue that people should go out and buy dual-core CPUs for gaming.

Why the hell do you have a problem to admit that you forgot to mention this important detail?!
Almost sounds like you want to defend this shitty Intel CPU.

Again, where is the problem to admit that you should have mentioned that dual cores can't run some/many games today?

 

17 hours ago, skywake said:

The fact that some games fail to run at all with dual-core CPUs doesn't take anything away from the point I was trying to make. 

Yes because runs like shit is better than doesn't start at all on the escalation skala...

 

17 hours ago, skywake said:

I was making a point about how far behind the curve the PS4/XBOne were even in 2013.

No, they weren't.

Because PS4/XBone has 8 Core CPU.

What 8 Core CPU could you get in 2013??

The PS3 already had something like that though not a general purpose CPU.

And something more special (look at the Emotion Engine).

 

17 hours ago, skywake said:

Specifically I was trying to counter an argument from someone who was saying that all GPUs from that era are now garbage and CPUs were even further behind. Total BS.

Yes there still are some strong GPUs from that Area - like HD7970 or 7950 but 7870 already falls behind.

The Problem with most of those GPUs is that they don't have enough memory for modern games.

 

17 hours ago, skywake said:

The point I made about an overclocked Pentium was pushing the limits for a reason, I was proving a point. I said that even a machine like that, something that is pretty average compared to today's hardware, would be ok running most games at 1080p.

Yes and "forgot" to mention that there are some/many games that don't run at all on that machine. That's lying by omission and you don't seem to want to accept the mistake you made. 

Don't you think that it is important to mention that there are some games that don't work at all on that machine?!

 

17 hours ago, skywake said:

And given that's the case someone who spent a bit more in 2013 and got an i5 and a 660Ti or better is still doing pretty well at 1080p. So saying 2013 hardware is garbage now isn't at all true.

Yeah, you are right here. But only if you got something with at least 4 Threads (2C/4T or 4C/4T), if you got 2C/2T you can't run shit.

And the 660Ti is also somewhat borderline...

 

 

17 hours ago, skywake said:

The XBOne/PS4 were much further behind the curve than previous consoles were and, to be frank, it didn't really matter because we're not pushing the hardware as hard as we used to.

No, they weren't!

They were normal mid range PC Like Hardware.

Both had 8 Core CPUs (where were they in 2013??), the XBox One had 8GiB (Shared) Memory and an 853MHz AMD GCN based graphics model with 768 Cores.

Both had a 256bit memory interface, though the XBox used just DDR3 SDRAM while the PS4 got GDDR5 SDRAM. And also the XBox One has 32MiB eSRAM integrated to the GPU.

The PS4 had a similar chip, but 1152 cores Graphics Cores and higher bandwith due to  GDDR5.

 

 

So they were somewhat in the mid range PC area but had a unified memory pool wich gives a bit advantage.

 

And another thing you don't think about is the Operating system of the console and the APIs those consoles have...

For the consoles the manufacturers can make tools, define the APIs so that its not that easy to compare them to PC, even if the hardware is similar.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

If such a high end console could run things like Blender, as well as playing games, I might consider getting another console again. As gaming isn't my only hobby that requires compute, (plus, some of my other gaming tasks like emulation can't even be done on consoles) and I've only a finite amount of money, a PC was a very easy choice.

 

2 hours ago, Stefan Payne said:

Why the hell do you have a problem to admit that you forgot to mention this important detail?!
Almost sounds like you want to defend this shitty Intel CPU.

Again, where is the problem to admit that you should have mentioned that dual cores can't run some/many games today?

 

Yes because runs like shit is better than doesn't start at all on the escalation skala...

 

No, they weren't.

Because PS4/XBone has 8 Core CPU.

What 8 Core CPU could you get in 2013??

 

https://www.notebookcheck.net/Samsung-Exynos-5420-Octa-SoC.103633.0.html

 

While one could have included an AMD FX, it was debatable as to whether it is a true 8 core, or if only each module was a "core". With the Exynos 5 Octa (released in mid-2013), it is a true 8 core, no matter how one looks at it, and can use all 8 cores simultaneously. :)

 

Mind you, the Bobcat cores used in the consoles are very far from the larger cores used in Intel's Ivy Bridge and Haswell of the time. These cores would be more comparable to the Silvermont cores used in Intel's Atom line (specifically Bay Trail). Not knocking on either Sony or Microsoft here, as AMD's only other alternative would be terribly power hungry, but merely the fact that they're using eight core CPU's is of little relevance.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Zodiark1593 said:

Ah shit, I forgot those. Yeah, there are a couple of octa cores.

And Intel had their first one in 2009 and 45nm.

Though only LGA1567...

Just look for "Intel Beckton"

 

Quote

While one could have included an AMD FX, it was debatable as to whether it is a true 8 core, or if only each module was a "core". With the Exynos 5 Octa (released in mid-2013), it is a true 8 core, no matter how one looks at it, and can use all 8 cores simultaneously. :)

Yeah, you can debate that because of the shared Decoder. Though the newer Bulldozer variants had two decoders per module wich means one per INT Core.

 

 

Quote

Mind you, the Bobcat cores used in the consoles are very far from the larger cores used in Intel's Ivy Bridge and Haswell of the time.

They ain't no Bobcat.

They are Jaguar.

And improved quite a bit over Bobcat.

And you can argue that it was the best AMD had at the time and that a Bulldozer based CPU wouldn't have been much better in the same TDP Budget.

And it does the job pretty well.

 

 

Quote

These cores would be more comparable to the Silvermont cores used in Intel's Atom line (specifically Bay Trail). Not knocking on either Sony or Microsoft here, as AMD's only other alternative would be terribly power hungry, but merely the fact that they're using eight core CPU's is of little relevance.

Well, no. Not really.

Its more powerful than Silvermont/Bay Trail

https://www.anandtech.com/show/7933/the-desktop-kabini-review-part-1-athlon-5350-am1/3

http://www.tomshardware.com/reviews/athlon-5350-am1-platform-review,3801-5.html

 

And remember that the Athlon 5350 is crippled by Single Channel Memory interface. The Consoles have both 256bit (=Quad Channel like Threadripper/other HEDT Plattforms).

 

And you can also see the older Bobcat aka AMD E350 - way behind if it is listed.

 

And remember, that both Console manufacturers wanted a cheaper and less powerhungry system than their predecessors!

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, Stefan Payne said:

Ah shit, I forgot those. Yeah, there are a couple of octa cores.

And Intel had their first one in 2009 and 45nm.

Though only LGA1567...

Just look for "Intel Beckton"

 

Yeah, you can debate that because of the shared Decoder. Though the newer Bulldozer variants had two decoders per module wich means one per INT Core.

 

 

They ain't no Bobcat.

They are Jaguar.

And improved quite a bit over Bobcat.

And you can argue that it was the best AMD had at the time and that a Bulldozer based CPU wouldn't have been much better in the same TDP Budget.

And it does the job pretty well.

 

 

Well, no. Not really.

Its more powerful than Silvermont/Bay Trail

https://www.anandtech.com/show/7933/the-desktop-kabini-review-part-1-athlon-5350-am1/3

http://www.tomshardware.com/reviews/athlon-5350-am1-platform-review,3801-5.html

 

And remember that the Athlon 5350 is crippled by Single Channel Memory interface. The Consoles have both 256bit (=Quad Channel like Threadripper/other HEDT Plattforms).

 

And you can also see the older Bobcat aka AMD E350 - way behind if it is listed.

 

And remember, that both Console manufacturers wanted a cheaper and less powerhungry system than their predecessors!

Well, I don't usually look into the low power AMD cores all that much, so I conceed that Jaguar is used in the consoles, as well as being faster than Silvermont, albeit, not that far ahead, at least, compared to throwing a big core into the comparison. Having more smaller cores are good if you've lots of lighter threads to juggle around, and to an extent, should actually be faster than having fewer larger cores (SMT notwithstanding), though lacking single thread performance makes them less versatile in a PC environment.

 

AMD also had their Stars architecture they were using in Ilano. It was certainly more powerful than Jaguar, and probably not far off from Bulldozer at similar clocks. Not sure if power consumption would have been much better than a Bulldozer/Piledriver cpu at similar clocks though.

 

In the Steam forums before the PS4/One launched, I made the prediction that at least one of the two systems would use an APU-type design (cpu and gpu on one chip), citing power consumption, temperature and failure rate concerns from the previous consoles. Very few actually agreed. No need to mention how that turned out.

 

On a final note, I neglected to mention Intel's 8 core Nehalem EX (Beckton) and other such server/HPC chips as the price and performance/power consumption targets were entirely different from Jaguar, and in general were not easily available to the average consumer. I hardly thought it a fair comparison to bring up.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Zodiark1593 said:

Well, I don't usually look into the low power AMD cores all that much, so I conceed that Jaguar is used in the consoles, as well as being faster than Silvermont, albeit, not that far ahead, at least, compared to throwing a big core into the comparison.

Look at the Benchmarks.

The AMD Core is pretty far ahead in some situations.

Although the Intel has a Boost and can clock up to 400MHz more and has dual Channel Memory...

 

So the difference can be quite big, especially with graphics:
https://us.hardware.info/reviews/7417/5/single-dual-and-quad-channel-memory-performance-more-lanes-more-speed-benchmarks-games

 

1 hour ago, Zodiark1593 said:

AMD also had their Stars architecture they were using in Ilano. It was certainly more powerful than Jaguar, and probably not far off from Bulldozer at similar clocks. Not sure if power consumption would have been much better than a Bulldozer/Piledriver cpu at similar clocks though.

Was it per Watt??

And was it AMD would have offered them?

Remember, Jaguar has Support for AVX as well.

That's also something you should not forget.

 

So the only other option would have been Bulldozer.

And here is the question:
Did Sony/M$ want Bulldozer?
Did AMD offer them Bulldozer?

Would Bulldozer have offered equal or better Performance/Watt Ratio or would it have been worse?

 

Given the Timeframe, Jaguar was available, Kaveri was not.

 

And also don't forget the level of integration!
Kabini had everything they need, no need for any external chips. Kaveri has not - and needs an external SOuthbridge.

Trinity and Richland didn't have GCN and only the old Terrascale Graphics thing. So nothing here...

 

Getting a Vishera wasn't an option either because of the Chipsets needed - they didn't have PCIe in the CPU at the time.

 

It might be possible that one of those originally planned with Kaveri - that would explain the GDDR5 Interfaces those have (AFAIR 4x32bit) but choose to go for Jaguar instead.

Because it was available and possibly could have had a better Perf/W Ratio.

 

 

1 hour ago, Zodiark1593 said:

In the Steam forums before the PS4/One launched, I made the prediction that at least one of the two systems would use an APU-type design (cpu and gpu on one chip), citing power consumption, temperature and failure rate concerns from the previous consoles. Very few actually agreed. No need to mention how that turned out.

Yeah, you were right and many "in the Know" also speculated that the APU was more likely than going for two separate chips.

But there was the possibility to go to IBM again and let them make a Dual Die unit with AMD Graphics. Or a single Die even.

 

1 hour ago, Zodiark1593 said:

On a final note, I neglected to mention Intel's 8 core Nehalem EX (Beckton) and other such server/HPC chips as the price and performance/power consumption targets were entirely different from Jaguar, and in general were not easily available to the average consumer. I hardly thought it a fair comparison to bring up.

Yes and Intel isn't an option either.

Microsoft got burned with the original XBox and probably doesn't want to do it again (Remember, that Pentium III/Geforce 3 variant, IIRC).

They stuck with ATi/AMD for at least two generations now -> 360 and XBox One.

Sony used nVidia for the last gen but also uses AMD right now.

 

Quite interesting how similar both consoles have become when they were both so different just one generation before...

 

 

But yeah, the intermediate Consoles (Xbox one X and PS4 Pro) were made to give a little increase in performance because both were waiting for the ZEN Architecture.

And possibly also Navi graphics architecture...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Stefan Payne said:

Why the hell do you have a problem to admit that you forgot to mention this important detail?!
Almost sounds like you want to defend this shitty Intel CPU.

Again, where is the problem to admit that you should have mentioned that dual cores can't run some/many games today?

I agree entirely that a dual-core CPU is not a good option for gaming in 2018 given how games are optimised now. But I wasn't making that point and I don't know why you have a problem admitting you misunderstood where I was coming from. I was making a point about how far behind the curve the PS4/XBOne were at launch vs previous "high end" console launches. The only reason I brought up the Pentium was as an extreme example of what even fairly entry level 2013 gaming PCs could do.
 

9 hours ago, Stefan Payne said:

No, they weren't.

Because PS4/XBone has 8 Core CPU.

What 8 Core CPU could you get in 2013??

You do realise that both the PS4 and XBOne CPUs are Jaguar based mobile chips with fairly low clock speeds right? I mean technically they all two quad cores that are kinda similar to a pair of A6-5200s (PS4/XBOne slightly underclocked, Pro/X slightly overclocked). Not the highest end of CPUs even for 2013. For the sake of comparison here are the PassMark numbers for one of those quad cores vs a stock Pentium I was talking about before. I actually looked this up while I was making my first post just to be sure I what I was making made sense, I didn't think I needed to post it because I thought what I was saying was obvious...

cpu.png
 

9 hours ago, Stefan Payne said:

Yeah, you are right here. But only if you got something with at least 4 Threads (2C/4T or 4C/4T), if you got 2C/2T you can't run shit.

And the fact that you can get away with an almost 5 year old quad core in 2018 does say something. 

Fools think they know everything, experts know they know nothing

Link to comment
Share on other sites

Link to post
Share on other sites

The ultimate consoles are definitely called custom built computers. My i7-8700k, gtx 1080, 32gb 3200mhz ram, 2x 1tb 960 evo m.2 drives is all crammed in a >5L chassis.

Small as a console, can be wireless like a console (despite power and hdmi/DP) and way more powerful.


The time is now #PCMR

Osmium: NFC Skyreach 4 // i7-8700k (delidd) // GTX 1080 // 32GB DDR4-3200Mhz // 1TB 960 Evo M.2 // 1.1TB MX300 m.2
Peripherals: Razer Blackwidow // Razer Orbweaver // Razer Kraken // Logitech G502 // Logitech K830 // LG 34UC88-B

Usage: Adobe Lightroom // Adobe Photoshop // Web Dev // Recording Gameplay // Video Editing // Portable Gaming

Link to comment
Share on other sites

Link to post
Share on other sites

I'm just going to side step everything that's been going on here and address the OP with my two cents.

On 1/26/2018 at 2:28 PM, porina said:

We have the PS4 serving as a base level, and PS4 Pro with a bit more poke in it. Why not continue going up? Variations of the image above have been going around since the PS4 Pro was announced. Just add another slice!

 

Anyway, the thinking is simple, add a higher model in the range above the Pro. They can keep the same requirements on the developers as they do now, such that the base model can still perform at a decent level and they're not allowed to make a Pro only (or higher) game. But also use the extra power to make it better. Why not have a higher end gaming quality in a console? It'll cost more, for sure, but you get more performance also. Would that make console gaming more appealing?

Not really. The market has pretty much floated around that ~$400-$450 is about how much a person is willing to spend on a console. Sure we had the $600 PS3, but the PS3 didn't really start gaining traction until the more affordable $300 PS3 Slim came out.

 

Two other problems with adding more SKUs are logistics and development. Before there was a mid-season upgrade, it was easy to add yet another SKU because you'd just stop making the previous one. It was all the same specs anyway. But now with the mid-season upgrade and with the manufacturers promising and enforcing minimum system requirements, that means they have to build two versions. Which means two different test plans, two different manufacturing processes, two different everything. I might've been a different story if all the companies were doing were a chip swap, but nope, the parts (except the disk drives) can't be interchanged

 

With development, any new SKU you add that has different specs means you have to make another build of software for it. Which means you have to potentially maintain separate code paths which require their own testing. Just because the hardware is similar doesn't really mean the same exact test is still valid. I'm pretty sure developers are  perturbed with the Xbox One X using GDDR5 memory as system memory rather than DDR3 + eDRAM. I mean, one memory pool is simpler, but they were developing with two memory pools in mind. And now they have to have another version of the software where it uses one pool.

 

On 1/26/2018 at 2:28 PM, porina said:

The upgrade doesn't necessarily have to be limited to CPU/GPU updates, but the storage is something I've found lacking in consoles. I don't expect TB+ SSDs, but it could be interesting if, for example, it came with a 512GB SSD and also maybe say 3TB HD, but as both are fitted at the same time, the OS could use the SSD to cache intelligently and help the HD out. For a given game, the more frequent accessed items can be cached on SSD with other stuff still streamed from HD/optical as needed.

Eurogamer did do a test on putting an SSD in the PS4 and they found that substantial decreases in loading times were hit and miss. While they blamed the SATA 3Gbps interface, I don't really think that's the case. I haven't found a lot of games when recording their loading activity on a Windows logger to spike higher than 300MB/s. I'd argue that the CPU is slow enough that it can be a bottleneck at times.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, M.Yurizaki said:

Eurogamer did do a test on putting an SSD in the PS4 and they found that substantial decreases in loading times were hit and miss. While they blamed the SATA 3Gbps interface, I don't really think that's the case. I haven't found a lot of games when recording their loading activity on a Windows logger to spike higher than 300MB/s. I'd argue that the CPU is slow enough that it can be a bottleneck at times.

On the earlier complexity, yes, it will increase workload to an extent, but unless devs are console-exclusive they'd have to deal with that on the PC side anyway so it shouldn't be that big a hurdle.

 

As for storage, I unfortunately agree. Speedups vary a lot. I did swap the 1TB 5200rpm HD in my PS4 with a 480GB SSD. It did make the system feel a bit more responsive, but maybe not so much on game load times. I've also seen similar on PC... to get the most of the storage speed potential requires the rest of the system to catch up, and/or a different storage strategy. For instance, if a dev knows they have to load a ton of stuff from HD, they may work out it is faster to compress it to reduce disk load, but increases CPU load. Removing the HD bottleneck just pushes it straight into CPU limiting. It may be the case a different storage optimisation would be faster for a SSD, although having said that, uncompressed data will fill up an expensive per capacity SSD faster... but maybe they could use a less CPU intensive method instead...

 

I know I'm a bit of a dreamer, and my proposed high end console is unlikely to happen, but at least from a tech level it is still interesting to discuss and get viewpoints on it.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, porina said:

On the earlier complexity, yes, it will increase workload to an extent, but unless devs are console-exclusive they'd have to deal with that on the PC side anyway so it shouldn't be that big a hurdle.

 

As for storage, I unfortunately agree. Speedups vary a lot. I did swap the 1TB 5200rpm HD in my PS4 with a 480GB SSD. It did make the system feel a bit more responsive, but maybe not so much on game load times. I've also seen similar on PC... to get the most of the storage speed potential requires the rest of the system to catch up, and/or a different storage strategy. For instance, if a dev knows they have to load a ton of stuff from HD, they may work out it is faster to compress it to reduce disk load, but increases CPU load. Removing the HD bottleneck just pushes it straight into CPU limiting. It may be the case a different storage optimisation would be faster for a SSD, although having said that, uncompressed data will fill up an expensive per capacity SSD faster... but maybe they could use a less CPU intensive method instead...

 

I know I'm a bit of a dreamer, and my proposed high end console is unlikely to happen, but at least from a tech level it is still interesting to discuss and get viewpoints on it.

Oh, and even if we have higher end storage options, there's the issue that again, you have a different hardware environment so you have to plan accordingly.

 

The whole thing with consoles is that it's a locked-down system such that it's relatively easy to develop for because you know exactly what you're dealing with. That's the biggest plus to a console. The users may scoff at the idea of a locked-down system, but to a developer, it can be a freakin' godsend.

 

Also good catch on data compression. I wouldn't have thought of that being the reason why storage performance doesn't automagically equate to faster loading performance on a console.

Link to comment
Share on other sites

Link to post
Share on other sites

I think the higher cost would make it a niche market.  Probably not a big enough market to justify having to develop games that work with 3 different console configurations.  I assume developing a game that works nicely on both the base ps4 and ps4 pro is a hurdle on it's own.  Part of the whole console appeal to the masses has been they've been somewhat affordable.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/27/2018 at 12:49 AM, Stefan Payne said:

Been there, done that.

See XBox 360 and PS3.

 

Their sales in the first years were abysmal until they dropped the price dramatically.

 

 

So you can say that nobody wants a high end console because nobody bought it when the consoles were really high end...

No kidding, those consoles where sweet for their time, but darn those where expensive at launch price.  Plus, if I remember right, the companies where still losing money on each unit for a while.

 

I remember some of those consoles going near to over 1,000 bucks on eBay.

If I am recalling correctly, the PS3 was 700 bucks when launched.  I just remember many of my co-workers where balking at the price for the Xbox 360 and PS3 on launch and refrained from buying any until the price was lowered.  Most stuck to the PS2 and Xbox during that time.

2023 BOINC Pentathlon Event

F@H & BOINC Installation on Linux Guide

My CPU Army: 5800X, E5-2670V3, 1950X, 5960X J Batch, 10750H *lappy

My GPU Army:3080Ti, 960 FTW @ 1551MHz, RTX 2070 Max-Q *lappy

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, Xbox One S, Xbox One X

My Tablet Squad: iPad Air 5th Gen, Samsung Tab S, Nexus 7 (1st gen)

3D Printer Unit: Prusa MK3S, Prusa Mini, EPAX E10

VR Headset: Quest 2

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Ithanul said:

No kidding, those consoles where sweet for their time, but darn those where expensive at launch price.  Plus, if I remember right, the companies where still losing money on each unit for a while.

 

I remember some of those consoles going near to over 1,000 bucks on eBay.

If I am recalling correctly, the PS3 was 700 bucks when launched.  I just remember many of my co-workers where balking at the price for the Xbox 360 and PS3 on launch and refrained from buying any until the price was lowered.  Most stuck to the PS2 and Xbox during that time.

Yes, exactly!

That's what I meant...

 

Those were really high end consoles - the first time we've seen that and probably also the only time we will ever see that.

 

The PS3 used a little customized G70/G71 and the 360 used a full custom ATi Chip with combined Vertex and Pixel Shaders...

 

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Stefan Payne said:

Yes, exactly!

That's what I meant...

 

Those were really high end consoles - the first time we've seen that and probably also the only time we will ever see that.

 

The PS3 used a little customized G70/G71 and the 360 used a full custom ATi Chip with combined Vertex and Pixel Shaders...

 

Not to mention that the PS3 used what is essentially a server processor. Failure rates were also quite high for both early consoles, which led me to predict that their successors would not shoot quite so high. (Which turned out to be the case)

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Zodiark1593 said:

Not to mention that the PS3 used what is essentially a server processor.

Well, no, not really.

It had IIRC 1 Power PC core (wich has little to nothing to do with the POWER architecture) and 7 smaller in order Cell Engine ones.

 

The problem is that its a bit complicated to get the max performance out of those...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×