Jump to content

Ubisoft GDC Presentation of PS4 & X1 GPU & CPU Performance

thewhitestig

1.For me, the only thing that is shitty in WD, are the horrible mouse controls. They are indeed ported from controllers, in a very bad way.

SLI/Xfire is not the same as single card performance. A game being optimized for a few threads are not a shit port, because it cannot utilize a dual CPU setup (yes CPU). Out of the entire WD market, very very few will have more than 1 GPU (considering that the majority market for the game are consoles). Often SLI/Xfire is done by Intel/AMD on the driver side.

 

2.What E3 settings? The ones that made the dynamic weather patterns and day night cycle, into a static chosen weather situation at a specific time of day? Look at the "proof" picture we all know. It's all weather settings. Even Linus has pointed out this issue on the WAN show about Shadow Of Morder: Here it was difficult to benchmark it, becuase of dynamic weather. They found a place where the dynamic weather was static.

If you are making a demo of a game, it is not odd to dictate the weather and time of day. That is what the "E3 proof" is.

 

1 in the consumer grade computers it is not as uncommon to find multiple GPUs as multiple CPUs.

Take a look around the booth of Nvidia and AMD, they are showing off multi GPU systems, and the reason why people love the ref desing of the GTX (apart from the look) is that it performs well multi GPU. And it does not mean that with more GPU the performance should scale BACK

 

2 they were some graphical settings, that made the game prettier. not like "Turn on Rain"

There were many people who found these settings and used them, and although they found bugs, they were pretty much working, but taken out.

Link to comment
Share on other sites

Link to post
Share on other sites

1 in the consumer grade computers it is not as uncommon to find multiple GPUs as multiple CPUs.

Take a look around the booth of Nvidia and AMD, they are showing off multi GPU systems, and the reason why people love the ref desing of the GTX (apart from the look) is that it performs well multi GPU. And it does not mean that with more GPU the performance should scale BACK

 

2 they were some graphical settings, that made the game prettier. not like "Turn on Rain"

There were many people who found these settings and used them, and although they found bugs, they were pretty much working, but taken out.

 

Of course AMD and Nvidia shows SLI/Xfire setups at their booths. It is a feature in their cards after all, and of course they want to sell as many GFx as they can to the same costumer as well. But I highly doubt the average pc gamer has more than 1 GPU. On dedicated tech sites, like this one, where tech enthusiasts are, you will obviously find, not only higher end pc's, but also more multi GPU systems sure. Too bad Steams hw survey, do not show the amount of GPU's on the computers, like CPU's (whitch are only cores). But it does show the average gpu having only 1 gb of vram, thus being smashed to the ground compared to even the xbone.

 

True, you could activate a rain thing in the settings file (I did that myself). But it is not uncommon to "tweak" pc games in setting files, like FOV, resolutions, and so on. That does not mean they are bad consoles ports. Especially not if it is the case of PC exclusive titles. Even crysis 3 has advanced setting file and console tweaks you can do, but that does not make it a poor port.

 

As for WD, the horrible worse mod, aka Mr Magoo mod, turning you into a short sighted idiot, who could never pass a drivers license test. It hardly proved anything. The dynamic shadows of cars headlights, was never shown in any demo of the game. The extreme bloom was never shown like that either. Bokeh dof IS in the game and is used for cutscenes and aiming on weapons. It has never been used as part of normal gameplay, and would make ultra textures useless, as it just blurs out everything anyways.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

1. Because we are talking about a comparison between a console and an alternative. Like I said to Luka, it is less relevant in a gaming sense, but in a proper comparison you might have to factor that in as well. Either way, it is a minor issue, with little relevance to the discussion.

So you agree part of your complaint has little relevance.

2. Neither have I ever claimed such a thing? My comments was about BF4 and ONLY  BF4, being more GPU limited than CPU limited. Please understand the context, in which my argumentation is written.

You don't understand what it means.  It's either GPU-bound OR CPU-bound, not more or less of either.

3. So the point that multithreading will leave a Pentium behind, is accepted then? Glad you finally understood my point.

Please leave the ad hominem and name calling away from me. It is frivolous, and does not support your case.

A PS4 is not a PC. It does not have to run the same amount of processes, API's, middleware, drivers, software, tools and so on. Neither does the CPU have to process all the data the GPU needs form the RAM/VRAM because of HSA. If you do not understand this, then I'm not the problem here.

You just can't read.

More of HASWELL cores = better; not more of shitty AMD jaguar cores.

It's neither ad hominem nor name-calling; I was making a conclusion based on what I saw in your post.

A PS4 has a fairly costly operating system and background processes running...JUST like a PC(even though they are not running the same things, obviously)...that's why games do not get all 8gb of ram...

The CPU still has to work, you know?  And games are clearly suffering from its lack of power.

 

4. That is your opinion. Like I said, I would still like proof that WD is poorly optimized, as well as downgraded. The E3 settings are nothing but wether settings, making the weather static in the E3 Demos. No downscaling of graphics seems to have occured from those demos. The nightclub has been changed, and the human trafficing moved to a different location. The rest is cinematography, not graphics quality (low hanging sun, long shadows, high contrast with many lightsources, etc. The game still has all that.

The proof is right there :http://www.guru3d.com/files-details/watch-dogs-theworse-mod-8-download.html

 

 

5. Indeed it is my burden of proof. That is why I'm very curious about GTA5 and The Division, when they come out. Unless GTA5 has as many unique textures on the screen at the same time, I don't think it will need quite as much vram, but we will see. Either way, more textures at the same time = higher vram usage. I don't assume you would disagree with that statement?

We need PROOF; not your opinions.

 

6. What are you even talking about? I never said anything like that? Do you actually understand what I wrote, and did you even read it?

If devs have more than 10x amount of vram available, obviously it will be possible for them to use it. GTA5 was very limited by the vram available. Now they can go "all in" in the new consoles. That will benefit PC as well, since these games will be designed to use more and better textures, than before.

You can't read, that's what.

7. Designed for PC. Remember that the new Gen of consoles hadn't even been announced when WD started development. In fact WD had been in development for 3 years, before the PS4 unfinished dev kit was released in 2012 for devs. The real PS4 with finalized specs was not announced until february 2014. WD had been in development since 2009.

Like my link said, when you make new games, that actually push hardware, it again gets accused of being poorly optimized. Just like Crysis 1 did, when it pushed the market. Oh the jokes and accusations. Today we know it was well optimized, but pushed the generel pc more than it could handle back then.

Game development isn't all programming...etc. and there is nothing to suggest that it wasn't made for last gen console in first place(and it was actually released on PS3).

Crysis was revolutionary in graphical fidelity.  Watch-doges, however, is nothing.

 

The amount of cores are less relevant. It is the amount of threads that count and how parallelized they need to run, that matters. What is your point with Nvidias new cards? The fact that they increased 780ti's Vram with 33% on even the 970, should prove that even Nvidia is starting to see the need for more vram.

You failed to read again.

You were arguing for amount of cores in consoles; now they are less relevant?  Stop shooting yourself in the foot if you want to make an argument.

Stop stating the obvious; you are contradicting yourself with it.

That was in response to you obviously false statement: " and even worse, the GPU's had not evolved enough over the 5 years"

 

 

Ultra texture packs are becoming a comon thing now. How exactly are ultra textures "poorly done"?

Ultra quality does not necessarily mean "pointlessly large."

It being pointlessly large is a sign that it is poorly done.

 

8. Yes it does, which is why it uses more Vram than the normal res textures that comes with the game, right? But if you increase the number of textures, drawn on the screen at the same time, that would in itself increase the amount of vram used. Do you agree? If not where do you think all the extra textures are stored?

You are not making a point here.

I wonder where you think texture is stored; did you think texture can only be stored in VRAM?

High quality texture does not mean you need to store needlessly large amount of texture in vram.

 

9. It IS a thread about consoles. I don't own one, and I prefer to game on PC. But these consoles are relevant, as most AAA games will be developed for consoles, as they have a bigger marketshare than PC. So when these new consoles 10x the amount of vram, then it gets exiting, what will happen to the games for PC and how they will push the pc's now.

OC is a feature on PC only. And like I said, for console players, and most pc players, OC is not interesting to them. It does not matter how easy it is, if they can't or won't do it.

Consoles are only relevant as something poisonous to gaming as a whole.  

 

More "AAA" games made for consoles is a terrible thing; it means gaming as a whole is retarded for consoles.

There are still companies making games for either only PC or with PC as priority.

That still has nothing to do with OC.  That is something you cannot apply your poor opinion to everyone else in.  It's getting extra performance; not that stock performance isn't shitting on consoles already.

 

10. Several games can use more than 2GB. Several games require more for ultra textures. As most games are still held back by last gen, it will take some time for the new games to come out, to utilize the extra vram. Keep in mind that open world games, will always use more Vram, than corridor games, or racing games. So that factors in too.

Not sure if consoles use ultra textures or not, but in the aspect of PC gaming, it does push the texture quality and amount of texture on the screen. PC gamers getting higher quality textures, is just a good thing for us, as it can take advantage of the more expensive and powerful hardware we have.

Yes, but as games become more advanced, they will not only use more threads, but also use more parallel threads, that depends on each others calculations. There is a reason we have multicore CPU's now. They simply could not get much better on a single core, because they could only increase the length of the pipeline, making it slower at processing a thread. PC's can still brute force through the added threads on this gen of console games, but 2 cores is simply too little.

 

You are just dodging the burden of proof here.

You make a claim there are several games that can use more than 2gb of vram?  Post them.

GTAV is an open world and it can run on 256mb vram, Crysis 2 is a corridor shooter, it does not run on 256mb vram.

2 cores is not too little yet.  If you are going to whine about Pentium G3258, keep in mind  that AMD build based on Athlon x4 860k is about the same price(or less).

 

11. You know, just because you don't understand what I wrote, or you disagree, does not mean  shoot myself in the foot. Much Derp!

Not only have I not claimed that PC's do not have low level API's, I even linked to mantle myself to prove that low level API result in higher performance. I'm not entirely sure how low level mantle is compared to the consoles API's, but it does not get lower level than the consoles, as you only have 1 hardware setup you can fully optimize for. Hopefully low(er) level API's for pc's can give us more performance, but right now, we have seen very little of what's to come.

You know, just because you can't missed the point that does not mean you didn't shoot yourself in foot.

The point here is that PC has low-level API already and that it works just fine.  Mantle now and Dx12 later.

Consoles also have low-level API and it's already showing pathetic 900p 30fps games.

 

 

12. Cool you can scrap the cost of BD player software then.

There are still very few games, that works on linux. Especially AAA games. Kodus to Valve for developing a tool, to help devs port games from DX to Opengl and generally Linux, but right now, Linux is not a proper alternative to Windows for consumers.

And there is still poor AMD drivers for linux.

The point is that it's free and SteamOS/Steam box will eventually get enough attention.

And windows 8.1 is cheap.

 

 

 

 

 

As for WD, the horrible worse mod, aka Mr Magoo mod, turning you into a short sighted idiot, who could never pass a drivers license test. It hardly proved anything. The dynamic shadows of cars headlights, was never shown in any demo of the game. The extreme bloom was never shown like that either. Bokeh dof IS in the game and is used for cutscenes and aiming on weapons. It has never been used as part of normal gameplay, and would make ultra textures useless, as it just blurs out everything anyways.

 

Here you must mock whatever proves you wrong.

 

Btw, you can turn off DoF.

Anyone who has a sister hates the fact that his sister isn't Kasugano Sora.
Anyone who does not have a sister hates the fact that Kasugano Sora isn't his sister.
I'm not insulting anyone; I'm just being condescending. There is a difference, you see...

Link to comment
Share on other sites

Link to post
Share on other sites

Now lets compare to PC

Nuclear-explosion.gif

Well, shit, the benchmark exploded....

 

Ubi, sony, and microshit can all eat a dick.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

No that would be terrible..

 

This way code is more easily portable between PS4, Xbox One and PC. Making life difficult for devs is not good.

The cell processor that was made 8 years ago is as powerful as the cpu in the ps4. That's just sad. Devs just need to get off their lazy asses. If they want to make millions, then they shouldn't complain 

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

@Notional, any GPU on the PCIe bus can load directly from system memory. It's in the CUDA 6 and OCL 1.2 standards. The CPU needs to pass a draw kernel telling the GPU where to start, but otherwise it's system memory and the PCIe bus doing the work of passing memory, not the CPU. On an SOC you have a different scenario where GPU cores are not independent, hence requiring a unified memory standard to let the CPU run free.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

So you agree part of your complaint has little relevance.

From a gaming perspective it is irrelevant, like I told Luka. But ultimately it's relevance is up to the consumer buying the console.

 

You don't understand what it means.  It's either GPU-bound OR CPU-bound, not more or less of either.

That depends on the setup. On a balanced system, BF4 is GPU bound, but if you have a 295x2 and a pentium dual core, it will be CPU bound. But this is completely sidestepped. The point was that games utilizes more threads in parallel now, becuase they are more complex, thus needing more done syncronized.

 

You just can't read.

More of HASWELL cores = better; not more of shitty AMD jaguar cores.

It's neither ad hominem nor name-calling; I was making a conclusion based on what I saw in your post.

More Haswell cores than threads makes no difference.

Saying someone went full retard is ad hominem.

A PS4 has a fairly costly operating system and background processes running...JUST like a PC(even though they are not running the same things, obviously)...that's why games do not get all 8gb of ram...

The CPU still has to work, you know?  And games are clearly suffering from its lack of power.

The operating system is a sunk cost. I doubt it is THAT expensive. Besides it has 2 dedicated cores, so it is not really that relevant. The game has 6 dedicated cores. That is what we are discussing. Such a division is not present on PC's, sadly. ( I do wish Windows came with a gaming mode, that would prioritize everything game, when playing).

It Indeed does look like the new consoles are CPU bound. And like I stated earlier, that sucks, but that does not change the fact, that games on PS4/XBONE does not have to run as much API', middleware, software in the background, etc.

The proof is right there :http://www.guru3d.co...8-download.html

 

You have to elaborate on that one. This is a mod, that changes the look of the game. By changing the effects already in the game. That is an artistical change, not a graphical fidelity change. On the contrairy, it makes the game look different than what the devs decided on. This is worse than the tv's in the stores, that has boosted contrast and neon colours. It looks pretty, but it's completely distorted compared to the source.

Changing fog, people density, light intensity, etc. are not improving the grpahics quality, but just modding the game to look and behave differently than designed.

However the question was about poor optimization. Worse mod, does not change that, nor prove that it is poorly optimized.

 

We need PROOF; not your opinions.

I want proof too, but I cannot show any, that is not publically announced yet. However games like Shadow of Mordor, Watch Dogs, Titanfall and BF4 can all use more than 2GB vram @ 1080p. The first 2 even more than 3GB. Again it will take some time for newer games to take advantage of the massive increase in available Vram, just like it takes time before a new version of Direct X is used in a game.

You did not answer my question: Either way, more textures at the same time = higher vram usage. I don't assume you would disagree with that statement?

 

You can't read, that's what.

I can read, but you write gibberish. But let me ask you then: If a console get's 10x the amount of vram available for the game, what do you think will happen with textures in games designed for the new console?

 

Game development isn't all programming...etc. and there is nothing to suggest that it wasn't made for last gen console in first place(and it was actually released on PS3).

Crysis was revolutionary in graphical fidelity.  Watch-doges, however, is nothing.

Actually building a graphics engine/gaming engine is pretty much the first thing you star working on. They had to gimp the game a lot to work on PS3, so it is quite clear it was not made for PS3. Either way we can only choose to believe in what has been officially said by Ubisoft, and they said what I am repeating on this matter.

 

I agree on Crysis, but disagree on Watch Dogs. No other GTA style game looks as good. Neither with the effects (weather, sunlight through trees, blinding as you exit a tunnel in daylight, etc,), nor the textures (both in quality, but also quantity, as in amount of unique textures shown on the screen, instead of just the same few textures copy pasted all over).

 

You failed to read again.

You were arguing for amount of cores in consoles; now they are less relevant?  Stop shooting yourself in the foot if you want to make an argument.

Stop stating the obvious; you are contradicting yourself with it.

That was in response to you obviously false statement: " and even worse, the GPU's had not evolved enough over the 5 years"

 

This was your quote:"Unfortunately, PC can have as many cores as it needs. ". In that regard, no having 48 cores does not matter. Software uses threads of data to be processed. Having more cores than threads, has no benefit for that piece of software. Having LESS cores than threads can be a problem, especially if the threads run in parallel, where they need data from each other, during processing. It seems like you do not udnerstand how CPU's work. I think that is where the confusion lies.

Linus said it himself, that Nvidia postponed their top tier cards, one generation (probably because of AMD not following suit). GPU's should have been better by now. The 900 series (970/980) are a nice proof. 980 is not better than 780ti for the most part (except the added vram), so yeah Nvidia is holding back.

 

Ultra quality does not necessarily mean "pointlessly large."

It being pointlessly large is a sign that it is poorly done.

Please elaborate on "pointlessly large". Higher quality textures, are higher quality, and thus also larger in size. I don't get your point here. Define pointlessly large please. Better graphics is not pointless, and will require more out of the hardware.

 

You are not making a point here.

I wonder where you think texture is stored; did you think texture can only be stored in VRAM?

High quality texture does not mean you need to store needlessly large amount of texture in vram.

All textures being shown on screen is stored in VRAM, otherwise your FPS would collapse, if textures shown on screen has to be read from system ram. Let me make my point more clear then:

Game A has 50 unique textures on the screen, repeated all over.

Game B has 100 unique  textures at same resolution as A, on the screen, repeated as necessary.

Which game would use more Vram? In my example B would use twice as much for textures. That is the point with number of unique textures, and the reason why Watch Dogs uses so much. Every facade, every entity has its own unique texture, so the diversity in textures shown in the same shot is much larger, than any other open world game on the market.

 

Consoles are only relevant as something poisonous to gaming as a whole.  

More "AAA" games made for consoles is a terrible thing; it means gaming as a whole is retarded for consoles.

There are still companies making games for either only PC or with PC as priority.

Considering consoles pay for most of the development of most AAA games, I deeply disagree. Unless you want worse games, at higher the cost? Look at Ubisofts market. Less than 1/3 of the sales are PC. Actually it might be less than 1/4 of all sales.

Since the current gen consoles are now x86 based and use PC gpu's, I don't really see the problem. In fact the new consoles are better for PC, as they can make the games for PC, then downscale them for consoles. Several devs claim do be doing so atm. But development is expensive. Don't assume a dev would spend half the budget to make the game prettier for less than 25% of all the consumers. 

That still has nothing to do with OC.  That is something you cannot apply your poor opinion to everyone else in.  It's getting extra performance; not that stock performance isn't shitting on consoles already.

And it still has no relevance for consoles, as they cannot OC. It still is not a valid point, as most gamers might not care or want to OC. Do you have any statistics about how many OC?

 

You are just dodging the burden of proof here.

You make a claim there are several games that can use more than 2gb of vram?  Post them.

Examples further up

 

GTAV is an open world and it can run on 256mb vram, Crysis 2 is a corridor shooter, it does not run on 256mb vram.

Yes and Crysis 2 looks a lot better than GTAV on PS3. I think most people would lose their mind, if GTAV, looked on PC and PS4, the way it does on PS3. Your example does not disprove my point. Open world games use more vram, than closed corridor shooters.

 

2 cores is not too little yet.  If you are going to whine about Pentium G3258, keep in mind  that AMD build based on Athlon x4 860k is about the same price(or less).

Who said anything about Athlon CPU's? Haswell has twice as many ALU's and APU's as that Athlon, per core (afaik). We've already gone through benches showing BF4 being crippled by that Pentium, even though the game is primarely GPU bound on normal balanced builds. Why would anyone buy I5 or i7 instead of that chip then?

As far as I can tell from the source, all games run a few frames faster on the Athlon, but obviously any highend multithreaded game, would get bottlenecked by both of them.

 

You know, just because you can't missed the point that does not mean you didn't shoot yourself in foot.

The point here is that PC has low-level API already and that it works just fine.  Mantle now and Dx12 later.

Consoles also have low-level API and it's already showing pathetic 900p 30fps games.

You seem to assume that all low level API's are equally low level, equally effecient, and both elliminate need for a lot of API's, middleware, drivers, etc. That is an assumption you don't have the facts to conclude on. Furthermore only mantle is low level on the market, and is specifically written to lower the overhead on the CPU and be able to make more drawcalls than Direct X. Few games uses it, and they do not seem to be directly programmed for the GCN architecture. Otherwise, we would see a huge FPS difference.

However none of this disproves my point, that both current gen consoles, can punch above their weight right now on the GPU side, compared to PC:

And let's not forget that programming close to the metal will usually mean that we can get 2x performance gain over the equivalent PC spec - Oles Shishkovstov 4A games (metro): http://www.eurogamer.net/articles/digitalfoundry-2014-metro-redux-what-its-really-like-to-make-a-multi-platform-game

 

And there is still poor AMD drivers for linux.

The point is that it's free and SteamOS/Steam box will eventually get enough attention.

And windows 8.1 is cheap.

Crappy AMD drivers or crappy Nvidia drivers is in itself a good argument against Linux. The point is, you have to factor in all costs of the pc to make it do what a console can do. 

 

Here you must mock whatever proves you wrong. 

Btw, you can turn off DoF.

Does not change all the other crap they did. The point is that the worse mod, does not reveal "hidden" graphics effect, supposedly cut away because of the consoles. The only effect being brought out, is the glitchy headlight shadows, that has never been demoed.

 

 

@Notional, any GPU on the PCIe bus can load directly from system memory. It's in the CUDA 6 and OCL 1.2 standards. The CPU needs to pass a draw kernel telling the GPU where to start, but otherwise it's system memory and the PCIe bus doing the work of passing memory, not the CPU. On an SOC you have a different scenario where GPU cores are not independent, hence requiring a unified memory standard to let the CPU run free.

 

Do games actually utilize DMA? I thought it was just GPGPU applications. I cannot seem to find anything on google about it. AFAIK only HSA enabled APU's (or APU's in general), is capable of that. If I am wrong, my argument should be disregarded. But please find me a nice source, so I can learn more about it (how games use it that is).

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

The PS3, when it launched was stupidly over powered and developers had no idea how to take advantage of the custom cell processors that the PS3 was running  <_<

Cell processors suck... Because of them we don't have bruce force emulators of PS3... Geez.

Link to comment
Share on other sites

Link to post
Share on other sites

Do games actually utilize DMA? I thought it was just GPGPU applications. I cannot seem to find anything on google about it. AFAIK only HSA enabled APU's (or APU's in general), is capable of that. If I am wrong, my argument should be disregarded. But please find me a nice source, so I can learn more about it (how games use it that is).

If games don't use it the developers are stupid beyond reason. I'll put it this way: GPUs CAN access system memory via DMA for OpenGL the same way as in OpenCL if the program is built correctly. All that should be passed is a starting address of data (in system memory) and the instructions of how to manipulate it (OpenGL pragmas). If it's done differently it's done incorrectly, hence why you find me bashing the old guard gam programmers a lot. They're not good at parallel algorithms and they never keep up with new standards.

 

It can be done with OpenGL just as easily as OpenCL. Whether or not it is done is the responsibility of the programmers.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Cell processors suck... Because of them we don't have bruce force emulators of PS3... Geez.

They're ASICs. They're great at 1 or 2 things, but they're a massive pain to program for (the assembly makes x86 look good, and the existing public compilers for cell assembly are antiquated beyond being useful).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

If games don't use it the developers are stupid beyond reason. I'll put it this way: GPUs CAN access system memory via DMA for OpenGL the same way as in OpenCL if the program is built correctly. All that should be passed is a starting address of data (in system memory) and the instructions of how to manipulate it (OpenGL pragmas). If it's done differently it's done incorrectly, hence why you find me bashing the old guard gam programmers a lot. They're not good at parallel algorithms and they never keep up with new standards.

 

It can be done with OpenGL just as easily as OpenCL. Whether or not it is done is the responsibility of the programmers.

Well that is good news for PC's in general. Does it work ind Direct X as well then? I only read it worked in OpenCL, hence the GPGPU thing. None the less it still needs a pass from the CPU, so it still needs a tiny bit ressources.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

You don't need windows.

That was mostly because Sony was losing a lot of money per unit.

You do actually need Windows though,

If you intend to play a decent number of games.

And Windows is Nicer than Linux for the user since they are more guides , easier Instructions, and just generally people are used to it.

Not to forget more dev support and better drivers.

And let's face it, when you get ALL the peripherals and stuff , A console is cheaper.

Concidering it has-

A (Genuine) OS that is user friendly.*

A Hdmi cable

A Very Good webcam

A Semi-Decent Headset

A Blu Ray Drive

WiFi Card w/ wireless AC (Not a shitty one either)

In reality , Xbox Ones are cheaper for what you get and is very competitive vs a pc that comes with just hardware and Keyboard and mouse.

Granted you do have 3 Free games

and 1 Tb vs a 500 Gig hard drive.

*** Based on Xbox one Shipments.

** You did make a very good build , Props.

*Yes Linux (Assume steam OS) is getting better , but it has a way to go before it gets usable as a gaming platform.

A riddle wrapped in an enigma , shot to the moon and made in China

Link to comment
Share on other sites

Link to post
Share on other sites

Well that is good news for PC's in general. Does it work ind Direct X as well then? I only read it worked in OpenCL, hence the GPGPU thing. None the less it still needs a pass from the CPU, so it still needs a tiny bit ressources.

With DX 12, given that it uses Mantle, I'd be shocked to find otherwise, though for previous version I can't say.

Emphasis on tiny bit (by comparison). In c++ if you want speed, passing references (address to an object) and pointers is many times faster than passing a copy of an object, directly proportional to the size of said object.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Nooo... it was insanely powerful back then.

It was cheaper to get a rack full of PS3's and hack them into a giant compute server than a dedicated solution - that's how damn op it was. 

It doesn't matter how powerful it was for other purposes. It doesn't change the fact game devs has 0 knowledge about it and weren't efficient in programming for that platform.

It took time for them to figure it out, and as a result it was a "futureproof", gradually growing console for that time.

Link to comment
Share on other sites

Link to post
Share on other sites

Made my own graph.

 

yJCNoyl.jpg

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

So you agree part of your complaint has little relevance.

From a gaming perspective it is irrelevant, like I told Luka. But ultimately it's relevance is up to the consumer buying the console.

 

You don't understand what it means.  It's either GPU-bound OR CPU-bound, not more or less of either.

That depends on the setup. On a balanced system, BF4 is GPU bound, but if you have a 295x2 and a pentium dual core, it will be CPU bound. But this is completely sidestepped. The point was that games utilizes more threads in parallel now, becuase they are more complex, thus needing more done syncronized.

The point was that BF4 is one of such games and it runs just fine on a dual-core CPU

 

You just can't read.

More of HASWELL cores = better; not more of shitty AMD jaguar cores.

It's neither ad hominem nor name-calling; I was making a conclusion based on what I saw in your post.

More Haswell cores than threads makes no difference.

Saying someone went full retard is ad hominem.

More Haswell cores in i5 and i7 DO make a difference; you saw the graph.  Stop lying.

It's not ad hominem; I wasn't insulting you by saying you went retard.

 

A PS4 has a fairly costly operating system and background processes running...JUST like a PC(even though they are not running the same things, obviously)...that's why games do not get all 8gb of ram...

The CPU still has to work, you know?  And games are clearly suffering from its lack of power.

The operating system is a sunk cost. I doubt it is THAT expensive. Besides it has 2 dedicated cores, so it is not really that relevant. The game has 6 dedicated cores. That is what we are discussing. Such a division is not present on PC's, sadly. ( I do wish Windows came with a gaming mode, that would prioritize everything game, when playing).

It Indeed does look like the new consoles are CPU bound. And like I stated earlier, that sucks, but that does not change the fact, that games on PS4/XBONE does not have to run as much API', middleware, software in the background, etc.

I guess you never heard of Process Lasso.

You have no fact; your opinion isn't a fact.  You have no proof here.

 

The proof is right there :http://www.guru3d.co...8-download.html

 

You have to elaborate on that one. This is a mod, that changes the look of the game. By changing the effects already in the game. That is an artistical change, not a graphical fidelity change. On the contrairy, it makes the game look different than what the devs decided on. This is worse than the tv's in the stores, that has boosted contrast and neon colours. It looks pretty, but it's completely distorted compared to the source.

Changing fog, people density, light intensity, etc. are not improving the grpahics quality, but just modding the game to look and behave differently than designed.

However the question was about poor optimization. Worse mod, does not change that, nor prove that it is poorly optimized.

Read the changelogs.

We need PROOF; not your opinions.

I want proof too, but I cannot show any, that is not publically announced yet. However games like Shadow of Mordor, Watch Dogs, Titanfall and BF4 can all use more than 2GB vram @ 1080p. The first 2 even more than 3GB. Again it will take some time for newer games to take advantage of the massive increase in available Vram, just like it takes time before a new version of Direct X is used in a game.

You did not answer my question: Either way, more textures at the same time = higher vram usage. I don't assume you would disagree with that statement?

Then your words here have no value.

By the way, I do have Battlefield 4 and it never actually go above 2gb in usage.

And just because they can use more memories, that does not mean not having them will make game unplayable.

I don't need to answer your irrelevant question.

You can't read, that's what.

I can read, but you write gibberish. But let me ask you then: If a console get's 10x the amount of vram available for the game, what do you think will happen with textures in games designed for the new console?

No, you can't read; at least you failed to read there.

Nothing meaningful(other than maybe lowering requirement for texture caching) would happen; because consoles do not have the power elsewhere to take advantage of 10x more ram.

Game development isn't all programming...etc. and there is nothing to suggest that it wasn't made for last gen console in first place(and it was actually released on PS3).

Crysis was revolutionary in graphical fidelity.  Watch-doges, however, is nothing.

Actually building a graphics engine/gaming engine is pretty much the first thing you star working on. They had to gimp the game a lot to work on PS3, so it is quite clear it was not made for PS3. Either way we can only choose to believe in what has been officially said by Ubisoft, and they said what I am repeating on this matter.

I bet Ubisoft had the game in pre-production for years and counted it as development.

You can't choose to believe Ubisoft with all this crap going on; unless you like to be fooled by them.

 

I agree on Crysis, but disagree on Watch Dogs. No other GTA style game looks as good. Neither with the effects (weather, sunlight through trees, blinding as you exit a tunnel in daylight, etc,), nor the textures (both in quality, but also quantity, as in amount of unique textures shown on the screen, instead of just the same few textures copy pasted all over).

Watch_doges is simply poorly optimized(ever heard of something called texture caching?  And using same texture for things is fine if done properly).  GTA4 with icehancer mod is a lot better-looking.

You failed to read again.

You were arguing for amount of cores in consoles; now they are less relevant?  Stop shooting yourself in the foot if you want to make an argument.

Stop stating the obvious; you are contradicting yourself with it.

That was in response to you obviously false statement: " and even worse, the GPU's had not evolved enough over the 5 years"

 

This was your quote:"Unfortunately, PC can have as many cores as it needs. ". In that regard, no having 48 cores does not matter. Software uses threads of data to be processed. Having more cores than threads, has no benefit for that piece of software. Having LESS cores than threads can be a problem, especially if the threads run in parallel, where they need data from each other, during processing. It seems like you do not udnerstand how CPU's work. I think that is where the confusion lies.

Linus said it himself, that Nvidia postponed their top tier cards, one generation (probably because of AMD not following suit). GPU's should have been better by now. The 900 series (970/980) are a nice proof. 980 is not better than 780ti for the most part (except the added vram), so yeah Nvidia is holding back.

Joke's on you, it does matter; because I can run more instances of game(or other games) if I wanted.

You do not understand how CPU work.  Just because something uses more threads than the number of CPU core that does not mean the CPU is going to be heavily slowed down.

 

Ultra quality does not necessarily mean "pointlessly large."

It being pointlessly large is a sign that it is poorly done.

Please elaborate on "pointlessly large". Higher quality textures, are higher quality, and thus also larger in size. I don't get your point here. Define pointlessly large please. Better graphics is not pointless, and will require more out of the hardware.

An example would be using 8k quality texture for something at max view distance.

 

You are not making a point here.

I wonder where you think texture is stored; did you think texture can only be stored in VRAM?

High quality texture does not mean you need to store needlessly large amount of texture in vram.

All textures being shown on screen is stored in VRAM, otherwise your FPS would collapse, if textures shown on screen has to be read from system ram. Let me make my point more clear then:

Game A has 50 unique textures on the screen, repeated all over.

Game B has 100 unique  textures at same resolution as A, on the screen, repeated as necessary.

Which game would use more Vram? In my example B would use twice as much for textures. That is the point with number of unique textures, and the reason why Watch Dogs uses so much. Every facade, every entity has its own unique texture, so the diversity in textures shown in the same shot is much larger, than any other open world game on the market.

You don't understand how it works.  Read about Texture Caching.

The game does not simply take all textures and dump them into VRAM.

And reusing texture without looking out of place is part of the optimization; no excuse for poor optimization.

 

Consoles are only relevant as something poisonous to gaming as a whole.  

More "AAA" games made for consoles is a terrible thing; it means gaming as a whole is retarded for consoles.

There are still companies making games for either only PC or with PC as priority.

Considering consoles pay for most of the development of most AAA games, I deeply disagree. Unless you want worse games, at higher the cost? Look at Ubisofts market. Less than 1/3 of the sales are PC. Actually it might be less than 1/4 of all sales.

Since the current gen consoles are now x86 based and use PC gpu's, I don't really see the problem. In fact the new consoles are better for PC, as they can make the games for PC, then downscale them for consoles. Several devs claim do be doing so atm. But development is expensive. Don't assume a dev would spend half the budget to make the game prettier for less than 25% of all the consumers. 

It was a business decision made by peasant companies to make exclusives and/or timed exclusives to SELL CONSOLES.

I won't be getting worse game at higher cost if consoles all died right now; those peasant companies would either cut their gaming division or go put money to making GAMES instead of making potatoes.

There would probably be less wasteful advertisement, which is nice.

That still has nothing to do with OC.  That is something you cannot apply your poor opinion to everyone else in.  It's getting extra performance; not that stock performance isn't shitting on consoles already.

And it still has no relevance for consoles, as they cannot OC. It still is not a valid point, as most gamers might not care or want to OC. Do you have any statistics about how many OC?

 

You are just dodging the burden of proof here.

You make a claim there are several games that can use more than 2gb of vram?  Post them.

Examples further up

Still no proof and you are just grasping at straws here by biting on the few games that have poor optimization.

GTAV is an open world and it can run on 256mb vram, Crysis 2 is a corridor shooter, it does not run on 256mb vram.

Yes and Crysis 2 looks a lot better than GTAV on PS3. I think most people would lose their mind, if GTAV, looked on PC and PS4, the way it does on PS3. Your example does not disprove my point. Open world games use more vram, than closed corridor shooters.

It does prove you wrong; because you made a general statement that isn't correct.

 

2 cores is not too little yet.  If you are going to whine about Pentium G3258, keep in mind  that AMD build based on Athlon x4 860k is about the same price(or less).

Who said anything about Athlon CPU's? Haswell has twice as many ALU's and APU's as that Athlon, per core (afaik). We've already gone through benches showing BF4 being crippled by that Pentium, even though the game is primarely GPU bound on normal balanced builds. Why would anyone buy I5 or i7 instead of that chip then?

As far as I can tell from the source, all games run a few frames faster on the Athlon, but obviously any highend multithreaded game, would get bottlenecked by both of them.

BF4 is only somewhat bottlenecked by Pentium G3258; it still manages to keep 60fps in most part at multiplayer.

Similar things happen with a QUAD-core Athlon.

Why would people buy i5 or i7?  Because they want a higher framerate for their higher refreshrate monitor or have framerate always kept above 60.

They don't run faster on teh Athlon.

The point is that more core != better.

 

You know, just because you can't missed the point that does not mean you didn't shoot yourself in foot.

The point here is that PC has low-level API already and that it works just fine.  Mantle now and Dx12 later.

Consoles also have low-level API and it's already showing pathetic 900p 30fps games.

You seem to assume that all low level API's are equally low level, equally effecient, and both elliminate need for a lot of API's, middleware, drivers, etc. That is an assumption you don't have the facts to conclude on. Furthermore only mantle is low level on the market, and is specifically written to lower the overhead on the CPU and be able to make more drawcalls than Direct X. Few games uses it, and they do not seem to be directly programmed for the GCN architecture. Otherwise, we would see a huge FPS difference.

However none of this disproves my point, that both current gen consoles, can punch above their weight right now on the GPU side, compared to PC:

I didn't assume anything.

The assumption was that if console API was as meaningfully "low" as you say, it would have made a difference in games already.

Yet, the facts are there: BF4(a game that scales properly across 8 cores) runs terribly on consoles.

 

Quote

And let's not forget that programming close to the metal will usually mean that we can get 2x performance gain over the equivalent PC spec - Oles Shishkovstov 4A games (metro): http://www.eurogamer...i-platform-game

 Don't tell me you believe that bullshit.

They are trying to SELL you the game and platform(which lets more people buy their games).

And there is still poor AMD drivers for linux.

The point is that it's free and SteamOS/Steam box will eventually get enough attention.

And windows 8.1 is cheap.

Crappy AMD drivers or crappy Nvidia drivers is in itself a good argument against Linux. The point is, you have to factor in all costs of the pc to make it do what a console can do. 

And it does not matter.

The point is, I can factor in all cost to replicate all function of consoles and still end up in the same price range for just the platform.

 

Here you must mock whatever proves you wrong. 

Btw, you can turn off DoF.

Does not change all the other crap they did. The point is that the worse mod, does not reveal "hidden" graphics effect, supposedly cut away because of the consoles. The only effect being brought out, is the glitchy headlight shadows, that has never been demoed.

Read changelog again.

Anyone who has a sister hates the fact that his sister isn't Kasugano Sora.
Anyone who does not have a sister hates the fact that Kasugano Sora isn't his sister.
I'm not insulting anyone; I'm just being condescending. There is a difference, you see...

Link to comment
Share on other sites

Link to post
Share on other sites

You do actually need Windows though,

If you intend to play a decent number of games.

And Windows is Nicer than Linux for the user since they are more guides , easier Instructions, and just generally people are used to it.

Not to forget more dev support and better drivers.

And let's face it, when you get ALL the peripherals and stuff , A console is cheaper.

Concidering it has-

A (Genuine) OS that is user friendly.*

A Hdmi cable

A Very Good webcam

A Semi-Decent Headset

A Blu Ray Drive

WiFi Card w/ wireless AC (Not a shitty one either)

In reality , Xbox Ones are cheaper for what you get and is very competitive vs a pc that comes with just hardware and Keyboard and mouse.

Granted you do have 3 Free games

and 1 Tb vs a 500 Gig hard drive.

*** Based on Xbox one Shipments.

** You did make a very good build , Props.

*Yes Linux (Assume steam OS) is getting better , but it has a way to go before it gets usable as a gaming platform.

Your opinion simply consider the number on Steam to not be decent.

You do realize that a xbox one with webcam is $100 more?

Lower the GPU to a R7 260x(not recommended due to it being underpowered like xbox...and not very cost-efficient either) and you can get all the webcam/semidecent headset...etc. you want for around the same price.

Wireless?  It's cheaper and better to just buy cable.

Hdmi cable is about $2-3.

 

So it's still in same price range even with a bunch of peripherals, and when you factor in the games and online subscription; PC is still cheaper by far.  That's not even counting the utilities of a PC.

Anyone who has a sister hates the fact that his sister isn't Kasugano Sora.
Anyone who does not have a sister hates the fact that Kasugano Sora isn't his sister.
I'm not insulting anyone; I'm just being condescending. There is a difference, you see...

Link to comment
Share on other sites

Link to post
Share on other sites

me: cool!

 

*continues to game on 5000 GFLOP PC*

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

Concidering it has-

A (Genuine) OS that is user friendly.*

A Hdmi cable

A Very Good webcam

A Semi-Decent Headset

A Blu Ray Drive

WiFi Card w/ wireless AC (Not a shitty one either)

In reality , Xbox Ones are cheaper for what you get and is very competitive vs a pc that comes with just hardware and Keyboard and mouse.

Granted you do have 3 Free games

and 1 Tb vs a 500 Gig hard drive.

*** Based on Xbox one Shipments.

** You did make a very good build , Props.

*Yes Linux (Assume steam OS) is getting better , but it has a way to go before it gets usable as a gaming platform.

I called it right here (almost at least):

 

Oh God not this shit again.

In before you gets proven wrong and resorts to "well most people don't know how to build it!".

No matter how many times you get proven wrong or how many benefits they throw at PC gaming there will always be one or two things the console people scream over. It's the same as the Mac vs PC argument. No matter how good you make the PC, even if the PC has 10 benefits, if it lacks 1 feature the Mac has then the Mac people will see it as a victory.

 

Oh your PC has better controls, much faster CPU and GPU, free online, cheaper games and can be used for far more things? Fuck that! The console is better because it comes with a HDMI cable!

The Xbone doesn't have anywhere near wireless AC by the way, and the PS 4 is even worse (doesn't even support 5GHz WiFi at all).

 

But let me guess, even if someone do post a build which matches the Xbone in price and specs you will start whining about how the PC isn't as small. The problem is that you don't want a fair comparison. You don't want "here is the Xbone and here is what you get for roughly the same price on a PC". You want "here is the Xbone, and here is how much it would cost to build a PC that is exactly the same". That's not how you compare stuff. When you compare you look at the price of both things, and then see what benefits and drawbacks each option has. You are completely okay with the Xbone having a truckload of drawbacks, but if you find a single one on the PC you see that as a victory for the console.

 

And that's why I don't bother with console arguments or Mac vs PC arguments anymore. I have seen this tactic used a billion times before and it's just a waste of time to argue against it.

Link to comment
Share on other sites

Link to post
Share on other sites

What on earth is that graph trying to tell me? Those numbers are meaningless... is that GFLOPS?

One graph has Gflops and the other is how many characters with cloth simulation they could get on screen.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

Prune your quotes, those damn walls of text are horrendous

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

soooo what is UBI telling us here? They figured out the magic of GPU acceleration? why haven't they done that in the first place?

 

I don't understand what this is about.

Link to comment
Share on other sites

Link to post
Share on other sites

That seemed rather one sided.

Computing enthusiast. 
I use to be able to input a cheat code now I've got to input a credit card - Total Biscuit
 

Link to comment
Share on other sites

Link to post
Share on other sites

Don't know where you are but They are $400 in U.S.

Consoles punch above their weight, that's why PS4 runs BF4 at 900p medium/high settings, right?

 

Except your "facts" are not facts.

PCPartPicker part list / Price breakdown by merchant

PCPartPicker part list / Price breakdown by merchant

CPU: Intel Pentium G3258 3.2GHz Dual-Core Processor ($69.99 @ Amazon)

Motherboard: Gigabyte GA-H81M-H Micro ATX LGA1150 Motherboard ($51.34 @ Amazon)

Memory: G.Skill Value Series 8GB (1 x 8GB) DDR3-1333 Memory ($69.99 @ Newegg)

Storage: Seagate Constellation ES 1TB 3.5" 7200RPM Internal Hard Drive ($50.00 @ Amazon)

Video Card: HIS Radeon R9 280 3GB IceQ OC Video Card ($169.99 @ Newegg)

Case: BitFenix Merc Beta (Black) ATX Mid Tower Case

Power Supply: Corsair Builder 430W 80+ Bronze Certified ATX Power Supply ($19.99 @ Newegg)

Keyboard: Cooler Master CM Storm Devastator Gaming Bundle Wired Gaming Keyboard w/Optical Mouse ($27.66 @ NCIX US)

Total: $458.96

Prices include shipping, taxes, and discounts when available

Generated by PCPartPicker 2014-10-19 05:05 EDT-0400

A tiny bit more than PS4/Xbox one with one year of subscription(required to play online, pure peasantry)($450). It could go lower with R9 270 but then it would not be as cost-effective(but still better than ps4).

Plays BF4 on ultra at 1080p around 60fps.

 

This lacks a blu-ray player and wifi capabilities (unless its built in).

 

and if you argue that this build is strictly for gaming. I don't see a $90 copy of Windows unless you're promoting piracy or even worse asking the user to install a version of Linux and have the user make ends meet that way.

 

Also those peripherals are not wireless, so if we're trying to be objective as possible here, we would remove the MSRP cost of both peripherals from the system cost. $458.96-$27.66=$431.30 and the PS4 $399.99-$49.99= $350.

 

Adding the cost of windows 7 hp http://www.amazon.com/Windows-Premium-System-Builder-Packaging/dp/B00H09BB16/ref=sr_1_3?ie=UTF8&qid=1413875732&sr=8-3&keywords=windows+xp

 

$431.30+$96.89= $528.19

 

So $528.19 for a PC capable of playing modern releases vs a PS4 for $350. Even without the cost of windows you're looking at a price premium.

 

I realize that console gaming is looked down upon in this community but let's at least try and be objective about it. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×