Jump to content

Regarding The Witcher 3 on Consoles

Guest
Go to solution Solved by Guest,

[removed]

I think we're going to have to agree to disagree on this one, because for one I feel this has become personal, two, it wasn't an argument to begin with. If you cannot see that real world VR games aren't going to be able to run on consoles, such as the PS4, then I'm really not sure what to say to make the conversation better. Finally the third I really don't care about this argument enough to rebut what you have written. Well written however, but I think the future will tell who is right. It was nice debating with you.

FULL DISCLOSURE: I am a PC gamer and am playing the Witcher 3 on a PC. I love the PC, however I have a long history with Playstation and still love their brand though I recognize the superiority of PCs as a platform and therefore choose the PC.

 

As the 1.07 patch is released for The Witcher 3, I cannot help but ask myself a few questions about the differences between the Xbox One and Playstation 4. Regardless of how you feel, your fanboy allegiances, or the hype you've heard. The Xbox and the Playstation, this time around, are largely the same box. They differ however in terms of power. The Xbox One only has 768 stream processors, meanwhile the PS4 has 1152 Stream processors, the Xbox One has 8GB of DDR3 memory meanwhile the PS4 has 8GB of GDDR5 memory.

 

Therefore with these two facts I simply couldn't understand why it is that the PS4 can't perform the way one would think that it could. Then I began thinking that it's the same reason why Planetside 2 doesn't run well on the PS4, it's due to the low clock speed of it's processor at only 1.6GHz. Both the Witcher and Planetside 2 require a lot of CPU power due to their large worlds and many instances of AI and other players running around. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Xbox one can connect with Windows 10 PC? 

Computing enthusiast. 
I use to be able to input a cheat code now I've got to input a credit card - Total Biscuit
 

Link to comment
Share on other sites

Link to post
Share on other sites

Okay, you do understand that the PS4 is KNOWN to be more powerful, but in terms of magnitude, its maybe 5% better.  With such little margin, especially considering the disadvantage that Sony faces on the software side of things, overall actual performance is fairly even.

 

Regardless, what is your actual statement?

Please spend as much time writing your question, as you want me to spend responding to it.  Take some time, and explain your issue, please!

Spoiler

If you need to learn how to install Windows, check here:  http://linustechtips.com/main/topic/324871-guide-how-to-install-windows-the-right-way/

Event Viewer 101: https://youtu.be/GiF9N3fJbnE

 

Link to comment
Share on other sites

Link to post
Share on other sites

The CPU architecture in both consoles is just kinda poor in the first place. Both consoles have the CPU run at 1.6GHz.

 

Edit: Xbox One's CPU runs .15GHz higher than PS4's.

Edited by Godlygamer23

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

Erm they both have ddr3 memory lol. Gddr5 is graphics memory not random access memory

 

 

i7-6700k  Cooling: Deepcool Captain 240EX White GPU: GTX 1080Ti EVGA FTW3 Mobo: AsRock Z170 Extreme4 Case: Phanteks P400s TG Special Black/White PSU: EVGA 850w GQ Ram: 64GB (3200Mhz 16x4 Corsair Vengeance RGB) Storage 1x 1TB Seagate Barracuda 240GBSandisk SSDPlus, 480GB OCZ Trion 150, 1TB Crucial NVMe
(Rest of Specs on Profile)

Link to comment
Share on other sites

Link to post
Share on other sites

Erm they both have ddr3 memory lol. Gddr5 is graphics memory not random access memory

The PS4 uses GDDR5 RAM as its main memory which is shared between the CPU and GPU portions of the APU. The Xbox One uses DDR3 as shared memory between the CPU and GPU portions of the APU.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

FULL DISCLOSURE: I am a PC gamer and am playing the Witcher 3 on a PC. I love the PC, however I have a long history with Playstation and still love their brand though I recognize the superiority of PCs as a platform and therefore choose the PC.

 

As the 1.07 patch is released for The Witcher 3, I cannot help but ask myself a few questions about the differences between the Xbox One and Playstation 4. Regardless of how you feel, your fanboy allegiances, or the hype you've heard. The Xbox and the Playstation, this time around, are largely the same box. They differ however in terms of power. The Xbox One only has 768 stream processors, meanwhile the PS4 has 1152 Stream processors, the Xbox One has 8GB of DDR3 memory meanwhile the PS4 has 8GB of GDDR5 memory.

 

Therefore with these two facts I simply couldn't understand why it is that the PS4 can't perform the way one would think that it could. Then I began thinking that it's the same reason why Planetside 2 doesn't run well on the PS4, it's due to the low clock speed of it's processor at only 1.6GHz. Both the Witcher and Planetside 2 require a lot of CPU power due to their large worlds and many instances of AI and other players running around. 

 

 

honestly haven't looked into it at all, but most likely they just spent more time optimizing on the xbone. with consoles it all comes down to the optimization for the hardware, as its only one set of hardware. 

 

#xbone4life

 

...forgot Im on LTT. #intel+nvidia+android4lyfe

Link to comment
Share on other sites

Link to post
Share on other sites

Erm they both have ddr3 memory lol. Gddr5 is graphics memory not random access memory

The PS4 has 8GB GDDR5 as system ram as well as VRAM, however the Xbox One has 8GB of GDDR3. Check Wikipedia or their respective sited.

Link to comment
Share on other sites

Link to post
Share on other sites

Okay, you do understand that the PS4 is KNOWN to be more powerful, but in terms of magnitude, its maybe 5% better.  With such little margin, especially considering the disadvantage that Sony faces on the software side of things, overall actual performance is fairly even.

 

Regardless, what is your actual statement?

I'm not sure where to begin. I'm not sure where the "5%" number comes from. The PS4 has 50% more stream processors at about the same clock speed as the Xbox One. I'm not going to argue software because that is unknowable unless either of us is a developer working with both of them, which we aren't.

Not to mention the fact that the PS4 runs GDDR5 which unendingly helps in graphics speed, as evidenced by the jump in the PC graphics card.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not sure where to begin. I'm not sure where the "5%" number comes from. The PS4 has 50% more stream processors at about the same clock speed as the Xbox One. I'm not going to argue software because that is unknowable unless either of us is a developer working with both of them, which we aren't.

Not to mention the fact that the PS4 runs GDDR5 which unendingly helps in graphics speed, as evidenced by the jump in the PC graphics card.

First off, the Xbox One has a higher CPU clock frequency.  Xbox One has 8 Jaguar cores clocked at 1.75Ghz.  Also, Xbox One has ESRAM buffers which offer a MUCH higher bandwidth than even DDR5.  So while the PS4 might have more stream processors, it doesn't mean jack diddle if you STILL cant feed those graphics cores.  The PS4's GPU portion outpaces its CPU portion by a fairly large margin.

 

With all that said the Xbox One has a theoretical power of 1.31 TFLOPS and the PS4 has about 1.84 TFLOPS.  In a absolute number crunch, you're going to notice a 25% difference, but in reality, due to Microsoft advantage in software (DX11.2) it understands the hardware MUCH better and is able to squeeze more out of it.

 

I do not disagree that the PS4 isn't more powerful, it is.  What I am saying is that it doesn't just run away with the trophy just because numbers.  "Look guys, this one goes up to 11!"

 

What I am asking though, is what is this thread about?  Like you started out talking about Witcher 3, and now we're talking about theoretical performance of machines.

 

Are you saying that you feel like the consoles underperform?

Please spend as much time writing your question, as you want me to spend responding to it.  Take some time, and explain your issue, please!

Spoiler

If you need to learn how to install Windows, check here:  http://linustechtips.com/main/topic/324871-guide-how-to-install-windows-the-right-way/

Event Viewer 101: https://youtu.be/GiF9N3fJbnE

 

Link to comment
Share on other sites

Link to post
Share on other sites

First off, the Xbox One has a higher CPU clock frequency.  Xbox One has 8 Jaguar cores clocked at 1.75Ghz.  Also, Xbox One has ESRAM buffers which offer a MUCH higher bandwidth than even DDR5.  So while the PS4 might have more stream processors, it doesn't mean jack diddle if you STILL cant feed those graphics cores.  The PS4's GPU portion outpaces its CPU portion by a fairly large margin.

 

With all that said the Xbox One has a theoretical power of 1.31 TFLOPS and the PS4 has about 1.84 TFLOPS.  In a absolute number crunch, you're going to notice a 25% difference, but in reality, due to Microsoft advantage in software (DX11.2) it understands the hardware MUCH better and is able to squeeze more out of it.

 

I do not disagree that the PS4 isn't more powerful, it is.  What I am saying is that it doesn't just run away with the trophy just because numbers.  "Look guys, this one goes up to 11!"

 

What I am asking though, is what is this thread about?  Like you started out talking about Witcher 3, and now we're talking about theoretical performance of machines.

 

Are you saying that you feel like the consoles underperform?

 

I'm sorry but I cannot let this one go, the same argument was made by Microsoft for the ESM Ram of the Xbox One. It's size is only 32MB and though yes it is faster that GDDR5, barely however considering that it's only 30 some GB/s faster than GDDR5 and is infinitesimally smaller, many developers have stated that they prefer the GDDR5 memory interface because it is nearly just as fast and there is much more of it. Your point about how Microsoft somehow knows more about the software and there is able harness it "MUCH" better is nonsense, it's made up fact, and it is made up due to the fact that neither of us are developers and no one in the development community has made a comment either way on that topic. 

 

However whether you're in the console world or not, when you have 2x the amount of stream processors and therefore ROPS (texture mapping units), you're graphics processor and therefore your graphics performance in the real world is going to be much better, especially considering you have GDDR5, instead of DDR3, which has a much higher bandwidth and greater speed. All evidenced by the fact that games are consistently coming to the PS4 at a native 1080p resolution and running at higher frame rates. 

 

But my point about all of this was that the consoles are vastly under-powered this generation, not necessarily in their graphics cores, but their CPUs are a disaster, 1.6 vs 1.75 Ghz, we have phone processors that are moving faster than that now. Though yes they only crank up to that speed for a short period of time, my point still remains that for some reason Sony and Microsoft didn't think that they were going to be that important. In fact when SOE was talking about porting Planetside 2 to PS4 they said quite blatantly that the issue wasn't the GPU, but the CPU because of the large and open worlds on consoles this generation, due to their GPUs and large pools of memory it's quite sad to see that their CPUs are holding them so far back. It almost makes you want to either laugh or cringe when they speak of VR, no matter how well you program for either console, you aren't going to produce the resolution needed for a smooth VR experience with 768 or 1152 SPs or especially with the DDR3 memory for graphics in the Xbox One. It seems that this console generation unless somehow clocked significantly higher in the future with a firmware update, were just patch systems, either in the hopes that streaming would become more available and prevalent or that they could make a console with higher specs when the industry of hardware and software settle on 4K as a viable selling tool for new consoles and games much like the PS3 and Xbox 360 were for 1080p displays. 

 

Graphics Processors: https://youtu.be/0ruo84asvQo?t=5m13s

ESM RAM Point: https://youtu.be/0ruo84asvQo?t=9m32s

Planetside 2 Point: http://gearnuke.com/planetside-2-dev-ps4-real-bottlenecks-cpu-side-aiming-60-fps-ps4/

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sorry but I cannot let this one go, the same argument was made by Microsoft for the ESM Ram of the Xbox One. It's size is only 32MB and though yes it is faster that GDDR5, barely however considering that it's only 30 some GB/s faster than GDDR5 and is infinitesimally smaller, many developers have stated that they prefer the GDDR5 memory interface because it is nearly just as fast and there is much more of it. Your point about how Microsoft somehow knows more about the software and there is able harness it "MUCH" better is nonsense, it's made up fact, and it is made up due to the fact that neither of us are developers and no one in the development community has made a comment either way on that topic. 

 

However whether you're in the console world or not, when you have 2x the amount of stream processors and therefore ROPS (texture mapping units), you're graphics processor and therefore your graphics performance in the real world is going to be much better, especially considering you have GDDR5, instead of DDR3, which has a much higher bandwidth and greater speed. All evidenced by the fact that games are consistently coming to the PS4 at a native 1080p resolution and running at higher frame rates. 

 

But my point about all of this was that the consoles are vastly under-powered this generation, not necessarily in their graphics cores, but their CPUs are a disaster, 1.6 vs 1.75 Ghz, we have phone processors that are moving faster than that now. Though yes they only crank up to that speed for a short period of time, my point still remains that for some reason Sony and Microsoft didn't think that they were going to be that important. In fact when SOE was talking about porting Planetside 2 to PS4 they said quite blatantly that the issue wasn't the GPU, but the CPU because of the large and open worlds on consoles this generation, due to their GPUs and large pools of memory it's quite sad to see that their CPUs are holding them so far back. It almost makes you want to either laugh or cringe when they speak of VR, no matter how well you program for either console, you aren't going to produce the resolution needed for a smooth VR experience with 768 or 1152 SPs or especially with the DDR3 memory for graphics in the Xbox One. It seems that this console generation unless somehow clocked significantly higher in the future with a firmware update, were just patch systems, either in the hopes that streaming would become more available and prevalent or that they could make a console with higher specs when the industry of hardware and software settle on 4K as a viable selling tool for new consoles and games much like the PS3 and Xbox 360 were for 1080p displays. 

 

Graphics Processors: https://youtu.be/0ruo84asvQo?t=5m13s

ESM RAM Point: https://youtu.be/0ruo84asvQo?t=9m32s

Planetside 2 Point: http://gearnuke.com/planetside-2-dev-ps4-real-bottlenecks-cpu-side-aiming-60-fps-ps4/

 

Phones with super fast processors though are just a fad, surely? I truly don't know why some android phones in particular have octa-core processors, what kind of back breaking software needs all of that power?

 

The iPhne in comparison has a dual core, low speed CPU and doesn't need anything faster. The two consoles don't need hugely powerful CPUs - the graphics are meant to be good, not glorious. Remember, the console has to be more quiet than super-duper CPUs will allow. I believe there is only 1 fan in the Xbox One, not sure about the PS4. I love my Xbox One and 360 - I use them to play with friends, as none of them have really fast PCs like mine (comparatively), and will continue to do so.

 

And I think people are kidding themselves if they think games will ever run 4k on either console, they can play 4k blu-rays and that's about it.

CPU: AMD 7800X3D Motherboard: NZXT B650E RAM: 32GB 5600 30-CL Corsair Vengeance DDR5 GPU: MSI Gaming X Trio RTX 2070 PSU: Corsair RM850i Monitor: Samsung 27" 4K thing Cooling:Noctua Chromax Black NH-D15: Case: NZXT H510 Black

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I believe there is only 1 fan in the Xbox One, not sure about the PS4.

The ps4 has a single blower style fan that cold the heat sink and the psu.

Intel i5-3570K/ Gigabyte GTX 1080/ Asus PA248Q/ Sony MDR-7506/MSI Z77A-G45/ NHD-14/Samsung 840 EVO 256GB+ Seagate Barracuda 3TB/ 16GB HyperX Blue 1600MHZ/  750w PSU/ Corsiar Carbide 500R

 

Link to comment
Share on other sites

Link to post
Share on other sites

It almost makes you want to either laugh or cringe when they speak of VR, no matter how well you program for either console, you aren't going to produce the resolution needed for a smooth VR experience with 768 or 1152 SPs or especially with the DDR3 memory for graphics in the Xbox One.

Then you are missing the point, it's not about playing the witcher in VR, and it's not about 3440x2560 like oculus rift

The kind of games that will be on project Morpheus will be designed around the spec and either 120 or 60fps. And it's not like they are just making it up, they have several real time VR demos out now and there are videos of them

Intel i5-3570K/ Gigabyte GTX 1080/ Asus PA248Q/ Sony MDR-7506/MSI Z77A-G45/ NHD-14/Samsung 840 EVO 256GB+ Seagate Barracuda 3TB/ 16GB HyperX Blue 1600MHZ/  750w PSU/ Corsiar Carbide 500R

 

Link to comment
Share on other sites

Link to post
Share on other sites

Then you are missing the point, it's not about playing the witcher in VR, and it's not about 3440x2560 like oculus rift

The kind of games that will be on project Morpheus will be designed around the spec and either 120 or 60fps. And it's not like they are just making it up, they have several real time VR demos out now and there are videos of them

 

Forgive me for the confusion, but I was not talking about The Witcher 3 with VR, I mean to say that to run any game at 120FPS, which is what the Occulus Rift creators recommend for a smooth VR experience (to avoid detachment and motion sickness), meanwhile combining high levels of textures or even multiplayer combat will be impossible with the technical specifications that I outlined above. On the PC side they are having a hard time doing that on GTX 980 Ti's and Fury X's with eight core Intel CPUs clocked much higher than a PS4's Jaguar 1.6GHz processor and have many more streams processors than 1152 also clocked at often times 2x the frequency. 

 

The Kinect on the Original Xbox 360 has games as well and they were almost all universally terrible not because developers wanted to commit business suicide or they hated the platform, but because the hardware couldn't take that level of computation (motion tracking is VERY CPU intensive) along with handling the main game. Or let's say 'Move' on PS3, I love that console but I knew, and so did everyone else, that if the PS3 couldn't push 1080p at playable frame rates WITHOUT the 'Move' that it wasn't going to be able to push 1080p with the move. 

 

To your point about the Rift working at 3440x2560, I'm not sure where you got that number and seriously doubt that's its a real resolution. According to their own site the Occulus Rift is going to have a display with a resolution at 2160x1200. For this it is still recommending a GTX 970 or higher along with a top tier i5 Quad Core CPU clocked a lot higher than 1.6Ghz or 1.75Ghz, both of those systems will be far more powerful than any PS4 or Xbox One, WITH OR WITHOUT hardware optimization on the part of the developers for the specific consoles. 

Link to comment
Share on other sites

Link to post
Share on other sites

 

To your point about the Rift working at 3440x2560, I'm not sure where you got that number and seriously doubt that's its a real resolution. According to their own site the Occulus Rift is going to have a display with a resolution at 2160x1200.

Completely correct, it was just an asspull, i couldn't remember the real resolution off hand, i just knew it was one i considered odd at the time, i thought it was higher than it actually is, but my point there was its signifigantly higher than the PMs 1080p, less so than i thought though but its still 25% higher resolution.

which is what the Occulus Rift creators recommend for a smooth VR experience (to avoid detachment and motion sickness),

 

Actually Carmack at Rift said 90hz minimum, though i think he said 120 would be reccomended, and sony says 120 to the user, where PM games will be allowed to developed at 60 or 120hz and 60 and 120hz titles will be motion doubled to 120 in the case they aren't hitting 120. This is an inideal situation, but its preferable to direct screen refreshes if they are 60hz or fall below the 120hz target

I was not talking about The Witcher 3 with VR, I mean to say that to run any game at 120FPS,

 

I was just using Witcher 3 as an example, where OR will have people modding PC games to fit the rift, as well as new games that will target it specifically, PM will only have titles specific to it released on its platform, the destinction is that it won't be trying to fit current games to a higher standard, instead building new titles around a 1080p 60 or 120hz frame.

meanwhile combining high levels of textures or even multiplayer combat will be impossible with the technical specifications that I outlined above.

High levels of textures, while being a relative term, would probably be out the window in most people here's opinion even in 30 FPS on console hardware, i don't see the reason to make the distinction. I disagree with online multiplayer being impossible, sure its tougher, but for example, Call of Duty AW runs 60 FPS multiplayer without any out of the ordinary hitches, 

with eight core Intel CPUs clocked much higher than a PS4's Jaguar 1.6GHz processor

 

This is exactly where consoles benefit from their single spec platform, developers can load in the statistics tools and see the problem areas and where certain threads are going overbudget and those optimizations will be available and directly apply to every single console running a game with PM. a 16ms frame is tough on that hardware and an 8ms frame is obviously even tougher, but its far from impossible, even at their slow clockspeed the PS4 packs quite a lot of CPU power into a small package, nothing in comparison to an 8 core intel at 4.4 or even a bulldozer CPU, but to imply that its impossible to hit 120hz on a PS4 would imply that 120hz was impossible on hardware slower than it even on the PC front.

 

To that end do you think people weren't playing titles like CS or CS:S at 120hz on high end hardware circa 2004? I can tell you that no hardware from 2004 can hold a torch to a PS4 today

The Kinect on the Original Xbox 360 has games as well and they were almost all universally terrible not because developers wanted to commit business suicide or they hated the platform, but because the hardware couldn't take that level of computation (motion tracking is VERY CPU intensive) along with handling the main game.

 

We don't really know much about the original kinect's implementation on X360 unless one of us is a software developer, i am not, but from basic knowledge it seems like the 360 just didn't have access to the full precision the Kinect Sensor was capable of, This based on basic knowledge that kinect was very powerful as a non gaming tool on the PC side, you are likely correct that the Console's CPU power could have a place in this issue, but i just seemed like no game could use it very precisely, especially with fine movement around the body, Dancing games played okay though

Or let's say 'Move' on PS3, I love that console but I knew, and so did everyone else, that if the PS3 couldn't push 1080p at playable frame rates WITHOUT the 'Move' that it wasn't going to be able to push 1080p with the move.

 

Very little of Move was done console side, compare to the rather powerful kinect, Move was akin to the wii's attempt at it, better i might add, but it was innefective in the market. Game's with Move support didn't take on performance drops when using those features, as it used Sixaxis Motion sensors already native to PS3 controllers and a simple webcam and a program to track where a light ball lands on the sensor, nothing too massively performance killing, The harder part is getting inputs to track into the game correctly and intuitively, There were very good uses in titles like Heavy Rain, vs less effective titles like Resistance 3 or Killzone 3. now i don't think your point about 1080p in this paragraph is a very relevant one it's a PS3, 1080p was basically out of the question by the Move's release, and 720p was the standard, one that i don't think is complaint worthy in this regard

 

I also wanna slide in this shill

The PS3 actually did have several 1080 60 games, one of them Riiiiidge Racer 7

For this it is still recommending a GTX 970 or higher along with a top tier i5 Quad Core CPU clocked a lot higher than 1.6Ghz or 1.75Ghz, both of those systems will be far more powerful than any PS4 or Xbox One, WITH OR WITHOUT hardware optimization on the part of the developers for the specific consoles.

 

There is no arguing this point, a high end PC is much more powerful than a PS4, in fact those two parts alone will ring you up over the price of the consoles themselves. Neither of us would say they are comparable and that's why its important for for PM to have game's designed around its particular standards, Order 1886, Ryse, Infamous Second Son, games like these will never be on PM at 1080p @120hz, but the odd dozen 60FPS PS4 games show that its a reachable target, from the perspective of that its simply a job of cutting down CPU & GPU bottlenecks untill your minimums are 8.33ms. For this particular example i'll consider everything but native resolution a possible bottleneck. So if it takes a drop in LOD, shadow res, Poly levels,

 

My end of story is if you believe 1080 120 is possible on a 7850, then its possible on a PS4 given correct CPU implementation

 

I'll end this with a random bench i googled after writing this whole spiel

tabelle2_05.jpg For the sake of relevance 7970m is the closest product AMD releases to the public to the PS4s GPU, with the 3610QM, a 45w CPU, where the PS4 CPU is likely high 20s low 30sm TDP. and i wouldn't call CS:GO a poor looking game to be completely honest

Forgive me for the confusion, but I was not talking about The Witcher 3 with VR, I mean to say that to run any game at 120FPS, which is what the Occulus Rift creators recommend for a smooth VR experience (to avoid detachment and motion sickness), meanwhile combining high levels of textures or even multiplayer combat will be impossible with the technical specifications that I outlined above. On the PC side they are having a hard time doing that on GTX 980 Ti's and Fury X's with eight core Intel CPUs clocked much higher than a PS4's Jaguar 1.6GHz processor and have many more streams processors than 1152 also clocked at often times 2x the frequency. 

 

The Kinect on the Original Xbox 360 has games as well and they were almost all universally terrible not because developers wanted to commit business suicide or they hated the platform, but because the hardware couldn't take that level of computation (motion tracking is VERY CPU intensive) along with handling the main game. Or let's say 'Move' on PS3, I love that console but I knew, and so did everyone else, that if the PS3 couldn't push 1080p at playable frame rates WITHOUT the 'Move' that it wasn't going to be able to push 1080p with the move. 

 

For this it is still recommending a GTX 970 or higher along with a top tier i5 Quad Core CPU clocked a lot higher than 1.6Ghz or 1.75Ghz, both of those systems will be far more powerful than any PS4 or Xbox One, WITH OR WITHOUT hardware optimization on the part of the developers for the specific consoles. 

 

This comment would really be a waste if you don't reply, so pick a couple things if you don't want to reply with a whole essay

Intel i5-3570K/ Gigabyte GTX 1080/ Asus PA248Q/ Sony MDR-7506/MSI Z77A-G45/ NHD-14/Samsung 840 EVO 256GB+ Seagate Barracuda 3TB/ 16GB HyperX Blue 1600MHZ/  750w PSU/ Corsiar Carbide 500R

 

Link to comment
Share on other sites

Link to post
Share on other sites

Completely correct, it was just an asspull, i couldn't remember the real resolution off hand, i just knew it was one i considered odd at the time, i thought it was higher than it actually is, but my point there was its signifigantly higher than the PMs 1080p, less so than i thought though but its still 25% higher resolution.

Actually Carmack at Rift said 90hz minimum, though i think he said 120 would be reccomended, and sony says 120 to the user, where PM games will be allowed to developed at 60 or 120hz and 60 and 120hz titles will be motion doubled to 120 in the case they aren't hitting 120. This is an inideal situation, but its preferable to direct screen refreshes if they are 60hz or fall below the 120hz target

I was just using Witcher 3 as an example, where OR will have people modding PC games to fit the rift, as well as new games that will target it specifically, PM will only have titles specific to it released on its platform, the destinction is that it won't be trying to fit current games to a higher standard, instead building new titles around a 1080p 60 or 120hz frame.

High levels of textures, while being a relative term, would probably be out the window in most people here's opinion even in 30 FPS on console hardware, i don't see the reason to make the distinction. I disagree with online multiplayer being impossible, sure its tougher, but for example, Call of Duty AW runs 60 FPS multiplayer without any out of the ordinary hitches, 

This is exactly where consoles benefit from their single spec platform, developers can load in the statistics tools and see the problem areas and where certain threads are going overbudget and those optimizations will be available and directly apply to every single console running a game with PM. a 16ms frame is tough on that hardware and an 8ms frame is obviously even tougher, but its far from impossible, even at their slow clockspeed the PS4 packs quite a lot of CPU power into a small package, nothing in comparison to an 8 core intel at 4.4 or even a bulldozer CPU, but to imply that its impossible to hit 120hz on a PS4 would imply that 120hz was impossible on hardware slower than it even on the PC front.

 

To that end do you think people weren't playing titles like CS or CS:S at 120hz on high end hardware circa 2004? I can tell you that no hardware from 2004 can hold a torch to a PS4 today

We don't really know much about the original kinect's implementation on X360 unless one of us is a software developer, i am not, but from basic knowledge it seems like the 360 just didn't have access to the full precision the Kinect Sensor was capable of, This based on basic knowledge that kinect was very powerful as a non gaming tool on the PC side, you are likely correct that the Console's CPU power could have a place in this issue, but i just seemed like no game could use it very precisely, especially with fine movement around the body, Dancing games played okay though

Very little of Move was done console side, compare to the rather powerful kinect, Move was akin to the wii's attempt at it, better i might add, but it was innefective in the market. Game's with Move support didn't take on performance drops when using those features, as it used Sixaxis Motion sensors already native to PS3 controllers and a simple webcam and a program to track where a light ball lands on the sensor, nothing too massively performance killing, The harder part is getting inputs to track into the game correctly and intuitively, There were very good uses in titles like Heavy Rain, vs less effective titles like Resistance 3 or Killzone 3. now i don't think your point about 1080p in this paragraph is a very relevant one it's a PS3, 1080p was basically out of the question by the Move's release, and 720p was the standard, one that i don't think is complaint worthy in this regard

 

I also wanna slide in this shill

The PS3 actually did have several 1080 60 games, one of them Riiiiidge Racer 7

There is no arguing this point, a high end PC is much more powerful than a PS4, in fact those two parts alone will ring you up over the price of the consoles themselves. Neither of us would say they are comparable and that's why its important for for PM to have game's designed around its particular standards, Order 1886, Ryse, Infamous Second Son, games like these will never be on PM at 1080p @120hz, but the odd dozen 60FPS PS4 games show that its a reachable target, from the perspective of that its simply a job of cutting down CPU & GPU bottlenecks untill your minimums are 8.33ms. For this particular example i'll consider everything but native resolution a possible bottleneck. So if it takes a drop in LOD, shadow res, Poly levels,

 

My end of story is if you believe 1080 120 is possible on a 7850, then its possible on a PS4 given correct CPU implementation

 

I'll end this with a random bench i googled after writing this whole spiel

tabelle2_05.jpg For the sake of relevance 7970m is the closest product AMD releases to the public to the PS4s GPU, with the 3610QM, a 45w CPU, where the PS4 CPU is likely high 20s low 30sm TDP. and i wouldn't call CS:GO a poor looking game to be completely honest

 

This comment would really be a waste if you don't reply, so pick a couple things if you don't want to reply with a whole essay

 

You largely clarified my points one by one, rather than rebut them. Ranging from confirming for me that they recommended 120Hz but said the 90Hz would be the bare minimum, to saying that I made a "distinction" with my high resolution texture comment (of course it's a notable distinction, because it impacts performance which is exactly why I mentioned it).

 

Regarding with your statement construing my statement about the PS4 being incapable of 120FPS output, I think you misunderstood what I said. I was stating that even with the top end hardware that is present on the PC we still are largely unable to attain 120FPS with the Occulus Rift with decent graphical settings, which is the point that I was making on the PS4. The PS4 has a much less powerful CPU and GPU and trying to push out 120FPS with a decent looking game will be near impossible, especially because of it's 1080p resolution that is necessary along with 120FPS minimum, that it must stay at to provide an experience that is enjoyable. No one is making the argument that the PS4 cannot run at such a high frame rate with any crappy game like a Ridge Racer or even a Wipeout HD, but that those graphical compromises to achieve the frame rates necessary will be impossible on a CPU at 1.6Ghz. 

 

Regarding your comment on the Kinect and Move, you are quite self contradictory. In the first statement you say it is impossible to know how the Kinect impact the Xbox 360's performance, which I did and will argue that they were crappy games, because they could do nothing else due to the hardware limitations of a 5 year old console at the time.  Then you go down to say, as if you were a developer, that much of the Move wasn't done console side, which is false. The Move required the Playstation Eye Camera and was needed to track the users motion alongside the motion sensor, ensuring that the console needed to do processing for the motion controller. 

 

Which brings me to my last point about your graph, which is a unfair and false comparison. If you were to combine an FX-8320 and clock it down to 1.6Ghz meanwhile cutting off one of the cores for the OS (which the PS4's system does, as detailed by Sony themselves) and combine it with a 7970M with 64 ROPs (which if what the PS4 GPU has) instead of the 80 included on the 7970M, while also clocking it down to the 800MHz~ that the PS4's GPU is, you wouldn't get any where near the same number that you received for those benchmarks, Everyone knows that the FX 8-core series didn't compare well to their Intel counter parts at the higher resolutions that are now required for VR gaming and certainly won't if it is done on console due to their much lower system clock speeds. Specifically the CPU which is clocked down to 1.6GHz and doesn't have anywhere near the power that a i7-3610QM does considering that it boosts up to 3.3GHz and has 4 cores (Intel's core are much faster than AMD's, especially at the higher resolutions needs for VR gaming) and 8 threads.

 

To close, yes I believe it is "possible" with a game that looks subpar, like Counter Strike Global Offensive, to reach 120FPS. But if a game from 2004 that didn't even look that great back then, running at 153FPS (without mention of minimums I might add, which are far more important) on hardware that is not even comparable, is the best you can, then that's the best you can do. However this whole argument is predicated on the fact that the PS4 and Xbox One were powerful enough for 1080p60FPS which obviously both aren't with first person shooters with a visual fidelity required for an enjoyable or even worthwhile buy and experience (Battlefield 4 runs at 900p on PS4 at 60FPS). The Xbox One with it's 768 Stream Processors won't do it and neither will the PS4's 1152. It's ridiculous to argue it, because with clock speeds like 1.75 or 1.6Ghz, you're simply not going to get the level of performance you need in a modern game with 120FPS, especially if online game play is a part of it, regardless of what Call of Duty does. To not recognize that it's not going to happen this generation of consoles is to err on the side of the nonsensical. 

Link to comment
Share on other sites

Link to post
Share on other sites

Okay you you do know there are real time 3D games that run VR on samsung phones today right? like today? I mean we don't expect perfection or anything close, but they are real and out there now, if you exept that that exists, it should be no surprise that it's very possible for a PS4 to create a VR experience,

You largely clarified my points one by one, rather than rebut them. Ranging from confirming for me that they recommended 120Hz but said the 90Hz would be the bare minimum, to saying that I made a "distinction" with my high resolution texture comment (of course it's a notable distinction, because it impacts performance which is exactly why I mentioned it).

 

If high res textures are already not being put into console games  then why expect/ask/want it in a PM game at 60 or 120FPS. I think you're wrong in implying that ulra graphics are needed in a VR experience, even a "simply better than PS3" visual experience at 1080 120 is still enough to blow people out of their shoes

Regarding with your statement construing my statement about the PS4 being incapable of 120FPS output, I think you misunderstood what I said. I was stating that even with the top end hardware that is present on the PC we still are largely unable to attain 120FPS with the Occulus Rift with decent graphical settings, which is the point that I was making on the PS4. The PS4 has a much less powerful CPU and GPU and trying to push out 120FPS with a decent looking game will be near impossible, especially because of it's 1080p resolution that is necessary along with 120FPS minimum,

 

Decent graphical settings is very relative term here, what you might see as the minimum graphics settings for VR are probably far blown out by a large scale compared to what they actually are because PM isn't about "amazing looking" games the medium is the "amazing part" PS VR isn't looking to be comparable to PC VR, because that's simply not possible, but a 120hz 1080p VR platform on PS4 is simply difficult, far from impossible, 

Regarding your comment on the Kinect and Move, you are quite self contradictory. In the first statement you say it is impossible to know how the Kinect impact the Xbox 360's performance, which I did and will argue that they were crappy games, because they could do nothing else due to the hardware limitations of a 5 year old console at the time.  Then you go down to say, as if you were a developer, that much of the Move wasn't done console side, which is false. The Move required the Playstation Eye Camera and was needed to track the users motion alongside the motion sensor, ensuring that the console needed to do processing for the motion controller.

 

My mistake in talking about Move is that i own one, so i thought i'd talk about it, but my main point was the 6axis senor provides the values of the controllers acceleration and positioning, this is why it was better than Kinect, it didn't rely solely on the camera and software to provide input it had direct input from the controller. Kinect needed software to try to assume what your hands were doing based on the camera senosor. I don't think your point about PS Move being computationally expensive holds any water, when it didn't cause a performance deficit in the games that used it, Heavy Rain is an  unvsync'd game that falls below its target 30FPS, and it doesn't fall harder when you are using the move controller vs the standsard one. The camera only measured where a light source was, a simple tracking program to aid the sensors on board on the controller.

Which brings me to my last point about your graph, which is a unfair and false comparison. If you were to combine an FX-8320 and clock it down to 1.6Ghz meanwhile cutting off one of the cores for the OS (which the PS4's system does, as detailed by Sony themselves) and combine it with a 7970M with 64 ROPs (which if what the PS4 GPU has) instead of the 80 included on the 7970M, while also clocking it down to the 800MHz~ that the PS4's GPU is, you wouldn't get any where near the same number that you received for those benchmarks, Everyone knows that the FX 8-core series didn't compare well to their Intel counter parts at the higher resolutions that are now required for VR gaming and certainly won't if it is done on console due to their much lower system clock speeds. Specifically the CPU which is clocked down to 1.6GHz and doesn't have anywhere near the power that a i7-3610QM does considering that it boosts up to 3.3GHz and has 4 cores (Intel's core are much faster than AMD's, especially at the higher resolutions needs for VR gaming) and 8 threads.

 

This is a GPU side comparison, as the point being even at the PCs maximum settings a 1280 core, 800mhz GPU can  get 150 FPS in a benchmark, this is to show the basic experience is possible, I think everything you say about CPU's in this paragraph is completely bs and doesn't belong at all, and 8320 at 1.6ghz has simply nothing to do the PS4 CPU, its no closer to it than a 3610QM, they are both completely different chipsets. Saying this bench should be done with an 8320 at 1.6 completely goes against the point of the argument as CS:GO isn't going to be made to run on that hardware, and it will run like shit, obviously, no one needs to cry and wail about it, but if it was developed to run at 120 on a PS4 its CPU threading would be different

 

Why don't we take a look at some PC vs PS4 performance for a second to make this point more clear

 

CPU_03.png

This game runs at 1080 60 without any drops on PS4, yet if we go by this bench it should average closer to low 30s, but it doesn't why do you think that is?

 

CPU_03.png

Tougher Call, but id also say PCars performs above what CPU bench's on PC should show the 1.6 8core delivering. this was benched in the daytime so the videos of the game running at low 30s in the rain are not relevant to the above numbers

To close, yes I believe it is "possible" with a game that looks subpar, like Counter Strike Global Offensive, to reach 120FPS. But if a game from 2004 that didn't even look that great back then, running at 153FPS (without mention of minimums I might add, which are far more important) on hardware that is not even comparable, is the best you can, then that's the best you can do. 

 

K so try to stay on track CS:GO was released in 2012 and it uses the latest version of Source 1, It may look supbar from your perspective, but its a very popular game and even at that graphics level if Project Morpheus can deliver it will be a great experience, I agree that it's unfortunate they didn't care to include minimums but i was looking for a 7870 bench and a 7970M bench is better because its closer to the PS4 chip. Another thing you miss out on is that that bench was conducted with 4X MSAA, with plenty of performance headroom, i don't think MSAA is relevant at that pixel density, but maybe you disagree.

 

However this whole argument is predicated on the fact that the PS4 and Xbox One were powerful enough for 1080p60FPS which obviously both aren't with first person shooters with a visual fidelity required for an enjoyable or even worthwhile buy and experience (Battlefield 4 runs at 900p on PS4 at 60FPS).

 

This is entirely false and entirely provable to be so, both the games i listed about in this post Metro Redeux is 1920x1080 at 60 FPS start to finish with not a single dropped frame from DF's Testing. PCars on the other hand drops quite signifigantly in rain but it still targets 1080 60 and hits it during the day. Going by one game shows how misinformed you are about the resolution and framerate situation on the PS4, DICE has chosen to employ a 900p resolution in its shooters on PS4, this didn't change with hardline so it seems its here to stay, or at least we will have to see with Battlefront. but there are plenty of 1080 60hz games released since launch that you seem blissfully unaware of in your argument.

 

The Xbox One with it's 768 Stream Processors won't do it and neither will the PS4's 1152. It's ridiculous to argue it, because with clock speeds like 1.75 or 1.6Ghz, you're simply not going to get the level of performance you need in a modern game with 120FPS, especially if online game play is a part of it, regardless of what Call of Duty does. To not recognize that it's not going to happen this generation of consoles is to err on the side of the nonsensical.

This is entirely 100% a software issue and your failure to see how an 1152, or even 768 shader GPU can deliver games at 1080 120, or even 60, which you mistakenly believe they can not do. just drives home the point that you have know grounds to start your arguments when you don't know about the peices of hardware you are talking about. 

 

No noteable VR exclusive games have released to date for PM, OR, or any other major VR platform and right now PC games are still build around a 30FPS frame, witcher 3 for example, or countless other games, even if unlocked the game is developer for 30 on console and thats their design environment. With VR in the future we may see a major swing to games being buit in a 120hz dev environment, no one is saying they will look as good or be as complex, but they don't have to VR is an incredible experience that in itself so if even sub par graphics are delivered consistently and effectively it will be a great thing to see.  

 

i'll take the liberty of just linking the currently available real time VR demo's for PM since you probably ignored my comment saying you should check them out, its pretty cool

this one isnt a great video, but its the best i could find

this one's some weebshit from the guys who make the excellent tekken games
Last one is most impressive IMO

 

 

You largely clarified my points one by one, rather than rebut them. Ranging from confirming for me that they recommended 120Hz but said the 90Hz would be the bare minimum, to saying that I made a "distinction" with my high resolution texture comment (of course it's a notable distinction, because it impacts performance which is exactly why I mentioned it).

 

Regarding with your statement construing my statement about the PS4 being incapable of 120FPS output, I think you misunderstood what I said. I was stating that even with the top end hardware that is present on the PC we still are largely unable to attain 120FPS with the Occulus Rift with decent graphical settings, which is the point that I was making on the PS4. The PS4 has a much less powerful CPU and GPU and trying to push out 120FPS with a decent looking game will be near impossible, especially because of it's 1080p resolution that is necessary along with 120FPS minimum, that it must stay at to provide an experience that is enjoyable. No one is making the argument that the PS4 cannot run at such a high frame rate with any crappy game like a Ridge Racer or even a Wipeout HD, but that those graphical compromises to achieve the frame rates necessary will be impossible on a CPU at 1.6Ghz. 

 

Regarding your comment on the Kinect and Move, you are quite self contradictory. In the first statement you say it is impossible to know how the Kinect impact the Xbox 360's performance, which I did and will argue that they were crappy games, because they could do nothing else due to the hardware limitations of a 5 year old console at the time.  Then you go down to say, as if you were a developer, that much of the Move wasn't done console side, which is false. The Move required the Playstation Eye Camera and was needed to track the users motion alongside the motion sensor, ensuring that the console needed to do processing for the motion controller. 

 

Which brings me to my last point about your graph, which is a unfair and false comparison. If you were to combine an FX-8320 and clock it down to 1.6Ghz meanwhile cutting off one of the cores for the OS (which the PS4's system does, as detailed by Sony themselves) and combine it with a 7970M with 64 ROPs (which if what the PS4 GPU has) instead of the 80 included on the 7970M, while also clocking it down to the 800MHz~ that the PS4's GPU is, you wouldn't get any where near the same number that you received for those benchmarks, Everyone knows that the FX 8-core series didn't compare well to their Intel counter parts at the higher resolutions that are now required for VR gaming and certainly won't if it is done on console due to their much lower system clock speeds. Specifically the CPU which is clocked down to 1.6GHz and doesn't have anywhere near the power that a i7-3610QM does considering that it boosts up to 3.3GHz and has 4 cores (Intel's core are much faster than AMD's, especially at the higher resolutions needs for VR gaming) and 8 threads.

 

To close, yes I believe it is "possible" with a game that looks subpar, like Counter Strike Global Offensive, to reach 120FPS. But if a game from 2004 that didn't even look that great back then, running at 153FPS (without mention of minimums I might add, which are far more important) on hardware that is not even comparable, is the best you can, then that's the best you can do. However this whole argument is predicated on the fact that the PS4 and Xbox One were powerful enough for 1080p60FPS which obviously both aren't with first person shooters with a visual fidelity required for an enjoyable or even worthwhile buy and experience (Battlefield 4 runs at 900p on PS4 at 60FPS). The Xbox One with it's 768 Stream Processors won't do it and neither will the PS4's 1152. It's ridiculous to argue it, because with clock speeds like 1.75 or 1.6Ghz, you're simply not going to get the level of performance you need in a modern game with 120FPS, especially if online game play is a part of it, regardless of what Call of Duty does. To not recognize that it's not going to happen this generation of consoles is to err on the side of the nonsensical. 

Intel i5-3570K/ Gigabyte GTX 1080/ Asus PA248Q/ Sony MDR-7506/MSI Z77A-G45/ NHD-14/Samsung 840 EVO 256GB+ Seagate Barracuda 3TB/ 16GB HyperX Blue 1600MHZ/  750w PSU/ Corsiar Carbide 500R

 

Link to comment
Share on other sites

Link to post
Share on other sites

[removed]

I think we're going to have to agree to disagree on this one, because for one I feel this has become personal, two, it wasn't an argument to begin with. If you cannot see that real world VR games aren't going to be able to run on consoles, such as the PS4, then I'm really not sure what to say to make the conversation better. Finally the third I really don't care about this argument enough to rebut what you have written. Well written however, but I think the future will tell who is right. It was nice debating with you.

Edited by Godlygamer23
Removed quote. Original post still intact.
Link to comment
Share on other sites

Link to post
Share on other sites

Lol.

Both consoles use the same shitty Kabini CPU, and it bottlenecks the hell out games. If the game doesn't use too much CPU power then the PS4 pulls ahead with its superior GPU unit.

[ Cruel Angel ]:     Exterior  -   BENQ XL2420T   |   SteelSeries MLG Sensei   |   Corsair K70 RED   |   Corsair 900D  |                                                                                                    CPU:    -   4.7Ghz @ 1.425v             |

                             Interior    -   i7 4770k   |    Maximus VI Formula    |   Corsair Vengeance Pro 16GB    |   ASUS GTX 980 Strix SLIx2  |  840 Pro 512Gb    |    WD Black 2TB  |           RAM:   -   2400Mhz OC @ 1.650v    |

                             Cooling   -   XSPC 120mm x7 Total Radiator Space   |   XSPC RayStorm    |    PrimoChill Tubing/Res  |                                                                                             GPU:   -   1000Mhz @ 1.158            |

Link to comment
Share on other sites

Link to post
Share on other sites

  • 5 months later...

Okay you you do know there are real time 3D games that run VR on samsung phones today right? like today? I mean we don't expect perfection or anything close, but they are real and out there now, if you exept that that exists, it should be no surprise that it's very possible for a PS4 to create a VR experience,

If high res textures are already not being put into console games  then why expect/ask/want it in a PM game at 60 or 120FPS. I think you're wrong in implying that ulra graphics are needed in a VR experience, even a "simply better than PS3" visual experience at 1080 120 is still enough to blow people out of their shoes

 

Decent graphical settings is very relative term here, what you might see as the minimum graphics settings for VR are probably far blown out by a large scale compared to what they actually are because PM isn't about "amazing looking" games the medium is the "amazing part" PS VR isn't looking to be comparable to PC VR, because that's simply not possible, but a 120hz 1080p VR platform on PS4 is simply difficult, far from impossible, 

My mistake in talking about Move is that i own one, so i thought i'd talk about it, but my main point was the 6axis senor provides the values of the controllers acceleration and positioning, this is why it was better than Kinect, it didn't rely solely on the camera and software to provide input it had direct input from the controller. Kinect needed software to try to assume what your hands were doing based on the camera senosor. I don't think your point about PS Move being computationally expensive holds any water, when it didn't cause a performance deficit in the games that used it, Heavy Rain is an  unvsync'd game that falls below its target 30FPS, and it doesn't fall harder when you are using the move controller vs the standsard one. The camera only measured where a light source was, a simple tracking program to aid the sensors on board on the controller.

This is a GPU side comparison, as the point being even at the PCs maximum settings a 1280 core, 800mhz GPU can  get 150 FPS in a benchmark, this is to show the basic experience is possible, I think everything you say about CPU's in this paragraph is completely bs and doesn't belong at all, and 8320 at 1.6ghz has simply nothing to do the PS4 CPU, its no closer to it than a 3610QM, they are both completely different chipsets. Saying this bench should be done with an 8320 at 1.6 completely goes against the point of the argument as CS:GO isn't going to be made to run on that hardware, and it will run like shit, obviously, no one needs to cry and wail about it, but if it was developed to run at 120 on a PS4 its CPU threading would be different

 

Why don't we take a look at some PC vs PS4 performance for a second to make this point more clear

 

CPU_03.png

This game runs at 1080 60 without any drops on PS4, yet if we go by this bench it should average closer to low 30s, but it doesn't why do you think that is?

 

CPU_03.png

Tougher Call, but id also say PCars performs above what CPU bench's on PC should show the 1.6 8core delivering. this was benched in the daytime so the videos of the game running at low 30s in the rain are not relevant to the above numbers

K so try to stay on track CS:GO was released in 2012 and it uses the latest version of Source 1, It may look supbar from your perspective, but its a very popular game and even at that graphics level if Project Morpheus can deliver it will be a great experience, I agree that it's unfortunate they didn't care to include minimums but i was looking for a 7870 bench and a 7970M bench is better because its closer to the PS4 chip. Another thing you miss out on is that that bench was conducted with 4X MSAA, with plenty of performance headroom, i don't think MSAA is relevant at that pixel density, but maybe you disagree.

 

This is entirely false and entirely provable to be so, both the games i listed about in this post Metro Redeux is 1920x1080 at 60 FPS start to finish with not a single dropped frame from DF's Testing. PCars on the other hand drops quite signifigantly in rain but it still targets 1080 60 and hits it during the day. Going by one game shows how misinformed you are about the resolution and framerate situation on the PS4, DICE has chosen to employ a 900p resolution in its shooters on PS4, this didn't change with hardline so it seems its here to stay, or at least we will have to see with Battlefront. but there are plenty of 1080 60hz games released since launch that you seem blissfully unaware of in your argument.

 

This is entirely 100% a software issue and your failure to see how an 1152, or even 768 shader GPU can deliver games at 1080 120, or even 60, which you mistakenly believe they can not do. just drives home the point that you have know grounds to start your arguments when you don't know about the peices of hardware you are talking about. 

 

No noteable VR exclusive games have released to date for PM, OR, or any other major VR platform and right now PC games are still build around a 30FPS frame, witcher 3 for example, or countless other games, even if unlocked the game is developer for 30 on console and thats their design environment. With VR in the future we may see a major swing to games being buit in a 120hz dev environment, no one is saying they will look as good or be as complex, but they don't have to VR is an incredible experience that in itself so if even sub par graphics are delivered consistently and effectively it will be a great thing to see.  

 

i'll take the liberty of just linking the currently available real time VR demo's for PM since you probably ignored my comment saying you should check them out, its pretty cool

https://www.youtube.com/watch?v=_WzpLtCw9r0

https://www.youtube.com/watch?v=TGLs9hqogRE

this one isnt a great video, but its the best i could find

https://www.youtube.com/watch?v=wIl2-5f8NTo

this one's some weebshit from the guys who make the excellent tekken games

Last one is most impressive IMO

https://www.youtube.com/watch?v=tMI2Swxc1EM

Once again I was fucking right: https://www.gtplanet.net/playstation-4-needs-external-processing-unit-for-playstation-vr/

Link to comment
Share on other sites

Link to post
Share on other sites

This was announced quite some time ago, that there would be external support box, it's not another GPU or an additional CPU, but you were right in so far that the PS4 won't do it all on its own, but it physically can't since it only has one HDMI port and it needs to send the image to two screens , all the box is doing is the time warp stuff and splitting the HDMI signal from what we know now. It won't be doing parallel processing over USB 3.0 as that just wouldn't work.

Intel i5-3570K/ Gigabyte GTX 1080/ Asus PA248Q/ Sony MDR-7506/MSI Z77A-G45/ NHD-14/Samsung 840 EVO 256GB+ Seagate Barracuda 3TB/ 16GB HyperX Blue 1600MHZ/  750w PSU/ Corsiar Carbide 500R

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×