Jump to content

Xbox One vs. PS4 Resolution: Microsoft Challenges You To Spot The Differences on TV Smaller Than 60"

ACatWithThumbs

Well, it's not entirely wrong. At a certain distance you can't really tell 720p and 1080p apart (quality wise).

Yes, but you are talking a distance of several feet. For a modest 32", 1080p, LCD TV, you are supposed to sit 4-6.7 feet away from the television. For 720p that minimum distance, to not be able to tell the difference from 1080p, is over 10 feet away. Don't be confused by the line that says "The full benefits of 720p become available, that is strictly the distance for the full benefit of 720p. If you look at the graph that gives the distance for there to be no distance, you will see that at 32", that distance is greater than 10 feet.

But even then those are just averages. Some people can spot the difference more easily and will continue to spot that difference from a longer distance. Also, the basic quality of some TV's is crap, so even if the source is 1080p, it does not mean the TV does a good job of displaying that. Then add in the fact that upscaled content looks like crap all on it's own. The newer Harry Potter movies are a good example of that, I don't have a link but there is a website that has this sort of information for DvDs and Blurays, and the Harry Potter movies always get a horrible rating. The source on the disc is in 1080p, but it's an upscaled 1080p. Again it's already upscaled, so your Bluray player is not doing the work, but it is extremely noticeable, even from a long distance.

Same applies to upscaled games. The process of upscaling already fucks up the image and will never look as good as a native 1080p source will.

i7 2600K @ 4.7GHz/ASUS P8Z68-V Pro/Corsair Vengeance LP 2x4GB @ 1600MHz/EVGA GTX 670 FTW SIG 2/Cooler Master HAF-X

 

http://www.speedtest.net/my-result/3591491194

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, but you are talking a distance of several feet. For a modest 32", 1080p, LCD TV, you are supposed to sit 4-6.7 feet away from the television. For 720p that minimum distance, to not be able to tell the difference from 1080p, is over 10 feet away. Don't be confused by the line that says "The full benefits of 720p become available, that is strictly the distance for the full benefit of 720p. If you look at the graph that gives the distance for there to be no distance, you will see that at 32", that distance is greater than 10 feet.

But even then those are just averages. Some people can spot the difference more easily and will continue to spot that difference from a longer distance. Also, the basic quality of some TV's is crap, so even if the source is 1080p, it does not mean the TV does a good job of displaying that. Then add in the fact that upscaled content looks like crap all on it's own. The newer Harry Potter movies are a good example of that, I don't have a link but there is a website that has this sort of information for DvDs and Blurays, and the Harry Potter movies always get a horrible rating. The source on the disc is in 1080p, but it's an upscaled 1080p. Again it's already upscaled, so your Bluray player is not doing the work, but it is extremely noticeable, even from a long distance.

Same applies to upscaled games. The process of upscaling already fucks up the image and will never look as good as a native 1080p source will.

 

 

720p vs 1080p on my Pioneer Kuro is very noticeable in games, less noticeable in Blu Rays. The TV is some 15 feet away due to the nature of the room. So most of the time, all of it looks the same; but that is source dependent. Good sources can look indistinguishable, crappy sources are blatantly obvious. 

 

720p vs 1080p on our projector? Hello big differences. That shit you can spot so easily. But I'd wager not many people have a 130" screen to test it on. 

 

 

So is MS actually wrong? Not entirely. Its half truths. But they aren't talking out of their ass on this one.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not bothered by the challenge itself, it's the proposition of the challenge that's really odd. Even if the user can't see the difference between the screen shots, it still suggests that the Xbox One is less powerful than the PS4.

 

Although, even though the Xbox is less powerful, I think this has gone into Sony fanboys' heads a bit too much.

Tea, Metal, and poorly written code.

Link to comment
Share on other sites

Link to post
Share on other sites

I've heard some utter rubbish before but this takes the cake. Does the guy realise just how severe his example is? He says "Personally I struggle to see the difference". If just playing in a normal-sized living room, he seriously isn't able to distinguish between 720p/1080p and 30/60FPS on a 60 inch TV, then his eyesight is probably bad enough to warrant a f*cking guide dog.

 

What an idiot.

Link to comment
Share on other sites

Link to post
Share on other sites

Uh yeah. We lied about DirectX 12 increasing resolution through third party paid off clowns who have since backtracked, and that 10 percent more on the GPU still isn't going to get us to 1080p, unless we turn off AA and lower fps to 30 with a "parity clause game". The cloud might help cpu one day outside a closed network, so no help coming there, so now we are just going to go back to resolution AND FPS doesn't matter, since we sacrificed FPS on the Forza to hit 1080p. So I guess FPS matters less than resolution? I mean they had to make a choice. So both don't matter, but one matters less. Seems legit.

 

Hey MS I have the perfect guy for the "challenge". Use this "PC Gamer". He is the one who claimed to hate stream and asked Phil Spencer not cancel Games for Windows live. Slick though this might be a plant on the WAN show. Why? Cus Slick isn't a moron. 

 

 

uUjOB6W.png?1

 

HA6wMsx.jpg?1

 

 

MS. BS at it's finest.

 

http://www.gamespot.com/articles/microsoft-we-need-to-do-more-for-pc-gaming/1100-6420024/

Edited by Dim
Spoiler Tags

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

I am always in favor of blind testing,  do it in a controlled environment with unbiased scientists overseeing the test and I'll accept the results regardless.  Maybe ms, sony, ubisoft etc should get in contact with a third party tester like the royal society, csiro, stanford uni etc. 

 

The problem I have with a lot of these challenges and rebuttals is that the test conditions are nearly always compromised due to failure to remove influencing factors.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I am always in favor of blind testing,  do it in a controlled environment with unbiased scientists overseeing the test and I'll accept the results regardless.  Maybe ms, sony, ubisoft etc should get in contact with a third party tester like the royal society, csiro, stanford uni etc. 

 

The problem I have with a lot of these challenges and rebuttals is that the test conditions are nearly always compromised due to failure to remove influencing factors.

 

Blind testing? You mean like the "PC gamer" I listed who talked with Phil Spencer on Twitter who was just a MS advertiser? We don't need a blind test for this and the results would be laughable. 

 

We have eyes. We know this article/statement by MS is BS. 

 

This is like asking what looks clearer. Blu Ray or upscaled DVD. What looks better. Laser Disc or VHS. There is no rebuttal except claims that if you sit farther away than anyone sits away from their TV you won't notice it as much. That is resolution. That has nothing to do with FPS. 

 

This isn't like a pepsi vs coke test. There should be no test. There should be no discussion. One is simply better at the distances most people actually sit from their TV/monitor. This would be like me saying a PC with a 7790/7850 in it, is the same as a PC with a 7770 in it. It isn't. GPU matters. A PC exactly the same as mine with a GTX 780 in it would be better than mine. It would give better visuals/fps in games. This is not debatable. 

 

MS is stating that opinion can determine that black is white. 

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Blind testing? You mean like the "PC gamer" I listed who talked with Phil Spencer on Twitter who was just a MS advertiser? We don't need a blind test for this and the results would be laughable. 

 

snip

check the bit in bold:

 

I am always in favor of blind testing,  do it in a controlled environment with unbiased scientists overseeing the test and I'll accept the results regardless.  Maybe ms, sony, ubisoft etc should get in contact with a third party tester like the royal society, csiro, stanford uni etc. 

 

The problem I have with a lot of these challenges and rebuttals is that the test conditions are nearly always compromised due to failure to remove influencing factors.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

But Americans love big TV's to watch their game and Microsoft is an icon to American culture so you lose here Microsoft.

 

I have a 50 in and I've played my PC in my livingroom, I can definitely tell a different when I play at 720p and 1080p, it's so much more sharp and clear and it as less aliasing. 720p looks blurry and the colors just don't pop as much.

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

I can tell on my 24" monitor from the farthest point in my room. 

Good game MS.

It's not that I care way too much that the ps4 is more powerful it seems, but it's 2014. If you can't make console that runs all of its games in at least  1080p tan you have failed. Even the wii U is doing better than you. 

I know it's partly the devs fault, but come on. 

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

I can tell on my 24" monitor from the farthest point in my room. 

Good game MS.

It's not that I care way too much that the ps4 is more powerful it seems, but it's 2014. If you can't make console that runs all of its games in at least  1080p tan you have failed. Even the wii U is doing better than you. 

I know it's partly the devs fault, but come on. 

 

 

I think people tend to forget that the consoles aren't pc's, they are less than half the size of the average enthusiasts pc and probably perform significantly better than similar costing prebult pc's.  They will be limited, but they serve a function and to claim they suck at that function is unfair, yes they pale in comparison to a pc, but no one really buys a console hoping to get pc performance. 60 fps and high res is just not possible most of the time and if they were to shovel in some hardware that could do it, then they push the cost beyond what the market is willing to pay and/or cause heat/stability issues (E.G xbox360 redring).

 

Marketers will say what marketers will say, devs will shit on their pc ports in order appease the almighty revenue stream.  That's a fact.  But if we want devs/manufacturers to take us seriously and listen, then we need to stop expecting the impossible from consoles.  

 

EDIT: this is not a go at you although it might read like that, I'm just having a general rant because I think some people have lost sight of reality in what a consoles actual restrictions are, both physically and from the markets position.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

check the bit in bold:

 

And I am saying there is no reason for a test. Like I said we have eyes. 1080p>720P. 60 FPS stupidly better gameplay than 30 FPS. 

 

MS "If that leads to the perception that one machine is more powerful than another". One machine IS more powerful for games than the other. Again this is not debatable. MS wants to debate something that can't be debated. Find someone with the exact same 4770k build on this forum as me at the same clock that has a better GPU and guess what? That machine is more powerful. I can't debate that. MS wants to. 

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

And I am saying there is no reason for a test. Like I said we have eyes. 1080p>720P. 60 FPS stupidly better gameplay than 30 FPS. 

 

MS "If that leads to the perception that one machine is more powerful than another". One machine IS more powerful for games than the other. Again this is not debatable. MS wants to debate something that can't be debated. Find someone with the exact same 4770k build on this forum as me at the same clock that has a better GPU and guess what? That machine is more powerful. I can't debate that. MS wants to. 

 

I am always in favor of blind testing,  do it in a controlled environment with unbiased scientists overseeing the test and I'll accept the results regardless.  Maybe ms, sony, ubisoft etc should get in contact with a third party tester like the royal society, csiro, stanford uni etc. 

 

The problem I have with a lot of these challenges and rebuttals is that the test conditions are nearly always compromised due to failure to remove influencing factors.

 

I don't believe a bunch of forum users that believe everything youtube tells us any more than I believe a marketer from MS.  proper peer reviewed science is the shit.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I think people tend to forget that the consoles aren't pc's, they are less than half the size of the average enthusiasts pc and probably perform significantly better than similar costing prebult pc's.  They will be limited, but they serve a function and to claim they suck at that function is unfair, yes they pale in comparison to a pc, but no one really buys a console hoping to get pc performance. 60 fps and high res is just not possible most of the time and if they were to shovel in some hardware that could do it, then they push the cost beyond what the market is willing to pay and/or cause heat/stability issues (E.G xbox360 redring).

 

Marketers will say what marketers will say, devs will shit on their pc ports in order appease the almighty revenue stream.  That's a fact.  But if we want devs/manufacturers to take us seriously and listen, then we need to stop expecting the impossible from consoles.  

 

EDIT: this is not a go at you although it might read like that, I'm just having a general rant because I think some people have lost sight of reality in what a consoles actual restrictions are, both physically and from the markets position.

 

Thank you. 

 

People, you can guess who, act like consoles are the devil. Actually, worse than the devil. Consoles have a place. Consoles ALWAYS have a place. They always have, the might not always though. With how PCs are going and how sizes are getting smaller and how PCs are trying to adopt more console like interfaces (heh, who'd a thunk that) then maybe the entire need for a console might one day go away; but we aren't there yet. We have to live in the reality of today. 

 

I game on a 50" TV or a 130" TV. On the 50", here or there. Depends on the content, and even then. I have one of the best TVs ever made, a TV that still spanks around new units, and I couldn't tell you most times if its 720 or 1080. Honestly. 

 

On my projector? Yes, I can tell. The screen is over double in size. Pixels be larger. You can more easily notice what is what. But I don't game on that, I watch movies. So...moot point in my experience. 

 

My 27" PC monitor. Now we are in the area where I can tell. I sit 12 freaking inches away. 720p to 1080p is a large difference since I'm so close. 

 

Now, to make it more complicated. If I play Xbox or PS using this 27" monitor, all of a sudden 720p vs 1080p is way more noticeable. Games that are not 720p or 1080p and being upscaled are noticeable. Games that run natively and either resolution are noticeable. Cause I SIT SO FREAKING CLOSE. Any change in quality is more easily perceived

 

And there it is. Its a perception issue. At 10-12 feet on a large screen? You tend not to notice the little details as much. You really don't. Its easier to blend things in. So you are less likely to pick things up. Not saying you can't, just that its much less likely. Larger screens? Now you have larger pixels. So the fidelity comes into question, and you start noticing it. Smaller screens at a close distance? Same thing. You perceive it much better because you can notice it. 

 

People need to stop treating it like some mystery or rocket science or even harping down MSs throat when they say stuff that actually has validity. 

Link to comment
Share on other sites

Link to post
Share on other sites

IIRC, you can't tell the difference at 10' on a screen smaller than 40". This at least used to be the rule when I still worked at Samsung, however that is circa 2008.

 

Yeah somehow that made the jump to 60 inches per MS. Might as well inflate a number by 1/3rd. 

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

IIRC, you can't tell the difference at 10' on a screen smaller than 40". This at least used to be the rule when I still worked at Samsung, however that is circa 2008.

 

The human eye has a set point at which finer detail cannot be perceived, this is measured in arc minutes.   http://en.wikipedia.org/wiki/Minute_of_arc

 

I believe it is 600ppi at 10-12 inches for 20/20 vision.  Thus you just need to expand the ratio to determine the maximum visual acuity for a set distance.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

My head is so dizzy from ALL DAT SPIN!!!

 

Someone please tells these xbone execs to shut their mouths lol

Link to comment
Share on other sites

Link to post
Share on other sites

The human eye has a set point at which finer detail cannot be perceived, this is measured in arc minutes.   http://en.wikipedia.org/wiki/Minute_of_arc

 

I believe it is 600ppi at 10-12 inches for 20/20 vision.  Thus you just need to expand the ratio to determine the maximum visual acuity for a set distance.

 

60ppi at 100 inches. 

 

100 inches is 2.54 meters. 2.54m is 8.3 feet. 

 

I also don't know how to math, so I have no idea if thats valid. 

 

But there it is. 

 

 

 

 

The original source also a marketer. You guys are getting worked up over what a marketer is saying. since when do they EVER know what's going on? They market. They spew crap. And guess what? You're giving it all the free attention it doesn't deserve. 

Link to comment
Share on other sites

Link to post
Share on other sites

I think people tend to forget that the consoles aren't pc's, they are less than half the size of the average enthusiasts pc and probably perform significantly better than similar costing prebult pc's.  They will be limited, but they serve a function and to claim they suck at that function is unfair, yes they pale in comparison to a pc, but no one really buys a console hoping to get pc performance. 60 fps and high res is just not possible most of the time and if they were to shovel in some hardware that could do it, then they push the cost beyond what the market is willing to pay and/or cause heat/stability issues (E.G xbox360 redring).

 

Marketers will say what marketers will say, devs will shit on their pc ports in order appease the almighty revenue stream.  That's a fact.  But if we want devs/manufacturers to take us seriously and listen, then we need to stop expecting the impossible from consoles.  

 

EDIT: this is not a go at you although it might read like that, I'm just having a general rant because I think some people have lost sight of reality in what a consoles actual restrictions are, both physically and from the markets position.

Both the PS4 ans Xbone are perfectly capable of 1080p 60fps gaming, but game devs choose to focus on small effects with a large performance impact first before making it perform right then seeing what effects you can add. The ps3 and 360 were actually both capable of it somewhat. I have had experience game developing for both pc and the last gen consoles.  I know what I'm talking about. Small form factor has nothing to do with it by the way, nothing at all.

It's all in the marketing. They want the previews and the "graphics hype" to be the selling point, so they push for that instead of making it work right. Nintendo is seeming to make the opposite push recently. I get it, mario is not exactly a demanding game, but with ps4 and xbone you have more than enough power to make the games look better than last gen gradually while maintaining a good performance standard. 

I know that game devs will get better at squeezing every bit of power from these systems in the next few years, but instead of using that to make a game run at 1080p/60fps,I'm afraid they will continue to stay at 720p and 30fps just like last gen.  

It's lazy game dev. In fact last gen I think the only game I knew of that had a dynamic frame buffer was wipeoutHD. -.- So it's not like they laziness  started this gen. Tricks like that help you squeeze everything you can out of the hardware you have to work with. but instead they are trying to develop for them like they are PCs, and simply turn down the frame rate and resolutions caps instead of designing the game from ground up to run right.

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

Both the PS4 ans Xbone are perfectly capable of 1080p 60fps gaming, but game devs choose to focus on small effects with a large performance impact first before making it perform right then seeing what effects you can add. The ps3 and 360 were actually both capable of it somewhat. I have had experience game developing for both pc and the last gen consoles.  I know what I'm talking about. Small form factor has nothing to do with it by the way, nothing at all.

It's all in the marketing. They want the previews and the "graphics hype" to be the selling point, so they push for that instead of making it work right. Nintendo is seeming to make the opposite push recently. I get it, mario is not exactly a demanding game, but with ps4 and xbone you have more than enough power to make the games look better than last gen gradually while maintaining a good performance standard. 

I know that game devs will get better at squeezing every bit of power from these systems in the next few years, but instead of using that to make a game run at 1080p/60fps,I'm afraid they will continue to stay at 720p and 30fps just like last gen.  

It's lazy game dev. In fact last gen I think the only game I knew of that had a dynamic frame buffer was wipeoutHD. -.- So it's not like they laziness  started this gen. Tricks like that help you squeeze everything you can out of the hardware you have to work with. but instead they are trying to develop for them like they are PCs, and simply turn down the frame rate and resolutions caps instead of designing the game from ground up to run right.

 

Interesting, because I have a slightly better GPU (1.4Tflops) and an I5 3550 and I on many occasions drop under 60fps when playing at 1080.  In fact much of my gaming occurs at 40-50fps depending on how far I push things like AA etc.  So I can't help but wonder how they propose to make the console stable at 60 with similar graphics settings.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Xbox One vs. PS4 Resolution: Microsoft Challenges You To Spot The Differences on TV Smaller Than 60"

 

 

Please, I can spot the difference between crisp and blurry on 24" monitor running a res of 1920x122, let alone a massive display. The size of the display doesn't even matter as much as the resolution used...

 

Is Microsoft trying to be trolls or is someone over at HQ this out of touch?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×