Jump to content

Xbox Series X

The1Dickens
4 minutes ago, dalekphalm said:

Textures are not render resolution. Those are different things. Hell, you can use 4K textures on a game rendered at 1080p if you wanted.

 

Nobody is saying the graphics on a game running on a high end PC are the same as the graphics running on a Pro Console - but those are not resolutions.

 

4K is 4K is 4K. Either the game renders output at that resolution or it doesn't. What textures are used are irrelevant to the discussion, because textures are not render resolution. Textures are one of many graphical fidelity settings.

 

Aside from that, many PC games of old ran perfectly well while being rendered at native 4K output but did not have "4K textures" - those games were still "4K".

Let me repeat what I have said more than enough times now. I have never said they do not run at 4k but rather that running a game at 4k on a console is not the same as running a game at 4k on pc. They are simply not the same and to say that it doesn't matter is a joke. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, dalekphalm said:

Textures are not render resolution. Those are different things. Hell, you can use 4K textures on a game rendered at 1080p if you wanted.

^^^ Exactly. AFAIK the "Very High" textures in Rise of The Tomb Raider are 4K textures. I've played that game at 1080p with Very High Textures, it'll take 8.5GB VRAM (on a 16GB VRAM card, only 7.5GB on an 8GB VRAM card), at 4K it's only 1GB more, for 9.5GB. So there's little difference in textures between the resolutions (pretty sure if I ran it at 4K on a card with 8GB VRAM it'd keep the usage at or below that, same as with 1080p). Star Citizen is also (AFAIK) one of the earlier games to start using full 4K (they may be larger tho) textures, and I've ran that with max quality textures at 1080p as well. 

Hell, even this rando on Steam in 2016 knew that:

2092728879_ScreenShot2019-12-18at12_01_28PM.png.478c44f1e704407f864f8fe151e40736.png

 

So basically this:

6 minutes ago, dalekphalm said:

4K is 4K is 4K. Either the game renders output at that resolution or it doesn't.

There are some console titles that render at 1440-1800p then upscale, others run at straight, full 4K resolution. Any time you play a game rendering at 4K (3840x2160 most of the time, I believe 4096xsomething counts as 4K too), that's 4K gaming. Doesn't matter if it's on a console or PC or you run an emulator on a raspberry pi and somehow get it to run the OG Zelda games at 3840x2160, if the game renders at that resolution, and you play it, you're 4K gaming baby ?

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/16/2019 at 11:51 AM, JZStudios said:

No, it's literally irrelevant. The console shows true 4k. Just because unsurprisingly the PC might do it slightly better doesn't make it false. Does it also not do 1080 because the PC version can have higher textures and framerate?

That's a stupid argument that has no basis in what actually qualifies for what's being advertised. And if you watched the video I linked, the Xbox 4k is pretty much identical to the PC 4k, minus framerate. The draw distance is the same, the textures are the same, the model quality is the same... PC has a higher framerate, maybe better shadows, and anisotropic filtering which I don't understand why consoles still don't have.

i rather have 1440p 60 fps or heck give me 1080p 120 fps. for most games i prioritize how playing the game feels over how it looks. if its a turn based game maybe bring out the 4k but for games like rdr2 i rather have a better playing experience

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, Brooksie359 said:

Let me repeat what I have said more than enough times now. I have never said they do not run at 4k but rather that running a game at 4k on a console is not the same as running a game at 4k on pc. They are simply not the same and to say that it doesn't matter is a joke. 

The original fucking conversation you replied to was about RESOLUTION and nothing more. So saying that 4K is not the same (which was your exact argument) is saying its not the same. Give up dude. You keep moving goal posts and trying to change your argument. You've failed. Move on and stop making a fool of yourself.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, leadeater said:

There is still a bit of a mix here, most games have now started to separate display resolution and render resolution and not tie any settings at all to either but I know there are games that do even now but especially older ones. I forget the name but there was a semi recently large(er) profile game that did it which I only know about from reviewers having a bitch about it. I'd consider it the exception to the norm though.

 

I think part of the problem is for the longest time you couldn't pick render resolutions, outside of editing ini files that happen to have that settings, so there is a strong association between display/window resolution and render/picture quality in the PC gaming community. This is also encountered in other areas like video, FP having higher bit rates but people just want to talk resolution and don't understand 4k Youtube is actually low quality.

 

Games that only have basic graphics settings or lack explicit settings like texture quality do exist and are a bit of a problem, how often people encounter them is probably tied to the types of games they prefer to play. I have a lot of RTS and TBS games that only have very basic graphics settings.

I have very little idea as to what you just said, or what relevance it has.

These things have always had an internal and output resolution, regardless of how it's reported. Playing a 4k video on a 1080 screen doesn't make it a 1080 video. The source is still 4k. Similarly, playing a 1080 video on a 4k screen doesn't make it 4k, it's still 1080.

 

And regardless of display/render resolution you can interchange and swap every other graphical setting and it's still 4k. Assuming of course you're not messing with an internal resolution scaler that's become somewhat common.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, TechyBen said:

Fear at 4k is fear at 4k. IIRC HL2 can be played at 4k. That's a "4k game" if on PC or on console and running* at 4k.

Snake at 4k is "snake at 4k". No marketing fakery there if running native at 4k with 4k render (either pixel art for that, or vector/3d rendered polys).

Do you understand the concept of context? I'm not the one arguing that playing an old "ugly" game at 4k isn't 4k. The other guy is arguing that console 4k isn't the same because they have, in some instances, lower graphics settings.

Thus, FEAR is an old game, with low quality assets and basic direct lighting. Does playing that game in 4k mean it's not comparable to RDR2 at 4k, despite the fact that 4k is a resolution, not a graphics setting or texture quality?

3 hours ago, Brooksie359 said:

There is such thing as 4k textures so you are absolutely wrong. 4k on pc and 4k on console are different and there is nothing you can say that will change that. 

When in the fuck did I say there weren't?

Also, fun fact, most "4k ready" textures are not 4k, they're 2k or lower. Basically smaller things like water bottles or soda cans have lower res textures because they have significantly less screen space. You also try to prioritize smaller repeating textures without obvious distinguishing marks. That doesn't mean that there aren't textures in 4k, for example a car might have the entire exterior mapped to a 4k texture. Unless of course you're Hideo Kojima and you need that Monster Energy money so you slap a 8k texture on a soda can.

 

But anyways, yeah, 4k on console is exactly identical to 4k on PC. 4k and graphics settings are not tied together and this argument is fucking stupid. You can play RDR2 on PC, at 4k, with pure dogshit graphics settings. It's 100% possible, because 4k is not a graphical setting.

This is so fucking stupid. 4k is the amount of pixels being rendered and sent off for display. Fucking nowhere in there does texture resolution or model quality or draw distance make any amount of fucking difference to the total amount of pixels rendered.

3 hours ago, TechyBen said:

No. "4K textures" are marketing speak, and not real. They are "high res textures", and renamed to sell more games.

Well yes, but the idea is that... I forget the technical term, it's not Voxel, it's like "screen space pixel size." Anyways, the idea is that at 4k you don't see the individual texture pixels like you do in old games (Like FEAR or HL2) with high res output.

Might be texels. Texture pixel size in the final rendered product. There's a good possibility that "4k ready" textures are actually too high res to get any visual benefit out of 1080 or possibly 1440. And they're massive, like the 50gb texture files for FFXV. I mentioned a little earlier that they're mostly around 2k, a very large portion of non "4k ready" textures are 1024 or 512.

2 hours ago, Brooksie359 said:

Let me repeat what I have said more than enough times now. I have never said they do not run at 4k but rather that running a game at 4k on a console is not the same as running a game at 4k on pc. They are simply not the same and to say that it doesn't matter is a joke. 

Literally identical.

 

2 hours ago, Zando Bob said:

^^^ Exactly. AFAIK the "Very High" textures in Rise of The Tomb Raider are 4K textures. I've played that game at 1080p with Very High Textures, it'll take 8.5GB VRAM (on a 16GB VRAM card, only 7.5GB on an 8GB VRAM card), at 4K it's only 1GB more, for 9.5GB. So there's little difference in textures between the resolutions (pretty sure if I ran it at 4K on a card with 8GB VRAM it'd keep the usage at or below that, same as with 1080p). Star Citizen is also (AFAIK) one of the earlier games to start using full 4K (they may be larger tho) textures, and I've ran that with max quality textures at 1080p as well. 

Well, yeah, but you probably aren't actually getting much visual difference at 1080 unless your just really running into objects or staring at walls. Or using binoculars on something 20 feet away. You could probably pretty safely drop the texture res and get a FPS boost with little to no discernible difference. The issue with high res textures is you can't just make everything literally 4k, because that just wastes assets and 4k is a pretty damn massive image. People that don't print images don't really understand how massive these actually are.

For reference, my buddy has a rather humble 75" 4k TV in his living room (He couldn't physically fit a larger one) and I still have to sit within 2-3 feet to see the pixels. Transfer that 75" into a single object on that screen, and there's no way I'm seeing the pixels and it's just way overkill. a single 4k texture would be like a high res billboard of the side of an entire building, or the complete vehicle or something.

2 hours ago, spartaman64 said:

i rather have 1440p 60 fps or heck give me 1080p 120 fps. for most games i prioritize how playing the game feels over how it looks. if its a turn based game maybe bring out the 4k but for games like rdr2 i rather have a better playing experience

Agreed, some games have performance or quality modes that let you switch, but some are just stuck at 4k30 or something.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Zando Bob said:

^^^ Exactly. AFAIK the "Very High" textures in Rise of The Tomb Raider are 4K textures. I've played that game at 1080p with Very High Textures, it'll take 8.5GB VRAM (on a 16GB VRAM card, only 7.5GB on an 8GB VRAM card), at 4K it's only 1GB more, for 9.5GB. So there's little difference in textures between the resolutions (pretty sure if I ran it at 4K on a card with 8GB VRAM it'd keep the usage at or below that, same as with 1080p). Star Citizen is also (AFAIK) one of the earlier games to start using full 4K (they may be larger tho) textures, and I've ran that with max quality textures at 1080p as well. 

Hell, even this rando on Steam in 2016 knew that:

2092728879_ScreenShot2019-12-18at12_01_28PM.png.478c44f1e704407f864f8fe151e40736.png

 

So basically this:

There are some console titles that render at 1440-1800p then upscale, others run at straight, full 4K resolution. Any time you play a game rendering at 4K (3840x2160 most of the time, I believe 4096xsomething counts as 4K too), that's 4K gaming. Doesn't matter if it's on a console or PC or you run an emulator on a raspberry pi and somehow get it to run the OG Zelda games at 3840x2160, if the game renders at that resolution, and you play it, you're 4K gaming baby ?

Exactly - either it renders at 4K or it doesn't. Any rendered resolution can be scaled up or down by the display resolution - and yes both consoles do that frequently to save on the performance budget. But if a game renders at 4K then it renders at 4K. All other considerations are separate metrics.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, JZStudios said:

Well, yeah, but you probably aren't actually getting much visual difference at 1080 unless your just really running into objects or staring at walls. Or using binoculars on something 20 feet away. You could probably pretty safely drop the texture res and get a FPS boost with little to no discernible difference. The issue with high res textures is you can't just make everything literally 4k, because that just wastes assets and 4k is a pretty damn massive image. People that don't print images don't really understand how massive these actually are.

Aight so first-off, this was as an example to back up dalekphalm on textures having nothing to do with render resolution, for example, running 4K textures with a render resolution of 1080p. 

In Tomb Raider changing the texture size changes the textures on literally everything, I did notice differences (I forget the exacts, I did my playthrough and testing of Rise of The Tomb Raider back when I had my 1080 and Vega FE) and so I preferred the max textures. But then I'm also the guy who (in that type of game) slows down and examines minute details and also loves the work put into the textures on especially clothing and guns, which I want as good-looking as possible. Also it still ran at a playable framerate at 1080p with everything on max on both my 1080 and Vega FE, and with High textures (Very High ran out of VRAM and crashed) on my 980 Ti, so I didn't need an fps boost when gaming. 

In something like Destiny 2 I can basically run all low and not notice much of a difference, but for screenshots I like to run it at 4K, 200% render scale, max AA, downscaled back to my native 1080p (actually not really downscaling, the game is running and rendering at 4K, but both the built in screenshot utility and Shadowplay/Relive capture it at 1080p since that's the resolution of my monitor, IDK why but it's annoying), since the textures look much, much better that way. It's also the reason I want a 4K monitor, so I can take screenshots at full res, without needing to use DSR to set my monitor resolution itself to 4K and then taking the pics (since I can't switch back to 1080p very easily in-game, whereas I can with the in game settings). 

So yee I know all the stuff about those textures not being "worth it" in general, but they are for me, and I have my own reasons for running them in day-to-day gaming, aside from the reasons I had for the examples I provided, which was specifically testing for VRAM usage at different resolutions and comparing it between 8GB and 16GB VRAM cards. 

 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Zando Bob said:

Aight so first-off, this was as an example to back up dalekphalm on textures having nothing to do with render resolution, for example, running 4K textures with a render resolution of 1080p. 

In Tomb Raider changing the texture size changes the textures on literally everything, I did notice differences (I forget the exacts, I did my playthrough and testing of Rise of The Tomb Raider back when I had my 1080 and Vega FE) and so I preferred the max textures. But then I'm also the guy who (in that type of game) slows down and examines minute details and also loves the work put into the textures on especially clothing and guns, which I want as good-looking as possible. Also it still ran at a playable framerate at 1080p with everything on max on both my 1080 and Vega FE, and with High textures (Very High ran out of VRAM and crashed) on my 980 Ti, so I didn't need an fps boost when gaming. 

In something like Destiny 2 I can basically run all low and not notice much of a difference, but for screenshots I like to run it at 4K, 200% render scale, max AA, downscaled back to my native 1080p (actually not really downscaling, the game is running and rendering at 4K, but both the built in screenshot utility and Shadowplay/Relive capture it at 1080p since that's the resolution of my monitor, IDK why but it's annoying), since the textures look much, much better that way. It's also the reason I want a 4K monitor, so I can take screenshots at full res, without needing to use DSR to set my monitor resolution itself to 4K and then taking the pics (since I can't switch back to 1080p very easily in-game, whereas I can with the in game settings). 

So yee I know all the stuff about those textures not being "worth it" in general, but they are for me, and I have my own reasons for running them in day-to-day gaming, aside from the reasons I had for the examples I provided, which was specifically testing for VRAM usage at different resolutions and comparing it between 8GB and 16GB VRAM cards. 

 

You might be able to go into your Nvidia control panel and "OC" your monitor to 1440 or something.

Makes me wonder what res textures Tomb Raider actually uses. But if you're internal rendering at 4k and scaling it down to 1080 it'll make things look sharper because it'll have better sub pixel blending and less noticeably visual aliasing.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, JZStudios said:

Makes me wonder what res textures Tomb Raider actually uses. But if you're internal rendering at 4k and scaling it down to 1080 it'll make things look sharper because it'll have better sub pixel blending and less noticeably visual aliasing.

Yep. It looks much nicer, like I already said, so that's why I do it for screenshots. 

1 minute ago, JZStudios said:

You might be able to go into your Nvidia control panel and "OC" your monitor to 1440 or something.

.... Yes. No. That's DSR (same as in-game, you can use it on the whole monitor itself), and I specifically said I didn't want to do that because I can't switch back easily while in-game. I forget the name in AMD's control panel, but it does the same thing, and I don't do it for the same reason, I can't switch back easily while in-game. 

3 minutes ago, JZStudios said:

Makes me wonder what res textures Tomb Raider actually uses.

From what I've seen it's 4K, but as you and others have stated that doesn't mean they're all actually 4K, or really give any specific resolution for any of them. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, JZStudios said:

These things have always had an internal and output resolution, regardless of how it's reported. Playing a 4k video on a 1080 screen doesn't make it a 1080 video.

Not all games have a selection possibility where you can pick different display/window resolution to the render resolution, that ability is quite new in fact. A lot of games pick texture resolution based on display resolution. I have games with only one graphic setting drop down, low/medium/high. That's it, and on top of that the resolution does effect the settings. Not all games give you all options or control of them, these may not be your bigger titles but if we just go purely by number of games that exist big titles are actually not the majority of the market.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Derangel said:

The original fucking conversation you replied to was about RESOLUTION and nothing more. So saying that 4K is not the same (which was your exact argument) is saying its not the same. Give up dude. You keep moving goal posts and trying to change your argument. You've failed. Move on and stop making a fool of yourself.

How about you read what I said instead of saying I moved some imaginary goal post. I never once changed my argument you just keep on insisting on said something I absolutely didn't so stop putting words in my mouth. I said running 4k on console and running 4k on pc are not the same because the visuals on pc are better. 

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Brooksie359 said:

How about you read what I said instead of saying I moved some imaginary goal post. I never once changed my argument you just keep on insisting on said something I absolutely didn't so stop putting words in my mouth. I said running 4k on console and running 4k on pc are not the same because the visuals on pc are better. 

Stop talking about 4K. What you really mean is:

ftfy

Quote

Running console level fidelity on a console and and running ultra/max settings fidelity on a high end pc are not the same, because the visuals on the pc are better

 

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Brooksie359 said:

How about you read what I said instead of saying I moved some imaginary goal post. I never once changed my argument you just keep on insisting on said something I absolutely didn't so stop putting words in my mouth. I said running 4k on console and running 4k on pc are not the same because the visuals on pc are better. 

Or worse. If you build a rig for 4K at a $349 price point (what the 1TB is going for right now, and that's with Jedi Fallen Order included), including the mouse and keyboard - the console includes a controller after all, which you use for the OS and games - then it'll likely look way worse at 4K. 
 

The PC doing 4K with better graphics is at a much higher price point lol. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, dalekphalm said:

Stop talking about 4K. What you really mean is:

ftfy

 

I mean both. Running a game on pc is not the same as running a game on console because of the difference in fidelity but to say one has nothing to do with the other is just simply wrong. When you run something on a console at 4k there is a level of fidelity attached to that so they are not separate matters. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Brooksie359 said:

I mean both. Running a game on pc is not the same as running a game on console because of the difference in fidelity but to say one has nothing to do with the other is just simply wrong. When you run something on a console at 4k there is a level of fidelity attached to that so they are not separate matters. 

You are correct, but only kind of - in the sense that on most consoles, the graphical settings tend not to have any adjustments at all - or rudimentary ones tied to resolutions.

 

But that does not change the point - while 4K happens to be tied to lower graphical fidelity on a console, compared to 4K on a very high end PC running Ultra/Max graphical options.

 

But that still has nothing to do with 4K besides the fact that the console dev pre-targeted a graphical target - whereas on PC, you've generally got a wide variety of graphical options.

 

In fact, a console running a game at 4K often indeed looks and runs identically to a mid-range PC running graphical options low or medium, etc.

 

So does that mean that PC isn't running 4K? Because you'd be wrong. It would indeed be running near enough identical 4K to the console.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, dalekphalm said:

You are correct, but only kind of - in the sense that on most consoles, the graphical settings tend not to have any adjustments at all - or rudimentary ones tied to resolutions.

 

But that does not change the point - while 4K happens to be tied to lower graphical fidelity on a console, compared to 4K on a very high end PC running Ultra/Max graphical options.

 

But that still has nothing to do with 4K besides the fact that the console dev pre-targeted a graphical target - whereas on PC, you've generally got a wide variety of graphical options.

 

In fact, a console running a game at 4K often indeed looks and runs identically to a mid-range PC running graphical options low or medium, etc.

 

So does that mean that PC isn't running 4K? Because you'd be wrong. It would indeed be running near enough identical 4K to the console.

I doubt that would be the case for red dead redemption 2 the game that I said was a bad example. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Zando Bob said:

.... Yes. No. That's DSR (same as in-game, you can use it on the whole monitor itself), and I specifically said I didn't want to do that because I can't switch back easily while in-game. I forget the name in AMD's control panel, but it does the same thing, and I don't do it for the same reason, I can't switch back easily while in-game. 

Okay... sure I guess, but that would work with every game, and if your FPS is still high I don't know what the problem would be. I did it when playing Firewatch at 1440 on my 1080. Looked nice, but dropped my FPS too much.

 

4 hours ago, leadeater said:

Not all games have a selection possibility where you can pick different display/window resolution to the render resolution, that ability is quite new in fact. A lot of games pick texture resolution based on display resolution. I have games with only one graphic setting drop down, low/medium/high. That's it, and on top of that the resolution does effect the settings. Not all games give you all options or control of them, these may not be your bigger titles but if we just go purely by number of games that exist big titles are actually not the majority of the market.

I... never said that was common ... and I've NEVER seen a game pick graphics options based on resolution, from 1998 to today. Even games with a simple low/medium/high shouldn't be tied to resolution, and I've never seen a game only have low/medium/high without a resolution option, unless it just matched whatever your Windows registers it as.

 

31 minutes ago, Brooksie359 said:

I doubt that would be the case for red dead redemption 2 the game that I said was a bad example. 

One of the most graphically intensive games in recent years looks virtually identical on PC and Xbox.

Bad example.

 

...okay.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, JZStudios said:

Okay... sure I guess, but that would work with every game, and if your FPS is still high I don't know what the problem would be. I did it when playing Firewatch at 1440 on my 1080. Looked nice, but dropped my FPS too much.

 

I... never said that was common ... and I've NEVER seen a game pick graphics options based on resolution, from 1998 to today. Even games with a simple low/medium/high shouldn't be tied to resolution, and I've never seen a game only have low/medium/high without a resolution option, unless it just matched whatever your Windows registers it as.

 

One of the most graphically intensive games in recent years looks virtually identical on PC and Xbox.

Bad example.

 

...okay.

They don't look identical lol. And don't show me a 4k video on YouTube saying otherwise because that is not the same as what you would see running it on a pc. I have the game and it does not look like the console version at 4k. 

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, JZStudios said:

and I've NEVER seen a game pick graphics options based on resolution

There's a lot of them, usually ports or low budget games or ones where graphic settings legit don't matter like RTS/TBS games where the asset quality is crap regardless (got heaps of these).

 

46 minutes ago, JZStudios said:

Even games with a simple low/medium/high shouldn't be tied to resolution

That doesn't mean this doesn't exist, should and is are not the same thing. One could spend the effort to give you those options or they could choose not to because the game isn't COD 234 Re-Remaster 2040. And now lately we have started getting extra things that get in the way like Maximum vram setting so things scale down which ever way the game feels like to meet that setting so you have no good idea what your texture resolution or LoD is at any given point in time because it changes.

 

I wasn't saying your original statement was wrong in grander sense, it's just that there are games that do have limited options and scale settings based on resolution. Plus the point about display resolution and render resolution being tied for so long to most people that is effectively the same thing, which it isn't. 

 

46 minutes ago, JZStudios said:

unless it just matched whatever your Windows registers it as

That's exactly how it's done, and boarderless Window only matched to display resolution. Pain in the ass.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

There's a lot of them, usually ports or low budget games or ones where graphic settings legit don't matter like RTS/TBS games where the asset quality is crap regardless (got heaps of these).

So shitty games are the standard we're going for here?

2 hours ago, leadeater said:

That doesn't mean this doesn't exist, should and is are not the same thing. One could spend the effort to give you those options or they could choose not to because the game isn't COD 234 Re-Remaster 2040. And now lately we have started getting extra things that get in the way like Maximum vram setting so things scale down which ever way the game feels like to meet that setting so you have no good idea what your texture resolution or LoD is at any given point in time because it changes.

Again, shitty games from assumingly indie devs (Again I have games dating back to 1994 that don't have this issue) is not what we're discussing. I only have a handful of strategy games, but even Fantasy General (1996) and Civ 3 (2004) don't have that issue. At absolute worst, something running in DosBox like Tyrian 200 and Fantasy General just have a set resolution that gets scaled to your monitor. Either way, I've never seen either of these things you're describing.

Hell, Splinter Cell (2002) actually works perfectly fine natively at triple screen 5760x1080. And that's an OG Xbox port.

3 hours ago, leadeater said:

That's exactly how it's done, and borderless Window only matched to display resolution. Pain in the ass.

...sure. Why is it a pain in the ass to just run the game at native resolution?

 

3 hours ago, Brooksie359 said:

They don't look identical lol. And don't show me a 4k video on YouTube saying otherwise because that is not the same as what you would see running it on a pc. I have the game and it does not look like the console version at 4k. 

They literally zoom in on details, and outside of anisotropic filtering it's virtually identical.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, JZStudios said:

shitty games from assumingly indie devs

Tell that to Square Enix console ports ?

 

2 hours ago, JZStudios said:

...sure. Why is it a pain in the ass to just run the game at native resolution?

Because even if a Crossfire profile does not exist you cannot force on Crossfire in Boarderless Window only Exclusive Fullscreen. Yea I do care.

 

2 hours ago, JZStudios said:

So shitty games are the standard we're going for here?

No you said such a thing does not exist, yes it does. No need to blow it out to all, like damn it in my first comment I explicitly said exception to the rule. Did I need to bold that? Honestly I don't think you need you need to reply, it's not adding anything more to this. You seem to think I'm saying more than I actually am. It's my problem that my game library is full of games that have terrible graphics options, that's how I know it's a thing.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, JZStudios said:

So shitty games are the standard we're going for here?

Again, shitty games from assumingly indie devs (Again I have games dating back to 1994 that don't have this issue) is not what we're discussing. I only have a handful of strategy games, but even Fantasy General (1996) and Civ 3 (2004) don't have that issue. At absolute worst, something running in DosBox like Tyrian 200 and Fantasy General just have a set resolution that gets scaled to your monitor. Either way, I've never seen either of these things you're describing.

Hell, Splinter Cell (2002) actually works perfectly fine natively at triple screen 5760x1080. And that's an OG Xbox port.

...sure. Why is it a pain in the ass to just run the game at native resolution?

 

They literally zoom in on details, and outside of anisotropic filtering it's virtually identical.

I have the game and a 4k monitor it is not the same at all. Showing footage on YouTube is going to look different than it does in person because detail is lost in the process of recording and uploading to youtube so a bad way to compare. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/12/2019 at 7:19 PM, Caroline said:

paid online and you can't "verify" games without being online

 

consoles are a joke

If you're on PC, NO ONE is buying disks for that. You HAVE to be online to get a game for PC. To verify, you have to be online for PC. YES, you don't need to pay for online services but you can pay for certain gaming services if you want. PC you do get a choice for that.

Link to comment
Share on other sites

Link to post
Share on other sites

To me, it looks like we are going for a standing desk top type of console. It's turning into more of a PC. This machine is like a home theater PC. Unless I was going cheap, I would use an xbox for home theater experience. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×