Jump to content

Tim Sweeney explains his comments with respect to IO pef on PS5

hishnash
16 minutes ago, Sauron said:

Says who...? Based on what information detailing the settings RDR2 runs on a ps4...? Based on what benchmark performance measured on a ps4...? Not to mention that gpu performance can vary over time depending on gpu generation etc, flops are not a particularly good indicator of absolute performance.

 

Also here we're talking about hardware that, on paper, is faster, not equally fast. Hey, maybe it's true - but this is something that has been blatantly lied about in the past and mr.Sweeney has a vested interest in hyping the console.

The PS3 had radically different hardware from PCs, it wasn't directly comparable. Current consoles just have slightly modified AMD chips.

If you say so. I feel like I've made my case just fine.

Based on the fact that any given game today CANNOT run on a 1.84tf card on PC at the same resolution as a base ps4 with similar settings and fps ignoring the fact that you made up your claims previously. You are during yourself as the source of a quote that never existed.

Link to comment
Share on other sites

Link to post
Share on other sites

I got a NVMe PCIe 3 SSD because I did think games might come that required it. Hope it doesn't happen that not even that is fast enough in its lifetime.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, 3rrant said:

Based on the fact that any given game today CANNOT run on a 1.84tf card on PC at the same resolution as a base ps4 with similar settings and fps ignoring the fact that you made up your claims previously. You are during yourself as the source of a quote that never existed.

Uhm again how do you know what settings that game runs at on a ps4? You're dodging the question...

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Commodus said:

What I suspect: gaming on PCs will have a partial handicap for a while. 

I think the largest issue will be latency, raw throughput you can get around with loading screens if need be. 

But having a very low latency connection means you can load data on demand to the gpu without frame stuttering.

 

The solution might be for PC games to require much larger amounts fo VRAM so that you have more time to pre-load game data into VRAM before its needed. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Sauron said:

Uhm again how do you know what settings that game runs at on a ps4? You're dodging the question...

Are you blind? Set a console on a game, open the same game on PX. Set the resolution equal, setup the settings on PC until they look as close as possible (actually, THE SAME because the underlying textures and assets are the same). Use a 1.84tf card on that PC and enjoy your benchmark. This ofc require functioning eyes. Where did you take that claim from? Dodging the question? Oof

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Mihle said:

I got a NVMe PCIe 3 SSD because I did think games might come that required it. Hope it doesn't happen that not even that is fast enough in its lifetime.

The solution in the PS5 has a lot of custom hardware that does not exist on PC, so until that kind of hardware is implemented (if ever) anything over a normal SATA SSD on PC won't really make a difference on Pc for gaming

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, hishnash said:

I think the largest issue will be latency, raw throughput you can get around with loading screens if need be. 

But having a very low latency connection means you can load data on demand to the gpu without frame stuttering.

 

The solution might be for PC games to require much larger amounts fo VRAM so that you have more time to pre-load game data into VRAM before its needed. 

That's likely the case.  PS5 devs could design many games on the assumption that there will never need to be a loading screen even with intense levels of detail; PC devs may have to fence things off at times or, like you said, dump more things in VRAM (or general RAM, or...) to keep things humming.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, 3rrant said:

Are you blind? Set a console on a game, open the same game on PX. Set the resolution equal, setup the settings on PC until they look as close as possible (actually, THE SAME because the underlying textures and assets are the same).

Aside from the fact I strongly doubt you went and did that (and even if you did that would be anecdotal evidence), you can't know if the assets are exactly the same nor can you tell with the naked eye exactly at what framerate the game is running at any given time.

 

There are methods to make a game look good and perform well on less capable hardware, don't get me wrong - that doesn't mean that hardware is doing the exact same thing faster.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Commodus said:

That's likely the case.  PS5 devs could design many games on the assumption that there will never need to be a loading screen even with intense levels of detail; PC devs may have to fence things off at times or, like you said, dump more things in VRAM (or general RAM, or...) to keep things humming.

Yer the scary part of that is how much extra VRAM/system ram you will need.

 

It is clear that part of what `nanite` does is only load the needed parts of the mesh into the GPU (there is no way a PS5 gpu could take a single one of those full meshes in one go).  By having very low latency it is possible that the gpu is fetching the needed parts fo the mesh as you camera moves.

 

You cant even think of doing this from system memory on a PC (due to the latency of the system driver, `GPU -> Kernel -> user-space driver -> kernel -> copy memory -> GPU`) so in these cases much more of the model will need to be loaded into the gpu. (this is why we normally have multiple resolutions of a model that we can then swap out, but if its not there just yet its ok we continue to display the lower res model. )

Its possible that the `max` res that is loaded on PCs is lower since you just cant stream the data in and (unless you have a workstation gpu) you dont have the VRAM needed to load it all, and even then you dont have the VRAM needed to load all 500 high res objects in scene.
 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Sauron said:

Aside from the fact I strongly doubt you went and did that (and even if you did that would be anecdotal evidence), you can't know if the assets are exactly the same nor can you tell with the naked eye exactly at what framerate the game is running at any given time.

 

There are methods to make a game look good and perform well on less capable hardware, don't get me wrong - that doesn't mean that hardware is doing the exact same thing faster.

Assets are the same. Identical to the core. This is a given in any type of video game development environment for a multiplatform project. Also, there are capture cards available both for PC and consoles alike to test frame rate and consistency, that's how Digital Foundry makes it's videos. So yes you can test it.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, 3rrant said:

Assets are the same. Identical to the core. This is a given in any type of video game development environment for a multiplatform project. Also, there are capture cards available both for PC and consoles alike to test frame rate and consistency, that's how Digital Foundry makes it's videos. So yes you can test it.

You can test the framerate with equipment, sure, but you can't know what the settings are through that equipment. If you can, please stop wasting both of our times and provide said evidence.

 

As for assets being the same, that's not a given at all. There have been plenty of instances of a game's assets being modified to accomodate different platforms. Call of duty games on the Wii sure as hell weren't using the same assets as the PC or PS3 versions. Besides the same assets can be downgraded if need be, for instance you can lower the resolution of some textures with virtually no added development time.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, FloRolf said:

Wouldn't that completely defy the purpose of all that low latency high bandwidth storage stuff? Most people have 50/10 connections or even worse. 

 

I agree though, Sweeney and Sony can tell me what they want, unless I see real world results it doesn't matter to me. In all honesty, I don't expect much.

I don’t have a good enough internet connection to get the full download performance of my ssd, but it does help loading games and large videos.

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, 3rrant said:

Based on the fact that any given game today CANNOT run on a 1.84tf card on PC at the same resolution as a base ps4 with similar settings and fps ignoring the fact that you made up your claims previously. You are during yourself as the source of a quote that never existed.

I don’t know what card your referencing (and your claim right here is false) but games tend to run better on a ps4 than a similairly prices pc because of optimization, those games where meant to run on ps4 not pc.

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, 3rrant said:

Assets are the same. Identical to the core. This is a given in any type of video game development environment for a multiplatform project.

See here, this is the problem, you can’t really change your setting on a console, so how do you know what your running at on the console?

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, hishnash said:

Need large OS kernel changes first. 

It is the sort of thing that apple could do easily but MS would struggle massively. Possibly we could see stuff like the Radeon SSG were SDDs dies are directly attached to the GPU. 

Microsoft has said that they will be bringing some parts of DirectStorage to PCs, although this won't result in the same IO performance improvements we're seeing here 

ʕ•ᴥ•ʔ

MacBook Pro 13" (2018) | ThinkPad x230 | iPad Air 2     

~(˘▾˘~)   (~˘▾˘)~

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, VegetableStu said:

 

you know, i'm kind of wondering if they could stream the model files over the internet, just like how youtube buffers their videos ahead of playtime o_o

so the local storage would be caching the BSPs of where the player would have the options to occupy, instead of having the entire game models ready on hand

No they will be using local storage, when people talk about streaming here they are talking about streaming into the GPUs VRAm since the entier model data (even for a single model) might be more than the total VRAM on the system, the idea is you selectively load the parts you need for the current distance/camara angle, then as the camara move.model moves you stream that data from the ssd to the gpu on demand.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Sauron said:

Also here we're talking about hardware that, on paper, is faster, not equally fast. Hey, maybe it's true - but this is something that has been blatantly lied about in the past and mr.Sweeney has a vested interest in hyping the console.

And what exactly is that?

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, VegetableStu said:

yeah, but having a billion triangles in a particular level (and let's face it, if the game only has one thematic level or beat, surely the studio would be in the position to optimise the models a bit) would make the game comparable to the collective filesize of Train Simulator with all its DLC ._.

 

all in all, i feel like nanite would benefit more for studios working for films compared to those for games, but for it to work as a game in every sense, i kinda wonder if 2TB (over NVMe) would be enough

They still had a lot of `duplicated` meshes, I recon there is a LOT of compression going on. Raw meshes don't need to take up that much space (you can compress mesh data a LOT). If it is true that the PS5 has a dedicated hardware fixed function decompression solution then it is very possible that this will be used a lot. 

That said i do expect them to support play before fully downloaded (like you can with Starcraft etc). 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, VegetableStu said:

you're thinking of that scene with a lot of statues; that's definitely something that can be instanced (reminds me of unlimited detail's early demos with the runescape BGM). i'm referring to the one where they used an IRL scan of a cave or valley of some sort (or that final flythrough scene, there has to be more than 20 unique models there), but admittedly i'm not aware of the amount of decimation and compression and the rough polycount-to-filesize density estimation in photogrammetry work, just from not playing with those models and source images firsthand enough ,_,

So if you have a face per pixel (as they were quoting) then that is ~ 3*INT32 + 3*FP32, on a 4 screen you have 8294400 pixels would imply ~= 200MB for a raw uncompressed continues mesh.  A mesh like this should be very compressible so i would expect to see that reduced to 50MB easily. A high resolution mesh like this does not use up much more space than a high resolution texture + medium res texture.

 

Yes the mesh on disk will be higher resolution than this maybe 10 times but that is still just 500MB with compression worst case.
 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, 3rrant said:

The solution in the PS5 has a lot of custom hardware that does not exist on PC, so until that kind of hardware is implemented (if ever) anything over a normal SATA SSD on PC won't really make a difference on Pc for gaming

I mean Xbox Series X also have an NVMe but as far as I know it isn't as custom as it is in PS5. If games was made for that PC NVMe could do it fine while SATA might not.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Sauron said:

You can test the framerate with equipment, sure, but you can't know what the settings are through that equipment. If you can, please stop wasting both of our times and provide said evidence.

 

As for assets being the same, that's not a given at all. There have been plenty of instances of a game's assets being modified to accomodate different platforms. Call of duty games on the Wii sure as hell weren't using the same assets as the PC or PS3 versions. Besides the same assets can be downgraded if need be, for instance you can lower the resolution of some textures with virtually no added development time.

Truthfully, comparing games on console vs PC's is sometimes comparing apples to oranges (in that you can't always compare them fairly).

 

Having predictable hardware can make a huge difference in terms of performance and what you could do in a game.  An example being writing a game to run on 70% of people's systems, vs designing a game to run on a "weaker" system...but with known constraints can result in a better looking game on the "weaker" hardware.

 

Not saying that PS4 was powerful or anything; but just that sometimes you can't run benchmarks on the "same" hardware and expect to be able to properly compare results.  The PS3 is a prime example, the beginning games on it were worse than PC versions but as developers began utilizing the cell better (and the memory management better), the games kept pace with PC's despite being "better" hardware.

 

The predictability of hardware though I think does play a larger role, and if Sony has made progress in tweaking the hardware to better benefit of streaming data between components, I could see this actually making a larger impact (in the sense that some features might be added to PS5 exclusive games that wouldn't be feasible on PC games due to how the render process would have to occur)

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, wanderingfool2 said:

The predictability of hardware 

The predicatblity is massive, even automated targeting that is not hand crafted like knowing how much L1 and L2 cache can let a compiler provide signifant optimisations so that when the code runs it has less cache misses.

 

When you scale this to the GPU side these wins go even larger, if you know the exact gpu arc you will be running on youcan pre-prosses your game data. Most gpus have a `prefured` memery alingment when it comes to accessing data/writing data.
 

If memory access is not alinged the GPU will need to add a load of adidtions bitshifts (and masks) to aline the data before reading and writing (massivly hamping perfomance).

 

You can handle this at runtime but it might require you to re-shape game data before copying it to the gpu (costing both cpu cycles but most importantly latancy).

 

A good example of this is in apples Metal api you can ask at runtime what the alignment you need to use for a texture https://developer.apple.com/documentation/metal/mtldevice/2866126-minimumlineartexturealignment however if you know that you are only every going to run on one family of GPUs you can do all of this work in adance and have your textures pre-shaped ready to go. Infact for iOS you can pre-prepeare mutliple versions of your assets and then when usrs download your app they get the correct version for thier hardware this can save a lot of cpu time and thus battery (very important on a mobile device).

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't really understand all the technical stuff going behind this, but isn't it good for consoles ?

I mean, unless they release a PS5 +, the PS5 will be stuck on its hardware for the complete longevity of the console.

 

At first, because of the dedicated hardware and all the stuff you guys are talking about, yeah a PS5 might run better than a PC (of the same price or even high end).

But to compensate its lack of optimization, the PC will still have more and more raw power one year after another.

 

I guess this is why you can still see very nice games running on a PS4, and those games are even nicer on PC imo, if you take RDR2 for instance.

CPU: i7 4790K | MB: Asus Z97-A | RAM: 32Go Hyper X Fury 1866MHz | GPU's: GTX 1080Ti | PSU: Corsair AX 850 | Storage: Vertex 3, 2x Sandisk Ultra II,Velociraptor | Case : Corsair Air 540

Mice: Steelseries Rival | KB: Corsair K70 RGB | Headset: Steelseries H wireless

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, wanderingfool2 said:

Truthfully, comparing games on console vs PC's is sometimes comparing apples to oranges (in that you can't always compare them fairly).

 

Having predictable hardware can make a huge difference in terms of performance and what you could do in a game.  An example being writing a game to run on 70% of people's systems, vs designing a game to run on a "weaker" system...but with known constraints can result in a better looking game on the "weaker" hardware.

 

Not saying that PS4 was powerful or anything; but just that sometimes you can't run benchmarks on the "same" hardware and expect to be able to properly compare results.  The PS3 is a prime example, the beginning games on it were worse than PC versions but as developers began utilizing the cell better (and the memory management better), the games kept pace with PC's despite being "better" hardware.

 

The predictability of hardware though I think does play a larger role, and if Sony has made progress in tweaking the hardware to better benefit of streaming data between components, I could see this actually making a larger impact (in the sense that some features might be added to PS5 exclusive games that wouldn't be feasible on PC games due to how the render process would have to occur)

That's kind of my point, a game running smoothly isn't necessarily an indication of the hardware performing beyond expectations.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Sauron said:

That's kind of my point, a game running smoothly isn't necessarily an indication of the hardware performing beyond expectations.

Yep, consoles just make you work for it. On PC you can expected new GPU archs every 2 years or so and optimization is simply less effective there due to variation in hardware where as consoles can last 5+ years and the hardware never changes. If you want to achieve better you have only one place to look, software optimization.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×