Jump to content

UE 5 demo featuring limitless detail: Gaze upon Unreal Engine's true form!

Delicieuxz
40 minutes ago, Commodus said:

Oh, I don't doubt that resolutions and frame rates can be higher. But you're not really missing out on the core experience, and early on (I figure roughly the first year or two) PC gamers sometimes have to look on with a bit of envy.  And until late in the console cycle, getting those higher resolutions and frame rates can be prohibitively expensive for some.  Yeah, the rumored 3080 Ti might well outrun the PS5's GPU in most respects, but it'll likely cost twice as much as the entire PS5.

TL;DR : Amazing graphics at 30fps doesnt mean next gen consoles will beat high end pcs.

 

PC doesn`t really need a 3080 ti to beat a ps5

you can already beat it with existing hardware, put the ssd thing aside cause it doesn`t mean alot (pc got 5 gb ssds, but in reality the are just for loading times since pc got more ram and v/ram, ps5 is using the ssd to compensate the 16gb of combined ram/vram), pc was always able to reach the quality of consoles and run better almost always, unless its a low end pc. If u wanna be on the level off consoles u can push everything to high/ultra settings and run on 30 fps, on a medium / low budget pc. my old rig which retired last summer at a age of almost 7 years had a 3rd gen i3 ,8gb ram and a 750ti(i had before it some nvidia gpu  which i bored from a frnd,forgot the model name) which was later upgraded to a 1050ti cause while cleaning the 750ti from dust, it slipped and got damaged. This rig handled most of the games at high 1080p high settings between 40~70 fps , which was already better than current gen consoles from 2013 till the end of summer 2019. Even then i could have kept running the same rig on high/medium settings and go down to 30 fps. But i upgraded to a i5 8600,16gb ram 1 tb nvme,1 tb hdd , room for more storage and a 2070 super which are now running all games at maxed out settings at 120+ fps @1080p, something which consoles could never do, even next gen. Next gen consoles will most probably run on high settings the same as pc, but hit 120+ fps too?? i doubt it.They may have a 120 fps mode that disables some graphic settings to reach it or run on a select number of games. Then again i hope i am wrong, cause each gen i get a xbox for the living room,otherwise i will start using steam link to play on the tv i guess.

 

For me frames are more important. They make a huge difference.One of the easiest ways to notice the difference is set up a racing game on a console and pc , like forza horizon 4 ,on xbone it ran at a 30 fps and driving on high speed was sluggish, meanwhile on pc it ran at 140+ fps on all maxed out settings, graphics looked much better than console and while driving on high speeds it was smooth AF which made the experience better.

 

It`s all marketing, every gen of consoles "OUTPERFORMS THE HIGHEST END PC!!!"

it would run decently, then 1 or 2 years later if u compare the games on all platforms u will see the consoles starting to lag behind pcs more (talking from my own experience)

 

 

  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Goliath_1911 said:

But i upgraded to a i5 8600,16gb ram 1 tb nvme,1 tb hdd , room for more storage and a 2070 super which are now running all games at maxed out settings at 120+ fps @1080p, something which consoles could never do, even next gen.

The consoles could do it, the PS5 GPU is likely at worst around the performance of the 5700XT while the XSX one is faster and this time there's no issues with underpowered CPU like it happened with the consoles before. So hardware wise the consoles could reach the performance of your 2070S, the issue is that often console players don't have the option to select high FPS over high resolution as often due to the popularity of 4K TVs the target is likely going to be something closer to 4K60(1440p60 or higher), but that might change with the increase in the popularity of TVs capable of 120Hz, giving more often the option to choose between high resolution or high frame rate.

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/14/2020 at 5:41 AM, StDragon said:

Wow, that's a sick real-time graphics demo like I've never seen before. Looking back, that's better looking than many CGI movie animations in the past, and that took a render farm of weeks to months of computational power (ray traced, I know, raw cycles required, not the same comparison in method). So while this isn't pure ray traced, it's good enough!

 

Well that settles it. It couldn't be more clear. We need a PS5 PCIe card for the GPU. 🤣

 

 

Pixar actually didn't fully raytrace a movie until Monster's University: https://www.neogaf.com/threads/pixar-uses-real-raytracing-for-first-time-in-monsters-university.569821/

Up until then they used a variety of shortcuts to reduce render times.

Resident Mozilla Shill.   Typed on my Ortholinear JJ40 custom keyboard
               __     I am the ASCIIDino.
              / _)
     _.----._/ /      If you can see me you 
    /         /       must put me in your 
 __/ (  | (  |        signature for 24 hours.
/__.-'|_|--|_|        
Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, FezBoy said:

Pixar actually didn't fully raytrace a movie until Monster's University: https://www.neogaf.com/threads/pixar-uses-real-raytracing-for-first-time-in-monsters-university.569821/

Up until then they used a variety of shortcuts to reduce render times.

🤯 I had no idea. I assumed it was all ray traced even going back to special 3D effects with reflectivity in the 80's were ray traced. It in fact is a very old method with the downside of requiring lots of general CPU cycles (software only). But, it's an extremely elegant method.

 

So I clicked on that link which lead to jrMan.

 

Quote

jrMan is an open source implementation of the REYES (Render Everything You Ever Saw) algorithm used by Pixar's PhotoRealistic Renderman to render images.

 

Although PhotoRealistic Renderman is used to generate most CGI (Computer Generated Images) in motion pictures, textbooks on computer graphics do not explain how its REYES algorithm works (actually many books do not even mention it!).

 

Learned something new. Thanks for sharing.

Link to comment
Share on other sites

Link to post
Share on other sites

Noice, probably optimized the shit out of the hardware but that's what you get with consoles. I still impressed how well Final Fantasy 7 Remake which use UE4 run on base PS4.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×