Jump to content

How likely is that the newest AAA games could get optimized to the point where they almost run double the fps on same settings?

podkall

bimd.png.ad9519da19262c23015dba4516bd96a8.png

Note: Users receive notifications after Mentions & Quotes. 

Feel free to ask any questions regarding my comments/build lists. I know a lot about PCs but not everything.

PC:

Ryzen 5 5600 |16GB DDR4 3200Mhz | B450 | GTX 1080 ti

PCs I used before:

Pentium G4500 | 4GB/8GB DDR4 2133Mhz | H110 | GTX 1050

Ryzen 3 1200 3,5Ghz / OC:4Ghz | 8GB DDR4 2133Mhz / 16GB 3200Mhz | B450 | GTX 1050

Ryzen 3 1200 3,5Ghz | 16GB 3200Mhz | B450 | GTX 1080 ti

Link to comment
Share on other sites

Link to post
Share on other sites

As a programmer, who has also participated in nearly a dozen Game Jams, I think that, if a company devoted full employee time to nothing but optimization for those games for a month or two, you probably could get the games to run at double their current FPS.

 

The problem is, the ROI on that sort of thing is minimal. They make money when they sell the game, or on DLC, skins, legalized gambling loot boxes, etc. The economic incentive to do this isn't there.

Link to comment
Share on other sites

Link to post
Share on other sites

We really need different words. 

These engines ARE optimized. they are doing FAR more under the hood with fewer resources than ever before. It just so happens we are using that ability to do... MORE. 

We had this same conversation when DX11 came out with tessellation. The methods we had to draw models at that much detail before DX11 was far more expensive than tessellation, regardless of how expensive tessellation was. 

Also I know your meme says high, But if a game can be run at 60fps 4k on the flagship on ultra at launch. The devs fucked up and left fidelity on the table. 

Link to comment
Share on other sites

Link to post
Share on other sites

I am glad enough if they can consistently release a bug free game at launch.

There is approximately 99% chance I edited my post

Refresh before you reply

__________________________________________

ENGLISH IS NOT MY NATIVE LANGUAGE, NOT EVEN 2ND LANGUAGE. PLEASE FORGIVE ME FOR ANY CONFUSION AND/OR MISUNDERSTANDING THAT MAY HAPPEN BECAUSE OF IT.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, starsmine said:

We really need different words. 

These engines ARE optimized. they are doing FAR more under the hood with fewer resources than ever before. It just so happens we are using that ability to do... MORE. 

We had this same conversation when DX11 came out with tessellation. The methods we had to draw models at that much detail before DX11 was far more expensive than tessellation, regardless of how expensive tessellation was. 

Engine optimization and game optimization are two separate issues.

 

During practice for a game jam, I accidentally started nesting game objects onto a prefab instead of just instantiating a new object like I had intended. The game, that was very simple, started taking longer and longer to launch, and eventually had slowed to a crawl. The Unity engine was not at fault, I was - the game developer.

 

That was a particularly dumb mistake, which is why it comes to mind so easily, but smaller mistakes or sub-optimal methods of coding happen all the time. The bigger the game, the more likely such mistakes are to creep in, as you have more cooks in the kitchen and it's harder to spot errors.

 

It's not just "games are doing more." It's that games studios aren't prioritizing code cleanup and optimization in favor of putting in more features. Which is a balancing act that needs to be done.

 

There's a guy who recently went back and cleaned up the code of the original Super Mario 64 game, and got it to run at 60fps on N64 hardware, just by optimizing the game using modern coding techniques and the benefit of hindsight. There is no reason that games released today couldn't have the same thing done to them.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, starsmine said:

if a game can be run at 60fps 4k on the flagship on ultra at launch. The devs fucked up and left fidelity on the table. 

So much this.  It used to be almost no 3D game could run at top setting at launch, that was kinda the point that you could run them looking even better in a year or two when new GPUs came out, that's the point of having settings.

 

The fact that Ultra these days is often only aiming at current top-end hardware kinda sucks.

 

Is the MEME supposed to be sarcastic?  Games have never been optimised to run at 4K 120fps.

 

3 minutes ago, YoungBlade said:

There's a guy who recently went back and cleaned up the code of the original Super Mario 64 game, and got it to run at 60fps on N64 hardware, just by optimizing the game using modern coding techniques and the benefit of hindsight. There is no reason that games released today couldn't have the same thing done to them.

Technically sure, you are correct, but you already pointed out the reason why it can't happen - it would cost a fortune.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, starsmine said:

We really need different words. 

These engines ARE optimized. they are doing FAR more under the hood with fewer resources than ever before. It just so happens we are using that ability to do... MORE. 

We had this same conversation when DX11 came out with tessellation. The methods we had to draw models at that much detail before DX11 was far more expensive than tessellation, regardless of how expensive tessellation was. 

Also I know your meme says high, But if a game can be run at 60fps 4k on the flagship on ultra at launch. The devs fucked up and left fidelity on the table. 

That's a garbage take. I have consistently seen games come out that had amazing graphics and run at high fps and I have seen games come out with horrible fps and the graphics still sucked. Also fidelity is already so go on alot of games that you would be better off having it be easier to run on a larger range of rigs. 

Link to comment
Share on other sites

Link to post
Share on other sites

-Topic locked-

 

The body of this OP is best suited as a Status Update. 

"Put as much effort into your question as you'd expect someone to give in an answer"- @Princess Luna

Make sure to Quote posts or tag the person with @[username] so they know you responded to them!

 RGB Build Post 2019 --- Rainbow 🦆 2020 --- Velka 5 V2.0 Build 2021

Purple Build Post ---  Blue Build Post --- Blue Build Post 2018 --- Project ITNOS

CPU i7-4790k    Motherboard Gigabyte Z97N-WIFI    RAM G.Skill Sniper DDR3 1866mhz    GPU EVGA GTX1080Ti FTW3    Case Corsair 380T   

Storage Samsung EVO 250GB, Samsung EVO 1TB, WD Black 3TB, WD Black 5TB    PSU Corsair CX750M    Cooling Cryorig H7 with NF-A12x25

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×