Jump to content

Does anyone else think.....

emosun

...that this is Nvidia trying as hard as they can to sell high refresh rate gaming as something we need due to lack of innovation/interest in the market in general?

 

Does High FPS make you a better gamer? Ft. Shroud - FINAL ANSWER

https://www.youtube.com/watch?v=OX31kZbAXsA

 

My two cents

In terms of realism , such as 3d objects and effects and textures..... not a lot has happened in several years. Be it down to lack of innovation , or game developers just hitting their time/budget limit for how real a game can look and still be profitable. I cannot help but feel that pushing a refresh rate of 240hz is just a way to create a new benchmark to hit despite 99% of people not needing it. More or less an excuse to create demand for something we didn't know we wanted until they told us we wanted it.

Basically , I wish they'd focus on realism over refresh rate. But I think moores law is forcing them to go down this refresh rate path to more or less hide the fact that they are short on ideas.

Link to comment
Share on other sites

Link to post
Share on other sites

High FPS does make things seem a lot smoother and nicer to look at, and the realism in games have also increased especially faces that look a lot more realistic you could think things aren't getting nicer because you're playing on lower settings. Also raytracing has increased realism a lot too.

Link to comment
Share on other sites

Link to post
Share on other sites

I dont think Nvidia's the one still trying to sell high refresh rate gaming, seeing how poor frame rates get with RTX and Gameworks features all cranked up to the max

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

The real advantage to pushing refresh rate over realism for game developers is that they have to focus a lot less on looks if the game is 'competitive'. What I mean is that they can simply say "you will want higher refresh rate, thus lower the settings anyways". That way looks are less important and the devs won't have to work on that as much; just have to focus on making a game that can be played at high refresh rates. 

If the dev is also making the game for console, that makes a ton of sense. That way they can simply push the same game to all platforms, the PC just has the ability to display more frames.

 

But I really don't feel this is the case.. Since the best example for that fps > settings thing is CS:GO, which is not for consoles. The other example would be Fortnite, which looks unrealistic for two reasons IMO:

1. art style

2. so 'everyone' can play it.

 

Personally I do think graphics have improved quite a bit in the last couple years. If you compare Skyrim to the Witcher 3 you can already see a large difference. Both were regarded as very pretty games, at their time of release.

Although W3 is from 2015 and (IMO) no game has surpassed it in looks...

So maybe there is a point to be mad there.

"We're all in this together, might as well be friends" Tom, Toonami.

 

mini eLiXiVy: my open source 65% mechanical PCB, a build log, PCB anatomy and discussing open source licenses: https://linustechtips.com/topic/1366493-elixivy-a-65-mechanical-keyboard-build-log-pcb-anatomy-and-how-i-open-sourced-this-project/

 

mini_cardboard: a 4% keyboard build log and how keyboards workhttps://linustechtips.com/topic/1328547-mini_cardboard-a-4-keyboard-build-log-and-how-keyboards-work/

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, minibois said:

Although W3 is from 2015 and (IMO) no game has surpassed it in looks...

So maybe there is a point to be mad there.

that and skyrim is from 8 years ago and really wouldn't fall into last couple of years territory. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, emosun said:

that and skyrim is from 8 years ago and really wouldn't fall into last couple of years territory. 

What I meant is that there has been improvement in graphics expectations in the last decade.

Skyrim used to be the standard for graphics (until we saw how shit the textures looked, oops) until games like Metro, Witcher 3 and that stuff came on the scene.

 

Was just trying to give a counter point that not everyone is focusing solely on refresh rate.

"We're all in this together, might as well be friends" Tom, Toonami.

 

mini eLiXiVy: my open source 65% mechanical PCB, a build log, PCB anatomy and discussing open source licenses: https://linustechtips.com/topic/1366493-elixivy-a-65-mechanical-keyboard-build-log-pcb-anatomy-and-how-i-open-sourced-this-project/

 

mini_cardboard: a 4% keyboard build log and how keyboards workhttps://linustechtips.com/topic/1328547-mini_cardboard-a-4-keyboard-build-log-and-how-keyboards-work/

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Jurrunio said:

poor frame rates get with RTX

I'm actually quite amazed someone can consider 60fps as poor when Nvidias and Atis flagships were barely pushing 30fps when Crysis released.

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Darpyface said:

the realism in games have also increased especially faces that look a lot more realistic

 

usd84mVnsqW2iatGoBFziU-320-80.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, emosun said:

I'm actually quite amazed someone can consider 60fps as poor when Nvidias and Atis flagships were barely pushing 30fps when Crysis released.

 

back then 1024x768 was good, now it's not even acceptable on all but the cheapest of laptops

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Jurrunio said:

back then 1024x768 was good, now it's not even acceptable on all but the cheapest of laptops

Yeah so I don't consider the fps reduction of rtx an issue. let me know when the cards can't run the game at 30fps in low res and maybe theres an issue worth discussion then. lol 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't see how this is Nvidia pushing anything. I'd imagine it was Linus that went to them to do the video, or work on the colab, not the other way around. 

 

Higher refresh rates makes for a smoother experience, thus adding to realism. Though I'd say adaptive sync plays a huge role in realism as well. If anything the video shows that 60 to 144 makes a difference, then go 240 not as much. 

 

Also, why are you blaming Nvidia for what you see as a lack of graphics improvement? They don't make game engines. They assist in developing others, yes, but they don't make them. You're finger pointing at the wrong group. 

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Marketing 101 =  when lots of people believe something, tell them your product does that thing better.

 

Whether there is any benefit or not is moot. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, dizmo said:

Higher refresh rates makes for a smoother experience, thus adding to realism.

i dont consider 240hz to be realistic to how I see

47 minutes ago, dizmo said:

why are you blaming Nvidia for what you see as a lack of graphics improvement?

at no point did I

24 minutes ago, mr moose said:

Marketing 101 =  when lots of people believe something, tell them your product does that thing better.

 

Whether there is any benefit or not is moot. 

exactly

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, emosun said:

at no point did I

Haha, really? Because the whole tone of your post kind of had that effect. Unless you were referencing someone completely different when you closed saying "I wish they'd focus on realism over refresh rate." Which would be odd, considering Nvidia was the subject of the refresh rate bit.

 

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, emosun said:

i dont consider 240hz to be realistic to how I see

 

Given majority of the detail we think we see is created by the visual cortex and not actually seen by the eye, one can argue that any improvement from FPS above 100 is either due to something other than vision (e.g total system lag*) or simply placebo.

 

 

*the brain is much more able to detect time intervals between a signal being sent to he hand and observing the expected feedback from the screen, than the speed at which several frames occur using only the vision.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

given that devs care more about multiplayer so they can monetize the absolute fuck out of it, they have to make it relatively accessible to bring in the biggest potential of players. a large majority of developers don't care about making good games, they pick what is "safe" and know will bring in large swaths of cash, look at Call of Duty 4billion. Assassin's Creed Garden Shed. There absolutely is no more innovation in gaming. As you've said a super high end GPU is only needed for frame rates, not actual graphical processing power. 

 

Indie Devs (and dare i say MOBILE game devs) make more interesting games than the big AAA studios. I cannot think of a single game that i'm hyped for thats coming up for release. Sure there are a few stand out games that popup every so often, but those are few and FAR between.

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/26/2019 at 1:53 AM, dizmo said:

Haha, really? Because the whole tone of your post kind of had that effect. Unless you were referencing someone completely different when you closed saying "I wish they'd focus on realism over refresh rate." Which would be odd, considering Nvidia was the subject of the refresh rate bit.

 

look I'm not going to be able to fix the way you read things , that's going to be a you task.

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/26/2019 at 3:41 AM, Arika S said:

given that devs care more about multiplayer so they can monetize the absolute fuck out of it, they have to make it relatively accessible to bring in the biggest potential of players. a large majority of developers don't care about making good games, they pick what is "safe" and know will bring in large swaths of cash, look at Call of Duty 4billion. Assassin's Creed Garden Shed. There absolutely is no more innovation in gaming. As you've said a super high end GPU is only needed for frame rates, not actual graphical processing power. 

 

Indie Devs (and dare i say MOBILE game devs) make more interesting games than the big AAA studios. I cannot think of a single game that i'm hyped for thats coming up for release. Sure there are a few stand out games that popup every so often, but those are few and FAR between.

Yeah the only game i actually got excited for when i saw it was a tech demo someone made of starwars podracer but modernized. other than thats it's been such a long time....

Link to comment
Share on other sites

Link to post
Share on other sites

graphics are starting to hit 2 complex issues when it comes to graphical fidelity. first is the uncanny valley where the more realistic you make them look the more you can tell its non-human responses and expressions. The other is When increasing graphics you exponentially need more powerful hardware. i cannot remember the term for this but it explains that the more polygons to create more realistic settings the harder it will be to run. For every generational jump we need hardware to run it and render it, we are hitting a very big wall where consumer products might have issues with this in real time for  video game settings. We could look at monetization on games or making mobile games or trying to sell high FPS competative games and all that but I wanted to see the view of why we do not have graphical differences from PS1 days to PS3 anymore and instead going towards more refresh rates. We mostly get the same visuals/slight increases increased texture quality. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, emosun said:

look I'm not going to be able to fix the way you read things , that's going to be a you task.

I read it the way it's written, but nice try to duck the fact it's wrong.

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GodSeph said:

-Snip-

I think this is why developers have stopped pursuing higher quality details like larger polygon counts and focused more on the lighting itself. The higher polygon counts are still there, but to me, the most striking change between graphics design of 2005-2011 or so and 2011 and beyond is switching to use physically based shading.

 

Plus as you say, there's an exponential effort needed to render those polygons and details, which translates to finer details. But those finer details are harder to make out unless we have larger resolutions and/or are starting right in front of the object in question.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, dizmo said:

I read it the way it's written, but nice try to duck the fact it's wrong.

 

On 11/26/2019 at 1:50 AM, emosun said:

at no point did I

i dont even know who you're talking to at this point

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Mira Yurizaki said:

I think this is why developers have stopped pursuing higher quality details like larger polygon counts and focused more on the lighting itself. The higher polygon counts are still there, but to me, the most striking change between graphics design of 2005-2011 or so and 2011 and beyond is switching to use physically based shading.

 

Plus as you say, there's an exponential effort needed to render those polygons and details, which translates to finer details. But those finer details are harder to make out unless we have larger resolutions and/or are starting right in front of the object in question.

My sentiments exactly. If we ignore any arguments for making mobile games, lazy devs, etc. It would make sense that developers have hit a sort of wall with how "good" graphics can get. There is an equation and I wish I could find it again that explains the difficulty in rendering and how exponentially more processing power would be needed for the same generational jump we had from ps2-ps4 to PS4-PS****(whatever they call it). 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×