Jump to content

DriveClub Is 30FPS For The Same Reason Other Games Are, Director Says [Old Thread]

They should have put a performance slider like this

 

720p @ 60fps

 

900p @ 40fps

 

1080p @ 30fps

 

1440p @ 15fps

 

2160p @ 5fps

Let me correct that slider for you :P 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think that it's just a matter of higer numbers are better. It's probably just people used to it and it's kind of hard going back.

 

I played on a PS3 for the last gen and never really complained about fps (Besides blighttown in Dark Souls... Jesus). Seems like I was used to 30fps. Just last week I installed some mods on skyrim and started playing. It felt soooo lagy I was like "Wtf, I must be like at 1-5, 20fps"... Nope. A good solid 30-35 fps and I just couldn't bare it. I'm too used to 60fps right now. There's no turning back. And I think that's the problem with PC people who experienced it on a regular basis, they're used to it so anything @30fps feel way worst than someone that always play at that rate. 

 

I've been PC gaming for a while, so I get that. 

 

But its never really killed the enjoyment of a game for me. A shit game is a shit game, regardless of what xxxxp@xxfps it runs at. 

 

Then again, I'm some sort of godless heathen who has most of the consoles, a PC, a Mac and somehow lives in peace using everything hand in hand. Too many times reading these forums it just seems like one sided crusades with people either too young, too stubborn or both thinking one side absolutely rules supreme over the other and not realizing that once you leave the internet and get into the real world (ignoring fanboys from all sides), that people are going to pick whatever makes them happy. I don't judge people on what makes them happy. PC, console, both; whatever. 

Link to comment
Share on other sites

Link to post
Share on other sites

BF4 runs decently (almost zero lag but occasional jaggies @ 60fps albeit at 720p on xbox one 

 

check this out 

 

talks about what jaggies are by linus

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

okay, lets face it, 90% of games on consoles will be 30 FPS, im kinda sick of seeing people bitch about it.

Link to comment
Share on other sites

Link to post
Share on other sites

So now it's confirmed, the best platform for racing games is currently the Wii U with Mario Kart 8 running at 1920x1080 at 60 FPS. The gameplay is also far superior to other racing games.

 

Actually, Mario Kart 8 runs at a native resolution of 1280x720, and runs at a consistent 59fps

which according to Digital Foundry causes a 'subtle but continuous stuttering'.

 

On topic, I don't really care about the framerate. I find 30fps to be absolutely acceptable so

long as it doesn't drop below it too often. I consider 60fps to be the cherry on top for the

games I play rather than the be-all-end-all requirement others consider it to be. :)

"Be excellent to each other" - Bill and Ted
Community Standards | Guides & Tutorials | Members of Staff

Link to comment
Share on other sites

Link to post
Share on other sites

I've been PC gaming for a while, so I get that. 

 

But its never really killed the enjoyment of a game for me. A shit game is a shit game, regardless of what xxxxp@xxfps it runs at. 

 

Then again, I'm some sort of godless heathen who has most of the consoles, a PC, a Mac and somehow lives in peace using everything hand in hand. Too many times reading these forums it just seems like one sided crusades with people either too young, too stubborn or both thinking one side absolutely rules supreme over the other and not realizing that once you leave the internet and get into the real world (ignoring fanboys from all sides), that people are going to pick whatever makes them happy. I don't judge people on what makes them happy. PC, console, both; whatever. 

You're absolutely right about the quality of a game. Resolution and fps won't bring fun gameplay or replayability. Fun is fun, regardless of what it runs at. 

 

Personally I like to tune down my settings to get a more fluid experience and the PC offer me that. I just wish sometimes consoles would do the same or offer the options to. 

 

Thinking about this got me wondering why aren't we not able to just ask for a fps and the PC would adjust settings on the fly. Like if frames are dropping a little, AA switch from 16x to 2x automatically. I know GeForce Experience is doing a part of it but you still got to thinker with it... And it's not on the fly. You need to quit and restart everytime :/

Link to comment
Share on other sites

Link to post
Share on other sites

You're absolutely right about the quality of a game. Resolution and fps won't bring fun gameplay or replayability. Fun is fun, regardless of what it runs at. 

 

Personally I like to tune down my settings to get a more fluid experience and the PC offer me that. I just wish sometimes consoles would do the same or offer the options to. 

 

Thinking about this got me wondering why aren't we not able to just ask for a fps and the PC would adjust settings on the fly. Like if frames are dropping a little, AA switch from 16x to 2x automatically. I know GeForce Experience is doing a part of it but you still got to thinker with it... 

 

Same here. Some games I have little problem dialing down settings to get something more fluid. Other times I am okay with coughing up FPS to get a absurdly good looking game, and with my hardware I can sometimes get both at once. The choice is nice to have and it lets me play how I feel like it. 

Link to comment
Share on other sites

Link to post
Share on other sites

Gotta love amd now right? look what they did with their sorry ass hardware, ruined the gaming as we know it.

Anyway consoles are from another gaming world its sad they put pc and consoles in same boat.(iknow this is ps4 exclusive)

I really dont care only stupid buy consoles nowdays,my problem is when the pc versions are locked to 30fps,graphics gimped and controlls feel sluggish and unrefined aka console ports.

I'm sorry but that was the most idiotic reply I have read in my life.. 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm tired of playing at less than 60.

 

Don't play on a console. Problem solved.

Link to comment
Share on other sites

Link to post
Share on other sites

Don't play on a console. Problem solved.

Money is tight. But this will change soon. My signature is giving hints.

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

This is odd... i remember that on my old sandy bridge i3 i could play crysis at 720p and get 30 fps max, going 600p it started running much better, on occasion even 45 fps, but the input lag was a LOT better.

And now 3 years after this they make a next gen consoles that can't run a racing game at 60 fps. Now i call that bullshit!!

 

Here is why, crysis 3 ran on the PS3 and Xbox 360 at 720 ( upscaled ) the real resolution was 480p, IT RAN AT FKING 30 FPS, 30!! so that means that the same games on those settings should rune faster on the newer hardware. Crisis 3 performance requirements are far above racing games on consoles. 

Both of the "new" consoles have GPU power in between an HD 7700 and 7850 ( 7850 is equivalent to a 6950, that was a high end card ), now all of those cards could play crysis 3 at 20-25 FPS average on maxed out settings with no AA, on medium rendering quality and settings with no AA the FPS went over 50FPS ON 1080P, not 720 or 900, but 1080p.

 

My friend today still has a HD 5850 ( even older high end card ), 1080p monitor and medium settings works like a charm over 30 FPS in modern games.

 

This all 30 FPS is just a fast cash in for the developers and publishers. The new consoles are nothing but PC-s that have PC crossed and written console, and ship with a BS OS.

The consoles today have reasonable CPU cores and capable GPU-s, not to mention a custom API that is more efficient, than DX 11, 30 fps limit even on modern consoles is a lie, they can do 60, but it would take more time to do that, not to mention that 1080p games on a console are upscaled, so even more overhead and those GPU-s should be able to do that. The 60 FPS and 1080p will come later, they will claim that they found a way to do that, but in reality they will just make the game 900p upscaled and mid PC settings equivalent, 30 FPS is just a fast cash in, and we all suffer.

 

The problem is not people buying consoles, the real problem is that the people who buy them as primary gaming machines have no clue about the hardware they are buying, and they don't even care, until they start caring, things will not really change...

you forgot about shitty AMD cpu.

 

it has 8 cores but frequency/ 1.6 GHZ ? really?

`~

fx 8350 has 8 cores and on 4.72 GHZ it's slower then stock i5 4670k  now take that whole GHZ and downclock to 1.6 GHZ lol it will be much slower then any core i3. and this shitty cpu bottlenecks these GPUs (xbox has ~ HD 7790 and PS4 ~ HD 7850-7870) all bottleneck is CPU!    that shitty cpu costs 110 $ with these GPUs built in. if they paid 250$ for intel CPU and NVIDIA GPU they would be much faster and these console could be real next gens! for 550 $  everybody would buy it. 

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

you forgot about shitty AMD cpu.

 

it has 8 cores but frequency/ 1.6 GHZ ? really?

`~

fx 8350 has 8 cores and on 4.72 GHZ it's slower then stock i5 4670k  now take that whole GHZ and downclock to 1.6 GHZ lol it will be much slower then any core i3. and this shitty cpu bottlenecks these GPUs (xbox has ~ HD 7790 and PS4 ~ HD 7850-7870) all bottleneck is CPU!    that shitty cpu costs 110 $ with these GPUs built in. if they paid 250$ for intel CPU and NVIDIA GPU they would be much faster and these console could be real next gens! for 550 $  everybody would buy it. 

1st. the 1.6 is the base clock of the chips inside ps4 and xbone, they have a boost clock, max reported is 2,7ghz.

 

2nd The CPU itself is not the same one as the 8350, it's 2 APU-s put together.

 

3rd. The CPU is not the problem in the consoles, since the Custom API is made to help with that, it is similar to mantle API so the CPU does not need to be strong since it can leverage multiple cores, they even tested this with the 8350 downclocked to 2 ghz and the FPS in Battlefield 4 did not suffer a lot in comparison to DX 11, and DX 11 is not a very efficient API.

 

4th. The intel vs AMD argument is not really needed here since like i said this is not a DX 11 API, and mantle makes AMD multile CPU cores working like a charm, and so does the API in the consoles. Oh and one more thing the 8350 can manage to hold it's own vs an i5, and the "lack of performance" is only visible when using light threaded applications, in video editing, an i5 is not that great, so as an all around/ all purpose CPU an 8350 is the better choice, and not to mention cheaper, plus you are comparing a new intel chip vs a 2 year old one from AMD, and the gap is not nearly as big as people believe it is...

 

5th. The Nvidia and Intel is a bad choice at this price point since lower end cards from Nvidia can't really match AMD in performance and price. AMD was an obvious choice in this scenario. As for the CPU-s i mean seriously dude? you would put a quad core intel haswell or ivy bridge that would cook in that little box, i have a haswell and i know what i am talking about, not to mention that APU-s have graphics on board that works with the GPU together, you can't do that with intel and nvidia, so once again AMD is the better choice... 

 

6th do the math, you will see i am correct...

System

CPU: i7 4770kMotherboard: Asus Maximus VI HeroRAM: HyperX KHX318C9SRK4/32 - 32GB DDR3-1866 CL9 / GPU: Gainward Geforce GTX 670 Phantom Case: Cooler Master HAF XBStorage: 1 TB WD BluePSU: Cooler Master V-650sDisplay(s): Dell U2312HM, LG194WT, LG E1941

Cooling: Noctua NH-D15Keyboard: Logitech G710+Mouse: Logitech G502 Proteus SpectrumSound: Focusrite 2i4 - USB DAC / OS: Windows 7 (still holding on XD)

 
 
Link to comment
Share on other sites

Link to post
Share on other sites

1st. the 1.6 is the base clock of the chips inside ps4 and xbone, they have a boost clock, max reported is 2,7ghz.

 

2nd The CPU itself is not the same one as the 8350, it's 2 APU-s put together.

 

3rd. The CPU is not the problem in the consoles, since the Custom API is made to help with that, it is similar to mantle API so the CPU does not need to be strong since it can leverage multiple cores, they even tested this with the 8350 downclocked to 2 ghz and the FPS in Battlefield 4 did not suffer a lot in comparison to DX 11, and DX 11 is not a very efficient API.

 

4th. The intel vs AMD argument is not really needed here since like i said this is not a DX 11 API, and mantle makes AMD multile CPU cores working like a charm, and so does the API in the consoles. Oh and one more thing the 8350 can manage to hold it's own vs an i5, and the "lack of performance" is only visible when using light threaded applications, in video editing, an i5 is not that great, so as an all around/ all purpose CPU an 8350 is the better choice, and not to mention cheaper, plus you are comparing a new intel chip vs a 2 year old one from AMD, and the gap is not nearly as big as people believe it is...

 

5th. The Nvidia and Intel is a bad choice at this price point since lower end cards from Nvidia can't really match AMD in performance and price. AMD was an obvious choice in this scenario. As for the CPU-s i mean seriously dude? you would put a quad core intel haswell or ivy bridge that would cook in that little box, i have a haswell and i know what i am talking about, not to mention that APU-s have graphics on board that works with the GPU together, you can't do that with intel and nvidia, so once again AMD is the better choice... 

 

6th do the math, you will see i am correct...

you are wrong CONSOLE CPUs do not have BOOST!!! http://www.vg247.com/2013/09/04/xbox-one-cpu-boosted-console-now-in-full-production/   xbone has 1.75 GHZ and PS4 has up to 2 GHZ thats it. they increased speeds. on xbone they also increased GPU wich was at 800 MHZ and now is at 853 MHZ 

 

if CPU is not bottleneck then can you explain if xbone has 7790 equivalent GPU and PS4 has 7870 equivalent. why the fuck are next gen games locked at 30 FPS and some of them are at 720 p 900p and so on.   

 

HD 7870 itself can play 8-9 out of 10 games on console level graphics at 60 FPS 1080p

 

you said there is special API and all that optimization for console games better ten PC and all that stuff and games are still locked at 30 FPS with shitty resolution

 

developers only have access to 6 out of the 8 cores on PS4 and i am sure that CPU 6 core 2 GHZ is much slower then core i3 :D

 

I told it's all because of CPU bottleneck!!!

Computer users fall into two groups:
those that do backups
those that have never had a hard drive fail.

Link to comment
Share on other sites

Link to post
Share on other sites

well looks like I'm not buying this game. Racing games are not meant to be 30 fps. THEY ARE FAST!

/dumbass devs

Fix your shit. If you can't do 60fps, then lower the detail! Optimize it better. 

Besides the many other reasons I have not gotten forza 5, 30FPS is probably in the top 5. Unacceptable. I mean the whole game is too. Forza 4 ran at 60fps, and it had well balanced enjoyable graphics on a last gen console, even if it was at 720p. Why the downgrade??

Call me crazy, but I'd rather have 720p and 60fps than 1080p and 30fps on most games. Not all. But most. 

If you are at 30 fps but you upped the res higher than a lot of games you fucked up. 

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

1st. the 1.6 is the base clock of the chips inside ps4 and xbone, they have a boost clock, max reported is 2,7ghz.

I would like a source on that. We know that it can download itself, but I have not heard anything about turbo.

 

2nd The CPU itself is not the same one as the 8350, it's 2 APU-s put together.

Yes and that's bad. Another thing that is bad is that it's crappy Jaguar cores. It's far worse than the 8350 in more ways than just the frequency. Each MHz is also weaker.

 

3rd. The CPU is not the problem in the consoles, since the Custom API is made to help with that, it is similar to mantle API so the CPU does not need to be strong since it can leverage multiple cores, they even tested this with the 8350 downclocked to 2 ghz and the FPS in Battlefield 4 did not suffer a lot in comparison to DX 11, and DX 11 is not a very efficient API.

Depends on the game and settings used. Like I said before, an 8350 at 2GHz will actually be far more powerful than the 1.75GHz CPU in the Xbone. I wouldn't be surprised if the FX at 2GHz is twice as powerful.

It's not just about FPS either. If they had used more powerful CPUs then Microsoft might have been able to maintain backwards compatibility through emulation. You can't emulate 3 powerful cores on 8 crappy cores though.

 

4th. The intel vs AMD argument is not really needed here since like i said this is not a DX 11 API, and mantle makes AMD multile CPU cores working like a charm, and so does the API in the consoles. Oh and one more thing the 8350 can manage to hold it's own vs an i5, and the "lack of performance" is only visible when using light threaded applications, in video editing, an i5 is not that great, so as an all around/ all purpose CPU an 8350 is the better choice, and not to mention cheaper, plus you are comparing a new intel chip vs a 2 year old one from AMD, and the gap is not nearly as big as people believe it is...

[Citation needed] on "Mantle makes AMD multicore CPUs work like a charm".

I would also argue that it's the i5 is a far better "all around/all purpose CPU" since the vast majority of applications can't take advantage of more than 2 cores, let along more than 4. There is a difference between having multiple threads and actually scaling properly to multiple cores. Most programs has a ton of threads, but because of synchronization issues you can't run all threads in parallel. It is very, very hard to make proper use of additional cores, which is why so few programs actually has it.

 

You can't hide behind "but the 8350 is 2 years old!" by the way, since that is what AMD currently has to offer. You can't just say "the gap is not nearly enough as people believe because the AMD CPU is old and therefore you can't count the newer Intel stuff!". You compare what is currently available.

 

5th. The Nvidia and Intel is a bad choice at this price point since lower end cards from Nvidia can't really match AMD in performance and price. AMD was an obvious choice in this scenario. As for the CPU-s i mean seriously dude? you would put a quad core intel haswell or ivy bridge that would cook in that little box, i have a haswell and i know what i am talking about, not to mention that APU-s have graphics on board that works with the GPU together, you can't do that with intel and nvidia, so once again AMD is the better choice... 

The Xbone is slightly less powerful as a 7770, correct? That card costs about 100 dollars.

At that price you can get a 740/750. The 740 will perform about the same or slightly worse, and the 750 will perform quite a lot better (for like 10 dollars more).

I don't think the whole "AMD is better for budget" argument holds up that well anymore. It was true back in the 400 series days (until the 460 was released) and it was true back when you could unlock cores in the Athlon CPUs, but today each manufacturer has pretty similar price:performance at pretty much all price brackets.

 

You're right that it would cook inside the consoles though (unless they added better cooling). The CPU part in the consoles only uses about 30 watts at load which is far less than a full blown desktop i5. It would be possible to bring an i5 down to that level, and you would probably get better performance as well, but it would be a very cost inefficient.

The alternative from Intel would be some Atom based chip but then you would still lack having the CPU and GPU on the same die. You could just do like the previous generation and not have them on the same die though.

 

6th do the math, you will see i am correct...

post-24675-0-30188000-1373258690.png

Link to comment
Share on other sites

Link to post
Share on other sites

Gotta love amd now right? look what they did with their sorry ass hardware, ruined the gaming as we know it.

Anyway consoles are from another gaming world its sad they put pc and consoles in same boat.(iknow this is ps4 exclusive)

I really dont care only stupid buy consoles nowdays,my problem is when the pc versions are locked to 30fps,graphics gimped and controlls feel sluggish and unrefined aka console ports.

lmao its funny that you think this is amd's fault. Please go back in your cave nvidia troll

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

Let's make all games run at 29.97 fps and call it a day.

lololol 

YESSSSSS

That time when analog frame dropping became a digital standard. 

I'm so glad youtube will soon have 60fps options. People, particularly non pc gamers, will finally see the difference between a 60 FPS and 30 fps experience for the same game, and might demand better. 

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

lololol 

YESSSSSS

That time when analog frame dropping became a digital standard. 

I'm so glad youtube will soon have 60fps options. People, particularly non pc gamers, will finally see the difference between a 60 FPS and 30 fps experience for the same game, and might demand better.

It's not a bad idea. It's slightly less demanding and for some reason looks smoother than 30 fps. :P

.

Link to comment
Share on other sites

Link to post
Share on other sites

It's not a bad idea. It's slightly less demanding and for some reason looks smoother than 30 fps. :P

Let_8bea3c_1486839.gif

It's an age old ntsc standard. The reason that content that is at the rate looks better is because it's still usually produced that way. 

Another example is 24 fps is usually actually 23.967

I don't know exactly why, something about broadcast blah blah cameras something something. 

Someone in the know there would have to explain it. 

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

Gotta love amd now right? look what they did with their sorry ass hardware, ruined the gaming as we know it.

Anyway consoles are from another gaming world its sad they put pc and consoles in same boat.(iknow this is ps4 exclusive)

I really dont care only stupid buy consoles nowdays,my problem is when the pc versions are locked to 30fps,graphics gimped and controlls feel sluggish and unrefined aka console ports.

Thanks for letting me know I needed to block another fucking retard.

 

Yeah the "cinematic" bs is really scraping the bottom of the barrel xD someone who says that isn't a developer I'd give my money to. Spreading misinformation for money is a moronic thing to do.

Games should have a cinematic experience to better increase the integration aspect of gaming.

 

Now are you going to make yourself a lier, or are you going to pay me?

Link to comment
Share on other sites

Link to post
Share on other sites

Let_8bea3c_1486839.gif

It's an age old ntsc standard. The reason that content that is at the rate looks better is because it's still usually produced that way. 

Another example is 24 fps is usually actually 23.967

I don't know exactly why, something about broadcast blah blah cameras something something. 

Someone in the know there would have to explain it.

I wasn't being serious. Given their excuses they should just use 29.97 because why the hell not.

.

Link to comment
Share on other sites

Link to post
Share on other sites

I wasn't being serious. Given their excuses they should just use 29.97 because why the hell not.

OH OH OH OH 

Laugh redacted. I see your point. 

muh specs 

Gaming and HTPC (reparations)- ASUS 1080, MSI X99A SLI Plus, 5820k- 4.5GHz @ 1.25v, asetek based 360mm AIO, RM 1000x, 16GB memory, 750D with front USB 2.0 replaced with 3.0  ports, 2 250GB 850 EVOs in Raid 0 (why not, only has games on it), some hard drives

Screens- Acer preditor XB241H (1080p, 144Hz Gsync), LG 1080p ultrawide, (all mounted) directly wired to TV in other room

Stuff- k70 with reds, steel series rival, g13, full desk covering mouse mat

All parts black

Workstation(desk)- 3770k, 970 reference, 16GB of some crucial memory, a motherboard of some kind I don't remember, Micomsoft SC-512N1-L/DVI, CM Storm Trooper (It's got a handle, can you handle that?), 240mm Asetek based AIO, Crucial M550 256GB (upgrade soon), some hard drives, disc drives, and hot swap bays

Screens- 3  ASUS VN248H-P IPS 1080p screens mounted on a stand, some old tv on the wall above it. 

Stuff- Epicgear defiant (solderless swappable switches), g600, moutned mic and other stuff. 

Laptop docking area- 2 1440p korean monitors mounted, one AHVA matte, one samsung PLS gloss (very annoying, yes). Trashy Razer blackwidow chroma...I mean like the J key doesn't click anymore. I got a model M i use on it to, but its time for a new keyboard. Some edgy Utechsmart mouse similar to g600. Hooked to laptop dock for both of my dell precision laptops. (not only docking area)

Shelf- i7-2600 non-k (has vt-d), 380t, some ASUS sandy itx board, intel quad nic. Currently hosts shared files, setting up as pfsense box in VM. Also acts as spare gaming PC with a 580 or whatever someone brings. Hooked into laptop dock area via usb switch

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×