Jump to content

Will This Rig Need To Be Upgraded to Achieve Cyberpunk 1080p 60fps Ultra

Ashino

Hi, I'm a huge fan of CDPR and their games so I, like everyone else, am extremely hyped about Cyberpunk but I'm worried that my pc just won't make the cut and won't provide a consistent 60fps at 1080p ultra settings (obviously without ray tracing). My cpu is not bad, but not the best either and my GPU, even though it was a great card 4 years ago is beginning to struggle even at 1080p with demanding games like Red Dead Redemption 2. Upgrading won't be easy, first I have to somehow find a 3080 that's in stock somewhere, which is difficult plus I wanted to wait and see if Nvidia decides to release 3080TI and I would really hate it if I get a 3080 now only for Nvidia to release the 3080Ti a few months later.  So what do you guys think I should do? do I hunt down and snag a 3080 whenever I see it available? Should I wait for the 3080TI? or will you think this rig could achieve ultra settings 1080p 60fps when the game comes out? Also I heard that the gameplay demo CDPR ran in 2018 was actually running on an i7 8700k and a gtx 1080TI, but the fact that they capped the framerate to 30fps is a red flag. It means that this kind of rig can't maintain a consistent 60fps so they capped it to 30. Anyways, what do you guys think? Here are my specs

i7 8700k

EVGA GTX 1080TI SC2

16gb ddr4 corsair vengeance ram,

Aorus gaming 7 z370 motherboard

Corsair TX850w psu 

Link to comment
Share on other sites

Link to post
Share on other sites

I'll be honest, you probably won't be able to see a difference between high and ultra anyway so you should try the game on your current system, and decide for yourself afterwards whether or not to upgrade. A gameplay demo from two years ago can't tell us anything about the performance of the game today, anyways.

 

and by that point, hopefully you don't have to worry about GPU availability either.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Ashino said:

Anyways, what do you guys think? Here are my specs

i7 8700k

EVGA GTX 1080TI SC2

The game is not out yet, but if we can assume 'Recommended specs' is for 1080p60fps Ultra (which I would expect), this build would run the game at 1080p60fps easily.

https://support.cdprojektred.com/en/cyberpunk/pc/sp-technical/issue/1556/cyberpunk-2077-system-requirements

"We're all in this together, might as well be friends" Tom, Toonami.

 

mini eLiXiVy: my open source 65% mechanical PCB, a build log, PCB anatomy and discussing open source licenses: https://linustechtips.com/topic/1366493-elixivy-a-65-mechanical-keyboard-build-log-pcb-anatomy-and-how-i-open-sourced-this-project/

 

mini_cardboard: a 4% keyboard build log and how keyboards workhttps://linustechtips.com/topic/1328547-mini_cardboard-a-4-keyboard-build-log-and-how-keyboards-work/

Link to comment
Share on other sites

Link to post
Share on other sites

Noone knows how "Hardcore" Ultrasettings will be.

Games are NEVER optimized for Ultra, and the recommended settings aren't usually for Ultra preset either.

 

But i dare to say: At least at High settings, you will easily get 60 fps lock at 1080p. Ultra MAYBE; maybe not. But visual difference between high and Ultra is usually <1-2%. Especially at only 1080p.

 

Your GPU is still very fine, and would probably even hit 60 fps at 1440p (Again: High, not always Ultra).

And your CPU is also perfectly fine. I'm still rocking an i7 6700k with only 4 cores, and no problems yet.

 

 

My advice: Get the Game, use Ultra, and see how muich fps you get. More than 60? great. Less than 60? Switch single settings down to High, and see how fps changes. Maybe you can find your compromise.

 

In Witcher 3, most settings didn't impact fps really, comparing High to Ultra.

However, "Object Distance" alone reduced fps by 20% just from going from High to Ultra. So i left that on High, since i didn't saw an imediate difference.

Link to comment
Share on other sites

Link to post
Share on other sites

recommended specs are confirmed to be for 1080p high, no specified framerate. thats listed as a 4770 and a 1060. 8700k plus a 1080ti should be more than enough for 1080p-1440p high rfr gaming for at least another few years, assuming you dont care about rtx. its a beast of a card and 1080p is limited in how much it can push gpus these days.

topics i need help on:

Spoiler

 

 

my "oops i bought intel right before zen 3 releases" build

CPU: Ryzen 5 3600 (placeholder)

GPU: Gigabyte 980ti Xtreme (also placeholder), deshroud w/ generic 1200rpm 120mm fans x2, stock bios 130% power, no voltage offset: +70 core +400 mem 

Memory: 2x16gb GSkill Trident Z RGB 3600C16, 14-15-30-288@1.45v

Motherboard: Asus ROG Strix X570-E Gaming

Cooler: Noctua NH-D15S w/ white chromax bling
OS Drive: Samsung PM981 1tb (OEM 970 Evo)

Storage Drive: XPG SX8200 Pro 2tb

Backup Storage: Seagate Barracuda Compute 4TB

PSU: Seasonic Prime Ultra Titanium 750W w/ black/white Cablemod extensions
Case: Fractal Design Meshify C Dark (to be replaced with a good case shortly)

basically everything was bought used off of reddit or here, only new component was the case. absolutely nutty deals for some of these parts, ill have to tally it all up once it's "done" :D 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/25/2020 at 10:52 AM, Darkseth said:

Noone knows how "Hardcore" Ultrasettings will be.

Games are NEVER optimized for Ultra, and the recommended settings aren't usually for Ultra preset either.

 

But i dare to say: At least at High settings, you will easily get 60 fps lock at 1080p. Ultra MAYBE; maybe not. But visual difference between high and Ultra is usually <1-2%. Especially at only 1080p.

 

Your GPU is still very fine, and would probably even hit 60 fps at 1440p (Again: High, not always Ultra).

And your CPU is also perfectly fine. I'm still rocking an i7 6700k with only 4 cores, and no problems yet.

 

 

My advice: Get the Game, use Ultra, and see how muich fps you get. More than 60? great. Less than 60? Switch single settings down to High, and see how fps changes. Maybe you can find your compromise.

 

In Witcher 3, most settings didn't impact fps really, comparing High to Ultra.

However, "Object Distance" alone reduced fps by 20% just from going from High to Ultra. So i left that on High, since i didn't saw an imediate difference.

Sounds like a good plan to me. I'm also hoping that Nvidia is planning to release the 3080 TI soonish though because i'm going to upgrade, but it's just a question of when. You guys think 3080 TIs will get released at all? or the 3080s/3090s are all we're getting?

Link to comment
Share on other sites

Link to post
Share on other sites

I think the 3090 is what the 2080 ti + titan were back then. Just combined into one product. Maybe a 3090 ti? Depends if that's the biggest Chip already or not.

But honestly, 3080 would be the max i would ever buy. The power Consumptions are really trashy this time. 320 Watt tdp for a 2080? That's like +50% compared to the 2080. My 1080 runs at 140w~ Power consumption with UV, at around 5% performance loss (instead 220w).

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/25/2020 at 11:39 AM, minibois said:

The game is not out yet, but if we can assume 'Recommended specs' is for 1080p60fps Ultra (which I would expect), this build would run the game at 1080p60fps easily.

https://support.cdprojektred.com/en/cyberpunk/pc/sp-technical/issue/1556/cyberpunk-2077-system-requirements

Absolutely DOUBT that the recommended specs are for 1080p60fps ultra. For sure its for 1080p, but I would guess with lower settings (not low per se, but maybe medium/high mix)

But this pc seens very capable to do 1080p with high settings (ultra is stupid and pointless) with possibly well above 60 fps.

Ultra is stupid. ALWAYS.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Darkseth said:

I think the 3090 is what the 2080 ti + titan were back then. Just combined into one product. Maybe a 3090 ti? Depends if that's the biggest Chip already or not.

But honestly, 3080 would be the max i would ever buy. The power Consumptions are really trashy this time. 320 Watt tdp for a 2080? That's like +50% compared to the 2080. My 1080 runs at 140w~ Power consumption with UV, at around 5% performance loss (instead 220w).

Some 3080s right now also have a pretty bad issue because AIBs chose to use cheap capacitors used for filtering high frequencies on the voltage rail to save money, which lead to crashing problems because of boost clocks that the power delivery system can't handle. Turns out the stock shortage was a blessing in disguise because even if the 3080s were readily available for purchase right now, I don't feel that its a good idea to buy one just so I can essentially have fun beta testing their hardware. The 3090 is also ridiculously more expensive compared to the 2080ti at msrp, which really sucks, but yeah I think you're right that the 3090 is just a different name for the 3080 Ti and we won't actually get a 3080 Ti. Can't imagine how much a 3090 Ti would cost. 

Link to comment
Share on other sites

Link to post
Share on other sites

If you want "ultra" no dips you're gonna need 3090 for most games that aren't indies I'd wager. 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×