Jump to content

Will it be worth upgrading from a 2080 Ti to a 3080?

GamerBlake
5 minutes ago, For Science! said:

60 fps on average

I know this is an example in my case though that's a big nope, 60 fps minimum, and maximum, I can't stand fluctuating framerates and my monitor is 60hz, and the games I'm playing are often designed around that framerate anyways ( RE2, MHW, etc)

5 minutes ago, For Science! said:

you will die if you cannot play games.

That's  a bit hyperbolic!

6 minutes ago, For Science! said:

since you play on a 1440p

No. 1080p/60 as mentioned above... No reason to go higher when my card barely can keep up with 60/ "high'ish" settings...

 

6 minutes ago, For Science! said:

But you see, this is all completely dependent on the arbitrary requirement of the OP, and so there is absolutely no point in asking the community for whether something is "worth it" or not.

Yeap, I get where you coming from but I think it's a misunderstanding, I already said this to someone today, and I see this a lot (maybe there's a reason but it escapes me tbh)

 

If someone asks "is *this* thing worth it...?" They don't necessarily want to hear an opinion they want raw facts "yes this thing is x times more powerful" 

 

Because it's often very hard to tell is this *new thing* a scam or what, does it barely perform better than what I already have? Etc...

 

Thing is, yes, with 30xx this is a bit dumb, because no one really knows, though I think many people on this forum have the experience to at least estimate the performance, and that's, in most cases, all the people asking want to know... 

 

It's basically either a Yes it Will be a huge upgrade or no you already have a nasa computer! (simplified). 😅

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Mark Kaine said:

I know this is an example in my case though that's a big nope, 60 fps minimum, and maximum, I can't stand fluctuating framerates and my monitor is 60hz, and the games I'm playing are often designed around that framerate anyways ( RE2, MHW, etc)

That's  a bit hyperbolic!

No. 1080p/60 as mentioned above... No reason to go higher when my card barely can keep up with 60/ "high'ish" settings...

 

Yeap, I get where you coming from but I think it's a misunderstanding, I already said this to someone today, and I see this a lot (maybe there's a reason but it escapes me tbh)

 

If someone asks "is *this* thing worth it...?" They don't necessarily want to hear an opinion they want raw facts "yes this thing is x times more powerful" 

 

Because it's often very hard to tell is this *new thing* a scam or what, does it barely perform better than what I already have? Etc...

 

Thing is, yes, with 30xx this is a bit dumb, because no one really knows, though I think many people on this forum have the experience to at least estimate the performance, and that's, in most cases, all the people asking want to know... 

 

It's basically either a Yes it Will be a huge upgrade or no you already have a nasa computer! (simplified). 😅

The card in question by the OP is a 2080 Ti considering an upgrade to a 3080, he uses a 1440p 144 Hz monitor, and generally prefer to use maximum settings ("cinematic look" as often misused by the OP). That is why the criterion in my post is valid for this particular thread. Although honestly, it doesn't matter, you can replace all the metrics as above for your ones and ask the same question.

 

I disagree with that people asking for "worth" are asking about raw performance numbers.

 

If somebody asked: "Is it worth spending $700 to upgrade my 2080 Ti to a 3080" (as in this thread)

Is the answer: "Yes, it will be 20% more FPS than a 2080 TI" or "No, it will be only 20% more FPS than a 2080TI". Both are valid answers and totally depends on your perception of whether 20% more FPS is "worth it" and how much $700 is "worth" for you. Is it worth it going from 40 FPS --> 48 FPS in MSFS2020? (maybe? if you think everybit closer to 60 FPS is helpful), or is it worth it going from 180 FPS  --> 220 FPS in CSGO (maybe? if you value input latency, but probably not?). All these are open questions that are completely clutching at straws.

 

Even a supposedly clear cut case of: "Is it worth selling my 2080 Ti and buying a second hand 1060?" --> "No, you will have a big drop in performance"

Could be argued with "Yes, with the money that you make in surplus and the lower power consumption, your kids can go to college". This is assuming you don't value FPS/max settings/high refresh/high resolution; you can still totally game on 1060, and so the money that you save would totally be "worth it" for some people.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, For Science! said:

If somebody asked: "Is it worth spending $700 to upgrade my 2080 Ti to a 3080" (as in this thread)

Is the answer: "Yes, it will be 20% more FPS than a 2080 TI" or "No, it will be only 20% more FPS than a 2080TI". Both are valid answers and totally depends on your perception of whether 20% more FPS is "worth it" and how much $700 is "worth" for you.

Ok I get where the"problem" is, they aren't asking the right questions...

 

I still think the majority who asks this want to know about the expected performance gains, I think they know that depending on perspective a 20% boost can be either "worth it" or not...

 

So what I'm saying is in most cases this should be completely ignored and answered with facts if possible " it's 20% faster if that's worth it is up to you..." 

But what I usually see is "are you ok with the performance you currently get?"

 

That isn't really helpful answering a question with a question... I think it's likely if someone asks if an upgrade is"worth it" they are already dissatisfied with their current performance - because why even ask this question otherwise?

 

This mostly stems from not enough knowledge, hence they go on a forum like this where one could expect people to actually know this kind of stuff.. 

 

BUT yes, I understand technically they're asking the"wrong question"... ¯\_(ツ)_/¯

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Mark Kaine said:

Ok I get where the"problem" is, they aren't asking the right questions...

 

I still think the majority who asks this want to know about the expected performance gains, I think they know that depending on perspective a 20% boost can be either "worth it" or not...

 

So what I'm saying is in most cases this should be completely ignored and answered with facts if possible " it's 20% faster if that's worth it is up to you..." 

But what I usually see is "are you ok with the performance you currently get?"

 

That isn't really helpful answering a question with a question... I think it's likely if someone asks if an upgrade is"worth it" they are already dissatisfied with their current performance - because why even ask this question otherwise?

 

This mostly stems from not enough knowledge, hence they go on a forum like this where one could expect people to actually know this kind of stuff.. 

 

BUT yes, I understand technically they're asking the"wrong question"... ¯\_(ツ)_/¯

Honestly, I think people who ask this are just weird flexing that they "can" upgrade and just want to demonstrate on a platform their position.

 

and so I do actually think that "are you ok with the performance you currently get?" , is a good question. Since these people "want" to upgrade primarily because there is something newer than what they own and its hurting their ego. Please note that this usually exclusively applies to people who ask this question and own the top end current hardware (that's about to become "previous generation").

 

The very thought of owning "previous generation hardware" is what is likely propelling these people to ask whether its worth upgrading their "last gen" to "current gen".

 

As I said above, you have to be under a very strict set of conditions to "not be ok" with the current performance of a 2080 Ti, and so immediately if you answer the "are you ok with the performance you currently get?" with "yes....but" you know that the question stems from a desperate "want" to upgrade rather than a "need". 

Link to comment
Share on other sites

Link to post
Share on other sites

3090 is NOT the titan card or the TITAN equivalent! 3090 is not the full die. The full die has slightly more cuda cores and 48 GB of RAM. It has leaked. However, because of the insane amount of VRAM, it's a quadro card, however based on it being the same exact die, except the FULL die, it's a titan.

 

3090 is not he titan, but the new enthusiast card. They gave it a new name this time, so that way they could charge more. Could you imagine if they called it 3080 Ti and charged $1500? No of course, so they had to give it a different name to lie to us. It's not a titan, if it was a titan, they would have just called it such.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, For Science! said:

Honestly, I think people who ask this are just weird flexing that they "can" upgrade and just want to demonstrate on a platform their position.

 

and so I do actually think that "are you ok with the performance you currently get?" , is a good question. Since these people "want" to upgrade primarily because there is something newer than what they own and its hurting their ego. Please note that this usually exclusively applies to people who ask this question and own the top end current hardware (that's about to become "previous generation").

 

The very thought of owning "previous generation hardware" is what is likely propelling these people to ask whether its worth upgrading their "last gen" to "current gen".

 

As I said above, you have to be under a very strict set of conditions to "not be ok" with the current performance of a 2080 Ti, and so immediately if you answer the "are you ok with the performance you currently get?" with "yes....but" you know that the question stems from a desperate "want" to upgrade rather than a "need". 

Very true.

I am happy with the performance of my 2080 ti but the 30 series has HDMI 2.1 and that changes everything since with it I can do 4k 120hz on my OLED TVs. 

If the 30 series did not have HDMI 2.1 I would skip it and waited to see if AMD had the feature.

Now that I know the Nvidia cards have the feature I will be spending about the same on a 3090 as I did the 2080 tis since 4k 120hz will be hard to do. 

If for some reason I was forced to use the money I put away for a 3090 on something else it would not be a big deal to me since I can do 4k 60 and 1440p 120 now.  

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, jones177 said:

Very true.

I am happy with the performance of my 2080 ti but the 30 series has HDMI 2.1 and that changes everything since with it I can do 4k 120hz on my OLED TVs. 

If the 30 series did not have HDMI 2.1 I would skip it and waited to see if AMD had the feature.

Now that I know the Nvidia cards have the feature I will be spending about the same on a 3090 as I did the 2080 tis since 4k 120hz will be hard to do. 

If for some reason I was forced to use the money I put away for a 3090 on something else it would not be a big deal to me since I can do 4k 60 and 1440p 120 now.  

For me VRAM capacity is everything for my scientific calculations, that's why the 1080 Ti was a god send for me with its 11 GB VRAM at the time. A 2080 Ti would have been a side-grade, and now the 3080 would be a downgrade. Then furthermore for my calculations I need at least two GPUs, so at the moment I have the choice between stay on my dual 1080 Ti's, or spend a boatload of money to get 2x 3090's.

 

Getting dual-3090's would indeed be a substantial upgrade, but given my jobs are typically overnight jobs, even if the 3090 was 200% faster than my current cards, my jobs will still be overnight jobs, and so may not be "worth it" in that regard (jobs per day is unchanged). Of course if I could continuously schedule jobs, that would change, but as manual intervention is required after each calculation, that isn't possible anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, For Science! said:

For me VRAM capacity is everything for my scientific calculations, that's why the 1080 Ti was a god send for me with its 11 GB VRAM at the time. A 2080 Ti would have been a side-grade, and now the 3080 would be a downgrade. Then furthermore for my calculations I need at least two GPUs, so at the moment I have the choice between stay on my dual 1080 Ti's, or spend a boatload of money to get 2x 3090's.

Why 2x? A single 3090 will have substantially better performance, and even more memory than dual 1080TIs...

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, BTGbullseye said:

Why 2x? A single 3090 will have substantially better performance, and even more memory than dual 1080TIs...

Because two half-maps need to be calculated independently and simultaneously. You might be able to fit both half-maps onto the 24 GB memory, but then there are memory overheads from two (Actually three, but one is a master process) MPI processes on the same card and so you will suffer from this. There are also fairly heavy I/O loads, although PCIe 4.0 may be an advantage in this regard.

 

For similar reasons, the Titan RTX was also not a suitable replacement of 2x 1080Ti (or 2080Ti) and therefore I do not think the 3090 will be able to replace two cards just yet.

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, For Science! said:

For me VRAM capacity is everything for my scientific calculations, that's why the 1080 Ti was a god send for me with its 11 GB VRAM at the time. A 2080 Ti would have been a side-grade, and now the 3080 would be a downgrade. Then furthermore for my calculations I need at least two GPUs, so at the moment I have the choice between stay on my dual 1080 Ti's, or spend a boatload of money to get 2x 3090's.

 

Getting dual-3090's would indeed be a substantial upgrade, but given my jobs are typically overnight jobs, even if the 3090 was 200% faster than my current cards, my jobs will still be overnight jobs, and so may not be "worth it" in that regard (jobs per day is unchanged). Of course if I could continuously schedule jobs, that would change, but as manual intervention is required after each calculation, that isn't possible anyway.

I was a freelance 3D artist and my computer/CPU purchases were based on overnight jobs.  If I was animating the amount of computers I used was based getting the job done by morning. That was so I could do any 2d work on the frames, redo any frames that needed it and setup the renders for overnight again.  Even in later years when I only did design work I would use one computer for 3D, one for 2D and both rendered over night.

So I understand the overnight thing.

 

Mine never went away. I still had to rush to get the 3D stages ready so the scenes could be rendered over night, ready for the last day I worked.

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, For Science! said:

Because two half-maps need to be calculated independently and simultaneously. You might be able to fit both half-maps onto the 24 GB memory, but then there are memory overheads from two (Actually three, but one is a master process) MPI processes on the same card and so you will suffer from this. There are also fairly heavy I/O loads, although PCIe 4.0 may be an advantage in this regard.

 

For similar reasons, the Titan RTX was also not a suitable replacement of 2x 1080Ti (or 2080Ti) and therefore I do not think the 3090 will be able to replace two cards just yet.

Sounds like a rather unique workload.

 

You might be surprised with how well the upgrade from GDDR5X to GDDR6X will affect things. It should be nearly double the data rate, and lower overhead as well. Combined with PCIe 4.0, it's going to at least match the dual 1080TI's.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, BTGbullseye said:

Sounds like a rather unique workload.

 

You might be surprised with how well the upgrade from GDDR5X to GDDR6X will affect things. It should be nearly double the data rate, and lower overhead as well. Combined with PCIe 4.0, it's going to at least match the dual 1080TI's.

Sure, I might be surprised, but the 3090 would need to have a crazy amount of compute for it to be a justifiable upgrade, let alone a side-grade. Let's say the baseline is a job that gets fired at 5 PM, and ends the calculation at 5:00 AM (i.e. a 12 hour calculation) on a dual-1080Ti machine

 

I can go ahead and buy 2x 2080 Ti's on the second hand market to side-grade my 2x 1080Ti's. This will not change the types of jobs I can run (as they are both 11 GB cards), and the jobs will finish slightly faster, but nonetheless overnight (e.g. a job may finish at 3:30 AM instead of 5:00 AM, but I am asleep nonetheless so it doesn't matter). Lets say that I can do this particular side-grade for $1000 total (since 2080 Ti prices have bombed).

 

Alternatively I would buy 1x 3090 and hope that the MPI overhead doesn't mean that my jobs end up constantly crashing, and then maybe the card is even faster than 2x 2080 Tis and my jobs finish at 3AM instead of 3:30 AM or 5:00AM. So at best case scenario, its still a $1500 side-grade with a small risk that its a complete downgrade if my jobs cannot fit onto the memory due to software limitations (MPI).

 

Finally, if I got 2x 3090, there will definitely be enough VRAM to cover my needs. The only thing is whether 2x 3090 can reduce the calculation time from 12 hours to......I don't know, lets say 6 hours (i.e. 2x speed). At 11 PM there is still a slim chance I might be in the office, and so will be able to evaluate the results and fire a 2nd job. But you see, for a meaninful upgrade, the compute really needs to be 5x-10x faster for you to be able to "feel it in the same day".

 

So in essence, if a 3090 can provide more than double the compute performance of a 1080 Ti (or a 2080 Ti), then there may be a slim chance that the $3000 investment is "worth it". If a 3090 can provide quadruple the compute performance of a 1080Ti (or 2080 Ti), maybe even a single 3090 could replace two 1080 Tis. In all other scenarios, its a bit of a risky potential side-grade.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

IF you can (sell your 2080 Ti for 500+ AND not have to post it yet AND have option to cancel sale) AND (IF can get a 3080 on launch day AND get fast delivery before having to post 2080) AND (can afford it) THEN go for it.

 

Sloppy basic but I'm too drunk for OR's and ELSE's didn't come in until later.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Den-Fi said:

giphy.gif

 

Hol' up, lemme check.

When you're done checking can I get some lotto numbers?

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

I haven’t been able to get my hands on a 3080 yet but now I’m wondering if the difference between the 3080 and 2080 Ti is worth paying to upgrade?


I’m currently running a 1440p/144hz monitor and I really prefer a more cinematic experience with graphics at maximum and relatively high FPS too.

 

On my 2080 Ti I seem to have trouble hitting that 144 FPS on a lot of more demanding games like AC: Odyssey and RE:3 and RDR2 with max settings enabled.

 

Does anyone know where I can find benchmark comparisons @ 1440p? Or has anyone seen benchmark comparisons for major games and now approximately the FPS difference?

 

Thanks!

CPU: i7 8700K (5.1 GHz OC). AIO: EVGA CLC 280 280mmGPUEVGA XC2 Ultra 2080Ti. PSU: Corsair RM850x 850W 80+ Gold Fully Modular. MB: MSI MEG Z390 ACE. RAM: 32GB Corsair Dominator Platinum RGB (3600 MHz OC). STORAGE: 1TB Samsung 970 Evo Plus M.2 NVMe, 2TB Samsung 860 EVO, 1TB Samsung 860 Evo, 1TB Samsung 860 QVO, 2TB Firecuda 7200rpm SSHD, 1TB WD Blue. CASE: NZXT H510 Elite. FANS: Corsair LL120 RGB 120mm x4. MONITOR: MSI Optix MAG271CQR 2560x1440 144hz. Headset: Steelseries Arctis 5 Gaming Headset. Keyboard: Razer Cynosa Chroma. Mouse: Razer Basilisk Ultimate (Wireless) Webcam: Logitech C922x Pro Stream Webcam.

Link to comment
Share on other sites

Link to post
Share on other sites

Every single YT I watched review the 3080 showed this comparison in question.. not sure where you've been looking... 

Community Standards | Fan Control Software

Please make sure to Quote me or @ me to see your reply!

Just because I am a Moderator does not mean I am always right. Please fact check me and verify my answer. 

 

"Black Out"

Ryzen 9 5900x | Full Custom Water Loop | Asus Crosshair VIII Hero (Wi-Fi) | RTX 3090 Founders | Ballistix 32gb 16-18-18-36 3600mhz 

1tb Samsung 970 Evo | 2x 2tb Crucial MX500 SSD | Fractal Design Meshify S2 | Corsair HX1200 PSU

 

Dedicated Streaming Rig

 Ryzen 7 3700x | Asus B450-F Strix | 16gb Gskill Flare X 3200mhz | Corsair RM550x PSU | Asus Strix GTX1070 | 250gb 860 Evo m.2

Phanteks P300A |  Elgato HD60 Pro | Avermedia Live Gamer Duo | Avermedia 4k GC573 Capture Card

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, GamerBlake said:

On my 2080 Ti I seem to have trouble hitting that 144 FPS on a lot of more demanding games like AC: Odyssey and RE:3 and RDR2 with max settings enabled.

you sure it's GPU bound? because 150fps ish could also be CPU bound territory

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Shimejii said:

im gonna be honest,Is it Worth upgrading? No not really. its 10-35% faster, depending on the games and such. Theres quite a lot of Benchmark comparisons, not sure how you havent seen them.

Well I guess what I meant is reliable benchmarks.

 

Ive seen benchmarks where they were compared but they didn’t do reliable tests because they’d use different CPUs or RAM or even SSDs.

CPU: i7 8700K (5.1 GHz OC). AIO: EVGA CLC 280 280mmGPUEVGA XC2 Ultra 2080Ti. PSU: Corsair RM850x 850W 80+ Gold Fully Modular. MB: MSI MEG Z390 ACE. RAM: 32GB Corsair Dominator Platinum RGB (3600 MHz OC). STORAGE: 1TB Samsung 970 Evo Plus M.2 NVMe, 2TB Samsung 860 EVO, 1TB Samsung 860 Evo, 1TB Samsung 860 QVO, 2TB Firecuda 7200rpm SSHD, 1TB WD Blue. CASE: NZXT H510 Elite. FANS: Corsair LL120 RGB 120mm x4. MONITOR: MSI Optix MAG271CQR 2560x1440 144hz. Headset: Steelseries Arctis 5 Gaming Headset. Keyboard: Razer Cynosa Chroma. Mouse: Razer Basilisk Ultimate (Wireless) Webcam: Logitech C922x Pro Stream Webcam.

Link to comment
Share on other sites

Link to post
Share on other sites

with the secondhand price of 2080tis now that people have noticed that stock is low as expected, you could upgrade for almost free if you sell now and wait a bit with a placeholder card

topics i need help on:

Spoiler

 

 

my "oops i bought intel right before zen 3 releases" build

CPU: Ryzen 5 3600 (placeholder)

GPU: Gigabyte 980ti Xtreme (also placeholder), deshroud w/ generic 1200rpm 120mm fans x2, stock bios 130% power, no voltage offset: +70 core +400 mem 

Memory: 2x16gb GSkill Trident Z RGB 3600C16, 14-15-30-288@1.45v

Motherboard: Asus ROG Strix X570-E Gaming

Cooler: Noctua NH-D15S w/ white chromax bling
OS Drive: Samsung PM981 1tb (OEM 970 Evo)

Storage Drive: XPG SX8200 Pro 2tb

Backup Storage: Seagate Barracuda Compute 4TB

PSU: Seasonic Prime Ultra Titanium 750W w/ black/white Cablemod extensions
Case: Fractal Design Meshify C Dark (to be replaced with a good case shortly)

basically everything was bought used off of reddit or here, only new component was the case. absolutely nutty deals for some of these parts, ill have to tally it all up once it's "done" :D 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Moonzy said:

you sure it's GPU bound? because 150fps ish could also be CPU bound territory

That’s a good point. I think on AC: Odyssey that’s the case but when I run RE:3 my CPU is only ~80% bound whereas GPU is 99-100% locked.

CPU: i7 8700K (5.1 GHz OC). AIO: EVGA CLC 280 280mmGPUEVGA XC2 Ultra 2080Ti. PSU: Corsair RM850x 850W 80+ Gold Fully Modular. MB: MSI MEG Z390 ACE. RAM: 32GB Corsair Dominator Platinum RGB (3600 MHz OC). STORAGE: 1TB Samsung 970 Evo Plus M.2 NVMe, 2TB Samsung 860 EVO, 1TB Samsung 860 Evo, 1TB Samsung 860 QVO, 2TB Firecuda 7200rpm SSHD, 1TB WD Blue. CASE: NZXT H510 Elite. FANS: Corsair LL120 RGB 120mm x4. MONITOR: MSI Optix MAG271CQR 2560x1440 144hz. Headset: Steelseries Arctis 5 Gaming Headset. Keyboard: Razer Cynosa Chroma. Mouse: Razer Basilisk Ultimate (Wireless) Webcam: Logitech C922x Pro Stream Webcam.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Shimejii said:

im gonna be honest,Is it Worth upgrading? No not really. its 10-35% faster, depending on the games and such. Theres quite a lot of Benchmark comparisons, not sure how you havent seen them.

Where did you see these tests?

 

Here are some good info on video, benchmarks included.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, GamerBlake said:

Well I guess what I meant is reliable benchmarks.

 

Ive seen benchmarks where they were compared but they didn’t do reliable tests because they’d use different CPUs or RAM or even SSDs.

All the main Techtubers like GN, Hardware unboxed and such did full testing and they are reliable :P But you know thats up to you. Youll be fine either way

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, VeganJoy said:

with the secondhand price of 2080tis now that people have noticed that stock is low as expected, you could upgrade for almost free if you sell now and wait a bit with a placeholder card

Oh I didn’t buy mine second hand.

 

I paid $1,299 (+ tax) on Newegg for a New 2080 Ti XC2 Ultra EVGA.

CPU: i7 8700K (5.1 GHz OC). AIO: EVGA CLC 280 280mmGPUEVGA XC2 Ultra 2080Ti. PSU: Corsair RM850x 850W 80+ Gold Fully Modular. MB: MSI MEG Z390 ACE. RAM: 32GB Corsair Dominator Platinum RGB (3600 MHz OC). STORAGE: 1TB Samsung 970 Evo Plus M.2 NVMe, 2TB Samsung 860 EVO, 1TB Samsung 860 Evo, 1TB Samsung 860 QVO, 2TB Firecuda 7200rpm SSHD, 1TB WD Blue. CASE: NZXT H510 Elite. FANS: Corsair LL120 RGB 120mm x4. MONITOR: MSI Optix MAG271CQR 2560x1440 144hz. Headset: Steelseries Arctis 5 Gaming Headset. Keyboard: Razer Cynosa Chroma. Mouse: Razer Basilisk Ultimate (Wireless) Webcam: Logitech C922x Pro Stream Webcam.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Shimejii said:

All the main Techtubers like GN, Hardware unboxed and such did full testing and they are reliable :P But you know thats up to you. Youll be fine either way

Ah ok thanks I’m pretty new when it comes to learning which YouTubers & Websites are very realiable. I’ve mostly watched only LTT and some of Paul’s Hardware.

CPU: i7 8700K (5.1 GHz OC). AIO: EVGA CLC 280 280mmGPUEVGA XC2 Ultra 2080Ti. PSU: Corsair RM850x 850W 80+ Gold Fully Modular. MB: MSI MEG Z390 ACE. RAM: 32GB Corsair Dominator Platinum RGB (3600 MHz OC). STORAGE: 1TB Samsung 970 Evo Plus M.2 NVMe, 2TB Samsung 860 EVO, 1TB Samsung 860 Evo, 1TB Samsung 860 QVO, 2TB Firecuda 7200rpm SSHD, 1TB WD Blue. CASE: NZXT H510 Elite. FANS: Corsair LL120 RGB 120mm x4. MONITOR: MSI Optix MAG271CQR 2560x1440 144hz. Headset: Steelseries Arctis 5 Gaming Headset. Keyboard: Razer Cynosa Chroma. Mouse: Razer Basilisk Ultimate (Wireless) Webcam: Logitech C922x Pro Stream Webcam.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×