Jump to content

2070 Super or 2080 Super

FutaGrlAlicePA

I’m trying to think of a good computer I can put together. But I wanna maximize my dollars on it. I’m looking at the AMD X570 or the new AMD TXR40 and for doing up a computer build for the main board. But as for graphics I’m not sure what’s going to suit me best. But I do know I need a lot of quarters hence the AMD chipsets.

 

I plan to do some game streaming as well as multi media creation. I wanna run a single card but I will be running a lot of windows from the games the streaming software researching on the Internet all at the same time for DJing in this online world.

 

I do enjoy playing high and games like my favorite currently is Final Fantasy 15. And I want to use my television in a 4K resolution as my monitor and speakers. And like I said this is more like regular gaming and not FPS or anything like that. And due to my back injury it’s gonna be a lot easier on me to lay in bed with a keyboard and mouse next to me.

 

so essentially I’m trying to figure out if a 2070 super would be enough for me or if I should pony up and get a 2080 super.

 

thanks in advance.

With Great Power Comes Great Responsibility

Link to comment
Share on other sites

Link to post
Share on other sites

I would suggest for 4k gaming you're going to want to consider a 2080 Super. I quite literally just upgraded (well... replaced would be a better word) my aging 4790k GTX 1080 setup, for the following bundle of joy:

 

Gigabyte GeForce RTX 2080 Super Gaming OC 8G

AMD Ryzen 7 3700X

ASUS TUF Gaming X570-Plus ATX Motherboard

 

Thank you Black Friday :P Anyway, what I would say is that while the upgrade was necessary for me, due to more cores, upgrading from a Z97 relic system, DDR4 RAM etc as I'll be working on video content shortly, I wouldn't say I'm noticing a staggering difference in 1080p gameplay - which is to be expected of course.  Playing 4K stuff however, yes, it's not bad at all, assuming you're playing a newer game such as RDR2, and you're happy to tone down some of the visual settings.

 

I decided against a 2070 Super mainly because I wanted a card that would not only easily cope with 1440p gaming, but could touch 4K if I felt like sitting on the sofa playing on the HDR TV, and I would say it more or less copes well enough.  For proper 4K gaming, you're going to want to sell the kids and get a 2080Ti.

 

As for the actual AMD side of things, I couldn't be more pleased. The 4790k has been a faithful servant for many years, and still lives on as a media server now, but it was definitely starting to struggle when I wanted it to do ALL THE THINGS!  Could I have gone for Intel? Yeah... probably.  But the lack of hyperthreading on comparative CPU's just guided me towards AMD, and I've not had any buyer remorse since.

 

So in summary, AMD rocks, 2080 Supers are Very Good, but they're not going to rock your world - unless you're upgrading from a 970 or something....  But for 4K, More is... More, and while a 2070 Super COULD do it, you might as well use the extra pennies (or nickels from your side of the world) and go for the better card. Just remember to grab a good 650w+ PSU to go with it...

 

Link to comment
Share on other sites

Link to post
Share on other sites

I've done maybe a weeks worth of comparing side by side, spec by spec and in the end I've found the extra $250 Isn't justified for an extra 7-9 frames across many games. Very few games did the 2080 super UNLESS you're rocking that 4k build then might as well drop another 250 and get a TI. but 70s vs 80s meh naw not justified.

Link to comment
Share on other sites

Link to post
Share on other sites

It's a lot more useful for 144Hz play, the extra grunt does make a difference here. However, as we know, 4K gaming at anything over 60Hz isn't a thing yet... so if the extra money is available, a Ti is the way forward.

Link to comment
Share on other sites

Link to post
Share on other sites

So the ti is recommended over a super it seams. Wasn’t sure if it be to overkill.

With Great Power Comes Great Responsibility

Link to comment
Share on other sites

Link to post
Share on other sites

If you're gaming, TR is massive overkill, totally useless actually (even the 3950X doesn't perform better than the 3900X in games, adding even more cores won't help at all). Much better to put that money into a 2080 Ti, it's currently the best 4K card out there. 2080 Super will do well too, but the 2080 Ti is a good chunk better. 

 

If you're doing lots of multitasking and content creation, the 3900X is a very very good CPU. 3950X is an option if you need a ton of threads, but probably still overkill (depends on what you're doing and what your software uses to render). As I said at the beginning, TR would be useless for what we know about your workload so far. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Zando Bob said:

If you're gaming, TR is massive overkill, totally useless actually (even the 3950X doesn't perform better than the 3900X in games, adding even more cores won't help at all). Much better to put that money into a 2080 Ti, it's currently the best 4K card out there. 2080 Super will do well too, but the 2080 Ti is a good chunk better. 

 

If you're doing lots of multitasking and content creation, the 3900X is a very very good CPU. 3950X is an option if you need a ton of threads, but probably still overkill (depends on what you're doing and what your software uses to render). As I said at the beginning, TR would be useless for what we know about your workload so far. 

I think I will echo this advice. TR is pretty specialized to people who are using their PC's professionally.  Since your not opting for a Professional monitor and are debating between a 2070 Super and a 2080 Super as a graphics solution, I am going to guess your not going to use your PC professionally, i.e. as your primary job.  That being the case, you would be much better off going with a 3950X or even a 3900X and investing in a 2080 Ti for 4k gaming.  The biggest reason is that the TR isn't all that great of a gaming CPU and 3900X/3950X  can easily match TR in games so your paying an awfully huge premium for a TR that isn't going to gain you much in gaming.   As far a productivity, all your losing is a bit of time going with the 3950X or 3900X and you have to ask yourself if saving 10 minutes here or there is actually worth paying twice as much for the CPU.

Link to comment
Share on other sites

Link to post
Share on other sites

i do a ton of multi tasking because i dj in a online world called Second Life. that eats up cores like crazy and have killed a i7 on a p55 before from it. one reason im looking @ a 3950X. i'm only using a 2017 Sony BRAVIA 55” 4K HDR TV (KD-55X720E) i got a deal on for $250 with static cling protection still on it 2 days ago. still looking for all the specks but doubtful its HDMI 2.0. i do know HDMI 3 is also ARC, but not sure what it means but betting its best for 4K 30 FPS PC hookup.so with what i'm hearing a 2080 Super or Ti are my best GPU options though. if it is a good option on intel chipset im looking at Core i9-9900K on a Gigabyte Z390 DESIGNARE for io like thunderbolt. also trying to do a silver or white themed buuild and yet to find a white theme x570 yet. was going to sacrifice a white motherboard for cores. but if u all think the i9 will be okay.

 

Thanks in Advance:

Alice

With Great Power Comes Great Responsibility

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, Zando Bob said:

If you're gaming, TR is massive overkill, totally useless actually (even the 3950X doesn't perform better than the 3900X in games, adding even more cores won't help at all). Much better to put that money into a 2080 Ti, it's currently the best 4K card out there. 2080 Super will do well too, but the 2080 Ti is a good chunk better. 

 

If you're doing lots of multitasking and content creation, the 3900X is a very very good CPU. 3950X is an option if you need a ton of threads, but probably still overkill (depends on what you're doing and what your software uses to render). As I said at the beginning, TR would be useless for what we know about your workload so far. 

Imo more cores is always good cause it's kinda future proofing things.  Multi threading is the future (hopefully eventually after the PS3 introduced it over a decade ago) for PC gaming as well. 

 

And as for 2080/ti,  does it really matter?

 

Next year around this time it won't be the best card anymore and most likely even be "outdated" due not supporting new features and generally being "slow".

 

Imo currently is the worst time to buy any high end GPU if we know next gen 7nm are on the horizon... 

 

Unless someone really has a lot of cash to throw around then why the hell not.  Always the best! 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, Mark Kaine said:

Imo more cores is always good cause it's kinda future proofing things.  Multi threading is the future (hopefully eventually after the PS3 introduced it over a decade ago) for PC gaming as well. 

 

And as for 2080/ti,  does it really matter?

 

Next year around this time it won't be the best card anymore and most likely even be "outdated" due not supporting new features and generally being "slow".

 

Imo currently is the worst time to buy any high end GPU if we know next gen 7nm are on the horizon... 

 

Unless someone really has a lot of cash to throw around then why the hell not.  Always the best! 

I think your right about PC gaming taking advantage of multicore/multithread performance as we move forward but I am just not all that sure how many cores will actually be needed.  You have to consider most gamers are still running 4-6 core processors so game developers will likely develop to this not 12+ cores so I am not sure how much future proofing your getting out of TR.  Also consider that the 3900X walks all over last generation TR 2950x at nearly half the original price so your argument against the 2080 ti applies to the TR processor as well.

 

As far as the GPU, your absolutely right about it being a bad time to invest in a new GPU.  We will likely see the next gen, high end cards from both AMD and Nvidia within the next 6-8 months.  Providing the pricing doesn't skyrocket with the next gen, it kind of doesn't make sense to spend $1200 on a GPU that will reach its end of life in 6 months.  Still I think it makes more sense to spend an extra $500 on the GPU than it does to spend $700-$800 on the processor just to get TR but that is just me.  I will also go further and say if you just have to buy a video card right now, then from a value perspective, the 5700 XT makes the most sense.

 

So from my perspective, if overall value is the focus, based on what you want to accomplish and your other comments, a 3900x or 3950x with a 5700 XT GPU with the intention to upgrade to either AMD or NVidia top GPU in 6-8 months when they are released.  If your intent on TR, the 5700 XT still might make sense in the context of the new GPUs coming out in 6-8 months, especially since I think the 5700 XT will retain quite a bit of its value even sold used on ebay.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Midnitewolf said:

I think your right about PC gaming taking advantage of multicore/multithread performance as we move forward but I am just not all that sure how many cores will actually be needed.  You have to consider most gamers are still running 4-6 core processors so game developers will likely develop to this not 12+ cores so I am not sure how much future proofing your getting out of TR.  Also consider that the 3900X walks all over last generation TR 2950x at nearly half the original price so your argument against the 2080 ti applies to the TR processor as well.

 

As far as the GPU, your absolutely right about it being a bad time to invest in a new GPU.  We will likely see the next gen, high end cards from both AMD and Nvidia within the next 6-8 months.  Providing the pricing doesn't skyrocket with the next gen, it kind of doesn't make sense to spend $1200 on a GPU that will reach its end of life in 6 months.  Still I think it makes more sense to spend an extra $500 on the GPU than it does to spend $700-$800 on the processor just to get TR but that is just me.  I will also go further and say if you just have to buy a video card right now, then from a value perspective, the 5700 XT makes the most sense.

 

So from my perspective, if overall value is the focus, based on what you want to accomplish and your other comments, a 3900x or 3950x with a 5700 XT GPU with the intention to upgrade to either AMD or NVidia top GPU in 6-8 months when they are released.  If your intent on TR, the 5700 XT still might make sense in the context of the new GPUs coming out in 6-8 months, especially since I think the 5700 XT will retain quite a bit of its value even sold used on ebay.

 

 

Yeah, I agree TR might be overkill,  I was just speaking more generally that more cores is always a good thing imo, of course one has to keep costs in mind too... 

 

And yeah so games use 4-6 cores now,  that will obviously only go up so I think like 8-12 cores is indeed future proofing things,  CPUs also traditionally don't age as fast as GPUs usually do.

 

I'm not sure how feasible this is currently but I think going forward game engines shouldn't be limited by how many cores they use,  they should just use all available cores,  maybe in addition to some dedicated ones to certain tasks like AI , or even physics ... 

 

 

I think I read that's already a thing,  but not sure that's correct - I think it should be certainly possible !? 

 

 

 

 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, Mark Kaine said:

Imo more cores is always good cause it's kinda future proofing things.  Multi threading is the future (hopefully eventually after the PS3 introduced it over a decade ago) for PC gaming as well. 

Future proofing isn't a thing, really. CPUs on average just last a long time, so long as you get at least 6c/12t you'll be fine for a good long while (ones from 2013 are still competitive with current CPUs). Can't see game devs building games for threadripper given that a single CPU costs more than the average gaming PC, meaning they'd have an extremely, extremely small market. There's no really good reason to get a 3950X over a 3900X for gaming, and only get the 3900X over the 3700X because it's a slightly better bin, oooor if you want the absolute best gaming CPU then you get a 9900KS. OP said they're gaming and doing content creation, so the 3900X would be the best fit. Doesn't help that the 3950X is hard to find and quite expensive, nearing HEDT pricing but without the PCIe lanes, quad channel RAM, and other features HEDT brings. Plus you can just drop a used 3950X in down the line if somehow games magically require a 16 core to run (again, noooot really something that will happen). 

 

1 hour ago, Mark Kaine said:

And as for 2080/ti,  does it really matter?

Yes. No. Depends on budget and what games you play. If you're good with tweaking settings, a 2080 Super is fine. If you want to run higher settings, or get higher fps at the same settings, then the 2080 Ti is the best bet. If you want the card that gives you the best chance of hitting 60fps at 4K without compromising settings much, then the 2080 Ti is the best card. It is the best card because it literally is, lol. Only card higher is the Titan RTX, which is twice the price for 3% or so more performance (aka not worth it at all).

If you don't wanna shell out for the 2080 Ti, you take a settings hit, and 4K runs fine on a 1080 Ti, RVII, 2080, or 2080 Super. 2070 Super as well if you're willing to drop settings a bit further. All depends on personal preference for how you want your games to run. 

For content creation, if you do anything on the GPU, the 2080 Ti will be the fastest there too, while still giving you very good performance in games (the RVII can beat it in a few specific workloads, but is a good bit worse in games). 

 

1 hour ago, Mark Kaine said:

Next year around this time it won't be the best card anymore and most likely even be "outdated" due not supporting new features and generally being "slow".

No. 1080 Ti was the best card in 2017 (2 years ago) it still hurts Turing now. Faster than the 2070 Super, about on par with a 2080 or 2080 Super. Radeon VII was AMD's fastest card, still beats the current 5700XT. 980 Ti was Nvidia's fastest card before Pascal, still competes with a 1070 or 1660 Ti/2060 and it's 4 years old now. Apparently good overclockers can beat a 1080 (which is almost 2070 level performance). 780 Ti was the fastest before that, it can still compete with a 1060/1650 Super or so and it's 6 years old now. 

 

Flagship cards take a very, very long time to become generally "slow". And even back to the 780 Ti supports DX12, meaning you can play all the modern titles. If lacking RTX and NVENC (although Pascal had that and I think Kepler may have an alternative) makes them outdated, then the 5700 XT that released this year from AMD must be outdated too ?

Tech doesn't advance at the rate it used to, you're not seeing exponential gains anymore, flagship cards have held up better and better the past few years. 

 

1 hour ago, Mark Kaine said:

Imo currently is the worst time to buy any high end GPU if we know next gen 7nm are on the horizon... 

Worst time to buy is 2 months or so before a release. There's always something coming 6-8 months out, sometimes up to 12 or 18 if it's on a slower cycle. But there is always something "on the horizon", if you're not reasonably close to a release then just buy the best you can afford now. Again, it's not like it used to be where new releases basically made the old stuff obsolete. 

 

1 hour ago, Mark Kaine said:

Unless someone really has a lot of cash to throw around then why the hell not.  Always the best! 

So you'd advise someone to spend hundreds more on a 3950X, or even a TR chip, that they do not need, but call the best GPU for their workload basically a "I have money so hah" purchase? One thing has no practical impact on their workload (from what we know of it), the other is a massive help given they are trying to game at 4K. 

 

16 minutes ago, Midnitewolf said:

I will also go further and say if you just have to buy a video card right now, then from a value perspective, the 5700 XT makes the most sense.

No. If you only, only game, then yes. But it lacks RTX (both for gaming and tensor cores for some compute tasks), NVENC (useful for recording/streaming with minimal performance impact, something a content creator would very likely do), and CUDA acceleration (something that's used as a standard across many pro/prosumer apps). 

The 5700 XT is only ever a better value if you look purely at gaming fps vs cost, disregarding any features the competition offers. There is one thing the 5700 XT can do better, and that's RIS if you want to upscale, it offers higher visual quality and lower performance impact than DLSS in most games. Also if you use OpenGL in apps instead of CUDA, then AMD cards typically do very very well. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

@Zando Bob  see my post above yours I wasn't specifically talking about TR, my apologies if it looked that way,  just more generally . 

8 cores should be minimum,  the more the better basically. 

 

Personally I'm not gonna fall for "x"cores is good enough meme anymore,  I did once that's enough,  and I'm kinda sad I only got a a 6 core currently (didn't have more money though, unless 2700/x)... It's surely good enough right now,  but in 1-2 years that might look differently... 

 

 

Oh and I really disagree with you about the GPUs because you kinda missed the point,  yes it's true recent flag ship GPUs held their value /  performance for a long time, because they were all based on the same architecture but next year (I expect delays tho lol) will be different architecture and 7nm process.  I expect things to change drastically. 

 

 

Edit: We're nearing that" final frontier" of 5, 4, 2... (?) nm, but we're not there just yet.  I'm also pretty sure NVIDIA is holding back as hard they can - so that is certainly an unknown factor, but I do expect drastic changes with next-generation of GPUs nonetheless (it's also a new generation of consoles which does traditionally have a huge influence on tech overall,  AMD being the unknown factor here) 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Mark Kaine said:

Oh and I really disagree with you there because you kinda missed the point,  yes it's true recent flag ship GPUs held their value /  performance for a long time,  but next year (I expect delays tho lol) will be different architecture and 7nm process.  I expect things to change drastically. 

In 6-8 months? Too long to be relevant at all to someone building a computer in the near future. If it's longer than a 2 month wait and you have the money in hand, you're usually best to build now unless there is a guaranteed massive performance boost. And again, even then it still doesn't render old tech irrelevant, nothing in the past few years has done that in the CPU space. My 5820K and 5960X can still put in work, and they're from 5 years ago. Newer stuff offers higher core counts, but even then it can't really render older stuff comparatively unusable. New TR stuff is a fucking monster, but a 2950X/2990WX or 10980XE/9980XE/7980XE (basically the same CPUs lmao) can still put in hella work. They're not rendered unusable because something faster exists, again, there haven't been increases like that in quite a while. 

 

6 minutes ago, Mark Kaine said:

8 cores should be minimum,  the more the better basically. 

If it offers no performance improvement and it's not a flex rig, then uhhhh no it's not better at all. I'd like a 6950X, doesn't mean it's an actual upgrade over my 5960X, it'd actually be slower for what I do (unless I got a golden chip somehow) because they clock lower. With the 3950X vs the 3900X, they're at very similar clocks, and the 3950X adds nothing over the 3900X unless you're doing stuff that actually uses the extra 4c/8t, and then it still comes at a $250 price premium. 

6 cores is pretty much the standard, 8 cores is for a pretty beefy rig, and for the best gaming/content creation PC you go for the 9900KS or 3900X. Going more expensive isn't needed unless you're doing very heavily multithreaded stuff, you just have money to throw away, or you only edit/render all the time. And in what we know of the OP's use case, that money would be better thrown at a higher GPU, since that will actually make for a noticeably better experience. 

And again, waiting 6-8 months for a GPU refresh is dumb. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Zando Bob said:

In 6-8 months? Too long to be relevant at all to someone building a computer in the near future. If it's longer than a 2 month wait and you have the money in hand, you're usually best to build now unless there is a guaranteed massive performance boost

Well I think your argument is reasonable, but the bolded is kinda my point?  It's not 100% guaranteed (what is) but you also don't need to be a Nostradamus to think next gen of cards will blow current gen out of the water. 

 

Unlike previous gens which basically were only incremental updates for a long time. 

 

I said this before, but to me a 2080 (insert moniker here) is just a souped up 1080ti. 

 

I do expect things to change eventually.  And that's next year.  :o

 

 

16 minutes ago, Zando Bob said:

6 cores is pretty much the standard, 8 cores is for a pretty beefy rig, and for the best gaming/content creation PC you go for the 9900KS or 3900X. Going more expensive isn't needed unless you're doing very heavily multithreaded stuff,

I only game* and I expect games to finally take advantage of multi cores more often hence I think 6 cores isn't future proof at all.

 

If someone can afford this or not is a different question.

 

But it would be dumb going only for 6 core if someone could afford more.

 

 

If you don't have money future proofing is never a thing,  if you do it's something to take into consideration.

 

 

*fyi gaming includes recording,  I record everything, it's fun watching and it's a very good way to improve how I play too. 

 

(literally 80% of my storage is just game play videos lol) 

 

 

 

 

 

 

 

 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mark Kaine said:

I only game and I expect games to finally take advantage of multi cores more often hence I think 6 cores isn't future proof at all.

Play Destiny 2. It uses up to 20 threads. Guess what? Doesn't use them to 100% capacity, a high clocked 6 core beats a lower clocked 8 core every single time in every single situation (in my case, a 5Ghz 8600K vs a 2700X on a CH7). If you spread out the load over all the threads, the game's individual tasks can't each use an entire core or thread. Doubt they'll scale up much past 6-8 cores, especially given that that's what the majority of people are buying and running, and game devs want their games to run on people's PCs. If they are hitting 100% usage on an 8 core and slapping frame drops, the gamers complain until they fix the game. Hell Anthem was a terrible launch, hitting 100% usage even on a 7980XE, but that wasn't the game using a CPU properly, that was the game being shit. They've since fixed that. 

 

6 minutes ago, Mark Kaine said:

If you don't have money future proofing is never a thing,  if you do it's something to take into consideration.

Not really, how far in the future is future proof? My X99 stuff still kicks, is it future proof? Cause in 5-10 years it likely won't, is it therefore not future proof? What's the definition of future proof, how long does something need to remain competitive in order to be considered that? Is it different for GPUs, since they tend to slow down faster than CPUs? Or has there never been a future proof GPU ever since they all slow down? But how much does something have to slow down to be considered not future proof? 15% loss? 30%? 50%? 

It's a stupid term and I hate it, just please don't use it. 

 

8 minutes ago, Mark Kaine said:

But it would be dumb going only for 6 core if someone could afford more.

 

Yes, thus why I said:

24 minutes ago, Zando Bob said:

6 cores is pretty much the standard, 8 cores is for a pretty beefy rig, and for the best gaming/content creation PC you go for the 9900KS or 3900X.

If you can afford higher, you go higher. If you're the OP:

On 12/7/2019 at 5:11 PM, FutaGrlAlicePA said:

I’m trying to think of a good computer I can put together. But I wanna maximize my dollars on it.

Then you do not go up just because you can, price/performance comes into play. With a large budget and a 4K gaming/content creation workload, the 3900X with either a 2080 Super or 2080 Ti is the best combo. Possibly a 3950X or RVII swapped in, depends on what the specific software they is and what they do as far as content creation goes. For general use, the Nvidia GPUs are better and there's no need to step up to a 3950X. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Zando Bob said:

It's a stupid term and I hate it, just please don't use it. 

It's because you have a weird definition of that term it seems. 

 

You seem always to be in the here and now,  like things wouldn't ever change lol.  

 

Future proofing to me means "more than 2 years at least".

 

Here's a little story about that:

 

So I got a 2200G *last year*  because "no one needs more than 4 cores anyway"...

 

Then I got a 3600 this year because "it"s good enough"...

 

And how long is *that* going to last?   I'd say 2 years maximum...  

 

If people had told me better save up for at least an 8 core that would have been much more closer to "being good enough" imo,  just like "4 cores is more than enough" was already wrong and outdated last year, seemingly because people tend to not look into the future and think that things are stagnant forever.

 

So in conclusion one should future proof their hardware always as good as possible and not be content with what is currently "good enough"

 

 

 

Fun read for you (perhaps) 

 

https://www.pcgamer.com/capcom-explain-why-the-monster-hunter-world-port-is-cpu-heavy/

 

excerpt :

Quote

While the MT Framework engine has been around for ages, it does a good job in distributing CPU cycles and load-balancing tasks across all available cores and threads. The engine itself is optimized for x86 CPU instruction set, is highly scalable, and loosely speaking, is platform agnostic regardless of PC or console platform so as long as it conforms to the x86 instruction set."

 

 

 

It's not exactly rocket surgery and I'm at this point when a game doesn't utilize multi threading well I'm not interested tbh. Stagnation is boring as hell. And yes I don't play Destiny but that's another good glimpse into the future as well and kudos to the devs for trying. :)

 

 

 

 

And as for the OP, a TXR board obviously won't be necessary, but saying a 6 core would be good enough would also be a disservice. 

 

So if it boils down to don't get a TR board but a x570 instead that's surely sensible, doesn't mean they need to get a measly 6 core that must be replaced already next year most likely. 

And as for 2070 or 2080, it will be outdated soon enough so might just go for what's more fun, ehh.

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Mark Kaine said:

It's because you have a weird definition of that term it seems. 

 

You seem always to be in the here and now,  like things wouldn't ever change lol.  

 

Future proofing to me means "more than 2 years at least".

 

Here's a little story about that:

 

So I got a 2200G *last year*  because "no one needs more than 4 cores anyway"...

 

Then I got a 3600 this year because "it"s good enough"...

 

And how long is *that* going to last?   I'd say 2 years maximum...  

 

If people had told me better save up for at least an 8 core that would have been much more closer to "being good enough" imo,  just like "4 cores is more than enough" was already wrong and outdated last year, seemingly because people tend to not look into the future and think that things are stagnant forever.

 

So in conclusion one should future proof their hardware always as good as possible and not be content with what is currently "good enough"

That's not future proofing, my point is that that term is dumb and not really a real thing because there's no real definition. Lasting 2 years isn't future proof, that's normal for hardware in the last 8 years for CPUs, 6 or so for GPUs. It's just buying the best you can afford right now, which is what I usually advise anyways (unless there's actually an imminent release for some piece of hardware). 

I.... I don't understand how you thought 4 cores was enough last year? 2018? Ryzen has been out since 2017 or so, games like Battlefield 1 were bullying quad cores before that, it's been well established that 6 cores is the sweet spot for gaming, 8 if you're tacking on streaming a lot or multitasking a lot, and more if you have the money/use case for it. 

 

5 minutes ago, Mark Kaine said:

Fun read for you (perhaps) 

 

https://www.pcgamer.com/capcom-explain-why-the-monster-hunter-world-port-is-cpu-heavy/

 

excerpt :

 

It's not exactly rocket surgery and I'm at this point when a game doesn't utilize multi threading well I'm not interested tbh. Stagnation is boring as hell. And yes I don't play Destiny but that's another good glimpse into the future as well and kudos to the devs for trying. :)

 

Does performance scale properly? Like actually improves noticeably with more cores? Or is it still few fast cores > many slower ones?
 

7 minutes ago, Mark Kaine said:

So if it boils down to don't get a TR board but a x570 instead that's surely sensible, doesn't mean they need to get a measly 6 core that must be replaced already next year most likely. 

Yep, and that's also why I advised a 3900X, which is a 12c/24t chip, literally twice the "measly" 6 core (my "measly" 6 core 5820K still does everything I ask of it just fine though ? and it's a CPU from 2014). 

 

8 minutes ago, Mark Kaine said:

And as for 2070 or 2080, it will be outdated soon enough so might just go for what's more fun, ehh.

OP is gaming at 4K, so the 2080/2080 Ti is going to be more fun for sure. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Zando Bob said:

Does performance scale properly? Like actually improves noticeably with more cores? Or is it still few fast cores > many slower ones?

I think so,  huge jump over the 2200G at least (at similar frequency actually) and it's also using all 6c/12t *very evenly*  which to me seems at least equally important because what's the point of "multi threading" when it still only uses a fraction of cores properly. 

 

MT Frameworks is and always has been an amazing engine for sure,  I also like that it's largely platform agnostic. 

 

1 hour ago, Zando Bob said:

Lasting 2 years isn't future proof,

Well, I said more than 2 years at least,  more is better of course. But I agree the term is surely a bit wishy washy... 

 

1 hour ago, Zando Bob said:

that's also why I advised a 3900X,

 

1 hour ago, Zando Bob said:

OP is gaming at 4K, so the 2080/2080 Ti is going to be more fun for sure. 

Yeah, I agree for once... I just always advice to keep in mind what happens next year as to not having any bad surprises.

 

I mean either of those cards surely will still be good for several years,  they just won't be top notch and will be likely missing some features. 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Mark Kaine said:

I think so,  huge jump over the 2200G at least (at similar frequency actually) and it's also using all 6c/12t *very evenly*  which to me seems at least equally important because what's the point of "multi threading" when it still only uses a fraction of cores properly. 

The 2200G is about even with Haswell i5s from around 2013/2014. Zen+ has a similar IPC to Haswell, at similar speeds it performs about the same (it's the reason my 5960X beats the 2700X I used to have, purely because it can clock higher and the IPC is similar). The R5 3600 is about equal with a stock i7 8700K, which is a beastly gaming CPU both out of the box and overclocked. Of course there's a massive difference, not only is the game able to use the cores it wants, but it's not fighting windows and any other background processes for them. I mean more, does it scale all that well past 6-8 cores. As stated before, Destiny 2 can use up to 20 threads, but performance doesn't actually scale past 6-8 or so. A 16c CPU isn't twice as fast as an 8c, and the 8c can likely clock a lot higher so it'll beat the 16c by a noticeable margin. 
 

11 minutes ago, Mark Kaine said:

Well, I said more than 2 years at least,  more is better of course. But I agree the term is surely a bit wishy washy... 

That's why I hate it lol, it doesn't really mean anything. 
 

11 minutes ago, Mark Kaine said:

I mean either of those cards surely will still be good for several years,  they just won't be top notch and will be likely missing some features. 

True, but it'll be better than a 6-8 month wait. If the OP really wants the best bang for the buck and doesn't need CUDA, RTX, or NVENC, theeeen the Radeon VII can actually be an option too. They're around 1080 Ti/2080 level performance, and Newegg has them for $550 now, a good bit less than 2080s go for (when I bought mine it was $699, same as the 2080). The VRAM bandwidth helps at 4K, and they will push demanding titles at that res if you drop settings. I got Assassins' Creed Odyssey and Destiny 2 running fine at that res, back when I hooked up my 4K TV to test. Also compared it to my 1080 Ti (which I've since traded away because I didn't like having such a nice GPU just sitting around with no one to use it), there wasn't any noticeable difference in the gameplay experience. 

Nvidia GPUs just tend to win out due to the extra features (Radeon currently has RIS which is better than DLSS, and better OpenGL performance but that's only useful in a few specific workloads). If fps/$ is your prime concern though, that's where AMD's GPUs come into their own. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Zando Bob said:

Future proofing isn't a thing, really. CPUs on average just last a long time, so long as you get at least 6c/12t you'll be fine for a good long while (ones from 2013 are still competitive with current CPUs). Can't see game devs building games for threadripper given that a single CPU costs more than the average gaming PC, meaning they'd have an extremely, extremely small market. There's no really good reason to get a 3950X over a 3900X for gaming, and only get the 3900X over the 3700X because it's a slightly better bin, oooor if you want the absolute best gaming CPU then you get a 9900KS. OP said they're gaming and doing content creation, so the 3900X would be the best fit. Doesn't help that the 3950X is hard to find and quite expensive, nearing HEDT pricing but without the PCIe lanes, quad channel RAM, and other features HEDT brings. Plus you can just drop a used 3950X in down the line if somehow games magically require a 16 core to run (again, noooot really something that will happen). 

 

Yes. No. Depends on budget and what games you play. If you're good with tweaking settings, a 2080 Super is fine. If you want to run higher settings, or get higher fps at the same settings, then the 2080 Ti is the best bet. If you want the card that gives you the best chance of hitting 60fps at 4K without compromising settings much, then the 2080 Ti is the best card. It is the best card because it literally is, lol. Only card higher is the Titan RTX, which is twice the price for 3% or so more performance (aka not worth it at all).

If you don't wanna shell out for the 2080 Ti, you take a settings hit, and 4K runs fine on a 1080 Ti, RVII, 2080, or 2080 Super. 2070 Super as well if you're willing to drop settings a bit further. All depends on personal preference for how you want your games to run. 

For content creation, if you do anything on the GPU, the 2080 Ti will be the fastest there too, while still giving you very good performance in games (the RVII can beat it in a few specific workloads, but is a good bit worse in games). 

 

No. 1080 Ti was the best card in 2017 (2 years ago) it still hurts Turing now. Faster than the 2070 Super, about on par with a 2080 or 2080 Super. Radeon VII was AMD's fastest card, still beats the current 5700XT. 980 Ti was Nvidia's fastest card before Pascal, still competes with a 1070 or 1660 Ti/2060 and it's 4 years old now. Apparently good overclockers can beat a 1080 (which is almost 2070 level performance). 780 Ti was the fastest before that, it can still compete with a 1060/1650 Super or so and it's 6 years old now. 

 

Flagship cards take a very, very long time to become generally "slow". And even back to the 780 Ti supports DX12, meaning you can play all the modern titles. If lacking RTX and NVENC (although Pascal had that and I think Kepler may have an alternative) makes them outdated, then the 5700 XT that released this year from AMD must be outdated too ?

Tech doesn't advance at the rate it used to, you're not seeing exponential gains anymore, flagship cards have held up better and better the past few years. 

 

Worst time to buy is 2 months or so before a release. There's always something coming 6-8 months out, sometimes up to 12 or 18 if it's on a slower cycle. But there is always something "on the horizon", if you're not reasonably close to a release then just buy the best you can afford now. Again, it's not like it used to be where new releases basically made the old stuff obsolete. 

 

So you'd advise someone to spend hundreds more on a 3950X, or even a TR chip, that they do not need, but call the best GPU for their workload basically a "I have money so hah" purchase? One thing has no practical impact on their workload (from what we know of it), the other is a massive help given they are trying to game at 4K. 

 

No. If you only, only game, then yes. But it lacks RTX (both for gaming and tensor cores for some compute tasks), NVENC (useful for recording/streaming with minimal performance impact, something a content creator would very likely do), and CUDA acceleration (something that's used as a standard across many pro/prosumer apps). 

The 5700 XT is only ever a better value if you look purely at gaming fps vs cost, disregarding any features the competition offers. There is one thing the 5700 XT can do better, and that's RIS if you want to upscale, it offers higher visual quality and lower performance impact than DLSS in most games. Also if you use OpenGL in apps instead of CUDA, then AMD cards typically do very very well. 

jsyn the vii doesnt beat the 5700xt 

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, scuff gang said:

jsyn the vii doesnt beat the 5700xt 

Source? From what I'm seeing the gap isn't as big as I thought it was, but it is there and it is measurably faster than the 5700XT. Most RVIIs can do 1950-2000Mhz as well (often at voltages lower than stock making for stabler, higher clocks due to better thermals), meaning they'll be faster than the stock benches(I've gotten mine up to 2050Mhz so far without powerplay tables or any other advanced stuff, but mine is on a waterblock). Also has more VRAM for any VRAM hungry titles. At the price premium ($150 given current pricing) it may not be worth it though, that much is true. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Zando Bob said:

Source? From what I'm seeing the gap isn't as big as I thought it was, but it is there and it is measurably faster than the 5700XT. Most RVIIs can do 1950-2000Mhz as well (often at voltages lower than stock making for stabler, higher clocks due to better thermals), meaning they'll be faster than the stock benches(I've gotten mine up to 2050Mhz so far without powerplay tables or any other advanced stuff, but mine is on a waterblock). Also has more VRAM for any VRAM hungry titles. At the price premium ($150 given current pricing) it may not be worth it though, that much is true. 

yeah you get better nbody calculation and lik a little more fps -7% but if your looking for details in your graphics (lighting, shading, reflection) the rx 5700 xt is the better way to go 

so i guess it goes either way i wouldnt want to pay 2x as much for a a little more vram and nbody calc 

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, scuff gang said:

yeah you get better nbody calculation and lik a little more fps -7% but if your looking for details in your graphics (lighting, shading, reflection) the rx 5700 xt is the better way to go 

Going off of UserBenchmark? From what I've seen they're pretty horrible on the CPU side (literally changed their entire CPU scoring system to rank Intel higher than they should be), and eeeeh on the GPU side. Good for ballpark numbers but not specifics. You're better off looking at actual benchmarks from good sites and youtubers. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Zando Bob said:

Going off of UserBenchmark? From what I've seen they're pretty horrible on the CPU side (literally changed their entire CPU scoring system to rank Intel higher than they should be), and eeeeh on the GPU side. Good for ballpark numbers but not specifics. You're better off looking at actual benchmarks from good sites and youtubers. 

havent looked at theyre cpu side but i find the gpu has alot of things that other sites dont. would be nice to see how theyre conducting theyre tests, will look at other sites next time 

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×