Jump to content

Planning the new Ryzen 7 upgrade!

An important note - this build is all very tricky because some prices, specs, etc, are not yet known for these items because they haven't been released fully. I would love to use PC part picker but most of this isn't on there yet. The goal of this post is just to figure out if everything sounds like it's going to work well and try to get as many questions answered now, rather than trying to plan last minute (or waiting for each component to come out) and everything being out of stock by the time I figure the whole build out. My biggest concern is the struggle to stay on top of all the news coming out about these items and what they need. Also, I type fast & detailed so this will be long af. Details keep the expensive mistakes away!

 

Budget: Overall not wanting to pass 4k USD. The goal is to get the parts to go fast, the parts needed to make those parts work well, and nothing more. Not spending $$$ on RGB or other non-necessities. I don't overclock.

 

Country: USA

 

Use case: Blender! Cycles animations (Optix - GPU bound), physics simulations (often doing smoke sims, CPU bound), and I am having a lot of fun with large scale environments often involving volumetrics (clouds are fun) and things like large oceans and complex lighting (GPU bound). I do game occasionally but it's probably 1/10th of the time compared to what I spend in blender and simply put, I don't have any concerns about this system not being able to run really any game.

 

Monitors: Puke warning: Currently running 2 1080p 60hz monitors from work as well as a probably 8 year old LG 34 inch, 2560x1080, 60hz monitor. I plan to get the Samsung QD-OLED G8QNB when it comes out I believe around end of year - assuming the news stays good. I know it might suffer burn in, and it will likely run around 1,300$ (part of the overall budget), but I want to see the stuff I create in the best way possible.

 

Some history for anyone that cares that explains why I am upgrading... 

I built my second to last PC in 2014 right when DDR4 came out. An i7-5820k, 16GB of DDR4, R9 295x2 (later changed to 1070), 1200W Evga PSU. I held onto that thing until January of 2022 cause I am one of those "oh, that just got announced and will be even faster than the thing that just came out? I guess I should wait a little longer..." type folks. Well, after getting deeply back into Blender in late 2021 I quickly ran into issues. Smoke simulations on a 2014 CPU and rendering said volumetric animations with a 1070 in the Cycles render engine quickly proved painful, if not simply impossible.

 

By January I got fed up. I knew Ryzen 7 was coming but there were some deals to be had on the current processors, and they were actually in stock which was something I had very little confidence about regarding Ryzen 7 and DDR5 given the horrible shortages plaguing the PC world for about 2 years at that point. I also had a co-worker wanting to get rid of a nice 2080 for free so it seemed like the time to move. So, I finally got over my "just keep waiting" problem and got a 5950x, 32 GB of much faster DDR4, and that 2080. I also got new storage all around, mainly NVME. I ran that 2080 and 1070 together as Blender can take full advantage of that - and simply put, I was blown away by the immediate ROI. I should not have waited that long! I literally had no good reason to do so. I just happened to be at a Best Buy a few months later as they pulled a single 3080 TI FTW3 off the truck, which I snatched, and was immediately happy with as it chunked through frames about another 50% faster than my dual GPU combo could. I was sad to find though that my 2080 wouldn't fit at the same time though due to a required PCIe to NVME adapter since the MB only has two onboard slots.

 

While my current 5950x, 32GB RAM, and 3080 TI is obviously fast  - I guess I caught the bug and now I want to go even faster. The 7950X looks like it will be able to take a massive chunk of time out of my simulations which is awesome. And, the new MB's seem to allow me to return to my dual GPU setup, more on that later. Not to mention that I have a nearly 8 year old PSU now that was decently tortured by that 295x2 for a while, a monitor that needs replacement, an AIO that's still functioning but getting up there in years, and a case with dying fans that requires me to leave the glass off the side to keep the current components breathing.

 

For those whom may be upset about my 5950x or other parts going to waste - worry not! Everything I am recycling I have spare parts for (storage and GPU) such that my existing system can allow me to game while the other renders, or I can use it as a second over-the-network render station. If neither of those work out I am sure one of my artists friends will happily continue to give it life.

 

 

The new build:

 

CPU - Ryzen 7 7950x - 700$

No concerns honestly.

 

Motherboard - Asus ROG STRIX X670E-E GAMING WIFI - 700$ estimate

Concerns: Guessing around 700$ (hopefully a overshot) based on the only reports I could find.

Edit: I was going to say Asus's website (Edgeup.asus.com) is not yet live however, the waybackmachine has captures of the articles. Idk if it's just down or what. In any case... these links will show you the articles.

While I see little specifics to my board, these articles seem to suggest ALL the new MB's have three or four onboard M.2 slots at least - so my problem of needing a PCIe adapter to M.2 which blocks me from putting in a second 3 slot graphics card is fixed!

I also get a decent view of the power connectors and don't see any concerns.

 

https://web.archive.org/web/20220831164041/https://edgeup.asus.com/2022/three-new-x670e-motherboards-break-cover-from-rog-rog-strix-and-tuf-gaming/

https://web.archive.org/web/20220826183949/https://edgeup.asus.com/2022/new-x670e-motherboards-arrive-from-rog-proart-and-asus-prime/

strix-x670e-e.jpg

 

 

 

PSU -  EVGA Supernova 1600 P+, 80+ Platinum 1600W - currently 200$ on Amazon for 60% off??

I still trust EVGA as a brand for this job. Probably overkill but keep in mind three things.

1. I will hold onto this one for another 8 years and like longevity, I won't have to worry about how hungry future parts are as we seem to move in an evermore power hungry world.

2. Ryzen 7950X + a 3080 TI + a 2080 will be power hungry

3. If the 40 series power consumption rumors are anywhere near true and I indeed wind up with a 4080/4090 running alongside my 3080, that too will be very hungry.

And keep in mind I will sometimes floor all my GPU's and my CPU at the same time in Blender.

 

That said - I don't get EVGA's PSU naming and costs. Why is this 60% off? It has been for months it seems according to PC part picker. Is this an old product/model or is there something becoming outdated about it? Or is this just a markup and sale to look better situation? Also the whole P, P+, P2, P3 - I tried to look into this and it seemed to correlate to product age and/or size - where smaller PSU's with equal wattage obviously cost more. Just trying to make sure this is indeed a good product with a long compatible life span ahead of it. I am not trying to skimp on the angry box that can kill everything in my computer if it wants, but I also don't want overkill to have really unneeded cost behind it. If I should really get a 1200w for 300$ that's a P69+ model, then so be it.

 

 

RAM -  G.Skill Trident Z5 RGB Series 32GB (2 x 16GB) 288-Pin SDRAM DDR5 6400 (PC5-51200) - 309$

No idea what ram speeds are good for Ryzen 7 yet. I surprisingly still don't bump up against my RAM limit so I will be doing another 32 GB. Doing 2x16 to leave room for the eventual 64GB though. I would love to not pay for RGB - but for some reason non-RGB costs more at the moment?? That is a trend in other parts too for reasons I don't understand.

 

GPU - For now, recycling my 3080 TI

But!!! I want to be able to put my 2080 in along side my 3080. I currently can't do that because the Asus x570 gaming II has two onboard M.2 slots which required me to get a PCIe to M.2 adapter. That blocks off the 3rd slot my 3 slot 2080 would need. As mentioned under the MB, it seems the X670 series from Asus has 3-4 m.2 slots on each board so I should be all set to dual GPU again for Blender as that third slot won't be blocked off.

 

I also want to know that if I want, I will have the space / compatibility to upgrade to a 4080 or 4090 down the road (GPU ram is an issue) and run it alongside my 3080. I will assume a 4080 will also block three slots so, if I can do a 2080, I should be able to do a 4080.

 

Storage - recycling it all.

8 TB 3.5 inch drive
4 TB 3.5 inch drive

128 GB Sata SSD - old and I just throw a few small games on it.
1 TB Samsung NVME drive for OS
2x 1TB Samsung NVME drive for art (software raid, backups go to 8 TB)
Again - just needing a MB with at least 3 onboard M.2 slots which seems to be covered.

 

Monitor - Samsung QD-OLED G8QNB - 1,300$ estimate
As mentioned, this isn't out nor is price 100% known yet but estimates can be made and I have every intention to get it. While burn in is concerning, I think this will be a stunning monitor and honestly can't wait to see what my art looks like on it.

 

The cost so far... 3,200. 
So I have a good bit of room for cases and such - but again, I don't want to spend money I don't have to. I just want to make sure the parts can perform. Any unnecessary money I spend just takes away my ability to get a 40 series card. 

 

 

Here is where I really can't decide.

 

The NZXT build... because I have had NZXT cases since 2012. I am used to using and having NZXT CAM installed and being able to control this all would be nice.
NZXT H7 Flow - 130$
NZXT Kraken X73 - (LCD display edition, non RGB Fans) 254$
NZXT F120 RGB Triple Pack - 90$ - either this or 3 Noctua fans. As much as I roll my eyes at RGB, three Noctua fans would have a similar cost... so, having some fun RGB for the same price that I can control via the same software seems like an okay trade. I know the NZXT fans are not as great.

Cost: 474

Concerns: my current NZXT case is a closed box that chokes my components. While I know the flow model is better, I worry about the GPU's. I really don't want to wind up pulling off the side panel again to keep the air cooled GPU's fed. The GPU's are going to be stacked very close, it would be great if they had better airflow and the bottom wasn't pressed up against the PSU shroud in addition to the top GPU being up against the second GPU. I would do a top mounted rad so the front can pull in just fresh air to feed the GPU's, and I think it would probably be okay? That said, I worry about the GPU's exhaust going up into the CPU rad and causing issues, or most the front air getting needlessly blown out the top rad immediately and GPU's thus toasting in their own circulation.

 

I do love how easy it is to put there rads into there cases though...

 

 

And because of that airflow concern... the maximum air build.
Fractal Design Torrent RGB Black E-ATX RGB  - 258$ (non-RGB cost more at the moment for some reason, great flow for GPU's, no extra fans needed)
Noctua NH-D15 - 109$

Cost: 367

So my concerns here are interesting. The case RGB I think I can just hook up to a MB header and control through the Asus software. That's kind of a positive because it means I don't need two pieces of software to control the case versus board. The Noctua though... - I have gone with AIO's for so long it's weird to go back. I could put the same kraken in this case by mounting it to the front and I don't think that would horribly impact the GPU's since they would still have the bottom feeders, but that would send the cost up to about 500$ making it the most expensive option. Then I would need more software, I would also have to remove the two large front fans and pretty much waste them, and I know the nice fans this case comes with are a large chunk of the cost here.

 

So all in all at worst I would be around 3,700. That does not leave any room for my 40 series hopes sadly, but I am sure my 2080 and 3080 will still be a good duo.

 

 

 

Let me know what everyone thinks! Stock ideas on any of these components is a big question for me, and ultimately could kill the build. So far the CPU/MB/RAM (what I worry most about) stock numbers are sounding promising.

 

Link to comment
Share on other sites

Link to post
Share on other sites

There are apparently problems with slow initial boot on some x670 motherboards, it is not known yet if that is a general issue or not.  While the 2080 does have an SLI comb the 3080 does not.  I don’t know how you’re running both at the same time on the same app.  Would require multiple desktops at least. Could assign one to a VM I suppose.   If you water cooled the cpu why not also the GPUs though?  It takes care of your airflow problem.  This does not seem like the cheapest way to do this.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD is really pushing Intel, apparently Raptor Lake cpu will be announced this month and shipping next.

 

Have you considered an Asus ProArt Creator motherboard?

 

@Bombastinator's suggestion to use AIO GPU going forward would resolve cooling concerns. It does complicate case selection a bit.

 

You may want to start with 64GB of memory known to be compatible with the motherboard. DDR5 is relatively new and there are bound to be teething problems, as there are with Alder Lake. I would not count on mixing memory kits for a while.

 

80+ ratings certify electrical efficiency. Not quality.

 

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Bombastinator said:

There are apparently problems with slow initial boot on some x670 motherboards, it is not known yet if that is a general issue or not.  While the 2080 does have an SLI comb the 3080 does not.  I don’t know how you’re running both at the same time on the same app.  Would require multiple desktops at least. Could assign one to a VM I suppose.   If you water cooled the cpu why not also the GPUs though?  It takes care of your airflow problem.  This does not seem like the cheapest way to do this.

20 minutes ago, brob said:

AMD is really pushing Intel, apparently Raptor Lake cpu will be announced this month and shipping next.

 

Have you considered an Asus ProArt Creator motherboard?

 

@Bombastinator's suggestion to use AIO GPU going forward would resolve cooling concerns. It does complicate case selection a bit.

 

You may want to start with 64GB of memory known to be compatible with the motherboard. DDR5 is relatively new and there are bound to be teething problems, as there are with Alder Lake. I would not count on mixing memory kits for a while.

 

Interesting on the slow boot thing.... although I imagine that will be resolved via some combination of updates. I fully expect early adopter woes, but I will push through.

 

This SLI concern is going to take some explaining though...

 

TLDR: Blender does not need/want SLI, it can take full advantage of both cards entire power with the click of a box. It's the only app that I know that can do something like this, and it doesn't come with the crazed loss performance per $ of SLI. It just eats the power of both cards as much as they will allow.

 

You do need something like SLI to use multiple cards in games, and I believe they must always be the same exact card, to my limited knowledge. I believe the reason this is is that you need to make two cards look like one to the games engine. I have no interest in using the two card setup for games though. For games, the machine will just default to the first GPU, which I will make sure is the more powerful one. This all worked rather flawlessly when I had the 2080 + 1070. When in game only the 2080 was doing anything.

 

You don't need or to my knowledge event want SLI for Blender, hence I was able to run a 2080 and 1070 (two very different cards) at the same time in Blender. Blender renders a frame in "tiles." Basically, it divvies up the frame into multiple small chunks. So it might divide a 4k picture into 4 sections that it will then do one at a time. In older versions of Blender the tiles were much smaller, a 4K picture might consist of several hundred. So if you have ever seen a Blender render benchmark you are probably used to seeing the little snake of tiles that roam around the screen filling in the blanks until the image is complete - those are the tiles.

 

For a CPU render it might hand one tile to each core. In the case of a GPU, the GPU as a whole typically does one tile at a time but is much much MUCH faster. If you have more than one GPU, it uses one GPU for one tile and the other GPU for another tile at the same time. Blender can see and utilize them as two sperate devices. All you have to do to make this happen is plug in two cards, go into the preferences, and make sure both are checked for the render. You can even have it rendering with two GPU's and all your CPU cores at the same time if you want, although this is a bad idea. Unless you have a really low end GPU and really high end CPU, odds are enabling the CPU as well will just slow down the overall time as the GPU is likely much faster at this task than the CPU. The problem is that the GPU will complete the entire frame, but the CPU will still be assigned and working on several tiles. For the CPU to complete those will take 10x as long as it would have taken the GPU, meanwhile the GPU is just sitting idle waiting for tiles to be assigned which won't happen until blender starts the next frame which doesn't happen until the current frame is done. The same problem happens if you have one really fast and one really slow GPU, so don't mash a 3080 and a 670 together. Otherwise the 3080 cleans up the whole frame, while the 670 tries desperately to get through the one or two tiles it got assigned.

 

As far as water cooling the GPU(s) - if I water cooled my CPU it would just be a simple AIO. I have no experience with installing custom waterblocks onto GPU's, building loops, etc. Nor do I think that would be the best idea given these cards may be moving between systems, and are not exactly the same cards or dimensions. Your comment seems to make the suggestion that this would be cheaper, which I have a hard time believing. It would mean getting the blocks, tubing, a pump, fittings, radiators, etc, in addition to the case which would likely need to be a more expensive flavor to house all that.

 

For what it's worth, while I genuinely can't explain why this is, rendering in Blender is not as "heat producing" as just playing a high end game is. I think this is because Blender really hammers the "copy" and memory component of the card only (according to task manager) while leaving everything else alone. Meanwhile games seem to hammer the card more evenly on the 3D and encode parts as well. My best guess is that, for some reason, 3D rendering only uses part of the GPU where gaming uses more. Blender still max's these components out so it is still GPU bound, but it doesn't use "all" of the GPU and thus produce as much heat. I again don't understand it, but I know that this room stays cooler during Blender renders than a basic AAA title, and that the fans hardly spin up as much on the GPU(s).

 

To that same angle, my 2080 had no issues staying cool in game despite being mashed up against a 1070. The fact the 1070 was doing nothing certainly helps.

 

I should have also mentioned - Windows 11.
I have no issues with upgrading if anyone thinks that would be beneficial. I know it's largely beneficial to Intel's big/little scheme which I don't think is any concern here. I am not opposed to the jump. Building a new PC is the time to do it IMO. I work with Win11 as part of my job so I am fairly used to it at this point and have access to several systems running it that will allow me to double check all my art software doesn't puke on it for some weird reason.

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, brob said:

AMD is really pushing Intel, apparently Raptor Lake cpu will be announced this month and shipping next.

 

Have you considered an Asus ProArt Creator motherboard?

 

@Bombastinator's suggestion to use AIO GPU going forward would resolve cooling concerns. It does complicate case selection a bit.

 

You may want to start with 64GB of memory known to be compatible with the motherboard. DDR5 is relatively new and there are bound to be teething problems, as there are with Alder Lake. I would not count on mixing memory kits for a while.

 

Forgot to reply to the ProArt board / RAM

 

I remember looking at the ProArt board variant when I got my X570 but all I really recall is that it cost a lot more and came with features I didn't need. The info Asus has out on the new boards is still relatively little, but I have a feeling it will be much the same case this time around. I will certainly keep an eye out for it's price and features though.

 

As far as the ram, I have yet to ever hit my 32GB ram limit, so it's hard to justify an extra 300$ right out of the gate. Likely by the time I need 64GB I will either opt to buy another set of 16x2 that matches my current, or hopefully the price will have come down and I can buy a full 64GB of faster ram than my original at a much cheaper $ per GB price.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, BluKobold said:

As far as the ram, I have yet to ever hit my 32GB ram limit, so it's hard to justify an extra 300$ right out of the gate. Likely by the time I need 64GB I will either opt to buy another set of 16x2 that matches my current, or hopefully the price will have come down and I can buy a full 64GB of faster ram than my original at a much cheaper $ per GB price.

 

Until Asus posts the motherboard specs we won't know for sure, but on the memory QVL pages of its DDR5 Alder Lake motherboards there is a specific caution against using multiple kits even of the same part number.

80+ ratings certify electrical efficiency. Not quality.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, BluKobold said:

Forgot to reply to the ProArt board / RAM

 

I remember looking at the ProArt board variant when I got my X570 but all I really recall is that it cost a lot more and came with features I didn't need. The info Asus has out on the new boards is still relatively little, but I have a feeling it will be much the same case this time around. I will certainly keep an eye out for it's price and features though.

 

As far as the ram, I have yet to ever hit my 32GB ram limit, so it's hard to justify an extra 300$ right out of the gate. Likely by the time I need 64GB I will either opt to buy another set of 16x2 that matches my current, or hopefully the price will have come down and I can buy a full 64GB of faster ram than my original at a much cheaper $ per GB price.

If all you do is game on the thing you probably never will either.  The problem is ram only comes in specific and fairly large chunks.  That’s why 16gb is suggested instead of 8gb.  You don’t need 16gb to game and run whatever you run in the background to not hit swap, but more than 8 is common.  I’ve never heard of more than 16 yet.  The only time I suggest more than 16 is if the user wants to run VMs or some arts programs.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, Bombastinator said:

If all you do is game on the thing you probably never will either.  The problem is ram only comes in specific and fairly large chunks.  That’s why 16gb is suggested instead of 8gb.  You don’t need 16gb to game and run whatever you run in the background to not hit swap, but more than 8 is common.  I’ve never heard of more than 16 yet.  The only time I suggest more than 16 is if the user wants to run VMs or some arts programs.

 

"If all you do is game on the thing you probably never will either."

 

I think there is some confusion here. Respectfully, I realize that I did go on the internet to ask for help, and I appreciate that you are trying to assist me in that request. I also realize that people over prescribing parts such as wanting a 3090 TI to play Roblox on their 1080p monitor is probably a common problem on forums like this. But, I have made it pretty clear that's not my case. As I said in detail in my original post under the header "Use case:", not to mention the long explanation I happily provided to your comments earlier, my use case for this machine is a lot of Blender. As I mentioned, I hardly game at all. I just don't find it a satisfying use of my time.

 

I would be a little shocked to find out that folks on this forum don't know what Blender is since it's mentioned or used in most every LTT benchmark related video, along with plenty of other channels like Gamers Nexus. What you just said however gives me the impression you are not familiar with Blender though, as it is easily one of the most applicable "Art Programs" you would want more than 16 GB of RAM for. 

 

https://www.blender.org/download/requirements/

32 GB is what they themselves recommend. 

 

To give you some background for future reference - It's 3D animation software, and in my case I work in the heavier ray traced side and also involve physics simulations frequently. To load up and initiate a render of a single frame of my last animation project requires Blender to eat over 16 GB's of RAM. If you run out of physical ram with Blender, regardless of your swap/page file, it's gonna crash and burn. That's part of why my old build which only had 16GB became "impossible" to use. That GB count of RAM needed is only going to go up as my processor gets faster and allows me to build even more elaborate physics simulations, or my scenes which now involve thunderstorms and oceans become even bigger and more complex. Breaking 32GB is merely an eventuality, but it's one I think is far enough away that I will save money by waiting as I mentioned. That will allow the cost per GB of DDR5 to come down, the bugs to get worked out, and the speed for the same price to go up.

 

Realistically though what I am building isn't all that special. I can just be treated as building a very powerful PC which I would think happens around here often. However, I do have a use case which is going to really justify and make use of the full thing - which is perhaps something that doesn't happen as often around here.

 

Hopefully this clears things up.

 

 

And I found this for anyone curious about Blender and multiple cards... It's a thing.

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, BluKobold said:

 

"If all you do is game on the thing you probably never will either."

 

I think there is some confusion here. Respectfully, I realize that I did go on the internet to ask for help, and I appreciate that you are trying to assist me in that request. I also realize that people over prescribing parts such as wanting a 3090 TI to play Roblox on their 1080p monitor is probably a common problem on forums like this. But, I have made it pretty clear that's not my case. As I said in detail in my original post under the header "Use case:", not to mention the long explanation I happily provided to your comments earlier, my use case for this machine is a lot of Blender. As I mentioned, I hardly game at all. I just don't find it a satisfying use of my time.

 

I would be a little shocked to find out that folks on this forum don't know what Blender is since it's mentioned or used in most every LTT benchmark related video, along with plenty of other channels like Gamers Nexus. What you just said however gives me the impression you are not familiar with Blender though, as it is easily one of the most applicable "Art Programs" you would want more than 16 GB of RAM for. 

 

https://www.blender.org/download/requirements/

32 GB is what they themselves recommend. 

 

To give you some background for future reference - It's 3D animation software, and in my case I work in the heavier ray traced side and also involve physics simulations frequently. To load up and initiate a render of a single frame of my last animation project requires Blender to eat over 16 GB's of RAM. If you run out of physical ram with Blender, regardless of your swap/page file, it's gonna crash and burn. That's part of why my old build which only had 16GB became "impossible" to use. That GB count of RAM needed is only going to go up as my processor gets faster and allows me to build even more elaborate physics simulations, or my scenes which now involve thunderstorms and oceans become even bigger and more complex. Breaking 32GB is merely an eventuality, but it's one I think is far enough away that I will save money by waiting as I mentioned. That will allow the cost per GB of DDR5 to come down, the bugs to get worked out, and the speed for the same price to go up.

 

Realistically though what I am building isn't all that special. I can just be treated as building a very powerful PC which I would think happens around here often. However, I do have a use case which is going to really justify and make use of the full thing - which is perhaps something that doesn't happen as often around here.

 

Hopefully this clears things up.

 

 

And I found this for anyone curious about Blender and multiple cards... It's a thing.

 

So arts programs. Blender would be one, Adobe suite could be another depending what you do in it.  Video editing falls under that as well.  Given the incredible length (it’s actually one of the longest posts I’ve ever seen including ones with pictures in them) I didn’t read all of it.  I did read some though and that bit was near the top.  Maybe I just didn’t read far enough.  Given that you like to do large landscapes I’m very surprised that you don’t max 32gb. I’m might actually consider a xenon or a threadripper for that one just for the additional memory.  If you don’t you don’t though.  Your work is your work.  The standard way to check that is by seeing how often you go to swap.  

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×