Jump to content

just_bob

Member
  • Posts

    15
  • Joined

  • Last visited

Everything posted by just_bob

  1. I am aware of Flare X being the superior kit to the 4 years old Ripjaws 4 - I'm having issues imagining this performance being nearly close to being worth the 170% difference in price, so I'm wondering if and what am I missing out here. It's not my call, boss wants to stick to the kits, but to be fair I endorse his approach as well. Also we have another rig that can take advantage of the current flare X kit, so we're in the market for a new one.
  2. So far at work, we had 2950X running on X399 Aorus xtreme and that sweet 2933MHz CL14 Flare X 64GB kit. Boss decided it’s time to upgrade to 128GB kit. At first 128GB Flare X kit (1700€) seemed like the obvious choice, but he stumbled upon a good deal on 128GB kit of Ripjaws 4 2800MHz CL15 15-15-35 for just 999€. Both kits are on the mobo QVL. Sure Flare X is better and there’s that AMD-optimized sticker, but this kinda price difference raises some suspicions to say the least. Are we missing out something obvious?
  3. I'm putting together a 2950X on msi meg rig aswell and I'm just as perplexed about RAM choice as you are. Looking for 64GB kit on 4 dimms for starters and hoping to run them on at least 2933. Just like you, I have been looking at g.skill's trident z series, but I couldn't find them on meg's QVL, so I decided to play safe and forget about them. Corsair looks promising until you try and actually find those kits for sale. What confuses me even more is that there are 8x16GB kits listed in that QVL, but the last column implies that they are compatible in the 2 dimm and 4 dimm slot config, but not 8 dimm. Please let me know if you found what you were looking for, it means a lot.
  4. While I don't intend on OCing, I wouldn't want to close my options. At some point in distant future there is a chance this rig gets 2990WX and becomes a pure render slave aswell. Looking at the QVL (https://www.msi.com/Motherboard/support/MEG-X399-CREATION#support-mem-16), I'm having issues determining which of those sticks are actually B-die. Under chipset some just say Samsung while some have a code appended afterwards. I was hoping to find 4x16GB B-die kit from G.skill there but it's nowhere to be seen. Thanks for clearing that up. Thanks for the tip. I don't see a point in AIO as long as air does just fine. The pcpartpicker list I mocked up (https://pcpartpicker.com/list/TrC4P3) reports way less than 1000W. Or am I not looking at max stock draw? It's not my intention to OC, but I'd like to stay at that 50% efficiency.
  5. 1. Budget & Location EU, Croatia, ~2400 euros 2. Aim Houdini simulations and rendering I'll be transfering my GTX 1070, HDDs and SSD from my current build. So I'm buying CPU, RAM, mobo, cooler, PSU and a new case. I don't intend on overclocking the cpu. CPU: 2950X RAM: need 64GB for now, but will upgrade to 128GB at some point in future. Is B-die RAM still crucial for ryzen? mobo: I like MEG creation due to it's features, but damn that thing is hideous cooler: I would love to get one of those Wraith rippers, but can't seem to find them anywhere. I guess a Noctua would do. Are the black fans available yet? Also are Noctua's heatsink covers available for TR4 socket coolers? PSU: If my math is right - I'm looking at 600W consumption after taking future 128GB RAM, GPU, SSDs and HDDs into consideration, so I'm looking for a good 1200W PSU case: Airflow is priority, but looks matter. I kinda like H500P mesh, but can it fit wraith ripper if I opt for one? Thanks
  6. Contrary to popular belief, 3D software does gulp RAM. 16GB is gonna be enough if your goal is to learn and/or finish some smaller scenes. I don't do blender, but a buddy of mine does and he is struggling to squeeze out an architectural render of a house with 16GB of RAM. While blender's cycles does allow GPU compute, GPU also requires the VRAM for the task and gaming cards don't really provide more than 8GB. It also takes quite a bit of practice to reach a point where your scenes are memory optimal, no matter which 3D software you use. I own gtx 1070 with 8GB VRAM and I'm not even thinking about using GPU for my renders unless it's some really minor stuff I need asap and know 8GB is enough. If you are learning, 16GB should be enough to build up your portfolio to get you into the industry where your company is gonna provide you with a decent PC. However if you intend to make a living of off 3D stuff you do on that PC, I would recommend at least 32GB of RAM. From my point of view, the whole Intel vs AMD thing right now is about RAM. Ryzen might seem a bit cheaper, but RAM speeds are detrimental to Ryzen CPUs. For each DIMM slot you populate, your RAM speeds are gonna plummet and you're gonna need to buy some of the most expensive RAM if you want to get it back to adequate speeds. That being said, the more RAM you intend to shove into your PC, the more you should incline towards Intel. In the end you might find yourself in a position where AMD is the more expensive option of the two. Edit: On the core count side of things. CPU rendering is completely multithreaded and you can't go wrong with more cores there, but it is rarely the case when it comes to simulations and all the operations that take place before the rendering itself. I would personally recommend a 6 or 8 core CPU. Anything above that is pretty much for render slaves.
  7. Just wanna say noctua showed off their black fans on this year's computex. Also, you might wanna consider waiting until September/October for the 9th gen CPUs to roll out. Even if you don't find yourself wanting one of them, I would expect 8th gen getting considerably cheaper
  8. Sure, but the whole point of getting cl14 3200mHz is to guarantee you get that B-die RAM which ryzen craves for overclocking. But as Sernefarian pointed out in his link, there are some slower B-die sticks aswell.
  9. I haven't really tried those cl14 kits myself, but do intend on getting one in the near future (despite the fact I'll be going intel). Any chance you have a link with more info on this?
  10. 16GB of RAM is gonna serve you fine. 32GB is where you'd wanna be for 3D software if you intend on adding particles/grooming/grass/RBD destruction/etc to your scenes. Rendering is completely multithreaded (but then again, blender's cycles can utilize GPU rendering and you're going to find it a lot faster with your 1070 than any cpu), but same cannot be said for other parts of the process. Considering how RAM speeds can be detrimental for ryzen builds, for each dimm slot you intend to populate intel should become the more attractive option as their cpus don't really care which RAM you're getting. The whole intel vs amd thing is about how much ram you need. I would advise getting cl14 3200mHz ram if you want to go ryzen.
  11. Don't underestimate Houdini's hunger for memory. I mean 32GB will serve you for a while, definitely, might even do a few jobs with that much. But I'm sure you are aware you are going bleeding edge with your PC while most Houdini users will say 32GB RAM is the minimum and 64GB the recommended spec these days. Don't take my word for it - check out sidefx forums. At some point I was stuck with 16GB of RAM for a while and I didn't mind the 60+min render times on my FX-8350, but I did mind the 4 hour simulation time as soon as it had to use SSD for swap. Yes, that's 6 frames per day and we haven't even started rendering yet. Don't take these numbers seriously, it was a crappy PC, I'm just trying to make a point here. There's a lot more variables in this equation, but in my experience, simulations eat up a lot more RAM than rendering does, but are faster to compute (provided they have enough memory). My point is - 64GB is the higher priority than the best CPU on the market. For my next build I'm doing i7 / r7 with 64GB of RAM because I just don't need TR, but do need that RAM. But if you are sure you will have the cash for that RAM when the times comes, TR makes perfect sense. TR is gonna chew through render times and is perfect fit for the render slave. When it comes to all the work that comes before rendering...Houdini's "multithreadiness" is on a node by node basis, some nodes are completely single threaded and will remain so, though a lot of nodes are getting multi threaded with each new update. Some guys over at sidefx forums did some simulation benchmarks on TR with varying results which might be interesting to look at. Sure, it's gonna be a lot better than an i7 / r7, but all that comes with diminishing returns, so don't expect linear improvements. Every dimm slot you populate (and you're gonna need it) is gonna drop the max RAM speed down, so you sure don't wanna skimp on RAM speed, especially since Zen architecture seems so dependant on it. Also, while we're all gonna say it's fine shoving in another separate kit of RAM as long as they're the same, no one can 100% guarantee it. It's a tiny risk, but it's there. tldr: If you are sure you will be able to afford a second RAM kit to populate those empty dimms when the time comes, then sure, 32GB for starters makes perfect sense. My research so far points out that 3200MHz CL14 RAM sticks (google "zen Samsung B-die" if you wanna know more) are the way to go, especially if going AMD route.
  12. I would agree 1950X is the better investment. GPU is the least important component when it comes to Houdini, unless you can afford a Quadro. While some nodes do allow OpenCL utilization, giving the GPU a chance to help out with computing, it requires a lot of VRAM and gaming cards don't really provide much in that area. So GPU is mostly for viewport rendering and considering you're getting TR, there is no point of using GPU for rendering. I use 1070 (8GB VRAM) and I don't intend on changing that one for a while. I would reconsider your RAM choice since using playback in real time often means shoving several gigabytes of data into RAM per each frame. I am still trying to figure out how detrimental RAM speeds can be for the CPU when it comes to Zen architecture aswell.
  13. I'm not going TR route because cash, of course. If you can afford it, sure, go ahead. Do bear in mind not all work in Houdini is multi threaded, single core performance is still important. There is a certain amount of threads sims can use and for-loops are a whole other topic (don't ask me the numbers, I'm no expert), making TR build more of a render-slave. But tbh, considering I did most of my Houdini stuff on an old FX-8350 so far, I'm sure 8700K or 2700X are more than sufficient as long as I provide them enough memory.
  14. Hello, I do some heavy Houdini sims and am looking to get at least 64GB of memory in my upcoming workstation. I did quite a bit of research on my own, but would like to hear some feedback whether to go AMD or Intel. My research so far points out that Ryzen can't really handle that much memory well, since the infinity fabric is dependent on RAM speed and shoving in a 4x16GB kit plummets the speed all the way down to 1866MHz. While it is possible to OC the memory somewhere near 3200MHz, it pretty much requires getting that Samsung B-die which goes for 940‎€ right now (https://www.alternate.de/G-Skill/DIMM-64-GB-DDR4-3200-Quad-Kit-Arbeitsspeicher/html/product/1351449?). If I got it correctly, Intel does not suffer from same affliction and I would be able to get some cheaper (not B-die) RAM and OC it to 3200Mhz without any issues. Taking into consideration Buildzoid's recommendation to get Crosshair VII Hero if you want to unlock 2700X's true potential, it makes me wonder if Ryzen is the best bang for the buck in my case scenario. 2700X pretty much requires the best motherboard and best RAM to actually see the numbers that are shown in most benchmarks. While I would prefer having 64GB of RAM either way, going 8700K does open up the doors to the third option which is 32GB RAM + 280GB Optane as swap (which is the cheapest option of the three). Unfortunately I haven't really found many benchmarks showing Optane getting hammered for more than just a few hours and how exactly it compares to RAM speeds. I will probably wait for September to see if Z390 coffee lake happens by then and how it affects the whole situation, but i doubt I will be able to wait for 7nm Ryzen. Please correct me if I got something wrong over here and share any potential advice. Thanks
×