Jump to content

ModifiedPhotoGraphics

Member
  • Posts

    21
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

ModifiedPhotoGraphics's Achievements

  1. Good to know, I'll play with the settings to see what is stable, benchmarks well but also performs well in some application testing. I'm trying to minimize lag when it comes to editing because waiting on a system to process, save or otherwise "work" is terribly frustrating for me.
  2. Adobe can still leverage the additional cores, which is what most of my time will be go into. Being an excellent gaming rig is a side benefit. Plus, money wasn't really as much of a factor. (I couldn't justify a HEDT build because I'm not doing a good amount of video or 3d rendering.) It has also been 10 years since I built a new system so getting the latest processor buys me an extra 3 months before it is obsolete. Haha!
  3. I'm down to the waiting on the last component for my new build (the 9900k CPU), but do intend to do some moderate OC'ing. Based on research I did in advance, the G-Skill memory I purchased "should" be Samsung B die, but these are 3866 with 18-18-18-38 default timings. My understanding is lower timings are better than higher clock speeds. This is going into an ASRock Taichi Ultimate z390 board. Thoughts on going with the factory timings and clock speed, or aiming for lower timings or higher clocks or perhaps even both? I went with a 2 x 16GB kit to stick with two modules, but still get 32GB since I do a good deal of work with very large image files and multiple photo editing applications open at any given time.
  4. New chips have built-in Wireless AC, so less likely to have to choose between a board that has an add-on chip or not. So, in theory, all z390 should feature Wi/Fi by default. I've also heard they are building in a little bit more robust power handling to some of the OC boards to handle the 8 core processors with overclocking. (This has yet to be seen since nobody has torn down and inspected the VRM's of these yet.) For what it's worth, Newegg pulled the pricing and killed the links to additional photos, etc. So as suspected, it was activated a bit early.
  5. Posted... accidentally. They don't release for another ~5 days. (I suspect this was an error on the side of whoever added them but forgot to mark as hidden, sort of like the pricing for the 9700k and 9900k on multiple sites that were later pulled.) Still good info if anyone is wondering what they will cost in comparison to the current z370 boards.
  6. Not sure I've seen this leaked anywhere else yet... I'm assuming prices are in Canadian since this is the Newegg.ca site and not the American site (which does not have results shown yet.)
  7. Anyone happen to have the BenQ SW320 or SW271 that can chime in on the image quality? I also have to wonder about the controls in the base because it looks like it wouldn't work with a non-standard bracket unless those controls can be disconnected and used somewhere else?
  8. Sorry the title is a bit vague... But I mostly do photo editing where color accuracy and good shadow detail / blacks is important to me. But on the new system I am building for editing, it will also get used for gaming and I don't have the money or space for a second monitor just for gaming. Ideally I want a minimum of 27" 16:9 / 16:10 but would entertain anything up to a wide screen as large as 35" give or take. (one large monitor will fit, two large monitors not so much) Preferably 4k native with Adobe RGB (95% or better) or switchable sRGB / Adobe RGB modes. Pluses would be a removable blackout hood. (My current Lacie 324i has one, its necessary due to where my desk is in relation windows that need to remain mostly uncovered.) Right now I'm leaning towards the BenQ SW320 which hits a lot of those needs in terms of photo editing, but with a 60hz, 5ms GTG response IPS panel, will it be problematic for gaming? And if so, suggestions for an alternative that might fit my needs?
  9. They did make good products. It would have been nice to see what they could have created if they stuck around. They were very good products.
  10. They are very thin aluminum (almost always) so they bend very easily. Something you might try is taking an old credit card and cutting it a little more narrow than the width of a fin. That should make a nice tool to flex any back into shape. As someone else mentioned, sharpie will hide any bare metal. But in reality, hide it behind a fan or the grill of the case and you'll never even notice. The performance wouldn't be harmed unless every fin was smashed flat preventing air flow at all.
  11. Two GPU's is where it's going to get costly. You may actually need two radiators to effectively dissipate that much heat. And it seems that the GPU plates are the most costly part aside from decent radiators. Even almost 20 years ago water cooling a full system was a $500+ price and it certainly hasn't come down much from that since.
  12. First off, this thing belongs in a museum because this is probably as old as some of the posters here. This was the Danger Den CPU block but I actually have the matching GPU block still and I even had the matching chip set block too but no clue where that one is because the mount for that broke so I put the fan and heatsink back on it early on. Needless to say, considering how old it is, it's held up extremely well. You can tell from the shiny square on the surface where the copper met the CPU and how much overkill the size of the cooling plate was on this thing. (at least in one direction) As you can tell, this thing is solid... I bet JayzTwoCents didn't even have one of these. (Or he did and this will bring back memories of early water cooling way back when.) For what it's worth, this was the "beast" it was cooling. I wish I could remember the OC numbers I got it to, but i'm sure it was nothing impressive.
  13. It makes me cringe... At least when I sit down and realize I have to process through a few thousand images. Once upon a time it was actually pretty nice. Dual Xeon 2.8ghz quad cores. Started with 2gb of ram but I've since bumped it to 16gb to make using it this long bearable. It does actually still work and half way decent, just painfully slow for photo editing and short of non-usable for video editing (even at standard HD, I can't even begin to edit 4k footage on it). I've pondered Threadripper but the additional cost is still a bit much for me, even for a first gen version. I've also found that Adobe software specifically has diminishing returns for more than 8 cores, so anything that beefy would be overkill outside of the video software that can actually utilize the cores better. (But video editing is only secondary to photos.)
  14. Yea sort of where I'm at. The only potential setback would be if Intel announces it in October as expected, but they don't actually ship until 2019 as some rumors have told. (Then again, test CPU's are already in the wild and some international computer parts sites have already accidentally listed them.)
×