Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined


This user doesn't have any awards

About zua93117

  • Title

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hey all, I'm looking at these two graphics cards that I can get for $229.99 USD + tax --> total of ~$250.69 tax included EVGA GeForce GTX 1660 SUPER SC ULTRA --> no return, newegg deal MSI GeForce GTX 1660 SUPER GAMING X --> 15 day return, local shop (not ideal, can't return after boxing day if better deal shows up, but at least it's something -- buyer's remorse) This is the lowest price that pcpartpicker + camel site has shown for these two specific models to date. There's too many models...I didn't look at each one. Is this a good deal for 1660 super in of itself and compared to 1650 super? Another tag on question: The 1660 super has +~77% Multi-Render on gpu.benchmark.com I don't know anything about the workings about gpu, so the definition of multi-render didn't help me too much.. I'm mainly using the cards for graphics work and animation - I doubt I'll have high polygon counts (at least not for the foreseeable future). Does having a higher MRender score really benefit applications such as blender? Also, the EVGA SC isn't exactly on GPU cooler tier list.. is it kind of the same as SC2 or SC Black?
  2. thanks! the more i think about it..as much as i want the latest flashy things, the best approach is probably throwing the money i'd spend for 1 more m.2slot towards 1660 super. The prices of gpus make my jaw drop
  3. Hey ya! Could do that wow i'm so out of the loop lol. I've just caught up with current tech these past few days. The thing is that the chipset lanes on Gaming Plus Max is gen 2, so it'd be really gimped and the nvme would also be fighting with other what nots for data transfer. But still something worth considering
  4. Ya..i'm pretty upset at MSI for that.. like wwhhhhy?! and Thank you! Appreciate it! I've looked on pcpartpicker. The boards that support 2xM.2 slots with decent vrm + heatsink for use with 3700X are pricey. I think the cheapest decent option is the Asus Prime Pro X470, but that board gimps pcie slots lol..The next level-up option would be Gigabyte Aorus Elite x570, which is overkill for my cpu. Granted I could get an open-boxed one for ~$162USD which is a steal in that regards, but still..pretty much twice the price of Gaming Plus Max. So I think I'll stick with what I have. The price hike just isnt' worth it.. I still have to get something like 1660 super since my GTX 660 2GB ..just won't cut it imo
  5. Yes the mortar max..oh i dream of getting that board. It's the best b450 out there imo, but North America is gimped. Not on sale here and i have a feeling international shipping is going to be expensive. And i'll be charged import fees.. which would take me up to solid entry x570 prices
  6. Ah yes SCRATCH DISKS! can't believe that slipped my mind lol. that makes so much more sense! i'm such a noob 500GB drives does seem to be the sweet spot.
  7. Hello, I'm planning my setup for animation / graphic arts purposes. I have Ryzen 3700X, B450 Gaming Plus Max, and 2x8GB 3200mhz Cl16 ram on hand. NVMe got my attention since Crucial, Team MP, AData etc NVMes are supposedly as durable as as Evo Pro and yet at a fraction of the price. So I came upon a video / article (which I can't find now) that mentioned something aobut an ideal setup having two nvme and hdd for backup mass storage. The main point is to have OS on one nvme, and big files on another, so the OS and spplications (which deals with the big files) don't end up clogging the dedicated nvme lanes. But I can't remember where graphics related software should reside..(on the OS one on big files one). I'm fuzzy about the technical underpinnings of the above statement. In my mind, the OS takes up ram as well as any open applications, so the data is mainly going between ram, cpu and mainly the dedicated gpu lanes. Question 1 --> So how does having OS + (maybe software) + big files on the same nvme clog up the nvme lanes? The concept sounds good to me but I don't understand how it works. Question 2 --> Say OS is on storage A; files + application is on storage B. When the application is running and doing stuff, will most of the communication will be between storage B and the CPU? Wouldn't applications be making system calls which involves the OS that's on storage A? Maybe all of this matters more when there's multiple applications open and the OS also happens to be doing other stuff in the backgrounds? I can't even imagine the 3700X running multiple graphics-related applications simulatenously though...I have a feeling that's more like 3950X or threadripper realm. Also, I read somewhere else that OS and applications usually work with small files, so the benefits of NVMe is almost unseen when compared to SSD. If this is true, then the most cost effective setup would be 1xNvme for big files and 1xSSD for OS + software? The Gaming Plus Max only has one Nvme..so that means having to look for another board with 2xN.2 slots (and hopefully one that doesn't gimp the PCIE slots to compensate for the addional m.2 slot + decent VRM and heatsink). That dictates a more expensive motherboard + nvme costs on top of gpu cost *big jaw drop yikes.* So I'm leaning towards not having two nvme...but am asking just to make sure that the boosts with having two nvme aren't worth it. Thanks for the help in advance!
  8. Hi, While trying to decided between 3200 and 3600 ram, I realized that 3600 is not listed as one of the speeds on MSI Gaming Plus Max's official spec sheet: Does it mean that I can't use 3600 on the gaming plus max? But higher speeds such as 4000 is on the list, so I'm a bit confused, due to my lack of knowledge in hardware.
  9. Good news is that I haven't opened anything just yet. So I can return it.
  10. Thanks! Oh, I just remembered...the Gaming Plus Max doesn't even list 3600 as one of the speeds. From MSI Gaming Plus Max specs Does it mean that 3600 ram just won't work? The Tomahawk Max has 3600 listed, but I decided against it to save a few dollars.
  11. Hey all, I'm planning to run Ryzen 3600X + MSI Gaming Plus Max for mostly graphics work and maybe some basic 3D. Little to no gaming. I got a RipJaw V 3200mhz CL 16-18-18-38-2N 2x8GB kit for ~$80 CAD + tax NewEgg is having a sale for RipJawV 3600mhz CL 16-19-19-39 for $106 CAD + tax (shipping included) 3600 CL16 is nice, but the timing is loose. So, overclocking on my CPU + mobo is limited in the first place. Considering the loose timings and cheap price, RipJaw V uses Hynix die, so I'd think OC is really limited. I could dial down the speed on 3600 and get tighter timings to see if it runs faster, but I don't know whether the differnce would be worth that extra $20~$25. Also, what exactly is the recommended speed for Zen 2? Zen 2 natively supports 3200, and the benchmarks I've seen shows no big difference between basic kits, but some videos recommend 3600 as the guranteed bes cost to performance ratio with no tinerking. If I can use 3600, is it worth the extra money?
  12. Hey all, A Ryzen 5 3600X is on the way for me. I hope to overclock the 3600X and the ram a bit if possible, but I doubt I'll be able to get much since I chose MSI B450 Gaming Plus as my motherboard and RiperJaws V 3200 mhz C16, 2x8GB for budget reasons. Thus I wonder if I'll even need an aftermarket cooler. I read that AMD kind of cheaped out on the 3600X cooler by not attaching copper plating. I currently have a ChiOne e1a-120. Honestly, I don't know if that's worse than using the stock cooler (have never measured). Right now I can get a Cooler Master MasterLiquid ML240L (tier C) for $69.99 CAD or a Noctua NH-D15 for $99.95 CAD or EVGA CLC 240 for $99.99 CAD. The reason I'd choose Noctua is due to less points of failure, and I'm more at peace knowing my computer won't be destroyed by leaks. But, I'm wondering if going for the Noctua or EVGA is even worth it giving my computer's specs.. Thank you all in advance
  13. Hi comp gurus, I've been reading up on the topic of multiple ICC profiles / LUT for years now, and the information is sparse and wild, varying from Windows only being able to use one at a time, to Windows being fully capable of multiple profiles, then from entry level cards only have single LUT, to cards nowdays should support multi LUT. I know Window 7's color management settings allow users to assign different ICC profiles to different displays, I'm pretty sure Windows support multiple profiles, so I'm not sure where "Windows can only use one ICC profile at a time" came from. https://support.microsoft.com/en-us/help/4462979/windows-about-color-management For entry level cards, it makes sense that they would only have one LUT for multiple heads, but I don't know what the general situation for graphics cards is like nowadays. Since multiple LUT isn't really a standard, no one knows which cards support them since there's no official info. (See update below) I mistakenly got a m-ATX motherboard instead of ATX one, and didn't realize it after I got home (big derp). That said, the m-ATX motherboard would totally suffice if I don't need two graphics cards. Right now, I'm running three monitors (one pen display and two monitors) with two graphics card on an ATX board. I need to have the pen display and one monitor color calibrated, since the pen display's actual color gamut and accuracy is inferior to one of my monitors. I don't need to calibrate the 3rd monitor. I'm debating whether I should dish out the extra cash for an ATX board. The two big questions are Do entry level graphics cards nowadays support at least dual LUT? (See update below) If I do get an ATX board and a better graphics card than my current one, will running a stronger and a weaker card simultaneously actually hurt application performance? Thank you in advance! *Update* For anyone who also has the same question: After searching AMD's general discussion forum a bit, I found confirmation (from 2012) that AMD Radeon Line ===> 1 LUT AMD FirePro Line ===> 2 LUT https://community.amd.com/message/1285510?q=LUT Then, the trend of Radeons having 1 LUT is confirmed again in 2017 https://community.amd.com/message/2787408?commentID=2787408#comment-2787084 Thus, I'm assuming that AMD cards to date have followed the above trend, and that the upcoming cards will likely follow suite. Will update again for Nvidia cards if I find the answer.
  14. Ah, I totally didn't know that a new entry level radeon card's going to come out. I think I'll wait then. Either the new card is priced very nicely or old cards will become cheaper, so it's good to wait. :