Jump to content

zua93117

Member
  • Posts

    37
  • Joined

Everything posted by zua93117

  1. Hey all, I'm trying to decide between ViewSonic VP2468, BenQ BL2420PT, LG 29WK600 for 8-bit HD video editing, since I don't have enough experience nor enough knowledge to know which one would be better. I'm mainly looking at the ViewSonic and BenQ (both about $300 CAD) because they have 100% sRGB coverage. I threw in the LG just because it's cheaper at $260 CAD with 99% sRGB and it's 29" at 2560 x 1080p. I know there's much better monitors with true 10 bit + LUT and all, but my setup is 8-bit through and through. The good thing about the ViewSonic is that it has 14 bit hardware LUT, but my strikes against it are Dissapointing contrast at 761:1 according to PCMonitorInfo's tests 6 bit + FRS Built-in power supply. I'm basically boned if it fails. The BenQ has true 8 bit and WQHD resolution. Contrast ratio is 953:1 (according to a random site that seemingly tests contrast ratios for monitors) I tried but failed to find any more measurement reports for these two 24" monitors.. My main question is whether 8-bit makes a big difference over 6-bit+FRS and whether hardware 14 bit LUT makes a big difference (even bigger than having true 8-bit). If hardware LUT and true 8-bit don't really make a big difference, then the LG 29WK600 or budget Dell Ultrasharps may be worth considering at a cheaper price point. Videos are probably going to look off on other people's screen anyway, but I'm just trying to pick the best monitor within budget. I have a i1DisplayPro 3 colorimeter, a spare 24" display for dual monitor setup, so I don't really need a 29" monitor per se. Any recommendation is greatly appreciated! P.S. buying in the US is perfectly fine too!
  2. Thanks for the clarification! I'm just afraid that 1650 Super won't hold up to HD video editing and that 1600 Super will just fit the bill or barely skim by. I was only able to find YT videos testing 4K video editing in Resolve with different GPUs. 4K's all the hype now I guess. The one useful piece of info I found on Resolve's forum was that GTX 1070 was no slouch for HD.. and that 1660 ti was just below 1070. If I remember correctly, 1660 super is equal to or a tad below 1660 ti, but with faster VRAM. which makes me think 1660 might barely fit the bill. Yikes! I was going by PSU tier list 4.0. @_@ I think I got the V-Gold 2018 version, which is tier A-.. Things change fast I guess? I got a EVGA G3 last time and it suddenly got listed as tier D+ potentially dangerous but in rather unrealistic situations >.>
  3. Thanks! That's a good point. I see. Ya, Nvidia GPU isn't bad. The 1660 super is supposedly a bit faster than RX 590, which is an OC'ed RX580 if I'm right. Might be worth looking into the used market on that one.
  4. Hello all, I'm trying to build together a pc that will do 1080p video editing in Davinci Resolve. Not much effects, very basic compositing with image / text, very light color grading (or possibly none). I'm located in Canada, the budget is tight, and the local used market for GPU isn't astounding. I'm really trying to cut cost where ever possible nevertheless. I have the biggest problem is choosing the GPU as Resolve seems highly dependent on GPU.. I also need some help choosing the CPU, although I think the difference between the two candidates are pretty minimal. CPU - I'm deciding between 2700X and 3600. They trade blows, and I just can't decide! 3600 beats 2700X wherever composting with the Fusion tab is used. But Resolve seems to benefit from more cores. Shouldn't make too much of a difference. *Edit = I've been kind of convinced to go with 3600 since smooth editing feels more important than fast rendering where the pc can be left alone. GPU - GTX 1660 Super 6GB or RX 580 8GB or even RX570 8GB. I was looking at GTX 1650 4GB, but that seems like a bad idea. It'd be nice to keep the GPU cost around $320 CAD ($240 USD) or under if possible. === Components I've settled on ==== Motherboard - MSI B450 Gaming Plus / Tomahawk / A-Pro ..and MAX versions, whichever one justifies itself. RAM - 2x8GB 3200Mhz cl 16..don't really want to spend extra for 3600 cl18 or 3600 cl16 when the money could go into GPU or CPU. PSU - Cooler Master V550W 80+ Gold Any help is appreciated!
  5. Thanks. I bought an EVGA SC Ultra, but couldn't find it on the list.. I'm assuming it's the same as EVGA SC2 on that list? I was wondering if I could save a bit by getting something cheaper. I'm assuming that Zotac Twin is in the same tier as Asus Dual, since I can't find any of the cards I listed above on that list :( If the pcb and cooler didn't change too much from the gtx 10xx series, then the list may give a good idea of where the cards stand.
  6. Sorry for tagging onto this thread, but I have the same dilemma. ASUS TUF Gaming, Asus Dual Evo, Gigabyte OC and Zotac Twin for 1660 super are all the same price at my location. Does Zotac Twin still beat the other models?
  7. @minibois Because, if a video card only supports 1 LUT for the entire card (regardless of how many heads it has), the card can only effectively handle one color calibrated display. Here's a quick blurb about LUT from Xrite - a color calibration company. On that note, no matter how many ICC profiles Windows seems to allow users to attach to one graphics card, the graphics card can only support as many color calibrated displays as the number of LUT the card actually supports. But I just found out that none of this matters anymore.... I just got off live chat with Nvidia. So..apparently NONE of the Geforce cards support LUT. So I can't color calibrate even ONE monitor. This hurts my soul......i'm hurting real bad. That rtx 2070 super (my main card I intend on using for graphic arts) wasn't cheap. lots of FFFFFF going on in my heart (apologies for twitch chat language) That is SO MUCH WORSE than AMD if we're talking about LUT support, because the radeon line supports one and their firepro (competes against quadro) supports two --- source from AMD forum. ^The above thread is quite old, but the same answer is given in another thread about how many lut RX 480 supports.
  8. Thanks for the clarification! Ah yes, I have a Ryzen 3700X. Thank you for being of so much help^^
  9. Thank you! I'll be good as long as slot 1 get full speed. The card that'll go in slot 4 could be either a gtx 660 or 640. it will only be used to connect to another display and do nothing else, so i'll probably go with 640 to decrease power draw. I'm really using it for the extra LUT anyway. I don't have anything in slot 2, 3, 5 so it'll rung at 2.0x4 speed. Here's the link to motherboard specs Thanks again!
  10. Hey all, I have a Ryzen 3rd Gen CPU with MSI B450 Gaming Plus Max that could support two graphics card. I would like to run to cards only because I would like to color calibrate two monitors. Because as far as I know, AMD's radeon series has one LUT and Firepro has two - I suspect the same of Nvidia. The two cards will not be linked. One very good GPU will do all the work (configured via Nvidia's control panel) while the other old and "weak" one will just be there for the second display. My motherboards specs regarding mutli-gpu is shown in the picture. Does this mean that my main card that is in slot 1 will drop down to 3.0x8 or 3.0x4 as soon as I put a card in slot4, or will the card in slot one maintain 3.0x16 speed?
  11. Hey all, I'm looking at these two graphics cards that I can get for $229.99 USD + tax --> total of ~$250.69 tax included EVGA GeForce GTX 1660 SUPER SC ULTRA --> no return, newegg deal MSI GeForce GTX 1660 SUPER GAMING X --> 15 day return, local shop (not ideal, can't return after boxing day if better deal shows up, but at least it's something -- buyer's remorse) This is the lowest price that pcpartpicker + camel site has shown for these two specific models to date. There's too many models...I didn't look at each one. Is this a good deal for 1660 super in of itself and compared to 1650 super? Another tag on question: The 1660 super has +~77% Multi-Render on gpu.benchmark.com I don't know anything about the workings about gpu, so the definition of multi-render didn't help me too much.. I'm mainly using the cards for graphics work and animation - I doubt I'll have high polygon counts (at least not for the foreseeable future). Does having a higher MRender score really benefit applications such as blender? Also, the EVGA SC isn't exactly on GPU cooler tier list.. is it kind of the same as SC2 or SC Black?
  12. thanks! the more i think about it..as much as i want the latest flashy things, the best approach is probably throwing the money i'd spend for 1 more m.2slot towards 1660 super. The prices of gpus make my jaw drop
  13. Hey ya! Could do that ? wow i'm so out of the loop lol. I've just caught up with current tech these past few days. The thing is that the chipset lanes on Gaming Plus Max is gen 2, so it'd be really gimped and the nvme would also be fighting with other what nots for data transfer. But still something worth considering
  14. Ya..i'm pretty upset at MSI for that.. like wwhhhhy?! and Thank you! Appreciate it! I've looked on pcpartpicker. The boards that support 2xM.2 slots with decent vrm + heatsink for use with 3700X are pricey. I think the cheapest decent option is the Asus Prime Pro X470, but that board gimps pcie slots lol..The next level-up option would be Gigabyte Aorus Elite x570, which is overkill for my cpu. Granted I could get an open-boxed one for ~$162USD which is a steal in that regards, but still..pretty much twice the price of Gaming Plus Max. So I think I'll stick with what I have. The price hike just isnt' worth it.. I still have to get something like 1660 super since my GTX 660 2GB ..just won't cut it imo
  15. Yes the mortar max..oh i dream of getting that board. It's the best b450 out there imo, but North America is gimped. Not on sale here and i have a feeling international shipping is going to be expensive. And i'll be charged import fees.. which would take me up to solid entry x570 prices
  16. Ah yes SCRATCH DISKS! can't believe that slipped my mind lol. that makes so much more sense! i'm such a noob 500GB drives does seem to be the sweet spot.
  17. Hello, I'm planning my setup for animation / graphic arts purposes. I have Ryzen 3700X, B450 Gaming Plus Max, and 2x8GB 3200mhz Cl16 ram on hand. NVMe got my attention since Crucial, Team MP, AData etc NVMes are supposedly as durable as as Evo Pro and yet at a fraction of the price. So I came upon a video / article (which I can't find now) that mentioned something aobut an ideal setup having two nvme and hdd for backup mass storage. The main point is to have OS on one nvme, and big files on another, so the OS and spplications (which deals with the big files) don't end up clogging the dedicated nvme lanes. But I can't remember where graphics related software should reside..(on the OS one on big files one). I'm fuzzy about the technical underpinnings of the above statement. In my mind, the OS takes up ram as well as any open applications, so the data is mainly going between ram, cpu and mainly the dedicated gpu lanes. Question 1 --> So how does having OS + (maybe software) + big files on the same nvme clog up the nvme lanes? The concept sounds good to me but I don't understand how it works. Question 2 --> Say OS is on storage A; files + application is on storage B. When the application is running and doing stuff, will most of the communication will be between storage B and the CPU? Wouldn't applications be making system calls which involves the OS that's on storage A? Maybe all of this matters more when there's multiple applications open and the OS also happens to be doing other stuff in the backgrounds? I can't even imagine the 3700X running multiple graphics-related applications simulatenously though...I have a feeling that's more like 3950X or threadripper realm. Also, I read somewhere else that OS and applications usually work with small files, so the benefits of NVMe is almost unseen when compared to SSD. If this is true, then the most cost effective setup would be 1xNvme for big files and 1xSSD for OS + software? The Gaming Plus Max only has one Nvme..so that means having to look for another board with 2xN.2 slots (and hopefully one that doesn't gimp the PCIE slots to compensate for the addional m.2 slot + decent VRM and heatsink). That dictates a more expensive motherboard + nvme costs on top of gpu cost *big jaw drop yikes.* So I'm leaning towards not having two nvme...but am asking just to make sure that the boosts with having two nvme aren't worth it. Thanks for the help in advance!
  18. Hi, While trying to decided between 3200 and 3600 ram, I realized that 3600 is not listed as one of the speeds on MSI Gaming Plus Max's official spec sheet: Does it mean that I can't use 3600 on the gaming plus max? But higher speeds such as 4000 is on the list, so I'm a bit confused, due to my lack of knowledge in hardware.
  19. Good news is that I haven't opened anything just yet. So I can return it.
  20. Thanks! Oh, I just remembered...the Gaming Plus Max doesn't even list 3600 as one of the speeds. From MSI Gaming Plus Max specs Does it mean that 3600 ram just won't work? The Tomahawk Max has 3600 listed, but I decided against it to save a few dollars.
  21. Hey all, I'm planning to run Ryzen 3600X + MSI Gaming Plus Max for mostly graphics work and maybe some basic 3D. Little to no gaming. I got a RipJaw V 3200mhz CL 16-18-18-38-2N 2x8GB kit for ~$80 CAD + tax NewEgg is having a sale for RipJawV 3600mhz CL 16-19-19-39 for $106 CAD + tax (shipping included) 3600 CL16 is nice, but the timing is loose. So, overclocking on my CPU + mobo is limited in the first place. Considering the loose timings and cheap price, RipJaw V uses Hynix die, so I'd think OC is really limited. I could dial down the speed on 3600 and get tighter timings to see if it runs faster, but I don't know whether the differnce would be worth that extra $20~$25. Also, what exactly is the recommended speed for Zen 2? Zen 2 natively supports 3200, and the benchmarks I've seen shows no big difference between basic kits, but some videos recommend 3600 as the guranteed bes cost to performance ratio with no tinerking. If I can use 3600, is it worth the extra money?
  22. Hey all, A Ryzen 5 3600X is on the way for me. I hope to overclock the 3600X and the ram a bit if possible, but I doubt I'll be able to get much since I chose MSI B450 Gaming Plus as my motherboard and RiperJaws V 3200 mhz C16, 2x8GB for budget reasons. Thus I wonder if I'll even need an aftermarket cooler. I read that AMD kind of cheaped out on the 3600X cooler by not attaching copper plating. I currently have a ChiOne e1a-120. Honestly, I don't know if that's worse than using the stock cooler (have never measured). Right now I can get a Cooler Master MasterLiquid ML240L (tier C) for $69.99 CAD or a Noctua NH-D15 for $99.95 CAD or EVGA CLC 240 for $99.99 CAD. The reason I'd choose Noctua is due to less points of failure, and I'm more at peace knowing my computer won't be destroyed by leaks. But, I'm wondering if going for the Noctua or EVGA is even worth it giving my computer's specs.. Thank you all in advance
  23. Hi comp gurus, I've been reading up on the topic of multiple ICC profiles / LUT for years now, and the information is sparse and wild, varying from Windows only being able to use one at a time, to Windows being fully capable of multiple profiles, then from entry level cards only have single LUT, to cards nowdays should support multi LUT. I know Window 7's color management settings allow users to assign different ICC profiles to different displays, I'm pretty sure Windows support multiple profiles, so I'm not sure where "Windows can only use one ICC profile at a time" came from. https://support.microsoft.com/en-us/help/4462979/windows-about-color-management For entry level cards, it makes sense that they would only have one LUT for multiple heads, but I don't know what the general situation for graphics cards is like nowadays. Since multiple LUT isn't really a standard, no one knows which cards support them since there's no official info. (See update below) I mistakenly got a m-ATX motherboard instead of ATX one, and didn't realize it after I got home (big derp). That said, the m-ATX motherboard would totally suffice if I don't need two graphics cards. Right now, I'm running three monitors (one pen display and two monitors) with two graphics card on an ATX board. I need to have the pen display and one monitor color calibrated, since the pen display's actual color gamut and accuracy is inferior to one of my monitors. I don't need to calibrate the 3rd monitor. I'm debating whether I should dish out the extra cash for an ATX board. The two big questions are Do entry level graphics cards nowadays support at least dual LUT? (See update below) If I do get an ATX board and a better graphics card than my current one, will running a stronger and a weaker card simultaneously actually hurt application performance? Thank you in advance! *Update* For anyone who also has the same question: After searching AMD's general discussion forum a bit, I found confirmation (from 2012) that AMD Radeon Line ===> 1 LUT AMD FirePro Line ===> 2 LUT https://community.amd.com/message/1285510?q=LUT Then, the trend of Radeons having 1 LUT is confirmed again in 2017 https://community.amd.com/message/2787408?commentID=2787408#comment-2787084 Thus, I'm assuming that AMD cards to date have followed the above trend, and that the upcoming cards will likely follow suite. Will update again for Nvidia cards if I find the answer.
  24. Ah, I totally didn't know that a new entry level radeon card's going to come out. I think I'll wait then. Either the new card is priced very nicely or old cards will become cheaper, so it's good to wait. :
×