Jump to content

phongle123

Member
  • Posts

    1,490
  • Joined

  • Last visited

Everything posted by phongle123

  1. I don't suppose someone would do this for me that has bought a 2080 or a 2080 Ti and doesn't want the PSU deal. I can pay with paypal as an invoice payment ahead of time.
  2. The difference is 0.4Ghz Base speed and 0.2Ghz Turbo Boost Speed. Everything else appears to be the same. Now mind you I have not taken into account any sort of new features that his CPU exhibits over the previous generation such as for ex: 7th gen CPUs having dedicated hardware encoders for 4K. However, considering price for price they are similar enough at current *NEW* pricing the 9980XE is the a better choice IMO (7980XE 1903$ on Amazon, 9980XE MSRP of 1979$). If we are talking about a 76$ difference in a 300$ CPU then this is a much different story. Of course you may be able to find offloaded or even used 7980XE's for cheaper since it is not new. Then again, the solder will already make it so you don't need to de-lid so that is 1 thing you don't have to worry about unless you already had an LGA2066 de-lidding tool. Final note is that, the base speed of +0.4Ghz for 3.0Ghz is pretty important if you don't plan on overclocking and will make use of all 18 cores simultaneously. This is dependent on temperature as well.
  3. Hello, I have 2x8GB SODIMM after a laptop replacement and want to use it in a desktop because why not? After looking all I see are what is label as "tester" SODIMM to DIMM adapters. Are these usable as daily use or are they only meant for a short burst to test the memory and not usable as a permanent solution?
  4. I want to chime in on this. Overclocking on Ryzen isn't as great as Intel. Therefore, if you can spare the cost between a 2700 and a 2700x it's better to get the 2700x rather than getting the 2700 and overclocking it.
  5. I'd turn down Anti-Aliasing first and settings. However, I don't think that you'll go out and buy a new GPU just because you can't reach 60FPS on Ultra. Like any sane person, turn down the settings first.
  6. Arctic Silver 5 is fine. The reason being is that it is electrically conductive. So you should be careful applying it. However, once applied correctly it has no way of getting onto the PCB to fry it. I use AS5 on all my GPUs and CPUs. For this reason, it is also 1 of the best Thermal Pastes on the market since it contains silver in it. However, this thermal paste is incredibly thick and requires you to boil it in a bag before applying it so it can thin down. However, that just means that you should apply it in smaller amounts than you normally would another thermal paste as it will expand when it is heated up and thinned down. If you don't want to pre-heat it like I do then that just means that it will take longer before you see ideal temperatures. Temperatures with this TIM you see from monitoring programs will go down more over time as the thick paste thins down with heat.
  7. Exactly, a rectangle kind of metal frame with 4 nail holes at the top that I can nail into the bottom of the tabletop. Something that can hold my PC or even something that has 2 or 3 separate squares that I can store stuff in. I would want 2 of them, 1 on the left and 1 on the right side. Left side for PC holding and right side for general storage. An IKEA is pretty far from me, if you happen to know just what they are called that would be appreciated because my searching of like table drawer or under table storage isn't coming up with what I want. My standing table states 200 lbs. Which is more than enough to have even the heaviest of components. A 50 lb Computer and 3x 17.6 lb 27" monitors and there is still plenty of weight left over.
  8. Hello, I'm wondering if anyone knows a good product that could be nailed under a standing desk so I can store stuff in. Since a standing desk has no drawers to store stuff, desk space becomes cramped.
  9. Freesync is software. G-Sync is a physical Module that Nvidia charges through the roof for. Therefore, spec for spec a monitor with G-Sync can be 200-300$ more than it's Freesync Variant. Additionally, Nvidia GPUs use G-Sync and AMD GPUs use Freesync. Don't bother with Nvidia GPUs over Freesync. It requires you to have an AMD APU/GPU that supports Freesync and isn't stable.
  10. @Wolfgang393 Blue Yeti is a terrible investment only if you don't want a huge mic in front of your face all of the time. Which for me at least when I watch a video with someone with a Blue Yeti in front of the camera it covers up a large chunk of the camera and is immediately unappealing. If you do not want to have the mic in front of your face/mouth you will have to turn the gain up. From my experience anything that's not the lowest gain possible hears everything. Even if you are on a 100th floor 5 star hotel with noise canceling walls. You will still hear the person in the basement floor taking a piss. Highly over exaggerated statement in the previous sentence. However, second floor to first floor background noise with doors closed can still hear people on the first floor doing general tasks. I have to use Adobe Audition to pass the audio through to Audition first to filter background noise in real-time which of course uses a lot of CPU processing power. 30% usage on a 5930k. However, this allows me to place my mic on top of my monitor with the gain up and no background noise and very little keyboard noise is captured. Therefore, if you are comfortable with a headset, just stick with it. Here's my take on a headset. Obviously it's mic is not as good as a Blue Yeti; therefore it won't capture background noise as much but since you're putting it right in front of your mouth it pretty much nullifies that weakness. However, if you turn the gain all the way down on a Blue Yeti and sticking it in front of your mouth then there's no reason to get a Blue Yeti over a Headset with a mic.
  11. EVGA actually never even messaged me back with my Customer Service e-mail. I just wanted to update on this. The Hybrid screw holes are straight through, therefore you are able to use any same thread length screw you want.
  12. Turns out it's a Ryzen problem. x1 and x4 Cards that don't work on the x4 slots will work on x16 slot. It's incredibly inconvenient. Since I have no x16 slots left. https://www.reddit.com/r/ElgatoGaming/comments/5zpqw1/elgato_hd60_pro_not_working_on_ryzen/?st=jmygf2op&sh=3084996e
  13. While you are in no way inclined to listen to me. This is strictly IMO. Your power supply is 42$ and is more of an entry level PSU. Unless you're absolutely building now than that is a different case. However, I highly recommend the EVGA G3 550w or the Seasonic Focus Plus 550w. Both are Tier 2 PSUs according to this PSU Tier List over in the PSU Thread. They both usually go on sale for about 70 or 75$ with a 20$ rebate making them 50 or 55$. I've seen it multiple times and have gotten them multiple times just because they are such a good price. Both of them are extremely small ATX PSU that means that you will have extra space to work with. Though it's about 8$ or 13$ (if 55$ AR) more. I'd rather get something high quality that you don't ever have to worry about several years from now. So if you're not building now you could wait it out and see if the PSU gets the 20$ rebate sometimes in the near future. Here are the links: EVGA G3 500W and Seasonic Focus+ 550W Edit: I was looking at Mello's PSU, yours is 70$ ATM and you don't need 650w. 550W is enough for most single GPU builds. 650W does nothing for Single GPU.
  14. And if you're not listing on his advice. Here's the fact. Armor not the Gaming X Armor, same design Armor = White and Gaming X Armor = Red. This is based on the 1080 Ti but should be relevant across all models. Armor CLC means it was modded to use Liquid Cooling. As you can see Armor ranks last and not by like 1 or 2c but by 16.1c in increase over the previous. That's HUGE!
  15. This is exactly why I bought another x370 board. Both x370 boards do not work. z270 board did work. CMOS has been reset like 10 times.
  16. All dedicated GPUs work except for other PCIe cards (non-GPU PCIe cards)
  17. I'm coming from a Ryzen X370 board I have tried it with another Ryzen board and it does not work. I have tried both PCIe cards on a Z270 and it works. Is there a setting that is turned off in BIOS or something? 1) I have tried TP-Link T9E Archer Wi-Fi Card. It does nothing. Once driver is installed system freezes. On restart (with driver installed and Wi-Fi card plugged in) system freezes immediately when loading into desktop. I have to remove the Wi-Fi card for it to stop freezing and additionally removing the driver afterwards. 2) I have tried Elgato HD60 Pro. Nothing happens, upon driver install Elgato Game Capture does not appear under Video Captures in Device Manager. In Game Capture HD software it says "No capture devices found." Would really appreciate the help.
  18. Intel may try to fast forward on lowering architecture size for IPC increase instead of staying too long on one architecture this time once 10nm is in volume production next year. AMD can IMO still easily throw in more cores. However, I don't think Intel is in the same position to throw in more cores without a constant need for a new socket.
  19. @W-L @Lord Xeb EVGA has discontinued this product. There is no longer even a link for the product. Any pre-existing hyperlinks to the product just re-direct to EVGAs general store. Should I still get in contact with them anyways?
  20. Hello, since resellers have raised the price on the EVGA 1080 Ti SC Hybrid Kit to 160$+ for new ones I had to search for an alternative. Even the EVGA 1080 Ti FTW3 Hybrid Kit still sells for 70$ from resellers. I have found a used SC Hybrid Kit however no screws are included for whatever reason. So the point of this thread is to ask if there is a place or even a local big box store that I can find these screws. Additionally, I am coming from a PNY 1080 Ti Blower. So I can also transfer the screws from those. However, I don't know exactly what screws the PNY 1080 Ti Blower has that I can transfer over. Below is an image of all the screws included in the EVGA 1080 Ti SC Hybrid Kit. If someone could help me out I would really appreciate it. Here is the link: https://www.evga.com/support/manuals/files/400-HY-5598-B1.pdf
  21. Yes, I scourged the forums and someone had 4x 1080 Ti's that did a comparison. In GPUs (1080 Ti) -- A tutorial scene given by the software to everyone. 1m58s 1m 42s 33s Improvements, from GPU to GPU > 2 = +97%; -58s; 58s total > 3 = +43%; -18s; 1m16s total > 4 = +27%; -9s; 1m25s total Or, if cumulative (which I don't think people use this metric to compare improvements) 1 > 2 = +97% 1 > 3 = +180% 1 > 4 = +358% The first metric makes more GPUs look worse and worse since the 3rd GPU is comparing against 2 GPUs and the 4th GPU is comparing against 3 GPUs. But the 2nd metric makes more GPUS look better, almost doubling each time. It also looks like 4 GPUs is off the charts if following the general assumption of 1 = +0%; 2 = +100%; 3 = +200%; and 4 GPUs = +300% but we peak at 4 GPU = +358%, 58% more than the +100%/extra GPU. I'm debating if that 43% is worth it on top of 2x1080 Ti. This is just a small tutorial scene provided by the software developers.
  22. I do not plan on upgrading thus I am simply using what I have. The reason for this thread. Therefore someone with knowledge in this area can give me the correct answer.
  23. So that means that I'm already not getting my full 24 lanes to start off with since the MB perfs are using some of my PCIe lanes. Do you happen to know or how I can find out which perfs embedded on the MB is using PCIe lanes that I can turn off to obtain the full 24 lanes? Meaning for 2) it will already be at x8, x8, x4, x# (MB stuff). Thus, if I add an NVMe, it will drop off to x8, x4 (GPU), x4 (GPU), x4 (NVME), x# (MB stuff)?
  24. Hello, I own an Asrock x370 Taichi MB. I plan on getting 3 x1080 Ti (not for gaming). Specs: 2 PCIe 3.0 x16, 1 PCIe 2.0 x16, 2 PCIe 2.0 x1 Ryzen 7 on x370 has 24 PCIe lanes which means I can run all 3 x 1080 Ti's in x8. 1) Does any of the motherboards built-in stuff (Wi-fi card, sound card, etc.) use any of my PCIe lanes? 2) If I run all 3 cards in x8 and I plug in for ex: an NVMe SSD x4. Will one of my GPUs drop down to x4 lane. So it will run x8, x8, x4 (GPU) and x4 (NVMe)? 3) Is there any way to control how many PCIe lanes each lane may use? I saw this image in regards to 2).
  25. Does this mean that if they were NOT in SLI. It would take longer for the scene to prepare since the textures would have to be put into the VRAM of both GPUs. Whereas in SLI, since it's working as one. It would take half the time since the textures are only being pushed into 1 GPU to prepare the scene?
×