Jump to content

D4n

Member
  • Posts

    48
  • Joined

  • Last visited

Reputation Activity

  1. Like
    D4n reacted to nicklmg in Make ANY PC Into a Hackintosh!   
    How-to on Passthrough Post: https://geni.us/u0rk
     
    Buy AMD CPUs on Amazon: https://geni.us/HcaLFd7
    Buy Nvidia video cards on Amazon: https://geni.us/wjj7Vzx
  2. Informative
    D4n reacted to cuongvfc in P106 NOW SUPPORT DirectX (Not Official)   
    I can't really give you a link tho, it's on an app called ”闲鱼“ it's an app where people sell their 2nd stuff in China. I just found this when I was looking for a P106.
    if you don't know Chinese then there's no point going thru a wall of Chinese text. He did mention that there'll be performance lost and something about frame drop when gaming so it still not that easy.
    nvm he said the bandwith is only 4GB/s
  3. Funny
    D4n reacted to ncrmnt in P106 NOW SUPPORT DirectX (Not Official)   
    Link, please Either they are gurus or scammers. Something tells me it's more likely the latter
  4. Informative
    D4n reacted to cuongvfc in P106 NOW SUPPORT DirectX (Not Official)   
    I found a shop claimed that they can add HDMI or VGA port and changed the bios so you can use a p106/04 as a normal 1060 without going thru all of the igpu passthrought stuff with the price of ~20$
  5. Informative
    D4n reacted to Seol Chaos in P106 NOW SUPPORT DirectX (Not Official)   
    these crypto specific card sell for 20 US dollar,and now sell for 70... which is the price,LOL~
  6. Funny
    D4n reacted to Seol Chaos in P106 NOW SUPPORT DirectX (Not Official)   
    In fact, in China, these graphics cards have already increased in price. and geek circle in China, everyone does not want more people to know this.
     
  7. Agree
    D4n reacted to boysenbeary in Apple REFUSED to Fix our iMac Pro   
    What a ridiculous system?
     
    "Hi we built a computer"
     
    "Great! Can I have a replacement part?"
     
    "No they dont exist"
     
    "How do you build a computer with parts that dont exist?"
  8. Funny
    D4n reacted to hunter2 in Apple REFUSED to Fix our iMac Pro   
    Take it to the Apple HQ.
  9. Like
    D4n reacted to Cvanh in Apple REFUSED to Fix our iMac Pro   
    They will reach out.
    Perhaps this could be an pr disaster.
  10. Informative
    D4n reacted to BLAfH in AMD B350 chipset not supporting decent overclocking, contrary to what users expect?   
    Simplified to a point a hope is juuuust before it gets "wrong", but given the point to start...: 
     
    All of them matter. Every one of them can kill something, but the main suspect here is IMHO overall wattage.
    A VRM is, as it's name implies, a module, not a single part. There's controller chips, MOSFETs, coils, capacitors, the lot.
    Each of them can be killed by one thing or the other, all of them will be killed by heat which ultimately burns down (pun intended) to watts. 
    It's next to impossible to kill a VRM with it's output voltage, as you can't set it to one that harms it. You may set it to some thing that kills a CPU, but you can't set it to a voltage that harms the VRM in and itself - regarding only the voltage.
     
    Amps - well, there is an upper limit of amp, some of the component can handle. Although this is - in most cases like MOSFETs - not only amps, but in a big part amps at specified voltage -> therefore Watt. 
     
    Some very rare effects ignored, it boils down to efficiency and temperature. VRMs take a "high" voltage (12V) and convert it down to a lower, in your case... 1.3 to 1.4V for the CPU. It can't do that with 100% efficiency (well, physics, and unfortunately there are no room temperature superconductors in sight) so there's always something lost. And any loss in a electrical system is... heat. 
     
    Let's assume your VRM is - over all - 90% efficient (simplified view on the VRM as a whole, no regard for varying efficiency at different  loadpoints and voltages and on different parts of the vrm, including the disadvantage of low volt high amp regarding On-State Resistance in FETs, cross heating of components  etc.pp.,  wildly guessed number, just for a nice calculation, results rounded). Cooling on the VRM is good to keep it in spec for max 15W heat generated at the VRMs.
     
    Your CPU is pulling 1V@100A. 100W. 10% is "lost". So your pulling 111W on the PSU side. 11W will be converted to heat. All is nice and dandy.
    Another CPU is pulling 2V@50A (you won't find one in real life, but... for the calcs, meh...). As above... 100W, 111W on the PSU side, 11W in heat. all is nice and... 
     
    You see? Neither 100A nor 2V alone have killed anything.
    Well... Put in a CPU 2V@100A. 200W. 222 on the PSU side. 22W in heat. You can cool off only 15.
    That's about 50% thermal overload, you fry some part on your VRM. Or, more likely... you degrade them. 
    Capacitors for example have a lifetime of... lets say 5000h @ 85C max. - overheat them... and they may only have 50.. at 170C. 
     
    It won't be "bang, dead". It will be more like... runs a day... Bluescreen. runs 4 hours... Bluescreen. runs  15min,... dead 
     
    And it also shows how you can improve the design of the VRM. You can have more "phases", sharing the load and heat over more components, of which each one won't get as hot. Cons? Cost and space. 6 Phase cost 50% more than 4. And it needs room on the board, and it has to be in a certain spot on the board, limiting space and cooling overall again...
    You can improve cooling (to a point... e.g. the MOSFETs in your VRM are tiny pieces of silicon, and there's a limit on how much heat you can remove from such a small area). Again. Cost. Space. 
    You could make a more efficient VRM - but those VRMs nowadays are already very close to whats doable. 
     
    Also as a side note on cooling VRMs: If you're thinking about a AIO for the CPU... Those tend to be worse for VRM temps, as the don't generate airflow on the VRM area on the board as a air-cooler does. Cooler CPU, higher OC. Dead VRMs. 
      
     
  11. Informative
    D4n reacted to seon123 in AMD B350 chipset not supporting decent overclocking, contrary to what users expect?   
    They're not designed for overclocking. X370 supports overclocking, that's it. Whether or not a board is designed for overclocking, depends on the board itself, not the chipset. 
  12. Funny
  13. Informative
    D4n reacted to miagisan in AMD B350 chipset not supporting decent overclocking, contrary to what users expect?   
    If you read the thread you will see someone posted a link to reddit based on motherboard quality that someone RESEARCHED to help buyers. 
  14. Informative
    D4n reacted to Bravo1cc in AMD B350 chipset not supporting decent overclocking, contrary to what users expect?   
    OP this is a really useful article if you need it. 
     
    https://www.gamersnexus.net/guides/1229-anatomy-of-a-motherboard-what-is-a-vrm-mosfet?showall=1
  15. Like
    D4n reacted to Princess Luna in AMD B350 chipset not supporting decent overclocking, contrary to what users expect?   
    Makes no sense to buy a new motherboard for a couple hundred mhz, just wait for the X570 chipset ones and Ryzen 2 at this point if OP is THAT much bothered with his performance.
  16. Like
    D4n reacted to bowrilla in AMD B350 chipset not supporting decent overclocking, contrary to what users expect?   
    Your ignorance amazes me. People actually explained to you how overclocking works, that you're not guaranteed anthing beyond stock, that your board is weak and probably not suited vor overclocking, that you can't just put in some numbers and expect overclocking to just work, that manual voltage control is needed for proper results and that it's ultimately a game of (silicon) luck - and you're still complaining about things that are basically just the way things work and are fine for basically everyone else who did a bit of research and successfully maxed out their overclocking potential. But sure, bad bad bad tech companies.
  17. Informative
    D4n reacted to miagisan in AMD B350 chipset not supporting decent overclocking, contrary to what users expect?   
    I wouldn't consider yourself a "seasoned" over clocker if alot of this is strange to you. Every chip is different (silicon lottery), and motherboard VRM's play a big part in overclocking besides heat and voltage. For example, my asus r9 290x DCII OC should overclock very well according to a ton of websites, but i push anything more than 40 mhz overclock and it just locks up no matter how much voltage i throw at it. I lost hardcore on the silicon lottery.
     
    Secondly, there is a virtual wall on the original ryzen chips. 4.0 ghz or higher is not easily attained.
  18. Agree
    D4n reacted to pstarlord in AMD B350 chipset not supporting decent overclocking, contrary to what users expect?   
    I'd say the lesson to be learned here is:  Learn first, then buy.  Sounds to me like you were just ignorant to a few important variants involved and how they will effect your outcome.  it's not big deal, you can use this as a chance to learn.  
  19. Funny
    D4n reacted to SenKa in AMD B350 chipset not supporting decent overclocking, contrary to what users expect?   
    PEBKAC: Problem Exists Between Keyboard And Chair.
  20. Agree
    D4n reacted to GMAX BT in P106 NOW SUPPORT DirectX (Not Official)   
    Bruh, it wont ship in my country (Bangladesh) and I wont be able to pay more than 75 USD. thanks
  21. Like
    D4n reacted to WDK in P106 NOW SUPPORT DirectX (Not Official)   
    The modified driver is clean, and here's how you can check it yourself:
    Download both the modified and original version. The modified driver is based on driver 416.34. You can download the original one from the official NVidia website. The executable for the driver installation can be extracted using 7-zip. This gives you a folder with the installer files and the setup.exe. Extract both the original and modified driver, each to a separate folder. Use a comparing tool to check for differences between these two folders. I used BCompare, which has a free trial. This will give you a list of files that are different between the original and modified driver. You can now see that the only file that was modified is nv_dispi.inf, which is simply a list of devices. All other files are identical.  
     
  22. Like
    D4n reacted to ZephCloud in P106 NOW SUPPORT DirectX (Not Official)   
    you've made it into the video..gratz!!
  23. Like
    D4n reacted to SkySway in P106 NOW SUPPORT DirectX (Not Official)   
    I know my English writing can use some improvement, but Im confident that you guys can understand what Im saying
    Here is that "cracked" Drive
    https://drive.google.com/drive/folders/1TN3MJlirDWNYvFanq3RI-EWZ9nDzGtFw?usp=sharing
  24. Like
    D4n reacted to SkySway in P106 NOW SUPPORT DirectX (Not Official)   
    Yo guys
    Someone from china(yeah thats my country) has used something like deceptive software to go around NVIDIA`s DirectX Block on P106, which Im sure you guys know, is specific designed for mining. 
    I got his tutorial file attached if anyone is interested, since its chinese, I did my best to translate it.
    It seems after cracking its DX block ,this thing`s performance can be nearly identical as a GTX1060 6G 
    Im still uploading that cracked driver to my google drive, once its done I`ll post link here
     
    BTW, how do I let Linus know about this? Maybe LTT can do a video about this
    P106.docx
  25. Like
    D4n reacted to Omxp in PC shutdown suddenly!   
    solved it .. was driver issue when loading windows at startup.
     
    thanks 
×