Jump to content

Bishop Crane

Member
  • Posts

    88
  • Joined

  • Last visited

Awards

This user doesn't have any awards

System

  • CPU
    2x E5-2690
  • Motherboard
    X9DRi-LN4F+
  • RAM
    16x 4GB DDR3
  • GPU
    1050ti
  • Case
    Corsair Air 540
  • Storage
    2x 512Gb M500, 2TB P3600, 8TB WDBlack, 2x 2TB WDBlack
  • PSU
    Corsair Bronze 750W
  • Display(s)
    Iiyama E2607WS, Dell 2007WS
  • Cooling
    2x H115i Pro
  • Sound
    Steinberg UR22
  • Operating System
    Windows 10 Pro

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. If you're genuinely interested in how it actually works there are loads of articles on it that can explain better than my short forum post but here goes anyway! Your CPU has 20 PCIe lanes which need to be divided up between resources that are directly attached to the PCIe bus So in your case you have two PCIe x16 slots each of which wants 16 lanes. That obviously isn't going to work, so they share them evenly between the two, so each gets 8 lanes. The remaining hardware (nvme and x4 slot) has the remaining 4 lanes to share between them. Now each lane has a lot of bandwidth, so in reality running two storage drives is not a problem, with each lane giving nearly 1GB/s for PCIe 3.0 so with x4 you're getting just under 4GB/s which no nvme is going to saturate. So in short don't worry about it, even if you put it in the other x16 slot and reduce them both to x8 the RTX2070 could just about saturate that at a major push.
  2. If they instructed you to take the CPU out for them to look at then in the modern world it really is on them, if you hadn't removed it it wouldn't have caused damage, blah, blah and all that.
  3. That is the motherboard of a TeamPoS 3000 XL Its a checkout/till/PoS machine whatever you want to call it. http://pdfstream.manualsonline.com/3/39ec0ba8-a85a-4a3f-9c27-05ed38015da7.pdf (page 7-6 for a picture of how it looks in the machine) If you can find someone who needs one (unlikely as they're ancient) they might pay for it, but the technology is fairly niche and 13 years old now.
  4. Always go for dual channel it is noticably faster. Your motherboard manual will tell you which slots to put the memory in when you have only 2 of the 4 For upgrading later on, if its exactly the same make and model of memory it will be perfectly fine yes. Its only when you have what appears to be the same, but is slightly different (speed, or CAS) is where its an issue
  5. Yep, that's not a problem, it will likely format as FAT32 but that's normally fine.
  6. The other thing you may need to do is get specific drivers for it, Quadro's are not generic like most graphics cards, so a dell Quadro 2000 can sometimes need Dell drivers.
  7. It can happen if the drier coils (motor or heater) aren't well shielded. Assuming your PC was off at the time it should be fine, as it should have earthed the short. If you're booting OK, it looks like you got away with it. For the future hold the hair dryer a good 4 - 6 inches away from anything to prevent that happening.
  8. Usually FAT, but some more modern motherboards will accept NTFS, if it doesn't say otherwise in the motherboard manual, I would go with FAT to be safe
  9. It should be totally fine. On the most part there isn't anything complicated going on here, its just an extension cable. the quality of the cable would limit your overall maximum distance you can use, but your extra 200mm should be totally fine even if it was half as much.
  10. Good luck finding a motherboard to run it. The highest TDP I know of on the 2011 socket is the 2687W at 160W and most motherboards can't even run that! Beyond that its finding a motherboard that recognises it! My suspicion however is that its a fake heat spreader as it has no FPO on it, go buy one, let us know!
  11. Sad thing is I've seen this and fixed something like this before. If you yank your graphics card out hard enough without opening the retainer catch on the slot, I would say easy done, but only if you're the hulk. There are 3 ways to fix it. 1. Make sure none of the pins are touching. Bend them, cover them individually in tape or something, whatever is easiest for you 2. Cut them off. Just be careful not to damage surrounding components, and make sure that the stumps aren't touching. 3. De-solder them. The hardest and most time consuming, but the best possibly outcome. I went for option 2 because well, its the easiest. Make sure you catch all of the metal offcuts so they don't damage anything. Just pray that shorting the board hasn't damaged anything further.
  12. Fairly certain in the BIOS, under Advanced, I think its onboard devices. There should be the option to set the bandwidth on the PCIe x16_2 slot to x4 mode, this will then disable both the USB3_E65 and the PCIe x1_1 slot
  13. What CPU do you have, and which slots are the devices in? As is changes slightly how things work. The key thing is that the PCIe x16 at the edge of the motherboard only runs at 4x maximum plus it shares bandwidth with the M.2 Also the PCIe x16_2 (the one next to the thunderbolt header) shares bandwidth with with one of the x1 slots as well as the USB3_E65 connector The key thing from the info is that PCIe x16_2 by default runs at x1 for resource optimization Hope that helps!
  14. Basically this. But in more detail, in the server environment you only have a set space to work in (19inches to fit a standard rack) So to fit 8 CPU's on a single motherboard the CPU and RAM are put onto daughter boards, so the "motherboard" just has chipsets and IO, and the CPU's/Memory are run from these expansion boards. If you want a good example to look at the Sun X4600 has some good pictures to see how it all ties together. The one you picked out is a slight deviation from the norm in that it runs the CPU from a PCI slot, I believe that's just for power which limits the maximum size CPU you can use. The real thing to be aware of is that the QPI link (the black block on the back) would have a special slot on the motherboard to connect to. The QPI is how the CPU's intercommunicate, which is why you can't run a CPU from a PCI slot.
  15. Do you have any examples? Basically its a small form factor machine and it has 2 PCIe x16 slots, its very small, one slot is occupied by a 10Gb dual Nic the other has the USB 3 adapter in it. It does have onboard USB 3, but they all share a single controller which gets overwhelmed quickly. The cost of replacing them with something larger to accommodate more cards is a lot, so a few hundred extra $ is worth it in comparison.
×