Jump to content

Allegrif

Member
  • Posts

    18
  • Joined

  • Last visited

Everything posted by Allegrif

  1. Thanks for your input, on both my threads. Not necessarily looking for gains as my budget for this build is much lower than my last, which I no longer own. Just looking for the best bang for buck. It's primarily the quad channel memory I'd be interested in on x99, but I feel like I probably have an unjustified mental draw to it because it's what I had before and going dual channel feels like a downgrade when it probably isn't in reality. I'm just excited to finally be upgrading and want to get it right. I've been stuck in the poor house with a Kaby Lake Pentium chip for FAR too long now since life took a nosedive a couple years back.
  2. Came across a 5960x with ASUS x99-s motherboard bundled together, overclocked to 4.2 Ghz (used obviously). £350 GBP all in which seems like a good price. Would it be worth going down this route and locking myself into a dead platform in exchange for quad channel memory and more PCIe lanes etc, or would you recommend just going Zen 2? Would cost a bit more and I lose the HEDT features, but the platform is very much alive and carries PCIe 4 support. Plus it would be new hardware and not 4 years old. Funnily enough my old rig before I had to sell it (bad times) was 5960x and I could only get it to 4.0 Ghz regardless of voltage.
  3. Assuming the game has very good SLI support? Guessing 2080 Ti wins out, but more curious on any numbers/percentages. Info probably out there but curious to know the personal experience of anyone who swapped between the two systems. I'm sure someone did My old rig was 980 Ti SLI before life happened and I had to sell it. Currently on the right track again and about to get a new rig. Not necessarily sticking with Nvidia, but knowing the above would be a good gauge for the new Navi cards too.
  4. Thanks for the advice, I appreciate it. Unfortunately I only seem to have that tickbox for refresh rate, not resolution, and even that option is greyed out. Would you know it - after the millionth time of unplugging and replugging I've finally got my beautiful faux 1080 display back ?
  5. Believe me I've been trying all afternoon
  6. I'm a pauper currently using a 1360x768 crappy 32" TV as a temporary monitor. I only play Football Manager on it, which is extremely text/data heavy, and the higher your resolution the more data you see on each screen without having to dig into menus and submenus. I've been running at this native resolution for ages and just put up with it. However, something happened a couple of days ago when I accidentally yanked out the VGA cable (yes, VGA!) and when I put it back in it did what it always does and set the resolution to 800x600... because of course it did. When I went into display options, to my exhilerated surprise, it gave me the option of setting 1920x1080 as the resolution! It's never done that before. I selected it to see what would happen, and it did exactly what I wanted it to do. It downsized everything to the scale it would be if the TV was truly at 1080 display. It obviously isn't, and a lot of sharpness was lost as a result, but I was perfectly okay with that. It was comparible to watching a youtube video of someone on Football Manager with a 1080p display on my 768p TV. It wasn't as sharp as 1080 would be, but everything was legible and loads more data was displayed. I wanted to keep it like this and even went so far as to leave the PC on 24/7 for a couple of days, but sadly last night we had a 5 minute power cut and when I booted back up it will only let me go to 1360 x 768 again. I've tried setting custom resolutions in Intel HD Graphics Control Panel but with no joy. Any ideas how to get this back until I can get my hands on a better monitor?
  7. I think it's a legitimate question with a potentially interesting answer if the consensus is that the performance isn't all that different given the difference in price. And it's a build I would consider looking at.
  8. General PC Enthusiast-ness; I like things that are powerful for no real applicable reason. I suppose let's go with overall multicore performance.
  9. How would 2x Xeon E5-2667 V2 (8 core, up to 4Ghz Ivy Bridge) on a dual-socket board hold up against the 1950x? Obviously the 1950x would win, but by how much? I'm just curious as I could get a dual socket board for £300 and put one of these in for £250, adding the second later as an upgrade. It would also allow for silly amounts of cheaper RAM
  10. Basically I want to put a Samsung 960 Pro in an ASRock - EP2C602-4L/D16 via a PCIe adapter card. Would this work, or do some motherboards (thinking because it's a server board) just flat out not support it? The PCIe slots in the board are mostly Gen 3 x16, so bandwidth not a problem. I just don't know if the board would 'support' NVME... Also if it did work, could it boot from it? SSD noob here.
  11. I got this for a steal on Amazon to temporarily replace my Intel HD graphics until I can get a better card later. Literally just wanted it to play FIFA, as even my crappy Celeron and 4GB RAM seem able to play it going by YouTube; it's just the Intel graphics that struggle. When I installed the card and connected the video output, it booted up fine but the image was stretched, presumably because of a lack of drivers. I then installed the drivers and it reverted to the correct resolution. However, there were random black squares/artifacts allover the screen which weren't present before installing the driver. After an hour or so of troubleshooting the issue just got worse. I can now no longer get to desktop using the discrete GPU - it gets to the login screen and then wthin a few seconds it black screens. Issue persists regardless of HDMI or VGA, although the images look slightly different with each type of output. Windows 10 is my primary OS, but I have a dual boot of Linux and the issue was also present there. I'd send it back, but I have a feeling it's not the card. The Amazon listing specified that it had been tested as working and comes from a reputable seller. Could it be my PSU? I have a 450W craptastic, off-brand, made in China PSU. I'm wondering if it might not be delivering the full 450W of rated power? Could that cause the issues above? I have a big pay day coming up soon and I'm planning to dump the entire rig and start again with much higher end parts, but until then I'm scratching my head trying to get it to work... System: Celeron G3900, H110M-K, 4GB DDR4, 3 x 7200RPM HDD
  12. When are we expecting it? Seeing differing dates across the internet ranging from June (well gone) to August to October...
  13. It seems to be fairly agreed upon from what I've read that quad channel won't do an awful lot for gamers, but is the leap from a single DIMM to a dual channel kit significant? Currently running a single 4GB stick in a new rig I have a set upgrade path for, and wondered how much extra performance I can expect from upgrading to dual channel (other than the obvious benefit of higher capacity)
  14. He's played one game since about 2007. I don't think he'll venture too far away from it lol. He really doesn't do change. I've tried to get him to try Cities: Skylines on my PC, which is basically Sim City 4 but more modern and much better, but he doesn't even want to entertain that idea.
  15. That was my go-to, but I reckon the 7350k would give him an entire extra GHz once overclocked. That's a lot of extra power in a single-threaded app. It's quite a bit more expensive though too. Decisions, decisions...
  16. So my autistic brother runs a Celeron G3900. It's terrible. It makes him want to spray lemon juice in his neighbours' eyes when doing... anything. He mainly uses this PC for standard stuff like web browsing etc, but he also plays the ancient title Sim City 4, with hundreds upon hundreds of third party mods and add-ons. Being so old, SC4 is purely single-threaded, but we find it's actually surprisingly CPU-intensive. It completely maxes out one of his 2.8 Ghz cores and causes a lot of lag with occasional crashing. I know everyone laughs at the 7350k for being mainly useless due to the true quad core i5s being barely more expensive and far superior for most gaming applications, but he would have literally no use for those extra 2 physical cores. They won't make YouTube faster and they won't make SC4 faster. Would going for the 7350k and overclocking the balls out of it (4.9 - 5Ghz?) be my best option to upgrade him?
  17. Thanks guys, so the CPUs having 16 lanes isn't actually as horrible as it's made out to be, as most people don't realise the chipset provides a bunch of other lanes anyway for the M.2s etc? If I'm understanding this right, the main concern with only having 16 CPU lanes is less expandability in terms of traditional physical PCIe slots for 3x GPU setups or 2x GPU plus other expansion cards etc? And as far as the M.2 slots go, it depends on the board as to whether a 44 lane CPU would allow the slots to use the CPU lanes or still restrict to the chipset lanes? And everything I do know I've basically just learned from watching Linus and a couple of others like Gamers Nexus on YouTube
  18. First off this is my first post on LTT as a relatively recent PC convert, so uh... hi. I need something clearing up in my head, not because I'm thinking of buying (I'm fancying Threadripper) but just because I can't get my head around it and the numerous other sources I've read seem to give completely conflicting information. x299 motherboards are ideally built with 44 PCIe lanes in mind, so I know putting a 16 lane CPU in there is going to disable a lot of features. My confusion comes from the additional chipset lanes. If a board has 3 M.2 slots, and is configured to give all 16 lanes of a 7640x/7740x to the top expansion slot (GPU for example), does this render the M.2 slots completely useless, or are they still connected through the chipset? If they are connected through the chipset, would putting 3 M.2 SSDs in RAID 0 be bottlenecked by said chipset? Further to this, if you went and put a 44 lane CPU in later, would this allow the M.2 slots to connect straight to the CPU rather than the chipset, or are these slots always going to use the chipset regardless of CPU lanes available? So much confusion on this platform as a newbie it's unreal
×