Jump to content

amelius

Member
  • Posts

    48
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

amelius's Achievements

  1. Just to clarify, other X399 platforms were confirmed to work by several users, including other Asus boards, the issue is exclusive to the flagship X399 board, the ROG Zenith Extreme. Additionally, since it seems the board is fine with supporting each GPU when the other is disabled, and only has this issue when two GPUs are used at once, would lead me to think it's not a hardware but a software issue, since the physical hardware for each slot and working with these GPUs works fine, just the POST check fails for some reason.
  2. I didn't say it's cost effective... i said that the only case in which it's cost effective is when there's no other option for improvement... It's got low marginal improvement per dollar, but if you run out of other upgrades to make, it's cost effective because there's no other options. But yes, agreed, it's not a common issue, and it's definitely not for everyone. My rig: ROG Zenith Extreme Threadripper 2990WX 128gb Trident Z DDR4-2933 2x 2080Ti 1x Titan V 3x 960 Evo All water cooled custom loop with CPU(monoblock) and all 3 GPUs in it, 3 radiators, and quick disconnects everywhere. Acer Predator 1440p 165hz IPS GSync So yeah, in my case, moving from 2x 1080Ti's to 2x 2080Ti's was the only reasonably upgrade left. Now, if only my damn GPUs worked with my motherboard, instead of having some bizarre error, I'd be very happy.
  3. I've never had screen tearing issues, nor stuttering, nor game not launching. I've had a few times where performance games weren't great, but supposedly the new nvlink sli bridge helps with that a lot and most games scale at least 50%, which is good enough for me. Is it cost effective? Only if you can't upgrade any further. given how my system's specs are TR 2990WX, Titan V, 2x 2080Ti, i really don't have anywhere else to make performance gains.
  4. Agreed, especially in a custom water cooling loop like I have it, with UV reactive fluid.
  5. Mostly, i'm concerned with Asus's seeming lack of interest in compatibility, and helping, just brushing it off and saying we should contact support to check if our parts are compatible with them.
  6. Well, in this case, a fair number of users on different forums had the exact same, reproducible issue, and all of them had sufficient PSUs. This also occurs with Titan V's in combination with RTX cards. Also, several tests by techtubers have shown that SLI scaling with the new nvlink bridge is substantially better than old SLI bridges, resulting to 80% or higher scaling in quite a few things, which seems worth it. Additionally, I'm guessing that people who had this issue with RTX + Titan V configurations (like myself) are using these cards as budget machine learning cards, since they have tensor cores. That's a good reason to have a multi-gpu setup, however, in this case, even having the GPUs plugged in, with no bridge, results in the motherboard failing to post, with an error "OE Load VGA Bios". While each GPU separately works fine, when the others are disabled.
  7. Asus seems to think that supporting new hardware isn't important. After numerous others and I encountered an issue where two RTX 2080Ti's or RTX 2080's together fail to POST when both are enabled. Several forum threads have found the same issue, to no response other than "clear your CMOS" and "check your psu" from Asus which had no effect for anyone, in the last several weeks. https://forums.geforce.com/default/topic/1074074/asus-zeneth-extreme-x399-rtx2080-sli-not-working/?offset=10 https://rog.asus.com/forum/showthread.php?96162-Zenith-Extreme-bug-report-form https://rog.asus.com/forum/showthread.php?105105-Code-0E-Load-VGA-Bios One of their support agents went as far as to say that it's not supported because it's not on https://dlcdnets.asus.com/pub/ASUS/mb/socketTR4/ROG_ZENITH_EXTREME/RZE_Devices.pdf?_ga=2.162430392.527285745.1539026443-891392033.1492966352 saying: "There is no listed stability with two 2080ti cards" "You can visit the support page to confirm when the BIOS is released." "I regret that you spent that amount of money [name]. Please await our confirmation to ensure that this works. You could've contacted us before hand to check but I sincerely regret the experience." Other users have had similar issues, where Asus support is entirely uninterested in helping with this issue, with no resolution whatsoever. It seems the ROG Zenith Extreme (and specifically that board) is incompatible with Turing or Volta multi-GPU configurations, with Asus uninterested in resolving this. People building HEDT systems or upgrading should be careful and avoid this platform, both due to the incompatibility and the horrible customer support.
  8. Well, the first two going into my workstation i can write off taxes as a business expense since i use my machine for that... wouldn't be able to write off the second set since it'd be only in the personal rig... Trying to avoid that.
  9. I have all of the above, and I actually have background in EE, i was just hoping a product or combination of products existed to help, but as you pointed out before, this product almost certainly doesn't exist because there's no demand for it... there's very few scenarios, even in enterprise where this would ever be desired, I don't think that this is necessarily infeasible so much as it has very little if any demand.
  10. Actually, yeah. Ideally, i'd like there to be some sort of easy way to switch them, without plugging/unplugging cables, but ultimately, the worst case scenario is literally just switching riser cables.
  11. a) see previous comment, the stuff in the video doesn't really address what i want... b) that's exactly what i'm trying to do, want to move to a 4k 120Hz display from my 1440p 165Hz display.
  12. This isn't quite what i'm looking for, looks like this is more about bifrucation, where I don't want that, I want a dumb switch that moves the electrical connection from one GPU to a different GPU. Basically, a PCI Riser version of a single pole double throw switch for PCIe x16 risers.
  13. Again, I want to use the 2080Tis together with NVLink when I'm gaming, and when i'm not, move them back over to the workstation... If it was a 1080Ti sure, I'd go for it, but I want to use 2080Ti's to their maximum potential in both kinds of workloads... I don't understand why people refuse to actual help me solve the problem i came here with... I guarantee that the simplest solution, which is just having two daisy chained pci risers which i manually switch from one to another would cost less than getting another 2 2080Ti's... I'm just trying to do better than the simplest solution.
  14. I'm asking for help figuring out a custom solution... Do you realize you're saying "buy a crappy machine for gaming when you have all this incredible hardware to use"? My current setup performs *well*, but it doesn't perform perfect... It's bottlenecked, but it'd still beat the hell out of a 2200G + $200 GPU... I have a 1440p 165Hz GSync display... the goal here isn't "i need a machine to game, i can't game" it's "i have hardware that for a specific case I can't take full advantage of, and I want to find a way to use it to it's full potential". I'm a 100% fine with a crazy solution... And I do in fact already use hardware virtualization, but that's not the solution here, because that doesn't really allow you to have the stuff on different machines without huge performance impacts, I virtualize and can assign any or all of my GPUs and CPU cores and RAM to specific virtual machines on the workstation, and currently I game in one of the VMs (before you ask, no, that's not the source of my performance hit, that's maybe a 1-2% difference). I'm asking a very simple set of questions: 1) Can I join two PSUs together so that they power both computers safely while playing nice. I know such options exist for powering a single computer with two PSUs, but i'm wondering if i can do that for two different PSU, so that they've got a shared ground and I can run GPUs off them and plug them into different machines without worrying. 2) Are there *physical* switches that exist or could be made that would let me electrically switch from one PCIe3.0x16 riser cable to a second cable with the push of a button, so that i could "move" the GPU from one machine to another, not virtually, but physically.
  15. Well, I already bought 2x2080Ti's for the workstation which currently has two 1080Ti's and a Titan V in it, and I don't want to buy another pair... I want to be able to use them fully as I need them in different cases... So far, nobody's actually tried to answer my question, and only has been suggesting doing something else. Believe me, if something else was a good solution, I've already considered it and ignored it for a reason... I'm asking how to solve this specific issue. I have a fully working workstation already, I want to add the second more gaming oriented machine that won't be CPU bottlenecked, and can fully utilize those GPUs for gaming when i do game, but when i'm working, i want those GPUs in the workstation machine.
×