Jump to content

TheGermanEngie

Member
  • Posts

    280
  • Joined

  • Last visited

Everything posted by TheGermanEngie

  1. Hi all - So with Nvidia's recent announcement of driver 465, beta testing for GeForce card passthrough into a VM can be done. I want to set up a W10 VM on a Ubuntu host os. I have three GPUs in my system and I want to make sure everything will work. Specs are as follows - Ryzen 1700 32GB RAM ASRock X370 Taichi, latest UEFI BIOS GPU(s) - 1x Quadro K600 (run on lowest PCI-e x16 @ x4 for display), 1x Titan Maxwell (PCI-e x16 @x8) 1x Tesla M40 24GB (PCI-e x16 @ x8) My plan is to use the K600 for display, Tesla M40 for Linux, and the Titan X Maxwell for Windows 10. Where should I start? What VM program should I use? I am all ears.
  2. Nothing worked. I'm completely dumbfounded.. guess I'll have to use Ubuntu since that's the only OS that seemed to work. Edit: this guy on L1T seems to have had the same problem. Looks like a memory allocation issue. https://forum.level1techs.com/t/pc-will-not-boot-with-tesla-m40/160416/17
  3. I just tried that. It came back up but with the same exclamation point as I had before. Should I try again with Above 4G MMIO enabled?
  4. Enabling SR-IOV made no difference. I also tried enabling "Above 4G MMIO" with the 40 bit address. Still no difference.
  5. I will try to find any sort of relevant network settings this post is referencing, but I'm highly skeptical because of their modded BIOS. I'm completely stock.
  6. I have made an update to my findings: disabling CSM on its own doesn't affect my network connection, but after enabling "Above 4G Decoding" does. The M40 is UEFI ready, as is my Titan X.
  7. I have an AMD ASRock X370 Taichi, UEFI Rev. 6.20.
  8. Hi all - in the latest rev. of Windows 10 (20H2) when disabling CSM in UEFI, the network adapter fails and it cannot recognize any drivers or sign of connection. The NIC is just blinking yellow. I need CSM to be disabled, because my Tesla accelerator needs "Above 4G Decoding" to be enabled in order to post. Enabling Above 4G Decoding requires CSM be disabled. I'm in a big catch 22. I'm using a Titan X Maxwell as a display GPU and a Tesla M40 24GB in the second PCI-e x16 slot. Interestingly, I had no issues disabling CSM in the past, but instead of a Titan X I used a small low profile Quadro for display, because the Tesla has no display outputs. Help and suggestions are very much appreciated. If need be I can move this to the Windows thread. Edit 1: Disabling CSM does not prevent network failure, but enabling "Above 4G Decoding" does.
  9. Interesting. That means I should probably put my Titan on the bottom slot and leave the open air Tesla on the top slot. The Tesla only peaks at around 50c but there's still a decent amount of heat that comes off it. Not to mention the Titan already runs at 70c at close to max fan speed.
  10. I have the Phanteks Enthoo Pro M. ATX Mid tower.
  11. I have the ASRock X370 Taichi. Full ATX.
  12. Hi all - I'm mostly doing a lot of machine learning intensive programs with my full tower. For this reason I modded a server GPU (Tesla M40 24GB) to fit an arctic accelero cooler so it can have active cooling. Problem is, it takes up 3 pcie slots and in my ATX mid tower, that's a lot of GPU real estate. I want to put in my old Titan X Maxwell along side the M40 so I can run two models at the same time, but my case does not have the right headroom to give the Titan ample breathing space. Are there any full size ATX cases out there that are both relatively inexpensive and can hold a 3 slot + 2 slot card config? (Edit: I still have the original cooling block for the Tesla, so if I should buy a used server rack for 2 slot cards, that works too.) Thanks.
  13. Hello all - My saga of the modded Tesla M40 continues. The first problem I've continuously had was when checking GPU-Z, it shows all 24GB of VRAM being used up, even at idle. Running workloads didn't seem to matter, and it processes workloads just fine without any OOM errors. But GPU-Z, HWiNFO, and MSI Afterburner both say the memory is fully used. Tried driver reinstalls, windows updates, can't seem to get anything to work. Maybe it has to do with be using a non UEFI GPU (radeon 5450) for display? Or that it's installed in W10 Pro? The Tesla is in the first X16 slot and has full PCI-e bandwidth. Also, when checking the Task Manager, neither the Tesla shows up nor the Radeon 5450 being used to power the display. Tesla Drivers: 452.39 for Windows 10 Pro, CUDA 11.0, 9/22/2020, top X16 PCIE 3.0 slot Radeon Drivers: 15.201.1151.1108, 11/4/2015, bottom X16 PCIE 2.0 slot Windows 10 in UEFI mode, GPT disk, CSM disabled.
  14. Hi all - My saga of the modded Tesla M40 continues. The first problem I've continuously had was when checking GPU-Z, it shows all 24GB of VRAM being used up, even at idle. Running workloads didn't seem to matter, and it processes workloads just fine without any OOM errors. But GPU-Z, HWiNFO, and MSI Afterburner both say the memory is fully used. Tried driver reinstalls, windows updates, can't seem to get anything to work. Maybe it has to do with be using a non UEFI GPU (radeon 5450) for display? Or that it's installed in W10 Pro? The Tesla is in the first X16 slot and has full PCI-e bandwidth. Also, when checking the Task Manager, neither the Tesla shows up nor the Radeon 5450 being used to power the display. Tesla Drivers: 452.39 for Windows 10 Pro, CUDA 11.0, 9/22/2020, top X16 PCIE 3.0 slot Radeon Drivers: 15.201.1151.1108, 11/4/2015, bottom X16 PCIE 2.0 slot Windows 10 in UEFI mode, GPT disk, CSM disabled.
  15. Thread solved... had to remove washers from cooler in order to make sufficient contact, even though the use of washers are required in the manual. Card only goes up to 60c now without washer attachment and felt more solid contact-wise. Weird.
  16. Card has absolutely no physical damage. In fact it and the Titan X are 98% identical in board layout. only for a few different resistors near the chokes. So if I can mount this on a Titan X, it should 100% mount fine on this Tesla.. all the VRAM and Chokes are covered with the included thermal pads.
  17. I just remounted it with a fresh application of Mx-5. Still shoots up to 90c and throttles itself to 300mhz clock. Fans r plugged into the 12v adapter. Also weirdly says that VRAM is full even when idling. I’m using it purely for compute. I have an AMD 5450 plugged into the pcie 2.0 slot for display because the tesla has no video outs. Might be driver related?
  18. Hi all - Recently I just bought an old Tesla M40 with 24GB of VRAM to replace my Titan X Maxwell for Deeplearning. As it is a passively cooled server card, I bought an aftermarket Arctic Accelero IV cooler and installed it so the card wouldn't overheat. However, even though the Tesla is just basically a Titan X with more VRAM (same GM200 die) it doesn't have a fan connector on the PCB because it is meant to be cooled by external server fans. Thankfully Arctic thought of this and included a molex fan adapter with the cooler. But it runs at a static speed and the card still overheats. I still have my Titan X fully assembled. Would it be possible to de-solder the fan connector from the Titan and re-solder it onto the Tesla so the GPU has control over the fan speeds? Or will the VBIOS not do anything because it isn't meant to have a fan in the first place? Suggestions are welcome.
  19. The 1660ti is running at full clocks. Reason I chose 2060 is its support for FP16 computation with tensor cores, so I could get double the performance for the same memory usage.
  20. Hi all - Been thinking of selling my purchased HP Omen 15 that I got refurbished for $1,200. It's decently specced, with a Ryzen 4800H, 16GB RAM, 1660ti, and 1TB SSD, but for what I do with computing I've been thinking of splitting up that cost into a powerful thin and light and re using my old desktop. I train lots of AI models. The 1660ti can handle it, but it takes up a lot of time to train and the laptop is fairly loud when under heavy use. I was thinking of buying an RTX 2060 on eBay and fitting it into my old rig once the RTX 3000 series drops, which I can put in the basement and remote into via Teamviewer or SSH so it can do its own thing without me hearing it. So that would be maybe $300-250? And I do the majority of my day to day tasks on the cloud and web (Spotify, Wix, general web browsing) So I'm looking at the Lenovo Ideapad series w/Ryzen for around $600. So that would save me about $300 as opposed to having one laptop that does it all while gaining more usability. Any laptop/hybrid users have suggestions? I do like having a machine that can do anything, but for the majority of my computing I'm fairly mobile and don't want to be confined to one task at a time.
  21. Hi all I’m looking to upgrade my desktop from a 1660ti to a 2060 sometime soon to help speed up my Deep Learning face recognition CNN. Heard the tensor cores can essentially fit twice the amount of memory because it can specially execute FP16 computations, compared to normal FP32. And image processing is ok with that because it’s just further image compression. I want to splurge on a 2060 now (preferrably used) but I get the feeling I should wait until Ampere/3000 series launches (rumored for september) so current RTX prices go way down and could snag a 2060/2060S for under $300. For my use case, should I buy one now or wait?
  22. Hi all - I've been embarking on a programming journey and have gone down the rabbit hole of how deepfakes, neural networks, and artificial intelligence work. Frequently I am coming across programs (is that what they're called?) called Tensorflow, Jupyter, Anaconda (or conda) in the realm of Python programming. I'm very new to this whole field, if someone could explain to me what all these mean and how they go together, that would be wonderful. I get the feeling Python and programming are in my future, I love how easy it is to get started with compared to other programming languages and the possibilities seem endless.
  23. It's a 144hz IPS covering 95% of sRGB color space. Pretty decent.
  24. Hi all - I've made a couple threads in the past about buying a Ryzen 4000 gaming laptop because of the value, but wait times are to the point where it will take weeks to order and weeks to ship. So I'm looking at local deals in the area around me for around $1000. Came across this deal from my local micro center for a Maingear Vector laptop. Has good reviews compared to the Lenovo Legion Y570 and other laptops in its class, currently open box for around $1k. Is this a deal?
×