Search the Community
Showing results for tags 'dual gpus'.
-
Budget (including currency): $1500 Country: Qatar Games, programs or workloads that it will be used for: Heavy Blender, Apex, Halo, Minecraft, Fortnite Other details (existing parts lists, whether any peripherals are needed, what you're upgrading from, when you're going to buy, what resolution and refresh rate you want to play at, etc): I am buying parts for a PC and I am 3d modeler. I am currently working on a crappy Lenovo ThinkPad and want dual 3060's in the future. Only one GPU for now. However, I don't know what motherboard to get that will support dual GPU's . I have already selected these parts: Ryzen 5 7600, MSI Ventus GeForce 3060 12G, Corsair Vengeance DDR5 5200 16 X 2, NZXT H5 Flow RGB, MSI Spatium 1T, Noctua NH-U9S Black, EVGA Supernova 850 G6 80+ Gold, and I plan to upgrade storage in the future. I want to be able to upgrade GPU's over time and I need some Serious help with this.
- 3 replies
-
- motherboard
- sli
-
(and 2 more)
Tagged with:
-
Budget (including currency): 1500€ - I am willing to go a little bit (max. 100€-150€) beyond that budget for more performance Country: Austria Games, programs or workloads that it will be used for: Programming/Game Dev, Web surfing, 1440p High Gaming (Minecraft, GTA V, Simulators like Euro Truck Simulator 2), VR (Beatsaber, Boneworks, ...) - I really like high definition gaming Existing PC: Hi, I want to upgrade my current PC setup and I would like your help with it but I have some "special" requirements: I switched from Windows 10 to Linux (Linux Mint with Cinnamon) as my primary OS a year ago. I am currently using a dual boot setup with Windows 10. I would like to change this setup: New AMD Card for Linux and my old NVIDIA for a Windows VM (VFIO style + Evdev for peripherals + PulseAudio for sound). => This means I need: An AMD Graphics Card so I can finally use Wayland A high wattage Power Supply with maybe room for a GPU upgrade for the VM down the line (1000W to 1200W?) A Motherboard which can handle two Graphics Cards regarding the PCIe Slots (preferably at least Gen 4 x8 on both PCIe slots for the Cards if something like this exists) [OPTIONAL] USB PCIe Card with separate USB Controller for the VM RGB, WIFI and a tempered glass window in the case is not needed I still would like to reuse some of my old parts: Peripherals Monitors Storage Fans GPU for my Windows VM Maybe the Case - I would prefer a Case with a built in GPU support bracket CPU Cooler??? - Not sure about that and if I would need an upgrade package for mounting I tried my stab at configuring an AMD build myself with my local prices sourced from https://geizhals.at/ (mylemon as vendor most of the time): Feel free to throw this build over board and start fresh. I am very grateful for any contributions.
-
Hi CPU: Intel Core i9-10900K Motherboard: GIGABYTE Z490 AURUS ULTRA Memory: G.Skill Trident Z Neo 32 GB (2 x 16 GB) DDR4-3600 CL16 Storage: Samsung 860 1TB 2.5" Storage: Seagate Barracuda Compute 2 TB 3.5" 7200RPM Case: Cooler Master Master case h500 Power Supply: 1000W cooler master rm1000 GPU : ZOTAC 3080 + ZOTAC 3090 OS : Ubuntu 20.04 LTS Could somebody please suggest a way to stress test both GPUS simultaneously Thanks in aadvance
-
Hello, I recently purchased a 1060 for mining in my gaming pc so that it could run side by side with my 2060. SPECS MB - MSI x570 A-Pro CPU - Ryzen 7 3700x GPU - EVGA 2060 KO STORAGE - 1 m.2 and 1 HDD PSU - Corsair CX650M They both work normally when plugged in to the first x16 slot which has a bandwidth of 4.0 x16 The second pcie x16 slot has a bandwidth of 4.0 x4 When they are both plugged in to either/or pcie slot OR plugged in by itself in that second x16 slot there is no video signal. See attached for my mobo manual screenshot. Would really appreciate the help :)
- 5 replies
-
- help
- troubleshoot
-
(and 1 more)
Tagged with:
-
Hey all! I'm planning on putting an 8600GT in a computer with an RX570. The reason for this is to use its native VGA with my CRT monitor (which matches the sleeper case I have for it ), which throws fits when using DP and HDMI to VGA adapters. From what I know about laptop graphics, you can switch a program to use dedicated graphics from onboard solutions; but what happens when you have two discrete graphics cards? Is the age difference too large between the two cards to use any form of pass-through? If it matters, the board I'm using these in has an X370 chipset, with one of the two x16 PCIe slots running at x4 (preferably with the 8600GT) and will be paired with a Ryzen 2600.
-
- passthrough
- vga
- (and 4 more)
-
I recently built a PC with ryzen 5 3600, msi x470 gaming plus max and rx 580 8gb. I wanted to use fouth monitor but my graphics card has only 3 ports. Then I thought why shouldn`t use my old prefectly working gtx 480 for just output. I am asking how to setup the two cards, because there isn`t much information about this. I also have a stuttering problem with my rx 580. When I open a video or something moving on the other monitor the game starts stuttering. And whenever my wallpaper changes the game stutters for a second. I wonder if this problems will be fixed if I use the gtx 480 for the other monitors.
- 6 replies
-
- dual gpus
- amd and nvidia
-
(and 1 more)
Tagged with:
-
Will I encounter any issues using two gpus at the same time? I have my better card in my 16x slot and a low end card in my 8x full size slot so I can have a second hdmi connection. Will the weaker card hold back my stronger one? The gpu's. (16x) EVGA rtx2060 xc gaming 12gig (8x) BTO gt1030 2gib Side note, I have no idea about anything regarding that BTO card, never even heard of the company and can't even find my exact card but it seems to work fine.
-
Hi all, This might be a dumb question or one whose answer is right in front of me. But I want to use my old video card to run a monitor from my pc to a smaller display that would sit on the end table next to my couch, with my newer nicer card driving my games on the big TV across the room. I would have strategy guides/forums/etc open on that smaller display I could pull up if I needed to look something up for the game. So the TV using my newer nicer video card, the smaller display utilizing my older card. I don't like the idea of extending the display between these two. Although to be fair I've never done dual gpu/dual monitor so I don't know if that works quite the same. Ideally I'd just have like a hotkey on my mouse that switches the output of my mouse and keyboard to the other display. If they make a switch type peripheral for this? Because that would feel most seamless to me Of course the other idea is just plugging in my chromebook to the smaller display, but I'd like to just utilize the one keyboard and mouse. Running both displays off the newer card I think would put too much load on it. It's a big 4k tv and then the smaller 22 inch high-ish refresh rate monitor Is extending displays in a dual gpu/dual monitor situation seamless and low workload?
-
Budget (including currency): 6000 ( EUR ) Country: Romania Games, programs or workloads that it will be used for: 3Ds Max ( V-Ray ) Other details (existing parts lists, whether any peripherals are needed, what you're upgrading from, when you're going to buy, what resolution and refresh rate you want to play at, etc): My company tasked me with building a workstation for 3D modelling and rendering of assets using the integrated V-Ray renderer inside 3Ds Max. This build will have 2 GPUs since the current build that runs only one GPU it's not fast enough for our needs. Even though I know that nVidia RTX 4000 series is coming soon, unfortunately I can't wait for it since the company needs this workstation now so that deadlines can be met. This workstation will use the GPUs for rendering henceforth CPU rendering performance is not critical, the main concern when it comes to choosing a CPU is enabling the best connectivity and maximum speed and bandwith. Since this is going to be very expensive I need it to be as futureproof as possible which means that I aim to have a DDR5 and PCIe 5.0 compatible build. For this I was hoping that I could get away with an intel i9-12900K since workstation CPUs are exponentially more expensive than regular consummer processors. Here are some of my questions: 1) Does CPU come into play when it comes to V-Ray GPU rendering ? 2) Is 3Ds Max or V-Ray able to properly take advantage of PCIe 5.0 or PCIe 4.0 for that matter ? Does it improve render speed when using multiple GPUs ? Or is it that PCIe 3.0 is fast enough for V-Ray as it is and newer PCIe standards only improve in the disk read-write speeds? 3) Considering that the i9 12900KS has only 20 PCIe lanes that can work at best in a 1x16+4 or 2x8+4 PCIe 5.0 configuration, would it be better to use a i9 10980XE with 48 PCIe lanes that supports 2x16 but on PCIE 3.0 or a Ryzen 5950x that has 20 PCIe lanes (1x16+4 or 2x8+4 PCIe) on PCIe 4.0 ? Apart from 2x16 PCIe 5.0 which is only available on Threadripper , Epyc or Xeon which configuration would provide maximum performance for GPU rendering in V-Ray ? I know this would be more suitable for the official Chaos V-Ray forum however a product license is required in order to be able to post there and right now I do not have acces to one.
-
**I NEED ADVICE ON WHAT PARTS TO GET I MAY OR MAY NOT GET IT PREBUILT (PROBABLY NOT)!**I have a $5,000 budget to get a pre-built pc but customized so I would like to know specific motherboards, 32gb ram, ssd, cooling unit, power supply, monitor, and sli unit if it makes a difference. I'm trying to build the best gaming computer I can using this and I want to run 144hz 1440p or 4k 120hz (I think) with also the ability to run htc vive vr. This would be extremely helpful and thank you! PS I am not building it because this is my second pc and the first one I built but it had a freezing problem I could not figure out for the life of me since the first boot up. (I used it for 4 years and got angry.) I also want to build it at cyberpowerpc.com unless anyone knows of a better site!
- 9 replies
-
- dual gpus
- gtx 1080 ti
-
(and 4 more)
Tagged with:
-
CAN THE HP 8200 ELITE SFF MOTHERBOARD USE 2 GPU'S SIMULTANEOUSLY? I AM USING THE PCI E x16 AND I WOULD LIKE TO PUT AN OTHER GPU ON THE PCI E x1? IS THAT POSSIBLE OR NOT, NOT GAMING REASONS JUST MORE MONITORS P.S I AM USING ALL THE PORTS OF THE GPU IN THE PCI x16 AND NOT BUYING A NEW GPU FOR MORE, AND I WOULD PREFER NOT USING USB TO DISPLAY ADAPTORS . I HAVE ALREADY READ THE MANUAL AND I AM CONFUSED PLZ HELP S.O.S.
- 5 replies
-
- motherboard
- dual gpus
-
(and 1 more)
Tagged with:
-
Hello everyone. I need some questions answered. I’m designing an extreme enthusiast gaming PC and I need answers. My computer will be a fusion of a gaming rig and a workstation, mean it will be blisteringly fast while packing multiple hardware parts like a workstation (heavy gaming rig/light workstation hybrid). Here are my questions: 1. What is the best CPU processor for running dual GPUs at x8 or x16 speeds in SLI + dedicated sound card? I’m not picky between Intel and AMD. Hardware example: Dual GTX 1080ti in SLI Creative Labs Sound Blaster ZxR audio card The sound card will be used for both gaming and music playback through abig sound system (the motherboard audio chipset just wont do). 2. I want a stand-alone SSD as a boot drive for fastest OS startup. What should I use? M.2 2280, standard SSD, or the motherboard based PCIe NVMe?
-
Ok so the usual start off of the build specs Rosewell rosewill Cullinan case with 4 oe blue 120mm fans Gigabyte Auros GA-AX370-gaming K5 mobo bios F50a AMD Ryzen 7 1700X all core 3.825Ghz liquid cooled with a Fractal Designs Celsius S24 aio G.Skill TridentZ DDR4 3200 8Gb 4 sticks (2rgb,2yellow/black) 32Gb total running at 2800 (Need 32Gb to do a 256Gb cache drive on a m.2 ssd with PromoCache "7.4Gb overhead memory" to accelerate a 4tb hdd) Rog Stix Rx580 8Gb Oc to 1435Core 2100Mem Primary adapter montor XFX GTR Rx580 8Gb Oc to 1435core 2100Mem Secondary Oculus Rift CV1 no name 1000W psu I dont remember the brand Ok so this system started out as a budget build and evolved to this over a couple of years, got the xfx card from ebay for $150 shipped in Nov of 2018 and recently got the Strix from a friend for $110 so all in all they were cheap enough. The Strix will eventually get liquid cooled (NZXT g12 Bracket($25 shipped from ebay) & fractal designs Celsius s36($46 shipped from ebay)) as I have all the parts now and will get to that soon. Got an Oculus Rift CV1 off facebook market place during the holiday for myself and the kids to get to enjoy. Overall the performance of the Strix card was adequate for and enjoyable play for most games we have so far and with tweaks on settings I could play even a demanding title like Zero Caliber at decent framerates. I felt like since I have the cards already and an PSU that's big enough to run both cards it was worth trying to explore some dual card setup ideas and see just what I can get to work for the heck of it. Oculus Rift Software Crashes if you try to run crossfire and have the vr in the main card with crossfire enabled as the headset cant wake up and some type of conflict with the motion tracking cause the cameras things to not work correctly. Using the current optional Radeon software so that's current and seen this same issue in forums going back a while so I know its a long standing issue. So I figured why not split the display loads across two cards without using crossfire since they basically are 2 different monitors essentially. Removing the load of the monitor from the gpu handling the Vr headset should give the Vr gpu a whole lot of freed up resources to allow it to focus on the Oculus Rift headset and maybe stretch my system performance a bit further overall. Well it work for the most part, a game like Zero Caliber now instead of only being 1/4 or so up on the resolution slider and still pass the internal benchmark as acceptable now can go up to 3/4 up the slider and still pass. Beat Saber while it still looked good even on one card could see the frames drop a little in more active songs and sections, then with both cards was very smooth and clean looking even after enabling a few extra high settings. So far the only game I have come across having some trouble is Arizona Sunshine, it looks amazing and runs even smoother but randomly crashes after a quick artifacting on the screen after maybe 15 mins or more into game play. So far I'm not sure if my Oc settings are maybe to aggressive, power limit was left at stock to make sure thermals were kept in check and MSi after burner is using a much more aggressive fan curve so at 60C its 100% fan and cards pretty much dont go above 60C. Upon crash to desktop Radeon software dose not respond and requires a reboot to restore normal operation. Not sure if using 2 different Graphic cards in my orientation is something new to the Radeon software (LiquidVr?) or officially supported by the Vr Software since its not the traditional Crossfire setup where the second card dose not output and only is used as an additional compute unit for the primary card to output the total renderings. I do Have Above 4G decoding enabled in bios and also SMV for amd's virtualization tech as I use it for NOX from time to time. Sure Eventually the ideal idea is to later on get something more powerful but for now this is what I have to work with. I would like to hear anyone else's experiences on tweaking this setup and ideas on this
-
I have a decent gaming rig with a 1660 ti, as well as a grx 1050 ti in it (which I use mostly for folding). I've been using rtx voice on my PC for a bit, but I didn't like that it used GPU resources while I was playing games so I figured, "hey, why not make RYX Voice use the 1050 ti instead of the 1660?" After some messing around, I made a batch file that allows you to launch any program on your secondary, weaker GPU, and save your precious resources. I also set up an example use case for it with RTX voice, also on the github page. All you have to do is choose your primary GPU (the one you DON'T want the program to run on) in the initial setup of my small, open-source batch code, and then you can manually launch anything on your secondary GPU. The program also allows you to set shortcuts or make simple batch files (like the example one included in the github) to launch anything on your secondary GPU. Get it here: https://github.com/ITCMD/RunAsGpu It's all open source except for the Microsoft executable devcon.exe Two Small Things to Note 1. you want to launch any programs before you start a game or it will crash your game. This is because it disables your primary graphics card so that whatever you launch will use the secondary one, then enables the primary again. 2. the program uses an executable "devcon.exe", which is an official exe from microsoft, to disable and enable your primary card. Because it's not my creation I couldn't post the source code but it is the official one. If you have visual studio you may already have the file, and it will copy it from your install. If you don't, you can extract devcon.exe from some official microsoft archives (the superuser link on how to do this is on the github) and put it in the same folder as the batch file, or you can use the one I already extracted which is posted on the github. The program is still a bit rusty, but it works, and I'll keep working on it. It should support AMD too, although not for RTX voice of course. Please post any issues on github! Thanks all!
-
I was wondering if they are still worth trying to get a hold of. Also, how hard would it be to get a matching pair for "quad" sli? Or would it be two way sli because there's two cards? I'm not actually sure how the computer would recognize them.
-
Hey everyone, I'm considering moving to an SLI setup (not Crossfire at all) for 4K but I'm not sure what GPU would be good for SLI. I'm currently running a 1060 3gb. Thx everyone.
-
Basically, do dual GPUs really have 2x the VRAM of their single card counterparts?
- 9 replies
-
- radeon duo pro
- dual gpus
-
(and 2 more)
Tagged with:
-
Hello, this is my first time posting to the forums so hi to everybody. Now, on to the questions and clarification. So a few months back I decided to buy a GTX 980. A few months afterwards, I read online about how some people ran two cards in their system. One for gaming, the other for PhysX and or GPU rendering. So this is where I have a few questions. Learning Sony Vegas 12 or 13 support the kepler GPUs, I decided to switch to Adobe Premiere Pro... my wallet is killing me at the moment. Anyways, I went into the program directory in the cuda_supported_cards.txt and added "GeForce GTX 760." I changed my video rendering and playback to the GPU Acceleration alternative. I went to render the video and I compared to the software use rendering, only to find out that the times were the same. So a few questions? Can I render with my GTX 760 on Adobe Premiere Pro? Is it possible to update my GTX 760 driver?Whenever I tried to, my GTX 980 got in the way My GTX 980 is my default card, how do I check/confirm the GTX 760 is working? Thanks for all future answers, its greatly appreciated!!