Jump to content

lululombard

Member
  • Posts

    22
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

670 profile views
  1. Alright I think I'll try with an RX 580 I use on my Mac and check if it works, I didn't even think about that before haha
  2. I have a 3-4K USD budget for the upgrade, so yes no problem. Do you know if Nvidia broadcast works even if I also have an AMD card used as the primary GPU? Thanks
  3. Hello everyone! Currently I have a 9900K/2080 Ti system, and I'm planning to update to a 5950X/6900XT system for two reasons: - the 6900XT seems to outperform the 3090 in most games I play - I really like overclocking and Big Navi seems to be golden for overclocking - I play at 4K 120 Hz (LG OLED B9) using a DP 1.4 to HDMI 2.1 adapter that has some issues and native HDMI 2.1 would probably be much better But I have two concerns: - I sometimes use programs that require CUDA acceleration (Topazlabs products, Tensorflow, Pytorch) - I really like Nvidia Broadcast/RTX Voice but it requires an Nvidia GPU So I was wondering, if I have the two cards in the system at the same time (with a 1300+ watts PSU), would I be able to play on the AMD card and use my "old" 2080 Ti in Nvidia broadcast and CUDA tasks? I've done some research but I found nothing on Navi/Big navi with Turing/Ampere so if someone has some insights that would be very helpful. Thanks! PS: what I could also do is wait for a 3090 and sell my 2080 Ti pay for the difference between the 6900XT and the 3090, but I'm not sure it would perform better, especially when both are overclocked
  4. I use it only for professional use. I'm a DevOps engineer and macOS is the only way to have a proper UNIX based graphic OS that doesn't break because you look at it wrong. Also, the fact that I can run Adobe software while having my terminal and the wonderful "homebrew" package manager. For me, macOS is a beast at productivity, but very bad as soon as you want games. The only thing it lacks is support for nvidia GPUs and something like valve proton.
  5. Thanks, but selling my MSI laptop will barely pay for the RTX 2080 Ti and nothing else, so CPU/memory would still be over budget Thanks a lot! Especially since I have a i7-7700HQ and not an i7-7700K (it's a laptop!) I guess it would be much better. For thermal issues, currently my MSI laptop runs at 90-95 degrees in game so I don't think there would be much difference, and I have a cooling pad as well. I think I'll buy an eGPU enclosure on Amazon, try it with a friend's RTX 2080 Ti and if the performance is OK then I'll sell my MSI laptop and buy the RTX, otherwise I'll just return the enclosure to Amazon. And last thing: I also have an Oculus rift, that should work with an eGPU right?
  6. Hello! First post on the forum, but here's my story: I currently have: - A MSI GS63VR laptop with an i7 7700HQ and a GTX 1070 Max-Q - A 2018 15" MacBook Pro with an i7 8750H - A 1080p 144 Hz screen - A 4K 60 Hz screen The issue here, is since I got my 4K display, I want to play some games, and the GTX 1070 Max-Q struggles in most titles. Since my MacBook Pro has a better CPU than my gaming laptop, I thought about using it with an eGPU to replace my current laptop. As I would sell my MSI laptop to buy the GPU and the enclosure, it would not cost much for me, as an RTX 2080 Ti + enclosure would be around 1400€ and my laptop can probably be sold for 1200-1400€. But the question is, what performances can I expect from an RTX 2080 Ti over Thunderbolt 3 (which runs on PCIe 3.0 but on 4 lanes) on Windows on a 2018 MacBook Pro? Would it be better than my MSI laptop with a GTX 1070 Max-Q? Thanks!
  7. Yes. Windows will free ram if needed, but if not needed, will keep everything in memory in case you need to access previously accessed data quickly.
  8. What's the MB, the CPU and the RAM ? I'm intrested to see if there's a transfer rate drop with something like a very low end Celeron
×