Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

About harshitaneja

  • Title

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. harshitaneja

    Recommendation for multi-GPU setup

    CUDA is sadly required as tensorflow and certain others only support it which is bad as I really don't want Nvidia being a linux user and their abysmal driver support. I have edited the question to include CUDA support.
  2. harshitaneja

    Recommendation for multi-GPU setup

    Maybe I should have mentioned it in the answer, 2950X is for the extra cores as some of my work involves running statistical libraries which are currently CPU only. Extra PCIe lanes are cherry on top.
  3. harshitaneja

    Recommendation for multi-GPU setup

    Thanks for the response. Titan V is an older card which has similar if not worse performance in certain cases in comparison to 2080Ti in FP16 workloads. Even in FP32 difference isn't large enough to warrant paying around 2.5x for it. I think you are confusing it with Titan RTX. Even though Titan cards have 24GB of memory in comparison to 11GB of a single 2080Ti. But total memory of all three planned GPUs 11+8+8 (GB) should be sufficient to counter that(now I know I am comparing a single memory with sum of multiple memories which is not a straightforward comparison) without much of an increase in price. With this given that my work mostly is parallelizable I can run it across all the three at the same time which should give higher performance and if not, it offers the benefit of running different models on all the three GPUs. I need the 16 cores offered by 2950X(well I wanted the 32 core 2990WX but couldn't justify the cost) as I need to run a lot of virtual machines and some of statistical work is done using libraries which don't yet have GPU options and thus need the cores for that. So PCIe lanes are not an issue. Thanks again for your input.
  4. harshitaneja

    Recommendation for multi-GPU setup

    Building a workstation with threadripper 2950X(need the cores) and 3 GPU- 1x 2080Ti and 2x 2080. Mostly for data science work. NVLink is not needed as work will be parallelized across multiple GPUs. Need help with GPU recommendations(CUDA support required) and motherboard. I need to live in the same room as this device which will run 24x7 so would need to really quiet. Water or air cooling? I live in India where it is tough to find people to help with water cooling(at least that I know of) so would be doing it myself. Is IChill Black 2080Ti a good option? Any suggestions or recommendations or pointers to places I could ask or read at. Thanks a lot for the help.
  5. Alright. Q3 is too late, so would get it now. Already delayed a lot. Thanks a lot again.
  6. Huh. I thought with all the ryzen 3 rumours it would be around the corner
  7. Ya. Thanks. What do you think, should I wait for threadripper 3 or just go ahead?
  8. I use tensorflow a lot and most of the other libraries I leverage use CUDA. Not having access to CUDA would be a real disability for my work. RocM support for tensorflow is shaky at best. Otherwise I would have preferred Raedon especially because I run linux and Nvidia driver support is not realy great for consumer GPU on linux, something where AMD is quite great these days.
  9. Android provides a good UI toolkit and creating good UI in linux is not fun and electron is really heavy. So thinking of getting an android tablet and flashing AOSP and build the system on it. Any android tablet recommendation for which I can get Android 7+ AOSP with some community support. Nexus line back in the day would have been perfect for this. Don't know of any tablet these days. Get a raspberry Pi like SBC(something more powerful) and hooking a touchscreen to it and a USB modem perhaps. Does anyone know of good 720p+ touch displays for this usecase? Or should I give up and just buy an android auto and drive on with my life? Thanks a lot for the help.
  10. Alright. 32GB modules are hard to get though, will try to find. Datasets lie in 50-80GB region generally but not so rarely I find myself working with sets of sizes 200-500GB+. I have worked with these even on my old laptop with 8GB of RAM but it gets really tedious so having more than necessary RAM is the luxury I want. I can max afford(I can go further i guess but only if it really makes sense) 3 2080 but would that be better than the other configurations namely 2 2080Ti or 1 2080Ti with 2 2070 or 1 2080Ti with 3 2060. Thanks a lot for your help
  11. Yes, 8x16GB is what I am looking at. Memory bandwidth issue is a concern but as almost all of the code I would run is my own so I should be able to avoid the issue to some extent. Not being able to keep it outdoors is saddening. I really need to worry about keeping the noise under control then. Thanks a lot
  12. I have read this article many times. Although it discusses multigpu builds I was unsure of the specifics and thus my question.
  13. Datasets tend to be large enough that they saturate 64GB RAM ocassionaly. Currently I have to engineer ways to load data asynchronously, which is fine for production models but while experimenting they take substantial time and effort that I would rather wish to spend on analysing data. 64GB should work but my major worry regarding upgrade was the impact of number of channels used on efficiency of RAM in threadrippers. I assumed I need to use all 8 slots for maximum efficiency. Budget is around 7-8k $. But I am in India so parts tend to be a bit more costlier here.
  14. I need to build a PC for my work and experimentation. Cloud is great for production or when I know what I am doing, but a lot of times I don't and I need to experiment a lot and that gets really costly on cloud. Also I wish to learn distributed computing and warehousing. This is my first time building a PC(at least a non-trivial build). So the build I have in mind is- Threadripper 2950X(Want more cores to spin up lots of VMs to experiment, also a lot of CPU intensive work is parallelizable(is that a word?)) 128GB RAM( Do higher clock rates help? Is 3000mhz fine?) Motherboard(x399. But which model? Which company do you suggest? Which model?) 2TB Samsung 970 Evo 6TB HDD(WD Black I think. Does anyone know of cheap SSD around this storage size?) GPUs(This is where I am really confused. I want high(relatively) compute power but also experiment with distribution of computation across multiple GPUs)- a) 1x 2080Ti with 3x 2060(I would really prefer a 4GPU build but I am not sure how feasible this would be) b) 2x 2080Ti (My only concern here is that 2GPU might be too less) c) 1x 2080Ti with 2x 2080(This pushed my budget a little. I can go for this if it really helps) d) 1x 2080Ti with 2x 2070 What kind of cooling would you recommend? It would be kept in my parent's place as I am quite nomadic. Would wish to keep the noise to a minimum. Would it be okay if kept in the balcony given that a weather shield could be arranged? I am really out of my depth here. I am more of a maths/stats kind of a person. Any help would be really appreciated.