Jump to content

Bartholomew

Member
  • Posts

    642
  • Joined

  • Last visited

Reputation Activity

  1. Funny
    Bartholomew reacted to seanondemand in Need to clean my monitor.   
    As long as you’re not using one of these, you’ll be fine. Bonus points for using just the tiniest amount of dish soap. 
     

  2. Agree
    Bartholomew reacted to Latvian Video in extra fan being goofy   
    My guess is that the fan is creating turbulence and the airflow gets worse or something.
  3. Like
    Bartholomew got a reaction from Godlygamer23 in Can thermal paste in cpu pins or Mobo makes PC act wierd even if it's not conductive?   
    Yes it can.
     
    Since it's non conductive, it can prevent proper contact if in-between the cpu and the socket for one or more pins for example.
    To much paste "on top" of the ihs, or even splattered around the pcb is OK. But not on sockets (cpu, ram dimms) or connectors; that can prevent contact et-all, or form a resistor or even some slight capacitance etc. That can do bad / weird things to signals.
     
     
  4. Informative
    Bartholomew got a reaction from Fracteller in Can thermal paste in cpu pins or Mobo makes PC act wierd even if it's not conductive?   
    Yes it can.
     
    Since it's non conductive, it can prevent proper contact if in-between the cpu and the socket for one or more pins for example.
    To much paste "on top" of the ihs, or even splattered around the pcb is OK. But not on sockets (cpu, ram dimms) or connectors; that can prevent contact et-all, or form a resistor or even some slight capacitance etc. That can do bad / weird things to signals.
     
     
  5. Informative
    Bartholomew got a reaction from veevee in how to remove graphics card?   
    I understand, and your not alone, sometimes this happens and its not always as easy as above posters make it seem (even thouh its rare, just had it twice happen in 30 years).
     
    Note that the clip rotates up when inserting, so rotation down (to the right) is needed to push it open (just pushing straigth down gives little force to the openjng mechanism).
     
    You said you used somethjng plastic, do try with fingers, gives much more feel.
    And i understand you dont want to break the clip, but a little firm force (rotating outward and then oushing down) can be needed (depending on the mechanism, it not only remkves the lock part from the card but also lifts the card uo on that side for 1 or 2mm.
     
    Sometimes it can help to reduce pressure on the clip by getting the pcie backside up a little (even just a nail between the case and the backcover can be enough to slightly tilt the car so that the bracket side is up a tiniest bit and the clip side down, reducing tension on the lock).
     
    And again, you are not alone, last time i had this it took me about a hour of sweating (rtx asus strix was stuck in z170k board, also didnt want to break that gpu for sure lol).
  6. Funny
    Bartholomew reacted to freeagent in Guessing time: Would a "Be Quiet! Dark Rock Slim" be sufficent to cool the upcoming AMD Ryzen 5800X3D?   
    Not sure if a Dark Cock Pro will fit into that space.
  7. Agree
    Bartholomew got a reaction from Mattias Edeslatt in Hot air is coming out from exhaust while doing normal tasks   
    This sounds like possibly a malicious process that uses 100% gpu which attempts to go unnoticed by hiding/stopping itself if taskmanager is opened.
     
    I've been on linux for while now so my knowledge on alternative ways (beyond taskman) of inspecting processes and their resource use is rusty when it comes to windows, hopefully other kind souls can advice on that (and perhaps good malware scanner).
  8. Agree
    Bartholomew got a reaction from IkeaGnome in Hot air is coming out from exhaust while doing normal tasks   
    This sounds like possibly a malicious process that uses 100% gpu which attempts to go unnoticed by hiding/stopping itself if taskmanager is opened.
     
    I've been on linux for while now so my knowledge on alternative ways (beyond taskman) of inspecting processes and their resource use is rusty when it comes to windows, hopefully other kind souls can advice on that (and perhaps good malware scanner).
  9. Agree
    Bartholomew reacted to HanZie82 in No display, red LED. Help!   
    Make sure both sides of the power-cables are connected properly.
    I had something similar changing stuff out (different hardware though). PSU side cable came loose and system would not boot. Put cable back in all is well.
    Hope you have the same.

    I personally do not know what that red light means. It does look like it has text next to it, read that!
  10. Informative
    Bartholomew got a reaction from HanZie82 in No display, red LED. Help!   
    Red light means no or insufficient power to gpu  so doing as the above poster said should resolve the issue.
    Edit: crossed the "already fixed" above.
     
    Congrats on OP 🙂
     
  11. Like
    Bartholomew got a reaction from igormp in Advise and Evaluation   
    Yup, performance tanks on windows. Its mostly the drivers to blame; they are optimized to the max for *nix since thats what datacenters and researchers usually run. They essentially just "make them work on windows" but it has lowest priority and are unoptimzed. When new cards come out usually the drivers are buggy as hell in the first few versions on windows while on nix they are mostly "one time right". Once they are "done" for unix they release, and basically go like "ok, now we have time to look if the windows one didnt just compile but actually works too."
     
    Ouch lol, and i thought visual gans where heavy stuff pretty much "max workload" for our poor hardware lol.
     
    Yeah tf eco is hard to beat with some stuff; im pretty plain/raw with what i need, and since its pure local research not yet embedded into anything yet i dont need to deploy to anything, im in "works for me" heaven lol. To bring gained knowledge to life in actual appllications is up to others.
     
     
  12. Agree
    Bartholomew reacted to WereCat in Advise and Evaluation   
    If you're heavily invested into deep learning then you should probably ignore the gaming 3000 series and focus more on something like RTX A5000 or A6000 (don't confuse it with the old Quadro cards that are on Turing architecture). 
    These cards can still game if you want to but are better suited for those kind of workloads. 
  13. Like
    Bartholomew got a reaction from igormp in Advise and Evaluation   
    "I am not at liberty to say specifically"  (under NDA).
     
    btw interstingly enough, in some cases having multiple slightly less powerfull cards can be beneficial; like two 3080s will outperform lne 3090 by a lot. In periods of normal pricing that can be beneficial (of course youd lose the benefit of being able to run >12gb networks, like 1024 res projected gan (paper from last month) for example, which can gobble up to 19gb.
     
    a few tb of nvme is nice, but just mostly if working with larger sets to mamage them and their metadata (to keep sets together with the meta i use jsonfiles alongside the original containing various nets ran inference data, so with a 250k source set and 3 inference runs (which then are used as input to train next net in the chain) kt accumulatetees to 1 million files+. 
     
    However for train result generated ouput sets (of which there are a lot, for comparison) and training cycle pickles saves each 20kims sata is more than enough. Its will depend on the case and workflows used if nvme is beneficial or not.
     
    For trainkng it wont matter at all, just spawn enough load workers, so with 24 thread a hdd could keep up (i think, but wouldnt try lol). Just make sure to go tlc or better and try to stick to at least 1tb preferably 2tb drives as their TBW ratings are usually a lot better.
     
    Most of all, its a opem door but still: for machine learning, go linux, save yourself a ton of headaches avoiding windows (less ml optimized drivers and it gobbles to much vram).
     
  14. Like
    Bartholomew reacted to igormp in Advise and Evaluation   
    Why not go for 2x32gb (total of 64) instead of 32gb? Also, high speed memory isn't really that important, specially since it seems you'll be using python most of the time and tf or pytorch, those aren't really that sensitive to ram speed and anything faster than 3200mhz won't net you any noticeable speedups.
    Other than that, your build LGTM.
     
    A 3090 would be cheaper and faster than a RTX A5000, specially since it has more unlocked SMs and faster VRAM.
    Although a A6000 has double the vram and more SMs, a 3090 can beat that by having those SMs clocked higher with faster VRAM, and with the price of a single A6000 you could buy 2x 3090s with some spare change.
     
    If you're talking about quadro optimizations and whatnot, that doesn't apply to ML, it's usually only important for CAD stuff. GeForce and Quadro/Tesla GPUs perform the same here.
     
    Huh, I always though GAN-like stuff would be on par with SOTA NLP stuff. I'm always hitting swap out of my 64gbs when playing with transformers 🙃
     
    Complementing on that:

    https://www.pugetsystems.com/labs/hpc/Quad-RTX3090-GPU-Wattage-Limited-MaxQ-TensorFlow-Performance-1974/
     
     
  15. Informative
    Bartholomew got a reaction from igormp in Advise and Evaluation   
    Hi,
     
    Looks pretty good, i have a simmilar configuration as you can see on my profile (3900x 12c 24t, 32gb, pro 980, 3090).
    And am in the same boat as far as training GANs and traiming vision cnns.
     
    Cooling:
    I personally opted for air cooling, for two main reasons:
     
    1. Reliability
    2. During multiday/week training things get hot (esp nvme near and or under gpu); so the inside of the case can use all the "whoos" i can get, location of cpu is a nice center, not just relying on the outer edges fans on a case.
     
    Doesnt look as nice though, good air coolers are large blobs... but safety/reliability when running high current stuff for days/weeks was paramount to me.
     
    Memory:
    I doubted between 32 or 64gb mem, opted for 32; this worked out well, am regularly above 16 but never over 24-28 (this is mainly when running inferrence by trained networks over 250k+ image sets, uskng anywhere from 4 to 10 parallel processes).
     
    Cpu:
    12c 24t same story, found it to be a good sweetspot. More than enough threads to have inputstream workers and some additional processing while still be able to use the shstem concurrent for daily stuff while training. When parallel processing (either preprocessing of learning sets or running inference on larges image sets) it allows enough processes to utilize all 24mb vram of the 3090, wkth a few threads to spare so machine stays snappy.
     
    Gpu:
    3090, youll love the 24gb, allows training of high res gan architectures, AND try out the trained snapshots at the same time no problem. When running trained networks, can apply parallel processing on the set because most nets can fit a multiple of times in vram. Also looked before at the ML cards like a5000 as werecat says but when i bought 3090 was significantly cheaper here, but these are defknatly good choices as well. Tip: limiting to 250w saves approx 30% in noise/heat but just a few % in compute performance. Dont wear out your fans/caps by blasting power like your trying to get that 2 extra fps. When doing long trains hours accumulate a lot quicker on the card than with office of gaming use.
     
    Storage:
    This is where i fell short initially; didnt consider the size of both my datasets but more so the processing speed of 3090 and i tend to snapsnot pickles and progress previews a lot. Running a few experiments a week can accumulate data quickly depending on the research you do. I ended up adding 2tb more (currently have about 5.5tb in ssd storage, 3tb being nvme), not sure if its myy profile yet but i added a crucial mx500).
     
    Case: just anything with easy accesible filters, 24hr/d training causes quick collection of dust in them
     
    Hope this helps a bit 🙂
     
  16. Informative
    Bartholomew got a reaction from ahmad13610 in Advise and Evaluation   
    Hi,
     
    Looks pretty good, i have a simmilar configuration as you can see on my profile (3900x 12c 24t, 32gb, pro 980, 3090).
    And am in the same boat as far as training GANs and traiming vision cnns.
     
    Cooling:
    I personally opted for air cooling, for two main reasons:
     
    1. Reliability
    2. During multiday/week training things get hot (esp nvme near and or under gpu); so the inside of the case can use all the "whoos" i can get, location of cpu is a nice center, not just relying on the outer edges fans on a case.
     
    Doesnt look as nice though, good air coolers are large blobs... but safety/reliability when running high current stuff for days/weeks was paramount to me.
     
    Memory:
    I doubted between 32 or 64gb mem, opted for 32; this worked out well, am regularly above 16 but never over 24-28 (this is mainly when running inferrence by trained networks over 250k+ image sets, uskng anywhere from 4 to 10 parallel processes).
     
    Cpu:
    12c 24t same story, found it to be a good sweetspot. More than enough threads to have inputstream workers and some additional processing while still be able to use the shstem concurrent for daily stuff while training. When parallel processing (either preprocessing of learning sets or running inference on larges image sets) it allows enough processes to utilize all 24mb vram of the 3090, wkth a few threads to spare so machine stays snappy.
     
    Gpu:
    3090, youll love the 24gb, allows training of high res gan architectures, AND try out the trained snapshots at the same time no problem. When running trained networks, can apply parallel processing on the set because most nets can fit a multiple of times in vram. Also looked before at the ML cards like a5000 as werecat says but when i bought 3090 was significantly cheaper here, but these are defknatly good choices as well. Tip: limiting to 250w saves approx 30% in noise/heat but just a few % in compute performance. Dont wear out your fans/caps by blasting power like your trying to get that 2 extra fps. When doing long trains hours accumulate a lot quicker on the card than with office of gaming use.
     
    Storage:
    This is where i fell short initially; didnt consider the size of both my datasets but more so the processing speed of 3090 and i tend to snapsnot pickles and progress previews a lot. Running a few experiments a week can accumulate data quickly depending on the research you do. I ended up adding 2tb more (currently have about 5.5tb in ssd storage, 3tb being nvme), not sure if its myy profile yet but i added a crucial mx500).
     
    Case: just anything with easy accesible filters, 24hr/d training causes quick collection of dust in them
     
    Hope this helps a bit 🙂
     
  17. Like
    Bartholomew reacted to MardyMarvin in MSI Suprim 3080 - 3rd fan started to spin full speed   
    Hello Bartholomew,
     
     
    I just wanted to follow up and say your advise was spot on. I looked at the card and there were two headers for the fans. I then noticed that one of the cables the blue wire had come loose from the connector. So put that back into the connector and now the fans are operating as intended. So this is great as I dont have to send it back now and all these years with linus and others saying get an ifix it kit payed off with the nice tweasers in there.
     
    So again and thanks for the adivce Bartholomew.
  18. Informative
    Bartholomew got a reaction from MardyMarvin in MSI Suprim 3080 - 3rd fan started to spin full speed   
    Usually 3 fan cards are split in 2 groups, 1st two over vrm/gpu/ram, second "group" aka 3rd fan on the "extra" fins part of heatsink that have few hot components beneath them. (Although this will varh between cards).
     
    Im crossing fingers and hope you can resolve it with a nudge on a connector or just wiggling the wires a bit in case one has a small break inside. One can always hope 🙂
     
  19. Informative
    Bartholomew got a reaction from JanKaare in Elp!!! Brand new comp, i12900k. 100C shutdown!   
    🙂
     
    (not trying to be a smartass, just hoping to move this forward so we/you can provide help fast as possible; im not familliar with that socket type or bracket so hoping you can provide tips)
  20. Like
    Bartholomew reacted to Pixelfie in Elp!!! Brand new comp, i12900k. 100C shutdown!   
    Read over that, thanks for letting me know 😉
     
    To OP, try remounting the cooler and make sure it's mounted tightly. 
  21. Agree
    Bartholomew reacted to MardyMarvin in MSI Suprim 3080 - 3rd fan started to spin full speed   
    O darn the last thing I want to do it send it away, I will take it out and check the cables are all seated on the card as I dont think the fan cables are under any screwable bit
     
    Thanks Bartholomew I did not even think the fans would have anything like that in them but then again how else would they read the speed DOH, everyday is a school day for learning.
  22. Agree
    Bartholomew reacted to dilpickle in GPU Scratched, no output   
    Moral of the story for me: This is why I've never taken apart a video card. I'll live with a few degrees of heat to not risk losing the whole thing. Especially when my current card has a street value of $3000.
  23. Like
    Bartholomew got a reaction from Derigueur in Current best budget 1TB NVMe SSD? [CANADA]   
    A respectfuly disagree. For sata drives thats correct, but for nvme it isnt.
     
    Im woking with large datasets (250k seperate files) for a ai project regularly, and when moving those, or running scripts on them etc. kingston a2000 is noticably slower than my samsung pro (granted thats pcie3 vs pcie4 i believe).
     
    Also, when working with for example davincy resolve on larger 4k video projects the difference is highly noticable in responsiveness when jumping the cursor around (when workin in full res for good preview, withoout lowres cach setup)..
     
    That said, even though "very noticable" its like "fast/good"  vs "supersmooth/excellent" so it isnt a huge deal, but to say its only noticable by watching synthetic benchmark numers doesnt seem right.
     
    As for "as os drive and/or gaming"; yeah zero, zip, nada noticable differnce there indeed for average joe; not worth the premium price.
     
    Without: That would be the crap tier he mentioned 🙂
    With: much better
     
    The price differences are usually small, so get one with ram if you can.
     
     
  24. Like
    Bartholomew reacted to SorryBella in LGBT community   
    I gotta be 100, i cant believe how supportive or indifferent people are to Transgenders here. I genuinely feel like im treated like a girl, and i really love it. Thank you.
  25. Like
    Bartholomew got a reaction from IronStaple in Could I get a once-over?   
    If the kingston a2000 was meant, that does have dram (1gb ddr3).
     
    Ssd optionally use dram as read/write caches, not all ssds have it. Those that doent perform worse in specific loads.
     
     
     
×