Jump to content

Niels_at_home

Member
  • Posts

    29
  • Joined

  • Last visited

Everything posted by Niels_at_home

  1. Just to update this. Tested on a 3080, once adapted from HDMI to DP, the monitor did sadly not show up as a GSYNC compatible anymore.
  2. I ordered this StarTech Active HDMI to DP adapter to simply try it out: https://www.startech.com/en-nl/display-video-adapters/128-hdmi-displayport I'm currently using a 2070 Super. It does allow 1440p 144hz, and I can also enable Nvidia Surround for my triple screen gaming setup. However, the HDMI to DP adapted monitor doesn't show up as Gsync compatible in the Nvidia Control Panel, so my entire surround gaming will not have functional Gsync. Now I have read that maybe on newer HDMI 2.1 cards (30 and 40 series Nvidia) maybe adaptive sync is supported over HDMI... But will that work with a 2016 'original' Gsync module equiped monitor I have? Surprisingly hard to find this info. Also surprisingly easy to spot a non working Gsync, geez that is a stuttery mess!
  3. Not quite system advice, not quite monitors, not quite graphics cards.. so I'll post it here. I'm considering going crazy, it is long overdue, who needs sanity? My setup would have: 3x 144hz 1440p DP monitors (DELL S2716DG) surround gaming setup 1x 49" ultrawide 'work' monitor (maybe Odyssey OLED ) 1x DisplayPort VR headset (Pimax 8kx) Ideally I don't go plugging things in and out every time. I will never use all at once, so if there is still a 4 simultaneous device limit in GPU's, that is fine. There are 4000 series Nvidia cards with 5 outputs, but I've only seen ones with 3xDP and 2x HDMI. The 49" ultrawide seems to support HDMI 2.1, so I can use HDMI for that at high refresh rates. (I would think!) The 3x DP monitors are picky about their connection. Using GSYNC and NVidia Surround, you kinda want make, model, and connection to be the same on all. HDMI 1.4 probably also doesn't even do 144hz 1440p. The VR headset is probably very picky and wants the purest DP signal available. So that means 4x DP devices and 3x DP ports... My best bet seems to be to try and convert HDMI 2.1 to DP 1.4 for one of the 3 GSYNC monitors.. But comments on these adapters aren't always positive for high refresh rate monitors, and I would need GSYNC to remain functional. So after all this, actually quite unnecessary information that looks surprisingly similar to a dude flexing...my question is: Has anyone successfully adapted HDMI to DP and kept at least 1440p 144hz GSYNC functionality? If so, what adapter / cable did you use? Thanks, Niels
  4. Firstly it is great to see a potential SpeedFan replacement! Is there some known list of issues? I.e. to give me some idea if I am screwing things up or there might actually be an issue? My Asrock B550 Phantom Gaming ITX doesn't seem to allow CPU fan control. I've made sure I know what fan it is, made a curve, and the fan control correctly says what % fan speed it should be, but the fan speed isn't actually following the curve. Again, correct fan is selected. SpeedFan does allow control of this fan. I have no other fan control or monitoring software things running..
  5. I think I agree with the screws, easy 'upgrade' down the line. I have the odd combination of interests being silent computing and compact cases, so you get a weird compromise that isn't really super small. My solution was to get a 49" ultrawide, and next to that, the case looks tiny.
  6. Heyho guys, I made a neat (if I may say so) small ITX case for use in my office. Bear in mind I designed the metal parts, I didn't make them! Professional people with professional laser cutters and bending machines did that for me. Here is some info, a video, and a few pics in case you'd like to have a look! Video: Parts (bought and in use already for about half a year) - AsRock B450 f4tal1ty or however you spell that.. - Ryzen 5 3600 - Noctua NH C14S - 32GB Corsair DDR4 3200 - MSI Ventus GTX1660 super - Corsair SF600 - 1TB Samsung NVME Case specs: - 11.9 liters - 32x22x17cm - up to 135mm CPU coolers - up to 210mm GPU length, dual slot - 140mm bottom 'GPU fan replacement' - Space for two 2.5" drives Material and design: - Stainless steel 1mm thick, complex (for me anyway!) sheet metal work - relatively accessible without front/top cover, 'easy' to work (for an ITX case that is) - compact but enough space to fit silent components How does it behave? - I run the CPU stock, and with my normal workloads the fan never ramps beyond 350RPM. - My job seems to be staring at CAD software and spreadsheets half the time while getting paid for it.. - I've lowered the TDP to 56% on the graphics card as its heatsink is pretty small. This way it stays passive (<65c) during all my normal workloads and only turns on when gaming after 5 minutes. Then the RPM is around 600. The performance loss is small compared to the increased efficiency, so that is worth it for me! Any regrets? - I'd find a way to not recess the power button, which I did so it was all installed and accessible without the front/top panel. - With this construction, 1mm steel is 'good' but 1.5mm would be multiple times stiffer (2x thicker = 8x stiffer if I'm not mistaken). That would be my next choice. Or be more clever with the bending of edges so you increase the effective design without going thicker. Its very stiff fully assembled but when 'open' it flexes a little bit. - I didn't look at the CPU fan direction. It blows out of the side of the case. I think CPU temps would be lower if I let it draw air IN from the side as now it might get some warmer air that has been through the GPU. But with fan speeds never exceeding 350RPM with my workloads its going to stay like this.. - Cage nuts are fine for one offs like this, I know there are weld nuts or pressed nuts that might work better and mount everything flush. Cage nuts give you some wiggle room though which is neat. Overall very happy, its a pretty professional looking case that I'm sure will be on my office desk for years to come!
  7. FWIW, I went with the EVGA RTX 2070 Super FTW3 as it seems to have the right features plus even a fan connector that I can use. Geez these 'mid range' cards are expensive compared to when I got my mid range GTX970 5 years ago!
  8. @dizmo I guess with the 3000 series coming at some point? Yeah probably.. But I have my summer holiday starting next week and I'd like a GPU boost. Plus with my I5 4690k at all core 4.7 I'm not really at the cutting edge of the CPU landscape anymore either..
  9. Thanks @TofuHaroto, and that's not some odd non accelerated graphics output but actually powered by the GPU?
  10. Thanks, I know, 4 at a time. I'm only using 3 at a time which works fine on my Gigabyte card when I have 5 connected. As long as no more than 4 are active. Main thing is if the USB C connector can be used with a USB-C to HDMI adapter to drive a screen..
  11. Hey all, Time for an upgrade, most probably to a 2070 of some sort, probably super. My GTX970 is tired of life... I use 5 screens, but only have 3 active at a time. Using the 'WIN +P' key I can switch between my triple screen surround gaming setup and my dual screen 'work' setup. My main question is if this is possible with cards that have 3xDP, 1xHDMI, 1xUSB-C, which seems to be a common combination of outputs. It seems only Gigabyte has clear information about their cards on what outputs you can populate and then you can have 4 screens active at a time. Or does that mean that with all other cards, you can have 4 outputs active at a time no matter where you plug them in? A lot of cards have a USB-C output. There would be more cards to choose from if I can hook up one of my 'work' monitors (1920x1200 60hz) via some USB to HDMI adapter. But does that just work as just a regular display output? Some cards I found: EVGA RTX 2070 Super FTW3 - Probably good if USB-C can be used - non stacked fans, seems thick heatsink, also wide heatsink - 300mm long card (and heatsink) Asus ROG Strix GeForce RTX 2070 Super 8GB - Only 2x DP, problematic when I upgrade to 3x DP 144hz gaming screens for my surround setup - thick and wide heatsink - card length 300mm (heatsink too) Asus Dual GeForce RTX 2070 Advanced edition - 3x DP - Thick heatsink but shorter (268mm) Gigabyte Aorus GeForce RTX 2070 Super 8G - Stacked fans i.e. probably less thick heatsink - Plenty of outputs (but of course only 4 at a time) What would you do?
  12. Yes I don't doubt Steve's data is correct, but probably 35dBa is just still 'loud' for a small group of users.
  13. To 99% of people this won't apply. I am really intolerant of system idle noise. I'm also happy to admit this is borderline delusional and obsessive. I just don't want to be able to hear anything at all until I'm rendering or gaming. My system (video link below the post) now has soft mounted 120 and 140mm Noctua fans currently running at speeds below 360RPM. No mechanical hard drives and no coil whine... until I'm gaming. When Gamers Nexus did their comprehensive new cooler tests, I was surprised to see that even noise normalized, some AIO's came out on top. My gut feeling told me the pump, which always has to run, would have to be more audible than my slow fans. So I got a NZXT X62 and hooked it up just to hear the pump. And indeed, it is completely and totally unacceptable for my, admittedly, delusional low noise requirements. There is slight 'sluurrp' noise from the water, perhaps that goes away if it means some air is inside. But the pump itself, even at the lowest speed the CAM software allows, is just completely audible. Worse, it has a little bit of vibration that is extremely amplified when hard mounting the block/pump to a system. This is exactly what I expected and again, for normal people who tolerate slightly audible things, this is not really an issue. It is not loud, it is just not completely silent. So, if you are really insane regarding ultra low, effectively inaudible computing, well below the 35dB normalized benchmarks by Steve from Gamers Nexus, then you get to a point where the pump just becomes the weak link in watercooling systems it seems. I had a new case design planned for a 280mm AIO but I'm not going to go that route now. Disclaimers: 1) Of course, N=1, but NZXT is a good company and you wouldn't expect them to have shipped me a terrible pump. I believe it is working just fine, just not up to the standard I hold silent computing to. 2) In a custom loop you could probably soft mount the pump if it is not integrated into your cpu block which should help considerably, but even when I soft mount the NZXT, there is still very slight whirring that would be ever so slightly audible.
  14. To those interested.. I 'fixed it' by using Edge instead of Chrome or FireFox. Somehow gpu usage is a bit lower and it just runs silky smooth with zero dropped frames there. Now it is weird, somehow Edge does different hardware detection or something. On my desktop GTX 970 it doesn't even give me a 4k option on videos, probably because the hardware decoding is missing from this older card. Yet in FF / Chrome it is available.. So something seems to be handled differently on Edge. And to my surprise, you can even get an ad blocker for Edge?! So MS has the go to browser now?! I need to sit down.. ?
  15. Hey Mark, did you also have the desktop resolution at 4k? I see a performance difference watching 4k in 1080p desktop or 4k desktop.
  16. I've had DP cable issues in the past with 144hz 1440p monitors, but that never resulted in dropped frames. It would only allow 60hz or 1080p as it sensed a lack of bandwidth. I use the supplied DP cable with the LG 43" monitor and it allows 4k60 all day. Dropped frames because of the cable would only seemingly be possible if it lost sync or connection briefly which I really don't think is the case. If that happens you'd see a black screen briefly or spikes in cpu / gpu use I would say..
  17. Hi there! I'm having some unexpected issues with my new HTPC build: System specs: - Ryzen 3200G stock clocks, seems to boost fine to near 4ghz depending on load - AsRock B450 Fatal1ty Gaming-ITX/ac - 2x8GB Corsair Vengeance LPX 3200 running at 3200 - Noctua NH L12S cooler, temps are low - Kingston A2000 M.2 250GB - SF450 corsair platinum PSU - Win10 64, up to date - AMD drivers, latest ones and also ones from AsRock site, and also with whatever Windows installed by itself - 43" LGUD79 monitor / tv - Running 4k60hz over DisplayPort Issues I get lots of (10+%) dropped frames on Youtube with 4k60 content and even a few lost frames with 1080p60 content, when I run the desktop at the actual 4k resolution. When I change the desktop resolution to 1080p, all 4k and 1080p content plays smooth. I realize the 4K content is only rendering 1080p in this case. I get this with Firefox and Chrome. My internet connection is fine (100mbit fibre) and it seems to have plenty of buffer. With 4k60 content, Task Manager shows peaks of 85% GPU utilization when hardware acceleration is enabled in the browser, never above this. CPU utilization is low Without hardware acceleration, CPU load is pretty much 100% and the experience even worse. With 1080p60 content, the GPU load is much lower, perhaps 60% and I still get a few dropped frames and clear stutters as a result. Task manager doesn't seem to report many differences between running the desktop res at 4k or 1080p, yet at 4k it isn't smooth and at 1080p it is.. Did I expect too much from the 3200G? Anything I should check? It seems it just can't draw the screen fast enough at 4k to keep up or something odd. I am late to the 4K party but my plan was to enter it more smoothly.. ?
  18. Hi! I've got a strange possible issue. My pc runs fine, has done so for 4 years, mainly running windows 7. I tried Windows 10 late 2015 and it started to run disk checking on startup for some reason. Running sfc / scannow in a command prompt found issues that it couldn't repair. I had more issues with win 10 so I went back to windows 7. A couple weeks ago I installed Windows 10 again, on a fresh Samsung SSD. All seemed fine, until yesterday! It started disk checking on boot agin! Sfc / scannow found issues, but it could repair them. It seems fine now. With Windows 7 I never had any issues and that has been running for over 4 years. My total Win10 install time is 4 weeks and I had multiple issues. Almost points to it being a windows 10 thing. Like i said, the pc runs fine, Win7 ran fine for years, and Win10 has done this startup disk checking now on two different SATA SSD's and ports. It almost sounds like Win10 doesn't shut down properly or something, occasionally corrupting files. It all seems fine though, i see no misbehaving.. But I am worried this will occur again and I end up having to reinstall Windows every few months because it corrupts itself. Very odd! Any ideas anyone?
  19. Hi! Apologies if this is clearly available somewhere, I just couldn't find it! I am happy with my ModMic, the sound is pretty good. But I often just want to wear the microphone when I'm recording some voice overs for example. Having a hot and sweaty pair of cans on my face is annoying in these cases! Has anyone found something or made something for this? Thanks! Niels
  20. Looks are mega subjective of course. I always like minimalistic things that do the job efficiently and quietly. And of course I didn't design and made a case that wouldn't be pretty in my eyes! That would be a bad way to spend 500 euro. :-) Small and quiet were the objectives, and I like how it turned out, but of course half the fun is making it and a sense of pride makes my views not so objective.. :-)
  21. Well any overclocking guide you read points to one thing: Luck! To some extend you need to be lucky with the chip. More speed needs more voltage. More voltage is much more heat. The more luck you have with your chip, the more speed it will do with relatively little extra voltage. I'm at about 1.3 Volt to get a video encoding stress test to be stable.. You may get the same, better or worse results.. I can't tell!
  22. Thanks! Its been going strong for a few months now and I even use its power now and then playing some Far Cry. I don't really see it as its underneath my desk, but hey I know its there being all pretty..
  23. Hey Cano, thanks! Its hard to measure without really proper equipment, and then you'd have to measure a few baseline 'normal systems'... I'm super picky about noise, and when gaming it does become audible, and under high CPU load as well. But for stable 4.7ghz you can't expect it to be inaudible. During normal time wasting usage (youtube, surfin.. or even a bit of Office or CAD) when I come back upstairs after dinner, I have to look if the pc is on, because its not audible. Here is a size comparison I made between a Corsair Carbide 300R and Coolermaster N200. I did some numbers and if I was to make a batch I might be able to sell it for about €300 / $350 ex shipping / VAT as a build yourself kit. I think the lack of snazzy colours and RGB leds means I didn't get the mega response here (or elsewhere) that I hoped though so its a bit unlikely I'll make a batch.. but at that price (which does include a profit for me as well) its not too bad I'd think.
  24. Good point, I didn't know that even there you get inaccurate specs sometimes! Swndlr, when using 3 screens for gaming rigs, the bezel thickness is annoying, thinner is better. But the bezel thickness specs given by manufacturers are almost always ''optimistic''.. The spreadsheet quickly tells you how thick the bezels actually are, not what the marketing department tries to sell you. When using 1 screen it really doesn't matter that much of course.
×