Jump to content

Daharen

Member
  • Posts

    160
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Profile Information

  • Gender
    Male
  • Location
    Seattle, WA
  • Interests
    Geek, PC Builder/Enthusiast, Law, Psychology, D&D, Gamer, Technophile, Audiophile
  • Occupation
    Law Student

System

  • CPU
    8700k @ 5.2 Ghz, AVX 1, Normal CPU Core V, -0.095 Offset, Avg Core 1.255 v Stable
  • Motherboard
    Z390 Gigabyte AORUS Master V1.0
  • RAM
    G.Skill Ripjaw 16x2 @ 3700 Mhz 19-19-19-39
  • GPU
    EVGA 2080 Ti FTW3 Ultra HydroCopper @ 2130 Mhz Clock & 8000 Mhz Mem
  • Case
    Phantek - Enthoo Primo
  • Storage
    Intel 905p 1 TB PCIe NVME SSD
  • PSU
    EVGA SuperNova Titanium 1600 w
  • Display(s)
    LG 4k 55" OLED & CUK 1440p 27" @ 144hz
  • Cooling
    Custom WL, 1 x 480x60mm & 2 x 240x40mm rads, 9 x Noctua A25x12 Fans
  • Keyboard
    Logitech G613 Pro
  • Mouse
    Logitech G Pro Wireless Gaming
  • Sound
    Sonos Soundbar, Subwoofer, 2 x Play:3
  • Operating System
    Windows 10 Professional

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. You're going to need to get in touch with a builder or machining shop and have something custom done frankly, or do it yourself if you have the equipment. Granted, its something you could jerry rig with a plastic case, a friend who can purchase the refrigerant (Likely need to be an HVAC technician), and the right tools and a bit of plumbing know-how, but if you want it actually designed to hold computer components, mount a motherboard, and not have to drill the holes and seal them yourself, it's gonna be a custom job. The produce your showing is server space and industrial application, not consumer, and not even prosumer yet. No one sells it to random people who want it, and the pricing is aimed at corporations that can afford an extra zero or two at the end of the price tag. You might be able to get it off a third party second hand merchant that specializes in server part re-sale, that's how most consumers who have the cash to throw around get their hands on server grade tech for custom builds, but its simply not designed for your usage, and you can expect zero support since these corporations usually operate on contracts to have specialized techs do the maintenance for these sorts of things early on, and they simply won't work with a consumer (There is no customer call line except to arrange for a person to come out and manually troubleshoot the problem, for what amounts to a multi-thousand dollar company expense). If its in the server space, you can wait a few years for it "possibly" come to the consumer space... But not likely. Puget Systems did an immersed cooling setup with mineral oil that they sold to consumers for a short time, but that has discontinued, and quite frankly, phase cooling designs have regulatory prohibitions in place for consumers because of the dangers of leaking refrigerant into the environment (Considered an extremely damaging environmental contaminant that can lead to multi-tens of thousand dollar fines for corporations, and requires extensive hazmat cleanup if you fuck up). Legally, if you did manage to get the coolant for this, you would probably be violating the law wherever you live, along with whoever provided it to you. The only exceptions tend to be in validated products that have liability going back to the manufacturer (Like refrigerators). So yeah, TL;DR... You can do a custom job yourself if you want to drill the holes and have an HVAC contact who will help you on the down-low.. You can do a custom job from a machining factory or service if you can CAD it, but still need the HVAC contact to fill it... You can spend obscene amounts of money to get it done directly as seen in the video through a professional service provider, paying what a corporation would pay for installation and maintenance, putting you in the minimum 5 digit price tag range (But probably in the 6 digit range frankly)... Or you can just not... Good luck.
  2. So, I have just recieved the N-Core Direct Die Water block I ordered back in 2018 for my then 8700k rig... I still want to use the bloody thing, but of course, I didn't expect it to arrive 4 years after I ordered it (A year maybe, I knew it was a kickstarter, but it was getting support from LTT, GamersNexus, and Der8aur, so I figured it was legit). I was one of the first supporters, so I know for a fact not many people have recieved them and they just started arriving... The thing includes instructions for installation on CPU's up to the 10900k with spacers and such included, and I'm wondering if the materials in the package were just printed before they validated 11900k installation spacing. If so, then it might work with an 11900k if I wanted to make use of this thing for a modern refresh of my build. That said, I wanted to see first if anyone had delidded and direct die cooled both a 10900k and an 11900k who knew if they were significantly different in terms of their dimensions, or if I should just go with the 10900k (Which I understand isn't significantly different from the 11900k when overclocking anyway).
  3. So, I'm looking for pitches and ideas on materials, processes, machining techniques, and things to be wary of if considering doing an extreme modification of a CPU or GPU to allow direct liquid to silicon cooling (No heat spreader or block, although I've considered trying to find a way to attach fins directly to the silicon, but can't imagine a way to do so with an interface material that would both firmly attach them and provide adequate thermal conductivity to make it worth doing in the first place). Obviously, this is ridiculous, impractical, stupid, likely to destroy the component, and the type of thing no sane person would do to their computer. I'm not sane, I'm a ridiculous enthusiast who doesn't care if this has no "practical" benefit, I just want to see if it can be done. I have a bunch of friends who are part of the builder community and have various equipments in their homes, lathes, CNC machines, laser cutters, 3d printers et cetera (None have them all in one place, bit of an expensive hobby, but I have access to most anything I would need), I'm just looking for advice on the mechanisms to make it work. First trial is going to be on an old silicon lottery 8700k I'm planning on replacing soon when I upgrade my rig. If it goes well I might consider doing these extreme modifications on current gen and sharing the results. Other considerations were using actual refrigerant for evaporative cooling directly on the die, but one crazy adventure at a time for now...
  4. Lol, on the one hand, I "think" you are correct. On the other hand, I knew people who saw the trend of GPUs and CPUs getting more efficient, and figured they'd be solid with a 800 watt PSU for the foreseeable future, and then discovered that as competition rose, you start seeing some GPUs and CPUs cranking power limits hard again (With some "spiking" above 1,000 W with high end rigs despite averaging 800-850 under load). While this is exclusively in the range of the enthusiast space, that's the range I like building in... Sooo... Admitably, 1600 is WAY above anything I've seen other than multi-gpu rigs, and I could probably have settled at 1200 and never hit my ceiling even with the most ludicrous current setups, say an OC'd water cooled 3090 with custom PCB and OC'd 5950X (Probably wouldn't bother OCing a AMD CPU, but I'd have to at least "try" and see if I could get consistent benefit out of it, or if I just won the silicon lottery). However, with that sort of spec, and the type of overclocking I enjoy, I might have been pushing it a bit close still (Again considering spikes, which I know most PSU's can handle greater than their declared max, but some will literally shut the system down if they get spikes above their rated power, even if they could theoretically handle it unsustained). So... I am wary of your statement being made with confidence. It's not that I think you're wrong... This uptick in power consumption could very well be short-term and we could return to ever decreasing power ratings for the foreseeable future until my PSU is totally stupid levels of overkill, however, ya never know, and in the enthusiast space, better safe than sorry seems the better solution... I was sincerely considering a 2,000 Watt power supply that was on the market when I bought my 1,600 watt, but it would have required me to rewire my outlet and breaker, and it was Platinum rated, and I like the fact that I can run very very close to the exact power draw of the system while still having a ton of overhead because of the PSUs efficiency (At the low ranges I know it runs closer to "gold" and it's titanium rating accounts for its entire range, so I know I'm sacrificing some efficiency, but still). None the less, I appreciate your reply, and that it probably applies to 99% of consumers (And "PROBABLY" even me).
  5. So as I make upgrades to my rig into the future, I've been "trying" to future proof elements of my build. To that end I bought a 1600 Watt Titanium power supply, a ton of Noctua fans, a bunch of high quality radiators, and recently upgraded to the Obsidian 1000D alongside Dual Laing D5 PWM. 800 - 4700 RPM for redundancy. Obviously the rest of the hardware for the most part is going to see upgrades, from better SSDs (Looking at the Gen. 4 Optane), GPUs, CPUs, Mobos, and upcoming memory upgrades, so none of that really matters. What I'm struggling with is trying to make a "modular" upgradeable system for my water cooling loop... I want to be able to clean it out fast, easily, and regularly, but also be able to quickly switch out components for rebuilds if I want to do so. I was thinking of trying to do a mixed soft/hard-line tubing. Hardline for the majority, but with softline tubing and quick disconnects between all the cooled hardware, and two empty reservoirs, one high, one low, that I could use to empty the loop of old fluid, and simultaneously exchange it with cleaning fluid, and then drain the whole thing again and replace it with new fluid with hardly any disassembly or pain. I figured I could re-invent the wheel, since I have a solid idea of what I'm looking to do... BUT... I also figured I should come on here and check to see if anyone else has done this sort of setup, had some pitfalls for me to look out for, advice, or suggestions on the overall objective here.
  6. So I know this is a bit far-fetched, but as I sit here playing video games with a second monitor and chatting with a friend on another window, and repeatedly screw up my alt tabs, and find myself typing in the wrong window, or sending commands ot the wrong application when I hit a macro and forget to alt-tab back, it occurs to me, using eye tracking to automatically control which application receives keyboard commands would be a life saver for me... I have no idea if anyone is working on a user application like this, I know specifically in VR it's available for custom programmed navigation of in game menus, and for foveated rendering and the like, but this seems like a much more broadly applicable use case that could integrate with smart glasses if they ever made a revival. Just a thought. Hopefully someones already thinking of this... If not I really have no clue where you would even send suggestions like this so someone might give it some thought.
  7. As a side note, when it comes to loading games, often times you deal with issues where the problem is more complicated than people realize. IOPS is not as intuitive as people think. I think most benchmarks use 8kb files as the IOP measurement, but it matters because latency becomes the bottleneck with more variable read and write data. If you talk to a programmer who actually has to deal with loadtime optimizations they can go on for hours about the headaches this causes, and it is in fact the reason you see SSDs that are literally in some cases 100x faster than HDDs from years ago, but the load times are maybe improved 20-30%, which is definitely not even close to proper scaling. Most people think IOPS is what determines this, and while they're right (Sequential only matters for large organized unprocessed data), IOPS vary based on the data size, and at really small data packet sizes you begin to become more effected by latency of the Hard Drive than by the actual throughput it can handle in general, and more importantly at REALLY low depths, talking the movement of data sections in the bits or bytes rather than Kb or KB, the IOP numbers listed in marketing materials become meaningless, your processor and memory become the keys, and in fact, you benefit more from REALLY tight memory timings than REALLY high memory speeds, and you need a processor with an outstanding IPC and extremely high single thread frequency, and even then you'll still never be able to even know if the SSD is being utilized properly or not, as it's seldom even possible to eliminate all the bottlenecks to measure the performance capacity when requests are that small, and latency is the primary factor and not throughput itself. This relates, because if you check you'll see HDDs have a tendency to offload 'slightly' more data to RAM than SSDs. In ALMOST all cases this has practically zero effect, and the difference is nearly completely negligible, we're talking never more than a few extra hundred megabytes of RAM usage... But a couple hundred megabytes of RAM usage when ALL of that data is nothing but small requests, can make a big difference, because the HDD isn't having to call that data at all, and now what you're really doing is comparing the speed of a partial RAM drive (Inadvertently) to a SSD. The RAM drive will always win in this battle. It's possible but very very unlikely this is what is happening. Again this would be poor programming if that was the case. Programmers should always push all data subject to frequent small requests onto RAM, that's what it's for, but when a program is poorly optimized that doesn't happen, and programs can discriminate based on the type of hard drive because in general it can be more efficient not to hog up RAM when you can quickly pull the data off of a faster SSD, but only so long as the data being pulled is in larger request sizes that don't force the RAM and CPU overhead to increase.
  8. While I can't give you a straight answer and probably no one can, it's almost certainly software side. The only way to know for sure would be to test another computer with a different SSD, and all other variables removed. That said, particularly with online titles in the early days of SSDs some developers realized it was possible to abuse the loading speeds to get unfair advantage over other players with your SSD, and they forced artificial load times, IE timers, instead of loading at the maximum possible speed. This would be done purely via identifying the type of hard drive, and so wouldn't effect HDD's which if this is the case, and the timers were set extremely aggressively, might result in the SSDs loading 'slower.' If you suspect this is the case, simply check the usage on the SSD during the load screens. Set the polling to maximum, and record the data so you can go back over it in detail. You should see your SSD barely being used at all if it's just a timer artificially limiting your loading, if it's actually some sort of hardware based problem then you will see the SSD maxing out during these periods. In the latter case you're dealing with very poor programming and optimization, so still software, but of a less intentional and malicious sort, and a more incompetent and negligent sort. Unfortunately if I'm right there's not much you can do about it. I suppose you could run HDD's in a very big RAID 0 array to get close to SSD speeds, and still end up having the hardware register it as a HDD, but that's assuming the developers weren't clever enough to do something to check for RAID usage. No matter what if they are trying to limit your ability to load too quickly, odds are they will succeed and you'll get worse load times ironically the faster your drive (Or at least artificially set and determined load times that 'could' be worse). No real way for me to verify. You could write the developers and ask though, they will usually concede these sorts of things 'vaguely' so long as you don't ask for details.
  9. Just PC, no peripherals, and purely for gaming, you're looking at equivalent of just under $1,300.00 USD. So assuming you can get parts for US prices in Australia (I know that's tricky sometimes): - 5700XT for raw performance or GTX 2060 Super if you really want to play with raytracing ($390/$400) - AMD R7 3700X (Just keep the stock cooler, overclocking doesn't help with Ryzen 2 enough to warrant better cooling) ($299) - Gigabyte AORUS X570 Elite, ASUS X570 Plus, or ASRock X570 Steel Mobo if you have a brand preference, I think the VRM is marginally better on Gigabyte, but they are all outstanding and the difference is negligible. ($200/$190/$180). - 32 GB Brand Name RAM @3200 Mhz (Preferably CAS 16), I reccomend the G.SKILL Aegis 16x2 Kit ($135-$140) - Intel 660p 1 TB M.2 NVMe ($125) - WD Blue 2TB HDD ($55) - EVGA 650GQ 80+ Gold ($120) - Fractal Design Define R6 ($150) - Windows 10 Home OEM (From G2A or your preferred OEM Seller) ($20) Now obviously the prices I listed here show a total of closer to $1,500.00, but that's because this is the quality of parts that would be complimentary, and if you shop around a little you can get these parts (Which roughly have MSRPs at around what I quoted), for this price OR BETTER. You should be able to squeeze down to $1,200.00 keeping these parts. That said. I also assumed you didn't want to compromise quality across the build, so purchased all parts of roughly equivalent quality. You can make a lot of cuts to make this system more affordable. You could get the Intel 660p 512GB instead, totally drop the WD Blue drive if you're willing to uninstall and reinstall only the current game you're playing, you don't need an 80+ Gold power supply, any 80+ unit will do and you can shave $60.00 there, you can get a super crappy case for a fraction of the cost and it makes assembly a nightmare but will barely effect performance. You can drop your RAM down to 16 GB and be good for most games for about half the cost, and could use an X470 Mobo on the lower end and 'probably' be fine. With those savings you could maybe bump up to a GTX 2070 Super, I wouldn't change the CPU though, no point. And if you're thrifty you could do that while keeping it still under $1,200 (Or $2,000 AUD). I generally recommend keeping component quality across the entire build similar, but it's hardly necessary, and most of your performance comes from GPU first, CPU second, and so long as the rest of the components at least can survive, and you have 'adequate' memory (Although Ryzen does realistically need 3200 Mhz to be at its best) your good. Hard drives have practically no effect on actual game-play quality, just load times and the overall 'speediness' of the entire computer. If you want using your computer to be a pleasure, stick with an SSD... But if you REALLY only care about squeezing as much raw gaming performance out of the computer as possible, even if it means having a sluggish computer in general and long load times while gaming, then you can stick with just an HDD and have a more money to put into the rest of your system (And also not have to deal with worrying about running out of room on your hard drive and uninstalling and reinstalling frequently like if you stick with the earlier suggested 512 GB Intel 660p compromise). I think in general you got few responses because you didn't specify what you want enough. For more and quicker responses include budget, maximizing for performance, or a balanced build, must include SSD or no, and what gaming peripherals you already have, or which ones you need to get as part of the budget. Obviously this entire suggestion is out the window if you have significant cost overhead for peripherals (Monitor, Keyboard, Mouse, Chair, Desk, Speakers/Headphones, Router, whatever). A useful tool for figuring it out yourself: https://www.logicalincrements.com/
  10. So my gaming rig is my signature, in addition to that I have a max specced 2018 Dell XPS 13 I want to use to handle the stream, a Logitech C922 Webcam, and am using an HTC Vive Pro Eye with wireless adaptor. The only two things I'm thinking I need to purchase are the Elgato Collapsible Green Screen and an AVerMedia AVerCapture HD. I'm not going for insane FPS or crazy high resolution, I just want to be able to stream a high quality encoded downscaled 1080p 30fps mixed reality broadcast of my gaming session with ZERO impact on my gaming rig. Although I seldom have serious drops, any effort at using OBS natively with games like Skyrim VR (Heavily modded), result in frequent and obvious spikes in frame times that are incredibly disorienting in game, so I've resolved that mirroring is NOT for me, and I do need to do an external streaming PC with a capture card... Thing is I absolutely don't understand how any of this stuff works. I'm an avid tinker, and enough of an enthusiast to figure things out assuming I have all the right equipment. But right now I'm not in a financial position to start buying stuff only to find out I need to be a dozen other things to make it work as I run into problems on the fly... Normally that would be fine, but not right now... So I'm coming to the community to see if I will have all the equipment I need to make this work, or if there's something I'm missing. If you want to provide helpful information on how to actually do the setup I'd love to listen, but so long as this is the gear I need, that's really all I need to know, I'm happy to flop around and struggle to get it working so long as I have what I need. As an afterthought, I am also using the Vive Wireless Adapter... Since so many of these capture cards are 'pass through' I'm not sure if this will work at all... But if it's still possible without any additional overhead on my gaming PC then it's cool that's all I need to know... If not, anyone who has any information on how to help with this would be of great help...
  11. Alright, I'll go do some research. It would be selling my current drive, and using M.2 drives, and I haven't yet switched to a new Motherboard or processor, but given I'm looking at the new Threadripper lines cost is not really on the table as a 'huge' concern (There are server grade components that are well out of my line of interest, and Quadros and stuff like that, but nothing in the extreme enthusiast market is really off limits). I appreciate the response, and at first glance the numbers in your other post look good. Might be worth just using the integrated M.2 slots, as it appears a few of the motherboards have 4-5 M.2 slots, and Threadripper has more than enough lanes to accommodate them all. Just need to make sure it can handle putting them all in RAID without compromising their performance significantly, haven't seen anyone really try it, might need to look up some Intel HEDT RAID reviews from closer to their launch and see if anyone tried it when they were still new.
  12. To clarify I want to use an adaptor so that I can run them in RAID while only using up a single PCIE slot on the motherboard, and making full use of the 16x slot, vice being forced to use multiple 16x slots in 4x mode with a single M.2 PCIE drive each, hence using a PCIE 16x adaptor that fits 4 M.2 NVME Optane drives, and automatically puts them into RAID and runs them over a single 16x slot in 16x mode.
  13. So I am looking to maximize both my low latency read/write times as well as my sequential speeds. While PCIE 4.0 drives are easily the best in class when it comes to sequential write speeds, they still don't hold a candle to my Optane 905p in high queue depth and small file write times. I am wondering if there is a PCIE Adapter for M.2 slots, that would work on PCIE 4.0 if I moved over to Ryzen with Intel's Optane drives, and that has a good enough controller on-board to not compromise the low latency of the drives, which is their key selling point? Essentially can I get the best of both worlds (For an obvious ridiculous premium as each of the 4 Optane M.2 drives is about $500.00 and the total capacity will only be just over 1TB)?
  14. Okay, well I really was hoping to solve this problem through pass-through, but while I found solutions, they appear to only be in the industrial/pro-sumer space, and involve sophisticated EDID emulators that can merge the EDID audio qualities of one output with the EDID video qualities of another, allowing pass through without video loss. Since these solutions start at the $2,000.00 range, even though they are functionally just emulating things that should in theory be able to be done via software, they are less viable than the suggestions most people push of just getting a sound card and a receiver to allow for surround. I find this annoying, but that's my problem, and I concede the issue... Now my problem is needing to find an actual sound card and receiver that has any chance of successfully working with my configuration... I already know for a fact that one of the cables I need doesn't exist yet, and won't likely exist until later this year or Q1' 2020 (Display Port 1.4 to HDMI 2.1 adaptor). However, I want the rest of my system to be fully compatible for when that cable does become available. So here's what I'm looking for... - Sound card must be able to output high quality sound that covers all the formats that a Sonos sound system is compatible with, https://musicpartners.sonos.com/node/464, https://support.sonos.com/s/article/79?language=en_US. - Reciever must be able to process all compatible sound options, have HDMI 2.1 input and output so it can eventually handle 4k/120hz video pass-through along with audio decoding, and must also be able to fully utilize the 5.1 surround functions of the Sonos sound system, not just 2.1 sound, as that completely defeats the purpose at this point for me https://www.lg.com/us/tvs/lg-OLED55C9PUA-oled-4k-tv . Based on my research so far this is not an easier set of requirements to fulfill than what I was looking for before anyway, it appears that no matter what avenue I go down this entire setup is doomed to fail, but here's hoping I just don't know what the hell I'm talking about. Any help would be appreciated.
×