Jump to content

Kalm_Traveler

Member
  • Posts

    1,173
  • Joined

  • Last visited

Everything posted by Kalm_Traveler

  1. You said 'good if you play benchmarks'. Not sure what you meant by that - the performance delta is enough to be obvious in real-world use.
  2. Better in that it boosts higher which seems to result in generally higher gaming framerates. It's not just reference vs custom - the Titan has more of everything, so it's interesting that a 2080 Ti is able to clock a little higher and that the higher clocks trump the lack of everything else.
  3. I'm not any sort of reviewer - bought mine from a computer shop in California in February. My guess is that they're having poor yields with the full fat die having all cores reach the advertised speeds at advertised voltages. Mine is doing just fine though, 4.9 all core and it I have it on the edge of not overheating during any workload I throw at it. On the flip side - as others have mentioned it is not really much different from a 9980XE which you should be able to find. The main reason I bought this is that I already had an x299 rig and didn't want or need a new platform but had the upgrade itch from my delidded 7960x.
  4. Hey guys, this has been happening to me for several hardware generations but I finally decided to ask if anyone else has seen this and knows how to fix it. On both of my old x99 boards (x99 TUF and Rampage V Edition 10) as well as both X299 boards (Rampage VI Extreme and VI Extreme Omega), randomly on bootup the G.Skill RGB RAM will just continue rainbow puking unless I power off, then power on again. Also, along with that RAM unsyncing, occasionally on bootup everything will lose my settings and default to red. Once in Windows, Aura confirms that everything is back to default (red) and I can change it back to what I had it before, but I'm not really seeing why either of these things happen. When it boots up red, it stays red upon subsequent reboots until I change it so definitely seems like it just randomly is losing my settings. What do you guys think? -- also -- This seems to only affect the HEDT boards. Currently I have a 9900ks gaming rig with the Maximus XI Extreme and it never has any color/sync issues.
  5. Kalm_Traveler

    rampage vi extreme

    hmm?
  6. That doesn't mean that temperature reductions can't be had by delidding and replacing the solder with liquid metal or just going direct die cooling. "worth it" is subjective - if a person values a 4c temp reduction, it would be 'worth it' to them. I've delidded a 9900k and 9900ks for direct die which gives quite a good temp drop and would like to delid my 10980xe if not for fear of shearing off all the SMDs near the IHS contact points but the work of having to resolder them has made delidding that soldered CPU "not worth it" to me so far. Might change my mind in the future. *EDIT* The 9900k doesn't have any SMDs near the IHS contact points so no risk there, though I think some have suggested you might crack the silicon die itself when breaking the solder with a delid tool. I've only delidded one 9900k and one 9900ks so not a lot of experience delidding soldered CPUs but neither appeared to be much at risk for damaging the silicon.
  7. yeah this was the first time I'd come across it. I tried 3 mice all less than 2 years old and they all did the same thing - they'd flicker on and off. They all 3 work fine through the old Asus monitor, and all 3 don't work through this brand new LG. I asked around a few places and someone mentioned that some USB hubs (in general, not just built-in monitors) don't handle "polling" devices well - which I guess a mouse is.
  8. should work in any USB-C or Thunderbolt 3 port. We use those types of multi-port adapters on Macs at work (Thunderbolt 3 Macs) all the time. It isn't a Thunderbolt dongle, just normal USB-C so it should work in any USB-C port (laptop, desktop, phone etc).
  9. you may want to test things - the USB hub built into my older Asus PG348Q monitor works fine for keyboard and mouse but the one build into this new LG monitor doesn't seem to work correctly with mice. Keyboard worked fine in either port but the mouse would never connect so I ended up just saying screw it and plugged them both directly into motherboard USB ports.
  10. I think you're confused because those types of PCI-E riser cards (mainly used for mining) use the physical USB 3.x connectors and cable to connect their two pieces. Nothing on those is electrically USB at all - they are only using the physical connectors and cable to send a 1x PCI-E signal between the small piece that goes on the motherboard and the larger piece that a graphics card connects to. ------------------ As far as an external graphics card over USB 3.1, it is physically possible but the results would be much worse than Thunderbolt 2 and Thunderbolt 3 external docks due to limited bandwidth and that USB is a serial connection. Someone correct me if I'm recalling this wrong but devices on a serial bus all have inherent latency because the devices running on them are polled in series. A Thunderbolt-connected device essentially has direct PCI-E lanes (at least to the TB controller) which is not a serial connection and thus can make direct communication with the cpu at any time - kind of like how PS/2 keyboards/mice can interrupt the cpu to process their input at any time but a USB keyboard/mouse can't.
  11. let us know if you find anything out about this. I have both versions of the Xbox One Elite controller and initially was sad that the original requires an Xbox One wireless adapter on PC (never saw anything about getting a normal bluetooth dongle to work with it). Granted I don't use bluetooth on the new version now since I just got used to using the original cabled but still curious.
  12. ahh good idea and thank you - that should work for sure howdy - yes indeed they do sorta track sleep via movement. I have been wearing a Samsung Galaxy Fit 2 smart watch since they launched last I think October which also tracks it via heart rate, and for just over two weeks with an oximeter (the things they put on your finger to test how much oxygen is in your blood) which tracks oxygen, heart rate and movement. I'm definitely moving a lot, and get random weird heart rate spikes from 40-50 suddenly up to 100-120 but I want a camera to see what the heck I'm doing during all this time.
  13. Kind of weird question I know - but I'm looking for a camera that I can set up to watch/listen to my sleeping because I have been diagnosed with severe insomnia and sleep apnea but my health insurance says they won't pay for a sleep study because in their words "your records show that you do not have insomnia or daytime sleepiness, so this is not medically necessary". Anybody know of a good cheap camera with decent-enough IR night vision etc that I could have streaming to my PC in another room (and software to do the recording) ? I guess a baby monitor camera would work but I have no experience with anything like this so not sure what to look for exactly. Thanks in advance!
  14. thank you So a 2070s will be able to play current games at 4k resolution with max graphics settings and maintain 60+ fps? I don't think the 3700x stock cooler would fit in the Node 202 case, so I'll be keeping the Scythe Big Shuriken 2 unless it also is too tall on top of AM4. You may be on to something about just upgrading the graphics card for now though. I know there are some games already that want 8 physical cores (RDR2) so that's why I was thinking the 3700x might be a good choice - double the cores and threads plus higher boost clocks and better IPC in general. I don't have any consoles so that's not a worry, but I just want to make sure that the TV setup can game as well as everything else in the man cave for when I have a few friends over.
  15. posted this on r/htpc but I want your guys input too if that's all right: Hi folks - I currently have a decent HTPC hooked up but after upgrading to a 4k HDR tv I'm noticing that it is struggling to keep up with modern PC games and kind of want to pull the trigger on new guts for it. I'd like to keep it in the existing Node 202 and retain the SFX 600w PSU if it's enough to power the new parts. Speaking of, current parts are as follows: Asus ROG Strix Z270i gaming mb Intel core i7 7700 (non-k, delidded + liquid metal) Scythe Big Shuriken 2 rev. B cpu cooler with Noctua thin 120mm fan 16gb (8gb x 2) 2666mhz CL15 Corsair Vengeance LPX DDR4 MSI Geforce GTX 1070 Gaming graphics card Samsung 850 Pro 512gb nvme ssd Use is maybe 50/40/10 couch gaming / emulators / media watching so obviously a better graphics card will help with the 4k resolution. I figured these parts are all fairly old now though and would like to upgrade the full platform so it stays adequate for a good 4-5 years before I get the bug to upgrade it again. Parts I'm thinking to plunk down for: Asus ROG Strix x570-i gaming mb AMD Ryzen 7 3700x cpu (because it is much beefier than the 7700 and still 65w tdp) 16gb (8gb x 2) 3600mhz CL16 DDR4 best air-cooled 2080 Ti that will fit the Node 202 decent 2TB nvme SSD - maybe Samsung 970 Evo Plus but maybe something PCI-E gen 4 because why not So my questions for you guys: Does this parts list seem like a good logical upgrade overall that should last me at least 4 years for 60+ fps gaming? Any reason other than price to drop to a 2080 or 2080 Super instead of 2080 Ti? Will the existing Corsair 600w SFX PSU be enough for the new parts? Will the Scythe Big Shuriken 2 rev B. cooler still fit both on the AM4 mb (I am pretty sure it included AM4 brackets) and within the Node 202 case? I don't know if there's a mounting height difference but as is I had to remove the internal filter to fit it on top of the 7700. Please let me know what you guys think and thanks in advance!
  16. Anecdotally I've seen more Micron VRAM Turing cards have memory issues, and historically Samsung memory chips seem to be better for overclocking but I wouldn't blanket statement say that Micron is bad. They are one of the largest memory manufacturers in the world because generally their chips are good - I think they just had a big batch of worse-than-expected chips that ended up in the first few rounds of RTX graphics cards. That being said, I am paranoid and have only bought RTX cards with Samsung memory as a result of those early issues.
  17. we need to make a movie about you - a rural programmer in the mid 2010's getting by on a 20 year old rig and a 6 GB monthly data cap... getting chills just thinking about it. I've lived in Utah my entire life and always thought we had it bad out here due to low population density but I remember growing up we had 256kbps DSL in 1997, and when I first moved out with friends in 2003 we had cable that was I think 50mbps.
  18. I love the progress pics and how clean you were able to do this... but after reading the OP I have just one question... How in the world were you running anything on 512mb of RAM in 2010 before that upgrade?? I had 2gb RAM in my final Pentium 4 rig, and 4gb in my last Windows Xp rig with the Core 2 Duo E8500 in I think 2008 ?
  19. I would check with HWInfo to see what those temp sensors actually are - "Temp #11" doesn't give you any clue as to what it is, though I'd be a little concerned myself of anything being 83 c.
  20. where are the sexy finished pics? You have had all week buddy - don't keep us hanging
  21. I have to agree with you on the Kailh box whites being more comfortable than MX Blues - and I also prefer the tactile clicky feel of blues. My first mech board had browns because they're sort of a middle ground between tactile and smooth, but when I bought another board with blues I was hooked. Now I have 5 boards with Cherry MX blues, 1 with browns, 1 with Razer orange (like Cherry browns), 1 with Razer Purple laser switches, and one with Kailh box whites. Now days, Kailh box whites and Razer's purple laser switches are kind of tied to me as the most comfortable but they don't feel the same. Cherry MX blues are in a defacto 3rd place because they're harder to press and not as smooth but I'd still take them over linear switches any day. I can't type on linear smooth switches at all, and since I type more than 'just' game on all my PCs I need the clicky clacky tactile switches.
  22. You'll need to do some testing to know for sure what that chip can do, but adding more radiators and fans doesn't linearly increase max OC. A main point of diminishing returns on x299 chips is hitting a point where the core heat simply can't be removed from the silicon quickly enough to keep temps to a safe limit. This is exactly the wall I've hit with my 10980xe, it can easily do 5ghz all core on about 1.3v but even on 1.23v it simply heats up too quickly and the two hottest cores hit 90c almost instantly. No amount of radiator in a case will stop that with ambient cooling.
  23. oh man... it took me 2 months just to find one in stock somewhere. Central Computers in California listed 4 of them on Wednesday last week, I found out about it Wednesday night and ordered one, they were sold out the next morning. If an exchange was possible I might try that but these things are like unicorns.
  24. Oh that's a risky question - sure it 'can' be delidded but as you said very risky. I didn't mind on the 7960x since it was not soldered, and also not worried about LGA1151 chips like the 8086k/9900k/9900ks since their cpu die is much smaller and there are no SMDs nearby to the IHS mounting so the only real chance of damage is if the solder rips a chunk of the die off somehow. With the 99x0 and 109x0 chips being soldered and larger die, as well as having SMDs extremely close to where the IHS touches the PCB I'm leery, although end-game version of this rig would be direct die I'll have to think about that. Yeah I saw about 20-25c drop delidding the 7960x so I know it helps a ton on those kinds of chips. That one was easier because it was stability limited, not thermals. Its sweet spot for ambient cooling was 4.7ghz all core on 1.245v and on that no core ever got above 76c; to do 4.8ghz stable required over 1.300v and would thermal throttle. This 10980xe though... it's just all thermals. Given how easily it does 5.0 with relatively lowish voltage I bet with just outdoor winter ambient I could get it doing 5.4-5.5 no problem (last winter I got the 7960x passing 3dmark benches at 5.2 but that required 1.425v and in -12.2c ambient it was still hitting 80c under load). Thanks for the input - I will try to check the paste spread tomorrow and report back. Don't mind lapping the IHS if it will help but just want to make sure I'm gathering as much info as possible before making that irreversible change.
  25. 5ghz on this one is stable with ~ 1.325v but I run into thermal limitations. No idea what is 'universally accepted as doable' as hardly anyone has these, and the few reviews I've seen were using worse cooling and way too much voltage. Haven't removed the block to check spread yet but I've never had a problem with a thick line down the middle and 4 blobs on the corners for LGA2066. Could be the first, but I'm skeptical. Crappy chip would be fine, but I'm not sure that would explain the temp delta unless as the other guy mentioned maybe there's a solder thickness variation under the IHS. If I have some time tomorrow I may pull the block off to confirm TIM spread - if that looks fine I may need to lap the IHS.
×