Jump to content

relaxation

Member
  • Posts

    15
  • Joined

  • Last visited

Awards

This user doesn't have any awards

relaxation's Achievements

  1. Drove an hr, waited 3 in microcenter to perform the firmware update (during 4090s launch day), 12500 was in fact dead, sent it back with a copy of the microcenter receipt and the refund appears to be going ahead, but I had picked up a nicely priced 12400 (secondary market again) and it's up and running for 24h. after it was confirmed running I had a chance to swap the ILM. I had reseated the heatsink so many times I was aware of the uneven contact in the middle of the cpu, after I had affixed the unit with new paste I backed off the screws to pull up the heatsink and I thought suction or dried paste was holding it but after twisting and lifting the heatsink it was such a clean spread I was impressed. a few quirks with the default options with the motherboard, secure boot enabled and almost all options to wake from sleep were disabled anyway, thanks guys. and good luck to any of you people from the future reading this.
  2. I could return the cpu within 30 days to the seller if it was defective, microcenter wants $30 for them to update the firmware with an appointment checked repair shops more local to home, $199 to perform the firmware upgrade
  3. Hello, I've picked up an MSI PRO B660-A DDR4 motherboard, G.Skill 32GB kit of 3200 RAM, and Scythe Fuma 2 rev.b off of amazon but an intel 12500 second hand that was advertised as working was delivered today (looks immaculate on the underside) and so far all I have achieved after the power is applied are the following; fans spin up and the EZ Debug LED flashes red and for a split second it will change to orange then back to red and stays solid, or more uncommonly it will flash red, orange, red, then reboot and be solid red, and I've allowed it to stay that way for 5 minutes before cutting power. A keyboard, monitor connected with a DP cable to the igpu, and I've also had an GTX960 attached in other attempts today (I'm using said keyboard, monitor, and gpu to write this post), but no logos ever displayed. I've swapped PSUs, different brands of 1 stick of RAM, different mounting pressures for the heatsink, reseating the cpu, and made sure the instructions from MSI were correct (cpu pcb is keyed to only install one way, no triangles on the socket), clearing the bios, reseated all the cables multiples times (EPS, 24pin, cpu_fan), and reseated the ram into different slots as well. I've read the README , the MSI post , and checked the socket for bent pins but it appears to be fine to me. I'm thinking I have to run to a store and swap CPUs to rule out any compatibility issues with firmware (which I cannot seem to flash without a working cpu), is there something I missed?
  4. Tomshardware listed what the prices of the case might be, over 900 american dollars. I'd be interested in another piece of content that uses this case, plus mixing a whole bunch of PSUs and GPUs together to suss out coilwhine on various gpu loads like 60fps, 120fps, and uncapped fps. Would this case be good for a DAW? Unsure how grounded the mobo is floating there. Wasn't there going to be a content piece on DAWs? In my own system playing something with little gpu utilization like rocket league at high frame rates my speakers external amp picked up on some hum that turns to a buzz at highframe rates and sounds like knocks when I use an FPS limiter to 4fps. Ask someone like James who has a living room setup connected to an amp and subs if <various gpus under various fps loads> cause audio issues, lots of people on the nvidia side of the fence been complaining since fermi. Author of this post has a fanmate 2 on a NF-A14 ULN spinning under 600rpm because ULN 800rpm isn't inaudible enough, on top of semipassive audio equipment so its fans never have to turn on.
  5. isn't it weird Alex measured around 30dB(a) in-office with beanbags, but a hemi anechoic chamber with a noise floor of 28.5 dB? was that Z-weighted (flat) and not A-weighted (est. human perception)? was measurement performed by them with their equipment? or perhaps yours with a reliable range above 30dB(A)?
  6. interesting stuff, I looked at my own Kill-A-Watt when I got the GTX960 years ago and one of the features I was interested in, hardware accelerated video encoding, drew about 45w when engaged (this could include drive load). I'd like to half my idle draw to match your results, but it may not be possible if you don't have a platter drive hooked up, and only one case fan. Hmmm, good to know what a baseline could look like for an ATX platform! I had benchmarked a game in 1080p 1440p (downscaled) and 2160p (downscaled) ranged from 120~171w, peaked at 216w when recording/encoding video while playing a game.
  7. It seems like a mixed bag ( spreadsheet, imgur.com ) and ymmv depending on loads that aren't listed, the timings adjustment seems to be the thing to look at.
  8. come back to this topic next gpu generation(?) to see how DLSS&RIS panned out for performance gains and a subjective quality comparison.
  9. advertised as 2.3GB/s for the 500GB model, went by the average speed on ssd.benchmark, I'm curious what the bottleneck is for most folks using it. Was their choice to have the "fastest" option for $37 more, caution was given about potential clients not having these fast ports atm. unlikely, if they did want encryption I'm uncertain if Samsung addressed their issues on a hardware level, and could suggest veracrypt for software encryption as a free alternative.
  10. After checking various sources and watching Der8auer OC a 9900K, I've switched to the NH-D15S, inside a Define R6. USB 3.1 g2 should be ~1GB/s, sometime later this/next year they could replace the enclosure and add card for ~1.87GB/s over USB 3.2 with that drive. I decided on the ex920 for os/apps, would function much quicker than the 860 evo in backup situations for $4 extra. Aorus Elite vs Pro, neither have a 3.1 g2 front port to pair with the R6 so went with the Ultra bundle at microcenter for $5 extra over a cable. Around $2,280 for the machine&monitor currently so I think I'm done with part selection, now to make sure there's cables included to hookup the monitor, and burner! I could grab the MPY-7501-ACAAG MWE Gold 750 for $39 less than the seasonic unit... after the 29th once they're in-stock on amazon.. if they don't correct the price.
  11. As a freelance person, I don't know if they'll keep the clients work after it's delivered, or how long projects may take requiring 'regular backups'. I guess that's what the external storage is for, backup and delivery. I just finished watching an hardware unboxed video on the LTTs i9-9900K numbers are wrong?, I've updated the motherboard, Mugen 5 seems to be between dark rock pro 4 and NH-D15 on an overclocked 6700K (tweaktown), and an overclocked 7700K (youtube) should be analogous to the 9900K and the NH-D15 seemed fine there. Hmm, only the Define R6 has 5.25 drive bays. Thanks for all the input so far! P.S. I didn't explicitly say it but there's (2) 970 pros, one for secondary drive (project files), another for external storage.
  12. Last I checked the Z390 UD didn't have any 3.1 Gen 2 ports, and MSI listed the i9900K as compatible for the Z390 PRO-A. Forgive me, I'm a 65W silent computing person who has never had to worry about overclocking and VRMs. 4 dollars more for an OS/Apps drive which from what I've heard, negligible performance increase in that task However I will admit it may do the external storage job well, according to userbenchmark the ex920 would use up 901MB/s of the available 1024MB/s bandwidth over usb 3.1 g2, where as the 1870MB/s will be bottlenecked on the 970 pro. usb 3.2 with 2GB/s bandwidth is supposed to come out some time this year however, I'll keep a look out for a pci-e card for that.
  13. I'm being asked to put together a workstation for an acquaintance (USA, MN), their budget is likely at or below 2,200USD for a monitor and machine total. Between two monitors, LG 27UK650-W & Dell Ultrasharp U2718Q, they've chosen the LG, 2,200 - 449, leaving 1751USD for the machine and so far I've come up with PCPartPicker Part List CPU: Intel - Core i9-9900K 3.6 GHz 8-Core Processor ($474.99 @ Walmart) CPU Cooler: Scythe - Mugen 5 Rev. B 51.17 CFM CPU Cooler ($47.89 @ OutletPC) Motherboard: MSI - Z390-A PRO ATX LGA1151 Motherboard ($109.99 @ Amazon) Memory: G.Skill - Aegis 16 GB (1 x 16 GB) DDR4-2133 Memory ($53.99 @ Newegg) Storage: Samsung - 860 Evo 500 GB 2.5" Solid State Drive ($69.99 @ Amazon) Storage: Samsung - 970 Pro 512 GB M.2-2280 Solid State Drive ($159.89 @ OutletPC) Video Card: EVGA - GeForce RTX 2070 8 GB Black Video Card ($449.99 @ B&H) Case: Corsair - 200R ATX Mid Tower Case ($65.99 @ Amazon) Power Supply: SeaSonic - FOCUS Plus Gold 550 W 80+ Gold Certified Fully Modular ATX Power Supply ($74.99 @ Amazon) Operating System: Microsoft - Windows 10 Home OEM 64-bit ($106.99 @ Other World Computing) Total: $1614.70 They've requested the i9900K, 16GB RAM, GTX2070. Following Pugetsystems recommendations for Premiere I grabbed an SSD(OS&Apps) / NVMe (Project files) and thought a fast external drive would be good so I decided on a 970 Pro&USB 3.1 G2 enclosure. After some digging I found out getting that 3.1 G2 output has a bunch of strings attached: more expensive case and or motherboard for front panel access, the $20 USB 3.1 G2 mod costs twice as much as a cable routed from the back I/O, which is what I've decided to go with. Took awhile to track down why cables only come in short lengths, 10Gbps signaling can't go very far. I decided on the 970 pro for their price of each terabyte written, which is par with HPs ex920. Since I don't have any idea on their Premiere workflow, besides "16GB is fine for what I do", I have concerns.. 1) the SSD/NVMe looks like they're behind the chipset on this board, is that a problem? 2) would 1 stick of 16GB be fine now, in-case they change their mind and want 32GB for dual channel support? 3) is ram clock speed important? 4) would pro&raid1 support be important to you for project files vs a backup of project files to the OS drive? 5) I haven't really followed solid state tech since 2014 so, have I glossed over a feature that may not show up in benchmarks? 6) I haven't had an Intel system since the 366MHz Celeron (never obsolete eMachine, lol) and I'm unsure if a modest cpu overclock is in the cards with the PSU I selected, I'm unsure how much a GTX2070 pulls in that kind of workflow. They'll buy the parts soon™ so time-sensitive deals aren't in the cards, I had advised to wait for Pugetsystems review of the 3900X but their response was essentially "too new, sold out". I should probably ask if they're okay with whole 'cable routed from the back is your fastest one' approach... P.S. I have that case for a requirement, requested a DVD burner
  14. I see input lag measured around the internet, WydD (Medium.com) Wired PS4 controller vs Wireless, measured via USB Host Shield (1ms device), Wireless on avg was 7.3 to 1.8 ms quicker depending on the dongle, coming in at 3ms avg with a good one. Plus other arcade sticks. Rocket Science (youtube.com) photosensitive diode vs 1K FPS camera setup, camera was 1ms quicker razor naga vs arduino, arduino was 4.4ms quicker despite both being 1000Hz (1ms) devices the arduino had 10.8ms of total response time TFTCentral (TFTCentral.co.uk) The quickest monitor measured with their own photosensitive diode&oscilloscope scope system, Asus ROG Swift PG279Q, 3.25ms total display lag The PG35VQ@200Hz has a measured 2~3ms g2g at certain shades LTT (youtube.com) XL2730Z@200Hz avg of 21.5 frames/960FPS of input lag or, 21.5/960*1000, 22.4ms total So this recent video of yours you've got the Acer FX250Q or A monitor which I couldn't find 3rd party measurements for, anyway you've got sub 14ms response times now.. so I've seen a fast clicking arduino with a 10.8ms total response time and your quickest mouse response of 13ms total response time, both on 240Hz monitors. TL;DR version - I'd say after a certain point the game engine (larger part?) and display must be the bottle neck if a USB Host Shield, intercept signals from devices like you described at the beginning of your video (check medium link for more info), can measure button response time avg of 3ms on a PS4 pad over a good bluetooth dongle. Would that be fair to say?
  15. Dxtory software before shadowplay/VCE/GVR were a thing. Sorry for necroing but I found this feedback (from 2015) helpful! Dxtorys video codec [w/out compression/YUV420] was more reliable than the Lagarith codec for my long plays (1h 30m) and can achieve 60FPS if I wanted (on a FX6300), however discovering UtVideo recently, my first few tests showed that it performs/looks indistinguishable as the dxtory codec at only 56% of the size! That being crucial to me for what was originally a 149GB recording would only be 83GB if the codec proves it's just as good at transitions as it is on static-ish backrounds [pinball tables that don't change perspective often]. Shadowplays 50Mb/s preset would dial in at around 34GB which is even more incrediable to me but I don't currently have a current card that records to x264 for a quality comparison and I havent found the settings or right x264 encoder to use to get something on the fly via CPU. I think my last long video [720p] which was a little over an hour at 120-130GB compressed via x264 to 1.7GB, its bitrate maxs at 30Mb/s for transitions and it sits at 1Mb/s to 3Mb/s during gameplay... could videocard x264 be variable too? I'm stuck between a new HDD or videocard x264 recordings for future recordings as I am low on available space. Edit, installed fraps and obs, got some benchmarking done for frametime/fps results here Order=no recording,dxtory,lagarith,obs,UtVideo601,UtVideo709 Y=baseline|R=29.97|G=30|B=60 top=forced flipqueue|bottom=no tweaks The game I play wants you to record at 60fps or use obs. this particular game slows down if game is rendered slower than 60fps, with a majority of frame drops you could see a ~7-8ms stutter in other methods going to use 50Mb/s via OBS in the future, saved me some money.
×