Jump to content

flipped_bit

Member
  • Posts

    284
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Profile Information

  • Gender
    Not Telling
  • Location
    Canada

System

  • CPU
    Xeon E3-1241v3 @3.9GHz, 1.07V
  • Motherboard
    Asus Z97-E/USB 3.1
  • RAM
    G.Skill Ripjaws X 8GB (2x4GB) DDR3-1600 @ 1600MHz
  • GPU
    MSI RX 480 Gaming X 4GB @1350MHz core, 2150MHz memory, 1.09 vcore
  • Case
    Phanteks Enthoo Pro M
  • Storage
    Crucial MX100 256GB SSD
  • PSU
    EVGA 750W G2 80+ Gold Modular
  • Cooling
    CM Hyper 212+ w/ Noctual F12 Fan
  • Operating System
    Windows 10 Retail

Recent Profile Visitors

923 profile views
  1. I'm close to buying the Asus VG27AQL1A. The only thing that's holding me back is I can't see an easy way to switch input sources - looks like it requires like ~8 clicks in the OSD. I contrast that with my current old LG 1080p IPS that requires one joystick press followed by a single nudge forward. Some days I switch multiple times between my work laptop and my personal desktop - having to make ~8 clicks through the OSD each time would drive me crazy. It's 2021 - surely ASUS has a shortcut option for switching inputs on their mid-range and higher end monitors? I've looked at the user manual for the VG27AQL1A, and there are two shortcut buttons on the back of the manual but it looks like you can't assign an input switch as a function to one of these buttons. I would really hate for such a small thing (though large from a day to day usability perspective) to keep me from buying this monitor. Am I missing something here? I would be using this monitor for productivity (programming, spreadsheets, word processing, web surfing) ~80% of the time, and gaming/entertainment ~20% of the time. If there's no input switching shortcut option, I may just go with the 27GL83A-B instead. It's an older model with slightly worse contrast and I hate the gamery red accents, but input switching should be easy and it's not Nano IPS (which is a plus for me - trying to stay away from Nano IPS due to the reported eye strain issues). There's also the MAG27QRF (non-QD) which may be the best choice for 144Hz 1440p IPS category, but I've read the panel lottery is especially awful on these. I've also considered the M27Q which with its inbuilt KVM would be perfect for me in that regard, but the BGR text clarity issue is a non-starter for me. Can anyone please point me to a way to change inputs easily on the VG27AQL1A?
  2. Great response Stahlmann, thank you. Agreed. Unfortunately it seems like there are very few such monitors on the market. The Dell S2721DGF seems to be one, but the reports of eye strain have steered me clear until now. It is on sale though...
  3. I have been shopping for a 1440p 144Hz monitor for PC use only for a long time now, and have been reading a lot of rtings.com reviews (in addition to watching videos from LTT, Hardware Unboxed, etc.). On a rtings review, they test response time at both the max refresh rate and at 60Hz. Invariably, the response time is lower at 60Hz. The question I can't find a clear answer to - even on rtings site - is under what conditions will a monitor run at 60Hz, and therefore have a lower response time? I understand hooking up a 60Hz source like a console is one situation. I will be using this monitor with only a PC source, so I want to know whether there are any circumstances where my PC would change from a max (144Hz+) signal to a 60Hz signal. Here are a couple hypotheticals - curious as to which response time would apply to each. For each hypothetical, the refresh rate would be set to max in the OS and in-game FPS would be at 60 - VRR OFF. - VRR ON. - Game engine limits max FPS to 60. VRR OFF. - Game enforces vertical sync and max FPS of 60. VRR OFF. The reason this is important is because a) I want a good response time experience across a broad FPS range, and b) very few monitors have great response time at 60Hz, so if I only have to care about the response at high refresh rates (as this is for PC use only) this greatly widens the range of monitors I can consider. I'd really appreciate some input into this - thank you in advance.
  4. You are right, the jump in single-core performance from the 5600G to 5600X is only about 10%, so pretty negligible. Thank you for clarifying that. With a rumoured 3D cache refresh to be coming for AM4 soon, there may be a further upgrade option for even more single-core performance down the road - though this may only be available for higher core count chips (perhaps not for the 6-cores - who knows). Yes, it's highly unlikely Intel would put out a broken platform, but being an early adopter always carries some risk. Between the 5700G and 12600K I would personally go 12600K, but I may be biased since I'm strongly considering an upgrade to a 12700K myself. But I have had many Intel and AMD based systems over the years - I usually go with the best mix of bang for the buck and platform features.
  5. I'm toying with the idea, yes - especially if I can get a dGPU at a more reasonable price as part of a bundle deal somewhere. My old Xeon/RX 480 combo is still working fine for 1080p, but I've been eyeing a move to 1440P for a while now. My use case is productivity during the day - and for my workloads more threads are more important than more speed - and occasional gaming on the weekends (mostly 4X like Civ, so single-core speed comes in very handy). I will be looking at a VFIO setup in my new machine, but I don't think that's relevant. With the dGPU situation, I am only considering CPUs with integrated graphics for the time being, so although Zen 3 is extremely impressive I won't be going in that direction.
  6. Like I mentioned, I'm looking at a 2x16GB (32GB) pair. The cheapest CL14 kit is roughly double the price of the cheapest CL16 kit.
  7. If you are gaming on the machine, the 5700G (or 5600G) is the way to go. It has a much beefier iGPU, though it's still very weak compared to even an older generation entry-level basic dGPU. It may be years before sanity returns to the dGPU market. If you just need something to run your monitor(s) until you get a dGPU (no heavy gaming, just watching videos, productivity, etc.), I'd strongly consider the 12600K. It blows the 5700G away in single-core and multi-core performance. It it a new platform though, so there is always some risk with that. And it will be more expensive - probably $200 or so after you factor in a more expensive motherboard and a good cooler. And it will also use more power at full load, though it seems to use comparable power at idle and typical workloads and you can always lower the power limits to make it more efficient. Either is a good choice. I just built a family member a machine with a 5600G a couple months ago and it's a great little chip. With the 5600G/5700G, you're buying into an established platform with room to upgrade down the road (5600X for more single-threaded power, 5800X or 5900X for more cores). With 12600K you're getting more performance today, but on a new unestablished platform.
  8. I've been on a review-watching binge of Alder Lake and it looks like pretty much everyone tested the DDR4 boards with DDR 3200 CL14. The early conclusion seems to be there isn't much difference in performance between DDR4 3200 CL14 RAM and early DDR5 sticks, except in certain multi-core workloads where DDR5 pulls ahead. So most reviewers are suggesting you stick with DDR4 for now due to lower cost. However, when I look for 3200 CL14 RAM, it's dramatically more expensive than CL16. The cheapest 2x16GB pair of CL16 RAM I can find is ~$140 Canadian, whereas the cheapest is CL14 kit is $270 - which is getting closer to DDR5 territory. So if you need 3200 CL14 for DDR4 to stay competitive, maybe DDR5 is actually worth the extra money? What kind of drop-off are we looking at with CL16 vs CL14? That's a sizeable 12.5% drop in latency, but this would need to tested to determine the actual impact on performance. Also, I have seen one reviewer (can't remember who) say Alder Lake will actually stay in Gear 1 at higher clocked RAM - so maybe a cheaper DDR4 CL16 kit ($180-200 Canadian) is the way to go. Has anyone done a detailed test of memory performance yet with different frequency and timing combinations?
  9. Ahh I see! I appreciate your and the other responders answers - I will go ahead with this build with confidence I don't need the 4.0 speeds at all, just wanted to make sure the top M.2 slot would work with the 5600G.
  10. One reason I am concerned about this is because I specced out an Intel build as well with a 10600K on a Gibabyte B560M Aorus Pro AX and the spec page for the motherboard says the CPU M.2 socket it only compatible with 11th gen processors, not 10th gen. Maybe they meant to say the 4.0 speed is only compatible with 11th gen, but a 3.0 speed would work fine? Anyway, it put a doubt in my mind even though I'm going with an AMD build (since I can't find the 11400 anywhere). Here is the spec page, BTW: https://www.gigabyte.com/Motherboard/B560M-AORUS-PRO-AX-rev-1x/sp#sp
  11. Putting together a new build I know the 5600G only supports PCIe 3.0, but the top M.2 slot on my motherboard (Asus B550-F Wi-Fi) is PCIe 4.0. Will my PCIe 3.0 NVMe SSD work in this slot at 3.0 speeds, or will the CPU not work with it at all? I'd hate to have to use the 2nd (chipset) M.2 slot. I've read PCIe 4.0 is backwards compatible, so I hope it will be fine. Thanks.
  12. Just necro'ing this thread with a solution in case anyone finds it via search engine. The card's performance got significantly worse since my original post, and since the card's warranty is up and it's next to impossible to get a next gen card now anyway, I decided to disassemble it to see what was going on. Turns out there was next to no thermal paste left on like 3/4 of the GPU. surface area Given I use this PC for like 2 hours a day on average, this was extremely disappointing/surprising. I guess I won't be buying an MSI card again if they skimp on the paste and/or do a poor job of applying it. Anyway, I put a generous helping of MX-4 on there, and now she's humming like new. Load temps back to low 60s, and idle temps at ~20. And high 60s when I put on a high overclock. Very pleased with this result. I would guess the reason my card was not going above 60 is because one or more parts of the die were well in excess of 60 due to lack of thermal paste, and the temperature sensor did not capture this. So if your GPU is stuttering and you're getting low temperature readings, it could be the thermal paste!
  13. I'm looking to upgrade my desktop PC. For years I've used a separate laptop for work and my desktop for personal use, which is primarily gaming. Now my work laptop is getting on in years (8+), and I'm thinking that rather than replace the laptop as well, it would be nice to use the same PC for both work and play. That would save me from using a KVM, and allow me to justify a more powerful desktop (say a 5900X and 32GB RAM rather than a 5600X and 16GB RAM). I work from home running my own business, BTW, so no corporate overlords in the equation. My main concern about this is in regards to security. For example, I know a lot of games use anti-cheat software that scans all of system memory. I read about one recently that even apparently installs into kernel space and starts with the OS. I am concerned about the privacy and data security implications of this. I don't know enough about how Windows handles memory and permissions to know if I'm making a mountain out of a molehill here or if this is a legitimate concern. If I create one Win 10 account for work and one for play, will Windows prevent processes spawned by the play account (such as games and anti-cheat software) from scanning memory currently allocated to the work account (as well as stored files in the work account)? I will also have Bitlocker turned on, but I don't think that matters in the context of my question/concerns. I've also thought about dual booting two instances of Windows, but restarting multiple times a week is a pain in the butt I'd prefer to avoid. Besides, I'm not even sure that would be much more secure than the single boot option. If anyone can shed any light on this or point me to resources on how Windows manages memory security that would be most appreciated. I'd like to have a better grasp on this before deciding which parts to buy.
  14. The CPU very much matters for gaming in certain titles and/or if you are aiming for high FPS. A weaker CPU will bottleneck a strong GPU at higher FPS. If you are aiming for 1080P, sub-100 FPS then yes, you'll probably be happy with a 3600 for a long time. If you're aiming for 1080P 144Hz or 1440P 144Hz, I think there will be times you wished you had a top of the line 8-core CPU. Man, I wish AMD had released an 8-core 65W 5700X. That's the part I was planning my build around, and I don't want to wait until next year to build.
  15. I'm obviously pleased with the improved technology/performance, but that pricing is very disappointing. I was planning on a 5700X build this fall - assuming it would be in the same price ballpark as the 3700X. That's a bit more than I wanted to spend, but I had decided to stretch for it. But now we find the 5700X doesn't exist (yet), and if it did it's probably going to be another $50-$75 more than the 3700X. I guess I shouldn't be surprised. AMD now appears to have the performance crown in all respects, so it's not surprising they'd start charging more. Still, as a long-time AMD supporter (in 20+ years I've only ever owned one non-AMD CPU, and have never purchased an NVIDIA GPU), I'm disappointed.
×