Jump to content

jayark1

Member
  • Posts

    32
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

jayark1's Achievements

  1. I wouldn't go OLED for desktop/PC gaming for those reasons alone. Burning is hit/miss, and certainly NOT guaranteed you won't experience it. There's even "burn-in" prevention technology they like to advertise, but it's still a known problem with people. Also, burn-in isn't covered under warranty. That tells you how confident they are in their burn-in prevention technology. Until they cover that under warranty, I don't believe any marketing pitch about burn-in prevention technology.
  2. I felt there were some Edge Bleeding, but personally, that didn't bother me. I'm typically not overly bothered with those types of discrepancies. Either that, it was minimal enough to be negligible. However, some people are more sensitive to that than others.
  3. The LG UltraGear 34GN850-B is a fine UW. LG has a new version of this now: LG 34GK950F-B, which I'm personally unsure of the differences. I feel the 34GN850-B is probably best in its class, however, I returned it. Not because it isn't a good monitor, but because of the drawbacks with Ultrawide Monitors in general.
  4. To better answer that, you have to ask yourself what size Monitor you want. If you're staying at 27 inches, the difference between 1440p and 4K won't be drastic, visually, in fast moving games since the screen is smaller and making pixel density more dense. The only big difference would be the amount of workload your computer will have to accommodate to play a game at 4K, as opposed to 1440p. If 27 inches is what you prefer, then 1440p, in my opinion, would be better for performance and will still look visually impressive. To me, at 27 inches, it's not worth the amount of power and processing your PC will have to ensue for what little difference you'll see at 27 inches. Also, as games become more advanced, you'll have to slide those graphics down to keep up. With 1440p, you'll keep those High settings, longer, and still keep performance. Now... you want to stay at 1440p and really began to tap into 3080 performance? Wait until 1440p/240hz monitors become more mainstream soon.
  5. I feel that Monitor will be dated really soon since it doesn't have HDMI 2.1. I feel Q1 of next year you're going to start seeing a mainstream of 4K Monitors that properly support HDMI 2.1 for the same money with better features and options.
  6. I returned mine recently. I couldn't handle the curve and how it seemed to warp pictures. Also, not all games support UW. Especially some competitive multiplayer games for which intentionally don't support UW . I feel as though more and more online multiplayer games are nerfing 21:9+ aspect ratio and will continue to do so moving forward. I play Dead by Daylight and it doesn't even support letterbox well. You have to literally change the resolution in Windows to get the game to work as intended. When things are supported correctly, I didn't like having to pan across the screen from Left to Right, even on desktop. I'm used to looking at the right hand corner quite a bit in Windows and it just didn't set well with me on UW. I suppose I'm used to multiple monitors where I can stay lazer focused on one thing as opposed to panning on one screen for information. Also, with gaming, you know that pretty much everything's going to work on a 16:9 screen. I'd rather just use a 16:9 main screen for which I know everything's going to work as intended and will be supported.
  7. Sounds good.... so am I more concerned about a CPU's "Turbo Boost" Clock Speed as opposed to its base clock speed for gaming?
  8. Would there be a substantial difference to justify selling a 3900 (non X) for a 3700X if I'm primarily gaming at 1440p? (combined with a 3080)
  9. Question: AMD Ryzen 9 3900 3.1Ghz [4.3Ghz Turbo] - 65W Processor AMD Ryzen 9 3900X 3.8Ghz [4.6GHz Turbo] - 105W Processor The money difference between 3900 and 3900X is ~$150.00. Is the price jump worth it going from 3900 to 3900X? Will there be a substantial, noticeable difference in gaming between these two CPUs when combined with a RTX 3080 playing games at 1440p or 4K?
  10. I was just looking this up myself as I'm interested in 240hz 1440p for 3000 Series. There are *some* reviews on the web, but not very many. Also, this is a TN panel... which is older tech and I'm interested in IPS Panel with these specs along with higher response times. Then there's this: Acer XB273U GX 270Hz. This isn't out yet and some articles earlier this year suggested September 2020 for U.S., however, it's September and Acer hasn't announced anything publicly about this (that I know of). Another site suggested "Late 2020 or Early 2021". Samsung Odyssey G7 is about the only abundant 1440/240hz on the market that I know of, however, I've read overwhelmingly negative reviews about it due to "flickering" issues....
  11. Based on research, I haven't seen anywhere that there will be new, native boards for Zen 3 "then". Some people have speculated that X570/B550 will be for the Zen 3 for the next few months to a year after Zen 3 launches, although they won't support anything past Zen 3. Again, speculation... but unfortunately I haven't found any news, or even upcoming news announcements, for a x670 or equivalent.
  12. Need some advice. Plan to get a 3080 and waiting on the new Ryzen 4000 Series. I've read the Asus ROG Strix B550 and ROG X570 is supposed to support Ryzen 4000 Series. Is this true? Which of the two is better and what are the differences? Anyone recommend anything better? Plan to use this with an 3080 GPU.
  13. Thank you guys for the insight. Based on your advice that 27 inches is negligible if comparing the same size with a 1440p, I feel UW (21:9) at 1440 hz using the new Ampere is going to be the better experience. I would hate to invest in a 4K 27 inch, 144hz monitor and not really see the difference and it tax my GPU harder... seems a waste of resources.... and especially if games will inevitably become more taxing... and by lowering the resolution, causing potential ghosting/blurring. I understand some games don't support the 21:9 natively, but it seems that most do. Most Microsoft games as well as the new Cyberpunk is going to support that format. If they don't, people tell me you can play it at 1440p at 16:9 (letterboxed) and not have any issues... no different than viewing it at a normal sized screen, but just some black margin Left and Right. At this point, the only other option that would possibly be better would be a 16:9 1440p 240hz monitor for the 3080 at 27" - 30" inches
  14. I will be building from scratch as I've been away from PC gaming since 2006. I'm waiting on Ampere cards and particularly eye-balling the 3080. I want an experience that sets a standard away from consoles, which is why I've considered UW as consoles don't support that (as of now anyway). BUT, a 3080 supposed to play 4K at 120fps, however, if a 27 inch monitor is negligible at 4K, I'd rather have a better experience with UW. I suppose the main question is this: Will 27inch 4K/144hz be a better experience? Or will 34inch UW 3440 x 1440 (21:9) at 144hz be a better experience? Considering the GPU can easily handle both?
  15. Is 27 Inches negligible for 4K gaming? Compared to an UltraWide 3449 x 1440 at 34 inches? (Considering both have same refresh rates) My two considerations: 1. UltraWide: LG 34GN850 (3440x1440 - 144hz) 2. Standard: LG 27GN950 (4K - 144hz) If that's the case, then wouldn't the UW at 34 inches be a better experience? The GPU would be 3080.
×