Jump to content

thyrel

Member
  • Posts

    14
  • Joined

  • Last visited

Awards

This user doesn't have any awards

thyrel's Achievements

  1. I have an MSI Trident X equipped with a 9700K clocked at 4.9 GHz, a 2080 Ti, 32GB of RAM DDR 4 at 2666 Mhz in dual channel, and a Samsung nvme SSD 970 Evo plus. PC runs great but I am noticing fps drops after I play for a while, especially in open-world games whenever I move around the map at high speed (Cyberpunk is a good example, but also Watch Dogs Legion). My CPU utilization is very high in such scenarios while GPU sits around 70-80%. I tracked my CPU status thinking that it may be related to overheating and throttling but, although temps are not ideal (up to 85 C but usually hovering around 78 C), apparently there is no throttling, clock remains locked at 4.9 GHz throughout long game sessions (checked with HWiNFO). GPU temps are around 69 C at sustained full load so no problem there. It seems to me that I am somehow CPU limited in those games, even at 1440p, but at the same time seems weird that a 9700K at 4.9 GHz cannot handle these games properly. One weird thing that I have noticed with HWiNFO is that the ''effective clock speed" while gaming is 500 MHz lower on average than the nominal clock speed, I don't know if this is something to consider or not. Then, I thought that it might be a RAM issue since I know that RAM can also induce a CPU bottleneck. My RAM is a low-profile Samsung model running at 2666 Mhz with not too great timings (CL 19). Can this be a RAM issue? Do I need faster RAM to remove this weird bottleneck that seems to get worse after I play for a while? Is it just a plain CPU bottleneck? Unfortunately, I cannot overclock my current RAM since it has no XMP profile and I am afraid that if I overclock it manually the system won't boot and I'll need to reset the CMOS, which is not convenient on a Trident X since you need to remove the heatsink to short the jumpers. Thanks in advance for your help.
  2. Yeah, I can confirm that RDR2 looks absolutely stunning on the C9, even at 1440p ( I have a 2080 Ti but 4K ultra is just way too heavy even for this GPU). I am really considering selling my beloved Samsung CHG70 ang get one of these newer TVs.
  3. Should my next gaming monitor be a newer 4K HDR TV? Recently, I had to move back to my parents house for a while due to this whole covid-19 situation and I got the chance to hook up my gaming rig to their new LG C9 OLED TV and boy... so far SO good. At first I was concern about input lag, but in gaming mode it is pretty good and I didn't notice any latency issue whatsoever. It also supports 120Hz refresh rate (at up to 1440p until a DP 1.4 to HDMI 2.1 adapter comes out) and again, I couldn't really tell the difference with respect to my 144Hz monitor in this regard, everything feels super smooth. Moreover, it is an OLED panel so you get those perfect blacks, top notch color accuracy and an outstanding HDR experience, a far cry from what you can get on any gaming monitor right now. The C9 in particular is a pretty expensive option but I know that there are also cheaper models with most of these features that an equally priced monitor just doesn't have, especially when it comes to picture quality. So my question is: does it make sense to buy one of these TVs instead of a gaming monitor? I know that hardcore competitive gamers will always go for the almost non existing input lag and extreme refresh rate of some monitors but what about all the others? What do you think? Of course, one really big problem with OLED TVs could be the burn in... that's what is really keeping me from buy one.
  4. I would suggest OP even if you do care about the camera. As pointed out in many reviews, the camera is totally fine and very very close to the top ones. Moreover, once Gcam port comes out, you will be able to take Pixel-like quality shot. As an example, my OP6 with Gcam still takes far better pictures than my friends get on their Note 10 or iPhone Xs. As for video recording, iPhone is still on top.
  5. Moreover, once Gcam port comes out, OP 8 users will be able to take shots with Pixel-like quality, which in my opinion is far superior than what you can get on iPhone or Samsung.
  6. I see, but still the noise on iPhone's shot is terrible and I think that does matter when comparing the pictures.
  7. Thanks! I agree with you on this. I just can't ignore all that noise on iPhone's shot, especially considering that a night mode is supposed to keep it at minimum.
  8. Can you elaborate your answer a bit more? I said that colors look better on iPhone but the noise is terrible. You said nothing apart from "are you serious? OP is trash".
  9. Yes I am. OP shot looks indeed better to me; I think iPhone shot is super noisy and grainy. Colors are better on Iphone in this case, but overall it looks worse to me.
  10. So I've seen the OP 8 and 8 Pro review and I just can't understand how Linus considers this: https://imgur.com/C4JOMtm better than this: https://imgur.com/0rIsSaR Provided that both of them are far from prefect, the iPhone's shot is indeed brighter but it's a noisy garbage and looks significantly worse overall. I know it is a small detail but still I would like to know what you guys think about it. p.s. please no OP vs iPhone here, just a discussion about pictures quality.
  11. Thanks a lot for the very detailed explanation!
  12. I have noticed that whenever I have my USB C - Ethernet adapter plugged in, my laptop's battery drains quite faster than when I am on Wi-Fi. Is that normal? I thought that an ethernet cable consumed less power than a Wi-Fi chip... Thanks in advance! :)
  13. $699 before taxes means €760-800 in Europe for a card that performs on par with a RTX 2080, which costs more or less the same and offers ray tracing support. I'm quite disappointed.
×