Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited

Posts posted by Mholes

  1. 50 minutes ago, Galion said:

    Which cpu should I get? 

    I plan to o use it for gaming, browsing and game emulator, 

    Games I'll be playing :

    Battlefield 3,4,5, Assassin creed origins, Smite, Overwatch, World of tank, Rocket league and F1

    Emulate : ps 2, NES, SNES, neo geo, sega/sega md

    My options :

    I5 8400 - $148.69

    I5 8600 - $179.99

    R7 1700 - $159.99

    R5 2600 - $164.99

    I'll be pairing it with a gigabyte aorus RTX 2070 8gb

    Personally i'd choose the R5 2600. While it might lose to i5 8600 in games (not by much), it provides a better upgrade path.

  2. Personally i would get 1070Ti, it has great price/perf and since RTX raytracing is kinda meh atm i don't see RTX cards as an option.

    Comparing to VEGA's: the 1070Ti uses less power and still beats both of them, VEGA 64 will beat it in some scenarios/games.


    Also with 8GB VRAM the 1070Ti should hold up in 1440p. Some games might even run with ULTRA settings.


    TL;DR; - 1070Ti - cos it has the best price/perf without TENSOR cores out of these cards.

  3. 3 minutes ago, LukeSavenije said:

    1) no, you'll need something external for that

    2) yes

    3) sometimes

    4) probebly


    Just now, cashan said:

    phones have tiny screens tho. Like it wouldnt make that much of a difference if the 4k monitor is somewhat large. I get that there are more factors at play like color n stuff. Just reflecting


    Yea, phones have much higher PPI. So it really depends what you are looking for in the photos/videos. Agree still tho.

  4. Personally i prefer 1440p over 4K atm. Main reason being the fact that you can't easily reach high fps in any games while playing in 4K. With 2080Ti you can get close, but NVlinked 2x2080Ti's are a must if you really want to reach those sweet 120fps marks. Personally I'd buy either an 1440p 144Hz etc etc. OR an ultrawide 1440p monitor with 120Hz etc etc. Personal opinion tho. You should wait for other answers too.

  5. 8 minutes ago, JorksX said:


    I just fixxed it by my own, I've had gtx960 for few years at least and when I got it I remember someone said that it supports only 2 monitors.

    now I looked from windows screen resolution settings and saw that "one output availible" and already had two monitors connected. now connected the 3rd one and all displays working from same gpu :)

    Any idea how much performance I lose by doing this?

    I mean I have 40" tv, some old 60hz benq monitor and this 144hz samsung connected to gtx960.

    Do I lose like 20fps ingame or not at all?


    the display itself has 60hz mode and 144hz mode and dark mode. from display settings there is only contrast, sharpness and brightness settings. nothing else.

    I did not found anything in the manual also.

    The performance impact is like max 1-3fps in games. Unless you do something taxing on another display while gaming with another.

  6. 11 minutes ago, JorksX said:

    Not sure if this is gpu problem but basically I just got samsung C24FG73FQU display, connected it with displayport to my gtx 960, and my other benq monitor is connected to same gpu via DVI cable.

    Before I had two monitors, one with DVI cabl and on with HDMI. I had no problems

    Removd the HDMI cable and the display and replaced with that samsung one.

    everything is working fine until this samsung monitor goes to sleep mode and from windows settings I cannot see the samsung monitor anymore. I have to manually turn on the samsung monitor and then windows detects it and everything is fine again.

    How can I fix this sleep mode problem.

    from windows power settings everything is "never"

    I'm using windows 8.1

    I tried to google it but did not found anything with windows 8.

    There was few windows 10 posts where I had to change some reggit settings but I cannot find them on windows 8.



    seccond problem im having that I cannot connect 3 monitors. gtx 960 supports only two but I have gts 450 for my 3rd monitor.

    Nvidia changed some stuff because like 6 months ago, when I downloaded gtx 960 drivers, my gts450 with displays started working and I had 3 monitors working fine.

    but NOW I have to first download gtx960 drivers. then I get my 2 monitors working.

    Then I have to download gts450 drivers. And I get my 3rd monitor working.

    so I get all my monitors working but for example Battlefield 5 starts yelling at me that I have old gpu drivers.

    If I downlaod newest gtx960 drviers and install them. I get to play battlefield 5 fine. but my 3rd monitor is not working and windows wont detect it. I have to download gts450 drivers again to get 3rd monitor working again but then battlefield 5 starts saying I have old drivers.

    Well, the Samsung C24FG73 might have it's own sleep settings. You should check those. No idea about the drivers tho.

  7. 45 minutes ago, DoctorMckay said:

    I just got my new 1080ti and I have some doubts I wanna clear before buying a decent monitor for it.
    Until now I had a 1080p 60Hz TN AOC screen paired with a GTX 770, V-sync on almost 24/7.
    I'm planning to buy a 1440p 120Hz+ screen.

    Question number 1:
    For what I'm seeing in many benchmarks, getting some games to 120fps+ is not possible with the max settings on with a 1080ti. Let's say I get 90 fps on X game. Would I need to activate V-Sync or G-Sync? Is there any image tearing for underfeeding frames to a screen? 


    Question number 2:
    Are there any screens I really need to avoid? I mean, I'll be expending like 350$+, ghosting should not be an issue at those prices, right?


    Question number 3:
    On games I know I can get 120fps+ easily, let's say Osu! or CSGO, if I'm capable of feeding almost 200fps to the screen with my GPU, could V-Sync be enough for playing at 120fps or would it absolutely trash my gaming experience?(Maybe I do not exactly understand what V-Sync is, I know what it is for, but not how it works)

    Question number 4:
    Kinda related to the previous question. G-Sync. I doubt I'll be playing Overwatch, League of Legends or CSGO. I don't have many games where I know I'll be getting tearing for overfeeding frames. Also, I dunno if V-sync can be used for 120fps gaming. Is G-sync a "must", a "buy it if you meet these requirements(list them pls)" or a "don't bother".

    Q1: You should activate G-sync, G-sync makes lower frame rates look smoother on a high refresh rate display. Since it will change your displays refresh rate to match your fps.


    Q2: Some people avoid VA-panels, you should check reviews etc before buying.


    Q3: If i remember correctly you could use G-sync + V-sync. V-sync removers screen tearing when going past the displays refresh rate and if you somehow go under it the G-sync will smoothen it up. Even better you could just limit the max fps of the game to your displays refresh rate.


    Q4: Q-sync is a nice feature to have if you cannot get to the 120fps, it will make the picture look smoother like i said earlier. V-sync in the other hand will just limit the fps to your displays refresh rate, for example 120. This will help with screen tearing.


    EDIT: If you want butter smooth experience with Nvidia GPU you should buy G-sync monitor.

  8. 3 hours ago, Marhier said:

    I'll give up for now, lol.
    I know it's an entry level TV and you get what you pay for; I'm only using it for PS4 really - I've got a PG279Q for the PC, which it damn amazing even if not HDR.

    Looking at these entry level TVs though, I'm interested to know how they can advertise them as HDR, when I think these models only achieve a peak brightness of 300-500 nits - isn't that SDR territory?

    Yea it's kinda shady, and yea it's SDR maybe HDRish if the peak brightness in over 400. When i tried to find a good HDR TV i just checked if the TV had a VESA HDR cerification. That is one way to identify if the TV really has an okayish HDR or just a bright SDR.

  9. 7 minutes ago, Marhier said:

    I never did get that call back, lol!

    Hmmm, i don't really know what to say :D Weird that they say it's HDMI 2.0a ?

    Well if the LG looked worse the Samsung is still the winner here even with HDMI 2.0a, since HLG isn't a must it just helps with TV broadcasts etc. :D

  10. Just now, Marhier said:

    So I ended up getting the Samsung.
    I went to the shop, the LG TV was on display and it looked awful; they were playing some Harry Potter trailer and maybe it wasn't in 4k, but the motion blur were coming off the movement of the characters really made the TV look bad.

    They has the 55" version of the Samsung I wanted on display and it looked incredible - obviously nothing like their OLED/QLED displays, but then again, I'm not spending £2k on a TV - I'm going entry level, lol.

    Prior to me going to the store, I rang PC World's technical team to confirm whether or not the Samsung's ports were 2.0b.
    I said to them that their documentation said it has HLG and asked how this could be if it wasn't 2.0b - they couldn't give me a straight answer and gave me a number for Samsung support.

    I rang Samsung and the 2 people I spoke to there were looking through the manual and confirmed it had HLG, but couldn't confirm for definite if it had HDMI 2.0b, lol.

    They then said as it's a 2018 TV, they're pretty sure it will have the most up to date HDMI ports.
    When I went to look at the TVs in person, the people in the shop said exactly the same thing as Samsung.


    Following on from this post though, I'm assuming it at least has HDMI 2.0a, because HDR isn't supported on 2.0, which leads me to my original question:

    "...if HDR is only supported by 2.0a and above, how does the Samsung TV do HDR if nothing you plug into it will work?"


    The table above contradicts this statement?

    Any more thoughts?


    Yes, you are right. And good that you checked the LG one. The monitor MUST have HDMI 2.0b since 2.0a doesn't support HLG.


    And yea since these are entry level TV's you really need to see them personally. Glad to help.

  11. 37 minutes ago, Marhier said:

    I'm probably going to go with the LG, as this one defintely says HDMI 2.0b... Every website I look at only shows HDMI 2.0 for the Samsung; what makes you think both should have 2.0b?

    I know with HDR on, these TV's have 40-50ms response times, but I only play single player games on my PS4 and doubt I'd notice really.

    Because both of those panels should have HLG support according to https://www.displayspecifications.com/en/model/1ed713e1

    HDMI 2.0a doesn't support HLG but 2.0b does.

  12. 2 hours ago, Zandvliet said:

    I also have a question about this. I'm looking at the Samsung UE43NU7190 which according to https://www.displayspecifications.com/en/model/5a511342 has HDMI 2.0 not 2.0a or 2.0b. But the samsung websites say the TV has HDR 10+ and HDR. 


    What's the story there?


    EDIT: hmm just noticed that the 50UK6470PLC also has HDMI 2.0 according to https://www.displayspecifications.com/en/model/4910121f


    Maybe the information is just incorrect..?

    Yea, all these HDMI 2.0, 2.0a and 2.0b are very confusing because no one lists them right. Sometimes they are listed as 2.0b sometimes as 2.0.

    BUT HDR -> 2.0a, HDR + HLG -> 2.0b

    In conclusion both of those monitors should have HDMI 2.0b.

  13. 41 minutes ago, Marhier said:

    Thanks Mholes, that table makes sense; so 2.0a is just the addition of the wide colour gamut?

    If just using the TV for PS4 gaming and watching Netflix etc, which of the TV's do you think is better for best image quality - I'm assuming a TV that can support WCG would be better?


    I personally prefer the LG. In my opinion it is the better choice.

  14. HDMI versions compared

    HDMI Version Max Resolution Max 4K Frame rate HDCP 2.2 HDR WCG Hybrid Log Gamma Dynamic Metadata
    1.4 4K 30Hz No No No No No
    2.0 4K 60Hz Yes No No No No
    2.0a 4K 60Hz Yes Yes Yes No No
    2.0b 4K 60Hz Yes Yes Yes Yes No
    2.1 10K 120Hz Yes Yes Yes Yes Yes