Jump to content

Link8295

Member
  • Posts

    8
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Link8295's Achievements

  1. I know. I'm not trying to suggest manufacturer is at fault. I was just wondering if it is possible for things like that to arise from a drop.
  2. Thanks for the info guys. Yeah I'm surprised it survived also. I guess it falling on its back as opposed to face first made a difference? Can it falling introduce defects that may not have been there before such as backlight bleed getting worse or anything like that?
  3. Long story short dropped monitor off table about 2 to 3ft high. Monitor fell on its back and received some damage on its casing. Luckily the monitor seems to be working fine. Is it possible though that it may have damaged any internal components that can be a problem down the road?
  4. It's not but I was just expecting a stable 90fps or 100fps at very low settings. The constant dips cause stuttering etc when monitor hz and game fps aren't matching. I guess lock game and monitor to 60?
  5. My system is ryzen 7 1700 OC to 3.7ghz+ gtx 1080ti + 16gb ram. In assassin's Creed origins I get on average 85fps even on very low settings at 1080p. Does that add up? Secondly I'm kind of a noob to pc gaming. I have a 144hz monitor. If I am getting lower than 144fps in a game am I better off locking the monitor to a lower hz to match the fps?
  6. Thanks for clearing that up. About dithering though is there anyway to ensure no dithering? Like getting a true 8bit monitor? But then I hear the gpu on a device for example like a xbox one or ps4 can force dithering even onto a 8bit monitor. Even though dithering might not cause a visible flicker I hear it may still be bad for eyestrain even if you can't see it? Some people on Intel forums and Apple forums have been making temporal dithering out as the culprit. Also would the refresh rate flicker apply to plasma televisions the way it applies to crt? Or is plasma fast enough and persistence long enough for it not to be an issue of any flicker? I know plasmas use pwm but from what I understand the pwm used in plasmas shouldn't be problematic as it's not the same kind of Pwm used in led monitors where the led is constantly flickering from on to off. Thanks for any help.
  7. Hi, wondering if anyone can help. I'm looking for a flicker free and temporal dither free gaming experience. Firstly I understand flicker can be caused by PWM. So removing that is a easy one. Where it gets complicated is that is it true that at 60 HZ refresh rate there is flicker regardless? Here are a couple of sites that talk about this not sure if I understand it correctly. Do we need 800hz displays?! https://www.nature.com/articles/srep07861 http://www.conradbiologic.com/articles/SubliminalFlickerI.html Secondly temporal dithering from what I understand can introduce flickering. So if a true 8 bit panel is used that would mean no temporal dithering? But aside from the panel being used I hear that the graphics cards can enable dithering by default. Not only ATI chips and Intel integrated chips but that even nvidia geforce chips from 900 series and onwards enable dithering. I hear specifically the Playstation 4 and Xbox have dithering on by default. The Nintendo switch is a question mark since it uses older nvidia Tegra and maxwell based chip. My questions then are, is there anyway to ensure temporal dithering is off and avoid any flicker? Is there any type of display be it oled to lcd to plasma or projector combined with a source be it a gaming console or PC setup that would ensure a flicker free experience. Speaking of oled and plasma since they are self emitting displays that don't use back lights would that negate any flicker or dithering problems? I hear plasmas heavily dither and flicker using pwm but I suspect they aren't a issue since it probably happens at a high enough hz. Thanks for any help.
×