Jump to content

Are 50hz displays still a thing?

JoostinOnline
Go to solution Solved by mariushm,

The TV broadcast standards used 50 Hz and 60 Hz  in order to make manufacturing of TVs easier ... the AC frequency was used as timing, to simplify circuitry inside the TV and make them cheaper.

Some of the earliest computers used TVs for display and RF outputs, moving later to composite,s-video, then cga, ega, vga... 

 

By the time ega and vga was a thing, the price of components got low enough that it was possible to have variable refresh rates.  Majority of computer CRT monitors supported wide range of refresh rage, from something like 48 Hz (24 fps interlaced) to 160+ fps.

The peak Hz was limited to the performance of the RAMDAC of the video card (digital to analogue converter) and how fast the electronic circuitry in the TV was ... so for example, you could have 1024x768 at 85 Hz, but you could have 320x240 at 200 Hz ... as long as the bandwidth is below what the RAMDAC on the video card can do, and the tv can pick up, it would work.

 

The 60 Hz was kept for backwards compatibility but higher refresh rates were implemented because it was determined 60 Hz can cause eyestrain, headaches and other issues because of the fluctuations of light intensity on the CRT screen. People determined that  75-85 Hz was the sweetspot where majority of people no longer notice that flicker, that fluctuation in brightness that cause eyestrain or headaches.

This was not a problem with TVs because TVs were much bigger and usually sit further from your eyes, and the resolution is much lower (720 x 480 or 720x576)

 

When LCD monitors (and plasma monitors) were invented, this refresh rate thing was no longer an issue, because once you set a lcd pixel on or off, the pixel doesn't have to be "refreshed" constantly to maintain the brightness as it was the case with CRT monitors.  So there's no eyestrain, no headaches, no need to update the panel at least 75-85 times a second.

If you're watching a movie, it's enough to update the lcd panel 24 times a second or 30 times a second and it would work perfectly fine - in fact a lot of LCD monitors actually support 24 or 30 Hz refresh rate.

The industry decided to stay with 60 Hz, because this way the factory that makes LCD panels for monitors can also make LCD panels for TVs, which have to update 60 times a second, because that's what your TV broadcast sends, 60 frames per second.

 

I remember back when analog connections were standard, 50hz was normal for PAL regions and 60hz for NTSC. I never see anyone here talking about 50hz monitors. Did that die with analog (or something else), or is it still a thing?

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

Monitors weren't tied to the TV refresh rates. Even in those days I think I was running the highest supported between the display and GPU, typically 75 Hz but memory might be off on that.

 

I just looked up the model I had, the Sony Multiscan SF II 17". It support up to 120 Hz, but I don't remember using anything that high. At a certain point it had to go into interlaced mode so I had to operate below that. I still have it in my store room and occasionally think about digging it out for some retro feels.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Monitors aren't TVs to begin with.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

I looked further in my monitor manual, dated 1995. My memory wasn't that far out as standard modes it supported included:

640x480 @ 60 Hz

800x600 @ 85 Hz

1024x768 @ 75 Hz

1280x1024 @ 60 Hz

 

I think I was running 1024x768 as the clarity of the image degraded at 1280x1024. So even then, 60 Hz was a base standard. It looks like to run at its maximum of 120 Hz you'd have to create a custom resolution at 640x480 or lower to achieve that. I don't think I ever tried.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, JoostinOnline said:

I remember back when analog connections were standard, 50hz was normal for PAL regions and 60hz for NTSC. I never see anyone here talking about 50hz monitors. Did that die with analog (or something else), or is it still a thing?

Both of my monitors (75Hz and 60Hz) can be tuned down to 50hz, it's not officially a thing It would be difficult to find a monitor that is listed as 50Hz but I think a lot of them can do 50Hz, you can even make a custom resolution if it isn't supported out of the box.

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

The TV broadcast standards used 50 Hz and 60 Hz  in order to make manufacturing of TVs easier ... the AC frequency was used as timing, to simplify circuitry inside the TV and make them cheaper.

Some of the earliest computers used TVs for display and RF outputs, moving later to composite,s-video, then cga, ega, vga... 

 

By the time ega and vga was a thing, the price of components got low enough that it was possible to have variable refresh rates.  Majority of computer CRT monitors supported wide range of refresh rage, from something like 48 Hz (24 fps interlaced) to 160+ fps.

The peak Hz was limited to the performance of the RAMDAC of the video card (digital to analogue converter) and how fast the electronic circuitry in the TV was ... so for example, you could have 1024x768 at 85 Hz, but you could have 320x240 at 200 Hz ... as long as the bandwidth is below what the RAMDAC on the video card can do, and the tv can pick up, it would work.

 

The 60 Hz was kept for backwards compatibility but higher refresh rates were implemented because it was determined 60 Hz can cause eyestrain, headaches and other issues because of the fluctuations of light intensity on the CRT screen. People determined that  75-85 Hz was the sweetspot where majority of people no longer notice that flicker, that fluctuation in brightness that cause eyestrain or headaches.

This was not a problem with TVs because TVs were much bigger and usually sit further from your eyes, and the resolution is much lower (720 x 480 or 720x576)

 

When LCD monitors (and plasma monitors) were invented, this refresh rate thing was no longer an issue, because once you set a lcd pixel on or off, the pixel doesn't have to be "refreshed" constantly to maintain the brightness as it was the case with CRT monitors.  So there's no eyestrain, no headaches, no need to update the panel at least 75-85 times a second.

If you're watching a movie, it's enough to update the lcd panel 24 times a second or 30 times a second and it would work perfectly fine - in fact a lot of LCD monitors actually support 24 or 30 Hz refresh rate.

The industry decided to stay with 60 Hz, because this way the factory that makes LCD panels for monitors can also make LCD panels for TVs, which have to update 60 times a second, because that's what your TV broadcast sends, 60 frames per second.

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, mariushm said:

People determined that  75-85 Hz was the sweetspot where majority of people no longer notice that flicker, that fluctuation in brightness that cause eyestrain or headaches.

This was not a problem with TVs because TVs were much bigger and usually sit further from your eyes, and the resolution is much lower (720 x 480 or 720x576)

I had forgotten about this until you mentioned it, but they used different phosphor characteristics in TVs than monitors designed to operate at higher refresh rates. The phosphor keeps emitting light for some time after it is activated and this contributes to the appearance of a constant image. With a fixed 50 or 60 Hz, this is easy to tune. On a monitor with variable refresh, you don't want a long persistence as image changes would get blurry. So it has to be tuned for the upper end of the refresh rate. At the lower end, it is then shorter than ideal, and flickering might become a factor.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

@circeseye  All of those also support 60 Hz, for bluray playback, playing videos from usb sticks or network, connection to PC etc etc...  the 50 Hz is just the default broadcast refresh rate in the countries where the monitor is sold.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×