As an Amazon Associate we may earn from qualifying purchases made via links on our website.

As an Amazon Associate we may earn from qualifying purchases made via links on our website.

HomeGuidesRefresh Rate TV 50(60)Hz vs 100(120)Hz vs 200(240)Hz explained

Refresh Rate TV 50(60)Hz vs 100(120)Hz vs 200(240)Hz explained

Refresh rate means the number of frames that TV displays per second (fps), also known as frequency. However, you can meet different frame rates: 50Hz/60Hz or 100Hz/120Hz. Shortly, there’s no difference between 50 Hz and 60 Hz or 100 Hz and 120 Hz, they are the same and are used only for the convenience of users.

50Hz vs 60Hz TV, 100Hz vs 120Hz

There is a lot of mess in TV tech specs and a lot of them are synthetic and don’t tell a lot for you, as for the user. The only one that really matters – is the Refresh Rate, which measures the frame rate of the display.

What is Refresh Rate

Refresh Rate – this parameter shows the frequency of changing the frames per second. This parameter shows the actual frame rate and this parameter depends on the used display quality.

Before 2017, there were only TVs that supported up to 60Hz video, because the standards that existed at the time supported no more than 60Hz. Including HDMI ports supported video up to 60Hz.

Difference between 50Hz vs 60Hz TV explained

However, there are TVs that have tech specs that say they support 50 Hz. How does a 60 Hz video playback work in this case? It is very simple, there is no problem with this because all TVs are designed to display video at 60 Hz.

Difference Between 50hz and 60hz TV: the history of different frequency standards

Earlier in Europe, the AC mains frequency standard was 50 Hz, while in the USA it was 60 Hz. Thus, the first analog kinescope TVs used the mains frequency for picture synchronization, because it was stable and maintained within a strictly limited range.

Today, TVs have long since stopped using the network frequency for picture synchronization, but the concept of 50 Hz for Europe and 60 Hz for the US is so ingrained in the minds of consumers that manufacturers continue to use these parameters to refer to the frame rate of the TV.

In practice these are only technical differences, any 50 Hz TV will support 60 Hz. So you just have to understand that it is the same thing.

Difference between 100 Hz vs 120 Hz TV

100/120hz this is a reserve for the future since 2017, a new standard of HDMI ports 2.1 has been adopted in which the support for video wIn 2017, a new standard for HDMI ports (HDMI 2.1) was adopted that supports 4K images and 120Hz video. Among other things, it will now be easier to use the TV as a monitor.

However, just as with 50Hz/60Hz, there is no difference between 100Hz and 120Hz. The European models indicate 100Hz support, but if the TV is for the U.S., the manufacturer indicates 120Hz.

Also, note that in some models of TVs there are modes of a frame rate increase, which allow you to create additional picture frames, but as the practice of using TVs, 98% of users do not use these modes.

To conclude, by buying a TV with frequency support at 100(120) Hz, you will get a TV with a display, which will positively affect the quality of the picture.


  1. In Europe TV channels use 50Hz framerate (or half-framerate to be more accurate) so having a matching refresh rate on TV improves quality of motion. Even YouTube has many 50fps videos.

    HDMI 1.3 was released in 2006. It supports 1920 × 1080 at 120 Hz, so I see no reason why TVs couldn’t have supported more than 60 Hz refresh rates before 2017.


Please enter your comment!
Please enter your name here

Related articles