Had this reflection that 144hz screens where the only type of screen I knew, that was not a multiple of 60. 60 Hz - 120hz - 240hz - 360hz

And in the middle 144hz Is there a reason why all follow this 60 rule, and if so, why is 144hz here

  • CubbyTustard@reddthat.com
    link
    fedilink
    arrow-up
    89
    ·
    edit-2
    1 year ago

    OP the top comment of this reddit thread sums it up pretty nicely

    60 because that’s the frequency of the North American power grid which became the timing source for analog television. See this video from Technology Connections for details about how that worked. It was the standard, and standards die hard. 120 because it’s 60 doubled, the next logical step. As a bonus, it’s divisible by 24, so you can watch cinema-standard 24 fps content without some frames being on screen longer than others (which was an issue on analog TV, see 3:2 pulldown). 144 because it’s bigger than 120 for marketing, and the next number divisible by 24. 240 - 120 doubled. You can probably spot the pattern.

    There are monitors with different refresh rates, like 75, 165, 175, 200… because we don’t need strict TV or cinema compatibility anymore, but the existing standards still rule most of the market.