A colleague and I are having a disagreement about whether LCD panel
pixel latency (e.g. 12ms) can be expressed as a 'refresh rate' (e.g.
63Hz). His argument is as follows:
If a display has a 16ms response time, it means the screen takes 16ms
to update the appearance of each pixel. So there's a lag of 16ms
between the screen displaying one frame and being ready to display the
next.
The frequency of a screen (measured in Hertz) refers to the number of
times a CRT screen's image can be refreshed in a second. In the world
of CRTs, this number refers to how many times the screen is drawn by
the electron gun per second. In LCDs, it's talking about the maximum
number of different images that can be displayed on the screen per
second.
There's 1000 milliseconds in a second, so 16ms panel can refresh the
entire screen 1000/16 (or around 63) times per second. The latencies
of each pixel simply prevent the screen from displaying images any
faster, which provides an effective "refresh rate". In the case of a
12ms screen, the rate is 1000 / 12 (or 83) times per second.
Is his argument correct? Is a CRT monitor refresh rate truly the limit
of how many frames per second it can display? And therefore can LCD
pixel latency really be converted to a notional 'refresh rate'? |