Production Expert

View Original

Timecode - Part 3 - The Origins Of Broadcast Television Frame Rates

In part 2 of this series on timecode and frame rates we discussed projected film eventually settling on a 24fps standard and the need for multiple bladed shutters to overcome the associated flicker that occurred below Edison’s 46fps ‘persistence of vision’ threshold.

Moving on to the 1930s, at the dawn of the television age, engineers faced new technological challenges to bring coherent images and sound to this new medium being broadcast over the airwaves. Broadcasting at the film standard of 24fps was not an option as noticeable flicker would still be present and added to the fact that transmission bandwidth limitations coupled with early television technology would not allow for the same frame to be shown multiple times a different solution had to be found.

'Cometh The Hour, Cometh The Engineer'

To conserve bandwidth and avoid flickering Telefunken and RCA engineers collaborated to devise a system whereby each frame of video would be interlaced.

To explain what this means this we should first quickly cover the prevalent screen technology at the time which used Cathode Ray Tubes or CRT for short. These CRT screens worked by shooting a highly charged focussed beam of electrons from the back of the television set towards the phosphor coated glass of the screen. This energy caused the phosphor particles to emit light in the form of photons.

The electron gun would fire from left to right across the screen effectively drawing the picture one horizontal line at a time (the number of lines and therefore perceived resolution in each frame differed between broadcast regions)

Interlacing involved splitting each frame of picture in half into two separate fields containing its respective odd and even horizontal lines (referred to as upper and lower fields). Broadcasters would transmit only one field at a time effectively halving the bandwidth required. The CRT televisions would then draw these fields across the screen in a comb like fashion one at a time with the two-combined equalling one whole frame of picture.

An example of interlacing two fields together to create one single frame

To get around the picture distortion which would have been caused by the incoming electrical current it was decided to synchronise the refresh rate of these fields to that of the AC power which in the US was 60Hz.

A 60Hz refresh rate meant that 60 fields per second were broadcast overcoming the 46fps persistence of vision barrier and resulting in 30 full frames of picture per second.

This rate of 30fps became the standard adopted for black and white television broadcast in the US by the National Television System Committee (abbreviated to NTSC) and is sometimes referred to as 60i giving heed to its interlaced nature.

European broadcast engineers in the meantime developed their own systems that synced to the European mains rate of 50Hz resulting in 50 interlaced fields being broadcast per second and a frame rate of 25fps for black and white (again sometimes referred to as 50i).

Everything was running smoothly until colour broadcasting came along.

You can't please everybody... or can you?

There had been plans to create an entirely new colour broadcast standard utilising the newly available UHF spectrum of frequencies (300 – 3000MHz) however in the few short years’ that the new standard was being considered television ownership had exploded in the US from just one million to over ten million sets. To suddenly make all those US consumers newly acquired technology redundant overnight would not have been a popular move.

The stage was set for the NTSC to collaborate with RCA to create a new colour television standard within the existing VHF frequency bands (30 – 300MHz) and enable backwards compatibility with all the existing black and white televisions in circulation.

It was found that by splitting the colour signal into luminance (black and white information) and chrominance (colour information) you could just about ‘shoehorn’ the extra colour signal (referred to as the chroma carrier) into the existing bandwidth. This was not without compromise however as the chroma carrier sat very near the existing audio information in the transmission signal. To avoid potential interference between these two distinct signals it was necessary to slow down the broadcasted frame rate by a factor of 0.1% - this ensured the chroma carrier and audio signal remained out of phase with each other thus maintaining their integrity.

A detailed overview of the NTSC broadcast spectrum

Slowing down the frame rate by 0.1% meant that 59.94 interlaced fields were now broadcast per second resulting in a new standard frame rate for colour television in the US of 29.97fps.

European broadcasters were able to devise their own new colour television standards (namely PAL developed by Telefunken in Germany and SECAM developed by the French) which overcame the same bandwidth issues that plagued the NTSC and could accommodate the chroma carrier without modification thus maintaining 25fps as the frame rate.

Join me in Part 4 where we discuss why broadcasting at 29.97fps caused further headaches and why entirely new timecode solutions had to be invented to maintain workable frame vs actual time relationships.

See this gallery in the original post