Analog Component Video

Almost everything you might want to know about "component video" and the three jack cluster color coded red, green, and blue.

Factoids

Y/Pb/Pr, or what we nowadays refer to as component video or color difference video, was invented to maintain compatibility with black and white TV, simplify video electronics and reduce the overall bandwidth requirements for transmitting video compared with RGB. In practice it provides one luminance signal with full horizontal resolution and two color signals with reduced horizontal resolution.

Component video was first used in the early days of color TV, back in the 1950's, as an intermediate step in video processing.

From your DVD player or HDTV set top box to your TV, it is usually analog, thus its full name "analog component video".

Also referred to as Y, R-Y, B-Y or color difference video. Some DVD players label the green, blue, and red jacks Y, Cb, and Cr but for now treat it is Y/Pb/Pr.

Not to be confused with composite video which is transmitted using a single cable plugged into jacks usually colored yellow and usually simply labeled "video in" or "video out"..

Compared to RGB, analog component video is vaguely red, white, and blue. Converting from one to the other requires circuitry, not just a cable.

Inside the TV, component video (and also S-video, etc.) is always eventually converted to RGB.

Be sure you have the cables properly matched to the jacks, that is why they are usually color coded, green for Y, red for Pr, blue for Pb.

Component video can give better resolution of side by side color details compared with S-video because there is more bandwidth available for color information.

Component video comes in different non-interchangeable formats (scan rate formats) for regular TV or HDTV, for example:

Interlaced or 480i from a standard NTSC DVD player,

480p from a progressive scan NTSC DVD player.

1080i or 720p HDTV.

Also in PAL formats.

Sometimes the same red-green-blue jack cluster in back of the TV accepts more than one scan rate format described above and the TV automatically selects the proper one. Sometimes you must select manually  Sometimes separate clusters are provided. If the TV handles both RGB and component video using the same jack cluster, you must always make this selection manually, otherwise you may get color errors such as a greenish picture

High grade A/V receivers and switch boxes for component video can handle any of the above formats one at a time. Lesser grade equipment may degrade the picture quality starting with HDTV, progressive scan next, due to insufficient video bandwidth.

In a pinch you can use an audio/video cable set for component video. Use the yellow cable for Y, the red cable for Pr, and the white cable for Pb. We cannot promise no degradation such as colors shifted as in a poorly done child's coloring book, but a cable less than 6 feet should not be a problem.

In a pinch you can use an unpowered audio/video switch box to run several component video sources into the same jack cluster on your TV. Again, use the yellow or video jacks on the switch box for the Y cables.

SCART refers only to the style (shape) of jacks and plugs. Although usually used with RGB, SCART jacks can be used for S-video or component video also.

Unfortunately there is a hodgepodge of standards for the exact definitions of Y, Pb, and Pr. Picture quality loss occurs if a different formula is used to recreate RGB at the receiving end compared with what was used during video source production.

We will ignore spelling differences such as Y/Pb/Pr versus Y/Pr/Pb versus Y-Pb-Pr.

Return to video page

Go to other topics.


Component Video Description and History

"Component video", broadly speaking, refers to video transmitted as three separate signals (subsignals if you prefer) to represent all colors. The first component video was RGB since the three signals represented pure red content, pure green content, and pure blue content respectively which represent the sensitivities of the three kinds of color receptors in the  human eye.

Today, most video experts use the term "component video" as short for "analog component video" consisting of the three signals Y (luminance), Pr or V or R-Y, and Pb or U or B-Y. For NTSC or PAL (interlaced video formats) the Y signal is the same as that used to construct composite video or that found in S-video. Component video can carry the full resolution and quality of RGB but in so doing it has no reason to exist. Instead, component video was intended to carry, and in practice carries, full resolution for Y and lesser resolution for color as will be described later on.

We could say that red, white, and blue formed the root of component video. Any three colors could be used to transmit video in full color. For compatibility with black and white TV, luminance (or "white") had to be one of the "colors". For simplicity, two chosen from RGB could be the other two, say red and blue. To optimize the construction of a video signal for transmission, and re-deriving RGB in the TV set, it was necessary that the color components vanish (tend to zero) as the actual "color" for a spot on the screen tended towards white or gray. By subtracting luminance (Y) from each of the color components, yielding R-Y (Pr) and B-Y (Pb) respectively, this is accomplished. Luminance continues to represent "white/black content", When Pb is zero, the color would range from a magenta-ish red to a greenish cyan, thus one might say that Pr, which represents an axis 90 degrees removed from Pb, represents those colors. When Pr is zero, the color ranges from and therefore you could say that Pb represents content ranging from a purplish blue to a greenish yellow. (R itself represents red to black content.) As the electron beam sweeps across the screen, Y, Pr, and Pb are constantly changing, each providing a single numeric value at any given point in time. The three signals are electronically mixed, or you could say their mathematical formula equivalents are combined, to yield three different numbers (or voltages) for R, G, and B, respectively, to specify the color at any given spot on the screen.

While DVD and U.S. HDTV are digital formats, (analog) component video connections from the tuner box to the TV set were the rule in 2005 and still common in 2007. For some "digital" TV sets with cathode ray tube(s), the entire video signal path inside (except the comb fiter and line doubler used for "non-digital" sources) is also analog.

All TV sets eventually convert the incoming video signal(s) into RGB. Most TV sets today actually convert all incoming analog broadcasts into component video as an intermediate step.

The term Y/Cb/Cr stands for the same three video signals but in digital form. Sometimes jacks on DVD players and other equipment are labeled Y, Cb, and Cr when Y, Pb, and Pr was intended.

We will continue to use the term "analog component video" or ACV rather than "component analog video" ...

1.  Because we have used the term ACV here for quite some time,

2.  To avoid confusion with the laserdisk format term CAV for "constant angular velocity".

RGB requires full horizontal resolution for each of the color components. Because the required bandwidth could not be transmitted on the broadcast channels defined and allocated, some "compression" had to be done. Taking advantage of the human eye's lesser sensitivity to incorrect coloration of fine details, the compression took the form of reduced horizontal color resolution using two color components while luminance as the third "color component" carried the full horizontal resolution.

In an analog world, it is easy to let one signal vary from zero to perhaps two volts to represent different shades of red, a second signal represent shades of green, and a third signal represent shades of blue. Together they produce all the colors needed for a realistic picture even if the reds were limited to about 100 values or steps and the same for the green and the blue. But 100 steps for each color means one million possible combinations of all three. Simple analog electronics cannot accurately distinguish a million different voltage steps for a single video signal, thus three signals are used. (Computer video, DVD, and U.S. DTV /HDTV use up to 256 steps per color instead of 100.)

For the original NTSC video, Pb and Pr were converted into two different "color pairs", I, and Q, if the latter were not generated directly from the RGB from the camera. The same color wheel principle was used. When Q was zero, the color (and therefore what I represented) ranged from reddish orange to greenish blue. When I was zero, the color ranged from yellowish green to purple.  If Pr is roughly red (it is), then I is roughly Porange and Q is roughly Pgreen. (Some test patterns show the inverses, greenish blue for I and magenta for Q.) These color representations were chosen to better match the human eye color sensitivity which is a little greater for reddish oranges and greenish blues. Accordingly the I signal was given a greater bandwidth allowing for finer color details than the Q signal although not as much detail as the Y signal.

YIQ is still used (and required) for constructing the NTSC broadcast signal.. Due to the expense of correct YIQ decoding circuitry, TV set manufacturers quickly started offering sets that did not offer the full color resolution of the YIQ encoding. Decoding of the color signal into Y, Pb, and Pr (and also a few other component formats) became common almost as soon as color broadcasting in the formally adopted NTSC standard began in the early 1950's. Pb and Pr are easier to work with than I and Q, and would produce an picture almost the same as using I and Q except for the reduction of color resolution of oranges and blues to be equal to that for other colors (a color transition taking 2% of the screen width).

It was not until the mid 1990's when DVD and digital TV made it possible to store and transmit video at reasonable expense with greater resolution, both in luminance and in color. Then analog component video connections between devices (DVD players, TV sets, etc.) started to find their way into consumer video applications. The most common standard is for Pb and Pr to each have half the bandwidth which translates to half the horizontal resolution of the luminance signal. For DVD this is 270 TV lines per picture height on a 4:3 screen which is twice the horizontal color resolution achievable using S-video. Incidentally both DVD and U.S. HDTV limit color resolution further to one color pixel per 2x2 block of luminance pixels. This means that every other scan line needs to have its color synthesized from its neighbors just before the video is converted to analog and component video output generated.


Component Video Math and Formulas

Return to top of page

Go to other video topics

Geometrically, all the colors can be represented by a color space in the form of a wheel: red, magenta, purple, blue, cyan, green, yellow, orange, and back around to red. I and Q represent axes  (lines crossing the wheel hub) 90 degrees apart. I crosses through reddish orange and greenish blue. Q crosses through yellowish green and purple. Pb as an axis crosses through a purplish blue and Pr crosses through a purplish red as a different pair of axes, also 90 degrees apart. YIQ and Y/Pb/Pr are referred to as different color spaces in the context that positioning the color wheel with the I axis vertical puts the colors in different positions compared with with the Pr axis vertical. Pb and Pr were used instead of (pure) B and R because Pb and Pr would tend to be zero when the actual color was gray or white resulting in fewer color errors on less than perfect equipment. RGB is also referred to as a color space but is expressed on the color wheel as three half-axes starting from the center.

First the obvious formulas.

R = 1.00R + 0.00G + 0.00B
G = 0.00R + 1.00G + 0.00B
B = 0.00R + 0.00G + 1.00B

Since the picture is scanned one line at a time, R, G, and B keep changing as time passes. To have a cyan spot on the screen, R would be nothing (zero), G would be something (something positive), and B would be something (positive) at that moment in time. As the beam goes off the right side of the screen and is returned to the left, usually R and B go to zero and G goes negative while representing the synchronizing pulses.

The formula below for Y (luminance) is over half a century old and represented the most natural looking picture from a color camera as displayed on a black and white TV set. Y also carries the sync. pulses for composite video, S-video, and analog component video.

Y = 0.30R + 0.59G + 0.11B

B-Y = -0.30R - 0.59G + 0.89B
Pb = 0.56(B-Y)  = -0.17R - 0.33G + 0.50B
U = 0.49(B-Y) = -0.15R - 0.29G + 0.44B

R-Y = 0.70R - 0.59G - 0.11B
Pr = 0.71(R-Y) = 0.50R - 0.42G - 0.08B
V = 0.88(R-Y) = 0.62R - 0.52G  - 0.10B 

More on U and V later.

The factors 0.56 and 0.71 attenuate Pb and Pr so that they range from -0.50 to +0.50 volts while Y (not counting the sync. pulses) ranges from 0 to 1.00 volts.

Other relationships:

I = 0.74P(R-Y)- 0.27(B-Y)
   = 0.60R - 0.28G - 0.32B ;

Q = 0.48(R-Y) + 0.41(B-Y)
   = 0.21R - 0.52G + 0.31B

Given Y, Pb, and Pr we recover B-Y and R-Y by rescaling (by factors of 1/0.564 and 1/0.713 respectively). Then to reconstruct pure red we simply add equal parts of Y and R-Y, etc.. Once we have pure R, Y, and B, we can reconstruct G without too much difficulty; there is no need to carry a G-Y signal which incidentally equals -0.30R + 0.41G - 0.11B.

We can also (we must be able to) reconstruct R, G, and B given Y, I, and Q (or given any three of the formulas Y, R, G, B, Pb, Pr, I, Q, U, V), but the math can be more complicated.

Why "Color Difference Video"

R-Y is simply that, red content minus luminance. Same for B-Y. The intent is to have the color component (sub)signal vanish (go to zero) when there is no color at that spot on the screen at that instant in time (the spot being white or gray or black).

The "color" is white or gray when R, G, and B are equal to one another. We will leave it as an exercise for the reader to see tha R-Y = 0 and B-Y = 0 when R = G = B.


A Note About S-Video and Composite Video

We mentioned earlier that analog circuits could not easily represent all million or more shades of color as a single varying voltage. For S-video and composite video, a single signal (or subsignal if you prefer) represents all colors using a scale of 0 to 360. The scale represents degrees out of phase, or position around the color wheel described above. Analog circuits can represent color changes as phase shifts much more accurately than as voltage levels. It is still not perfect, for example a cyan spot on the screen may vary from bluish green to greenish blue 30 times a second or each time the electron beam passes.

Using this method, all of the color information can be carried (minus some horizontal resolution) using one amplitude modulated subsignal instead of two. This single color (sub)signal, called C, can be made equally well using I and Q, or using Pb and Pr. The C signal regardless of how it was made can be decomposed back into RGB equally well (subject to encoding and decoding bandwidth limitations) using YIQ or Y/Pb/Pr as an intermediary although using the latter is simpler. Composite video is simply the mixture of Y and C.

The C signal is always modulated on a subcarrier, approx. 3.58 MHz for NTSC and 4.43 MHz for PAL. When C is demodulated, two color subsignals always result. All VCR's also record luminance and color as two components, Y and C, although the subcarrier for C is different  from that used in broacasting, and Y is also modulated on a subcarrier. With the current standards for subcarriers, no progressive scan or HDTV video signals can be encoded or transmitted using C or as S-video.

In phase vs. out of phase


Y/Pb/Pr versus YUV versus Y/Cb/Cr

Y/Pb/Pr is used for analog applications while Y/Cb/Cr is used for digital applications. The three components for both forms may have the same content, Y as 0.30R + 0.59G + 0.11B, B-Y, and R-Y.

U and V are also based on B-Y and R-Y respectively. The difference compared with Pb and Pr is in the proportioning (normalizing factor) relative to Y:

U = 0.49(B-Y)
Pb = 0.56(B-Y)

V = 0.88(R-Y)
Pr = 0.71(R-Y)

PAL and SECAM use U and V.

For Y/Pb/Pr, all three components have the same voltage spread of about 714 millivolts RMS (or about 1.0 volts peak to peak)  including the black pedestal (for NTSC broadcasts black is slightly above zero) but not counting the negative going sync. pulse.

For Y/Cb/Cr in an 8 bit system, the Y typically has a digital value spread from 16 to 235 which is slightly less than for Cb and Cr which each have a digital spread from 16 to 240.

The terms Y/Pb/Pr, YUV, and Y/Cb/Cr are sometimes used interchangeably which is not quite correct.

*As stated in Wikipedia.org and Answers.com


A Hodgepodge of Standards

While the concept of color difference video Y/Pb/Pr, etc. is simple, unfortunately a profusion of standards have come about. The obvious problem is that, if the source material is produced using one standard and the TV set recreates its RGB using a different standard, the colors will be not quite correct.

We have already seen the difference between Pb and U, or between Pr and V, namely multiplying by some factor and/or adding some base value or bias. In order to prevent signal interference problems in the C signal and in composite video, the B-Y and R-Y are both attenuated, and unfortunately not by the same factor for each. Consequently the color decoder in the TV set must multiply by the inverse factor (so as to amplify) the color components so as to recover correct B-Y and R-Y so we can in turn obtain pure blue and red as we discussed earlier. (In analog electronics, multiplying and scaling consist of simple amplification or attenuation.)

Another issue is the black pedestal. In NTSC broadcasts, if white corresponds to all 714 millivolts or 100% of the luminance signal, black corresponds to about 53 mv, or 7.5% (a.k.a. 7.5 IRE). For DVD, black is defined as digital value 16 (and white as 235) using the range 0 to 255. Video material below 7.5% (analog) or 16 (digital) is referred to as "blacker than black". Blacker than black should not be part of DVD subject matter but occasionally is. Not all DVD players will pass those picture details to the TV and not all TV sets will distinguish incoming content less than 7.5%.

To further compound the confusion, the definition of Y is different for HDTV than it is for NTSC broadcasts:

Y = 0.299R + 0.587G + 0.114B
(SMPTE 170M for NTSC)

Y = 0.2126R + 0.7512G + 0.0722B
(SMPTE 296M for 1080i and 720p HDTV)

Reference: Joe Kane Productions, http://www.videoessentials.com/componentvideo.htm

Multiple standards exist both in the analog realm (Y/Pb/Pr) and the digital realm (Y/Cb/Cr).

SMPTE -- Society of Motion Picture and Television Engineers


Return to top of page

Go to other video topics

Go to table of contents

Contact us

All parts (c) copyright 2001, Allan W. Jayne, Jr. unless otherwise noted or other origin stated.

If you would like to contribute an idea for our web page, please send us an e-mail. Sorry, but due to the volume of e-mail we cannot reply personally to all inquiries.