advertisement
advertisement

Lighting for Digital Video

How you light your video is one of the most important aspects of a shoot. This is especially true if the end result is going to be a compressed and encoded video to be shown on computers or the internet or even recorded back onto tape for broadcast.

How you light your video is one of the most important aspects of a shoot. This is especially true if the end result is going to be a compressed and encoded video to be shown on computers or the internet or even recorded back onto tape for broadcast.

advertisement

Let’s say you have someone wearing black jeans and standing in a shadowy area. There’s nothing wrong with that, and depending on how well that scene is lit, it could have a very nice, mood-creating look to it. If it’s not lit well enough, you may have a problem when you compress and encode that video. Let’s say you shot the video with frames that are 640 pixels wide and 480 pixels high. If the target size of your final video is 320w by 240h, you now have 1/4 of the pixels available to describe that frame (1/2 height times 1/2 width). This means that the program is going to have to calculate the combination of four adjacent pixels to create the single pixel that’s now representing that space. Think of it as taking that 320×240 and stretching it to cover the original 640×480 frame. Each “pixel” is now going to cover four pixels of the original. Let’s call those A, B, C and D.

AB
CD

If A and C represent the shadowy wall, and B and D represent the black jeans in the original frame, this new pixel will have to be a calculation based on all four of those. If the lighting of the scene didn’t differentiate well enough between the two, they will be seen as approximately the same color. This is a problem, because the definitions of where one object starts and another begins becomes more vague after the compression. This is one reason why people wonder why their video looks so bad once they’ve compressed it, when it looked so good on the videotape, or they could clearly see the separation between the jeans and the wall on their television screens.

You can see the effect of poor lighting if you open your color wheel settings on a graphics or video editing program. Next to the color wheel, there’s a slider that only slides from white to black. This represents “luminance” (luma) or the amount of “whiteness” of the color that you’re choosing by the wheel. Notice how you can choose purple or choose blue or choose brown… and the farther you slide the luma slider towards black, each one of those colors starts to look like the exact same color? That’s what happens to your video. That’s what the computer sees when you personally knew at the time of the shoot what colors you were looking at, but there wasn’t enough light for the computer to be able to see what you saw.

Bill Cammack • New York City • Freelance Video Editor • alum.mit.edu/www/billcammack