100% found this document useful (1 vote)
554 views

Bvcls

Uploaded by

Twinkle Sanoriya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
554 views

Bvcls

Uploaded by

Twinkle Sanoriya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 42

BCLS Notes all references & sources are from:

1. Video Basics, Zettl,Thomas


2. Video Production, Vasuki Belavadi, Oxford University Press.
3. TV Production handbook, Zettl, eighth edition.
4. ehow.com.
5. British Library
6. TIAS Library
7. http://hulk03.princeton.edu:8080/
By: Dewesh Pandey
……………………………………………………………………………………
Camera Angles-

The relationship between the camera and the object being captured (ie the ANGLE) gives
emotional information to an audience, and guides their judgment about the character or
object in shot. The more extreme the angle (ie the further away it is from eye left), the
more symbolic and heavily-loaded the shot.

Each shot requires placing the camera in the best position for viewing a scene. This
positioning of camera, specially camera angle, determines both the audience viewpoint
and area covered in the shot. To achieve the desire angle within the frame, the camera
must be placed at the appropriate level.

Low-angle:
When the camera is below the subject that is low angle. The camera will
make whatever it is looking at seem big. It also gives a general sense of dramatic
intensity and sometimes makes a subject seem threatening.

High-Angle:
When the camera is above the subject looking down to it that is the high
angle shot. The psychology behind the shot is the reverse of the low angle shot. High
angle shot makes a subject look smaller. To show weakness or vulnerability, a high angle
shot is appropriate.

Eye-Level-Angle:
These shots are best for close ups of people, meant to depict a general
scene. It may provide a neutral narrative position, or it may support a high-impact,
challenging situation. The camera position at eye level invites the viewer to read the shot,
rather than respond to it emotionally, as would be the case with either a high or a low
angle shot.

The Bird's-Eye view:


This shows a scene from directly overhead, a very unnatural and
strange angle. Familiar objects viewed from this angle might seem totally unrecognizable
at first (umbrellas in a crowd, dancers' legs). This shot does, however, put the audience in
a godlike position, looking down on the action. People can be made to look insignificant,
ant-like, part of a wider scheme of things.
……………………………………………………
CAMERA MOVEMENT-
Video camera movement is used for many purposes. It may be used to
make an object.

appear to be bigger or smaller. It may be used to make things blurred, scary, or just
different. Camera movement techniques are often used, however, to tell a story.
Multiple points of view in visual story can be given by rapid transitions from the camera
position to another. It is also possible to change the viewing perspective within a single
shot by moving a camera or zooming the lens. Several types of movements can be used.
The most common are zooms, pans, tilt, dolly, truck and boom.

Pans-
A movement which scans a scene horizontally. The camera is placed on a tripod, which
operates as a stationary axis point as the camera is turned, often to follow a moving
object which is kept in the middle of the frame.

To create a smooth pan it's a good idea to practice the movement first. If you need to
move or stretch your body during the move, it helps to position yourself so you end up in
the more comfortable position. In other words you should become more comfortable as
the move progresses rather than less comfortable

Tilt
A tilt is a vertical camera movement in which the camera points up or down from a
stationary location. For example, if you mount a camera on your shoulder and nod it up
and down, you are tilting the camera.

Tilting is less common than panning because that's the way humans work — we look left
and right more often than we look up and down.

Tracking
The term tracking shot is widely considered to be synonymous with dolly shot; that is, a
shot in which the camera is mounted on a cart which travels along tracks.

However there are a few variations of both definitions. Tracking is often more narrowly
defined as movement parallel to the action, or at least at a constant distance (e.g. the
camera which travels alongside the race track in track & field events). Dollying is often
defined as moving closer to or further away from the action.

Some definitions specify that tracking shots use physical tracks, others consider tracking
to include hand-held walking shots, Steadicam shots, etc.
Other terms for the tracking shot include trucking shot and crabbing shot.

Dolly Shots
Sometimes called TRUCKING or TRACKING shots. The camera is placed on a moving
vehicle and moves alongside the action, generally following a moving figure or object.
Complicated dolly shots will involve a track being laid on set for the camera to follow,
hence the name. The camera might be mounted on a car, a plane, or even a shopping
trolley (good method for independent film-makers looking to save a few dollars). A dolly
shot may be a good way of portraying movement, the journey of a character for instance,
or for moving from a long shot to a close-up, gradually focusing the audience on a
particular object or character.
The camera is mounted on a cart which travels along tracks for a very smooth movement.
Also known as a tracking shot or trucking shot.

Trucking
Trucking is basically the same as tracking or dollying. Although it means slightly
different things to different people, it generally refers to side-to-side camera movement
with respect to the action.
The term trucking is not uncommon but is less widely-used than dollying or tracking. Yet
another equivalent term is crabbing.

Camera movement equipment are:


1. Dolly
2. Jib
3. Crane
4. Track
5. Trolly

Define White Balance


White balance means colour balance. It's a function
which tells the camera what each colour should look like,
by giving it a "true white" reference. If the camera knows
what white looks like, then it will know what all other
colours look like.
This function is normally done automatically by
consumer-level cameras without the operator even being
aware of it's existence. It actually works very well in
most situations, but there will be some conditions that the
auto-white won't like. In these situations the colours will seem wrong or unnatural.
To perform a white balance, point the camera at something matt (non-reflective) white in
the same light as the subject, and frame it so that most or all of the picture is white. Set
your focus and exposure, then press the "white balance" button (or throw the switch).
There should be some indicator in the viewfinder which tells you when the white balance
has completed. If it doesn't work, try adjusting the iris, changing filters, or finding
something else white to balance on.

You should do white balances regularly, especially when lighting conditions change (e.g.
moving between indoors and outdoors.

Asthetics in visual composition


1. Look whole screen
2. Try to understand the difference between camera and human eye.
3. What is background: What it is ? Is it Effective?
Does it match to your subject? Is it giving any additional information about the
subject?
4. Costume & make-up: It should be up to the mark.
5. subjectivity: Who Speaks, Who Listens we must have to focused about it at the
time of composition.
6. We also must have to be focused towards Optical Center.

Subject & Camera Relationship

Always consider the purpose of the shot before you start to set it up. After finding
your subject, think that what do you want to show about it/them.

We always must have to be focused in terms of understanding the relationship


between the subject & camera, like we should be very much clear about our subject.
For example we must aware who is our subject, why it is our subject? Are they in
right clothes/dress or are they in right mood? These are some points which is very
essential at the time of maintaining relation between the both.

On behalf of some points we can explain the relationship between the camera and
subject:
1. Placement of the subject: what is the placement of the subject we should be clear
about it? For example mainly we should be aware about the Optical Center.

2. Camera Orientation: Consider where the camera is placed in relation to the


subject and how it relates & contributes.

3. Eye-Gaze:
A) Direct-Gaze: When a subject looks directly at the camera. Their power and
confidence is emphasized.
B) Indirect –Gaze: Sometimes subject deflect emphasize to the other people
or subject in the scene.

4. Camera angle: This is another important point which should be remember always.
For this we must aware all the angles like High & Low angle. Along with these
angles we also keep in our mind about Head-room & Nose-room.

5. Portrait of the Subject in different shots: For example Introductory shot, if your
subject is very poor so have to make that portrait according to it.

6. shots & Sequences for the subject


7. Camera Movements
8. Camera Mood:- It is Shot’s mood or Scene’s mood. For example if the scene is a
fear scene or a pleasant scene, so the mood should be according to it.
9. Camera Subject Distance: It refers to distance between the both while capturing
the picture or at the time of framing.

VIDEO CAMERA
A video camera is a camera used for electronic motion picture acquisition,
initially
developed by the television industry but now common in other applications as well.
Video camera that records and plays visual images and sounds made on magnetic tape.
The Video camera is the single most important piece of production equipment. Other
production equipment and techniques are greatly influenced by the camera’s technical
and performance characteristics. The video camera is central to all television production.
Video cameras are used primarily in two modes.

The first, characteristic of much early television, is what might be called a live broadcast,
where the camera feeds real time images directly to a screen for immediate observation.
A few cameras still serve live television production, but most live connections are for
security, military/tactical, and industrial operations where surreptitious or remote viewing
is required.

The second is to have the images recorded to a storage device for archiving or further
processing; for many years, videotape was the primary format used for this purpose, but
optical disc media, hard disk, and flash memory are all increasingly used. Recorded video
is used in television and film production, and more often surveillance and monitoring
tasks where unattended recording of a situation is required for later analysis.

Parts of Video Camera & Functions

1. Built-in Microphone; This records the sound along with the picture when the
camera is in operation.

2. White-Balance Sensor window: This indicates the white balancing of the camera
for its color corrections.

3. Lens:- The lens assembly handles all the light and the image that comes into
camera. We can add lenses to achieve different effects in some modals.

4. Viewfinder:- it also allows us to see what we are recording(most display in black


and white, however)

5. LCD viewfinder:- Most new versions of digital camcorders have the liquid crystal
display (LCD) viewfinder, a small screen that allows us to see what we are
recording in color.

6. Zoom:-The two way zoom button enables us to zoom the camera lens in and out,
that is, it allows us to go closer to the subject when we zoom in and further from
the object when we zoom out.
7. Recording Levels:- Most professional modals have a drum that we can use to
modify the levels of audio we are recording.

8. Operation switch: This switch is used for the power supply to the camera.

9. Auto light button: This button is pressed to activate the auto light function.

10. Lense cap: It protects the lense from possible damages.

11. Tape eject button; to insert for take out a video cassette, press the button.

12. External microphone socket: If you want to use an external microphone, connect
it to this socket ( in this case, the built-in microphone will be deactivated)

13. Start/stop Button- press this button to start and stop shooting a scene.
14. White balance button- Press this button to select manual white balance
adjustment. Press it again to reset to the automatic white balance adjustment
mode.
15. Exposure/aperture:- the exposure on the camera helps us to increase or decrease
the aperture levels so that the picture becomes brighter or darker depending on
what we desire. This increases or decreases the amount of light entering the
camera.

16. Play button:- Press this button to start the playback

17. Stop Button:-Press this button to stop the playback.

Various Camera Shots

WIDE SHOT(WS) or VERY LONG SHOT(VLS)


Used when it is important for the audience to understand or have an idea about the
setting. This shot will often establish a scene or place giving an audience context for the
following action. Any figures will be very small.

LONG SHOT (LS)


In this shot any figures will be seen from head to foot. The audience will be able to
identify more detail about the person or persons but will also be able to see where they
are or what is going on around them.
A Long Shot often follows a Wide Shot particularly if the film maker wishes to introduce
a character.

MID SHOT or MEDIUM SHOT (MS)


Now that the camera is closer to the subject the audience can easily recognize them and
identify detail about what they are doing. Within the frame a person will be shown from
roughly their waist to their head.
MEDIUM CLOSE UP (MCU)
This is very similar to the previous shot but the camera is a little closer to the subject and
within the frame a person will be shown from their chest to their head. The difference is
only slight but can create subtle effects.

CLOSEUP (CU)
This allows the camera to focus on the detail of the subject as there is little emphasis on
background or setting. The frame contains the person’s head and shoulders allowing the
audience to recognise a subject’s thoughts and feelings.

BIG CLOSE UP (BCU)


This shot shows just the face. It is an effective shot to use when signifying emotions or
focusing on expressions.

EXTREME CLOSEUP. (ECU)


In this shot the camera seems to be extremely close to the subject. The whole frame is
filled with a part of the subject. A person’s face would be shown from just below the
mouth to just above the eyebrows or even closer… It can have the effect of making a
person seem powerful or threatening but can also be used as a reaction shot to focus on
an emotional response.

Two Shot:-
There are a few variations on this one, but the basic idea is to have a comfortable shot of
two people. Often used in interviews, or when two presenters are hosting a show.
Two-shots are good for establishing a relationship between subjects. If you see two sports
presenters standing side by side facing the camera, you get the idea that these people are
going to be the show's co-hosts. As they have equal prominence in the frame, the
implication is that they will provide equal input. Of course this doesn't always apply, for
example, there are many instances in which it's obvious one of the people is a presenter
and the other is a guest. In any case, the two-shot is a natural way to introduce two
people.

A two-shot could also involve movement or action. It is a good way to follow the
interaction between two people without getting distracted by their surrounding.

Over the Shoulder shot:-


This shot is framed from behind a person who is looking at the subject. The person facing
the subject should usually occupy about 1/3 of the frame.
This shot helps to establish the position of each person, and get the feel of looking at one
person from the other's point of view. It's common to cut between these shots during a
conversation, alternating the view between the different speakers.

Point of View shot:-


This shot shows a view from the subject's perspective. It is usually edited in such a way
that it is obvious whose POV it is.
Cutaway (CA)
A cutaway is a shot that's usually of something other than the current action. It could be a
different subject (e.g. these children), a CU of a different part of the subject (e.g. a CU of
the subject's hands), or just about anything else. The CA is used as a "buffer" between
shots (to help the editing process), or to add interest/information.

Define depth of field:-


The range in which all objects in front of the camera lens appear to be in focus is called
the depth of field.
The area between the nearest object in focus and the furthest object in focus.
It is dependent on
1. the focal length of the lenses (wide lenses provide greater depth of field)

2. The aperture (smaller openings provide greater depth of field)


3. camera position ( the greater the distance between the object and the camera, the
greater the depth of field)

The DOF is determined by the camera-to-subject distance.


Aperture, or f/stop, is the most important factor in controlling the depth of field.

Aperture:-
The aperture, or f/stop as it is commonly called, is used to regulate the diameter of the
lens opening. That controls the luminance on the film plane. Besides controlling the
luminance on the film plane, the f/stop also controls image sharpness by partially
correcting various lens aberrations.
The most commonly used aperture control device is the iris diaphragm. An iris
diaphragm is an adjustable device that is fitted into the barrel of the lens or shutter
housing. It is called an iris diaphragm because it resembles the iris in the human eye.
An iris diaphragm is a series of thin, curved, metal blades that overlap each other and is
fastened to a ring on the lens barrel or shutter housing.
The size of the aperture is changed by turning the aperture control ring.
The blades move in unison as the control ring is moved, forming an aperture of any
desired size.
The control ring is marked in a series of f/stops that relate to the iris opening.
The aperture controls the intensity of light that is allowed to pass to the film and the parts
of the image that will appear in sharp focus.

CAMERA LENSES (Types & Functions)


A camera lens is a curved piece of transparent glass that focuses an image in a camera. A
camera lens is not a single lens, but a combination of lenses to bend the light entering the
camera in such a way that it can be captured on film.
Depending on the shape and size of the lens, many

different photographic effects can be achieved. In addition, combining multiple lenses


and changing the distance between those lenses, can create even more photographic
effects. There are many different types of camera lenses for different photographic
Purposes.
These lenses are permanently fixed to video cameras, or used interchangeably with lenses
of different focal lengths. There are several basic types of products:

1) Normal lens: A Normal Lens gives a viewpoint that is very close to what is seen by
the human eye. It has very little distortion.

2) Telephoto Lens: A telephoto lens or narrow angle lens is designed to have a focal
length. The subject appears much closer than normal, but you can see only a smaller part
of scene.

3) Zoom Lens: Most Video camera comes with the familiar zoom lens system, which is
a remarkably flexible production tool. At any given setting within its range the zoom lens
behaves like a prime lens of that focal length.

4) Wide angle: a wide angle lens has a short focal length that takes in correspondingly
more of the scene. However subjects will look much further away and the depth and
distance appear exaggerated.

FUNCTIONS:-
The function of the lens in the camera is to direct the light source to the camera sensor
and also to focus your image.

A camera lens uses refraction to focus light on to the area where the film is located inside
the camera. Refraction is caused by a change in the direction of light as it passes through
the curved glass.

Important things for Lens


1. Check the smoothness of focusing action from close-up to infinity.
2. Check focus when fully zoomed in, and when zoomed out.
3. Prefocus on a distant subject and then slowly zoom out, checking that focus does
not wander.
………………………………..
LIGHT
“Good Lighting can transform even a routine, uninteresting shot into an attractive &
appealing picture.

Lighting is an important part of the TV Production Process.


Good Lighting is essential if the viewer is to believe the reality of the pictures being
presented.

Bad Lighting may distract the viewer.

“Effective Lighting makes a Vital contribution to a TV Production. We quickly


Recognize bad Lighting when we see it, but good Lighting is so unobtrusive and
“Natural” that we usually take it for granted.”
Lighting means to control Light and shadows for three Principal reasons:
1. To provide the television camera with adequate illumination so that it can
presented/see well, that is, produce technically acceptable pictures.

2. To help the viewers recognize what things and people look like and where they
are in relation to one another and to their immediate environment.

3. To establish a general feeling and mood of the event.

4. Lighting is an area of the graphics pipeline that has seen no limits to its
complexity and continues to promote active research

5. Many advanced lighting techniques very complex and require massive


computational and memory resources

6. New algorithms continue to be developed to optimize various pieces within these


different areas

7. The complexity of lighting within the context of photoreal rendering completely


dwarfs the other areas of the graphics pipeline to the point where it can almost be
said that rendering is 99% lighting

8. The requirements of photoreal lighting have caused radical modifications to the


rendering process to that point that modern high quality rendering has very little
resemblance to the processes that we’ve studied so far

9. For one thing, so far, we’ve talked about rendering triangle-by-triangle, whereas
photoreal rendering is generally done pixel-by-pixel, but we will look at these
techniques in more detail in a later lecture

Types
• Ambient Light
• Direction Lights
• Spot Lights
• Point Lights

Ambient Light
• An ever present light
• `Floods’ the scene
• No highlights
• No Shadows
• Good for Base Lighting

Direction Light
• Light from a single direction.
• Like sun-light.
• Has shadows
• Has highlights.
• A good basic light.

Spot Light
• A theatrical spot-light.
• Has shadows
• Radiates out in a cone
• Has fall-off
• Has penumbra
• Very powerful.

Point Light
• A local light
• Radiates in all directions
• Great `filler’ light
• Has shadows
• Can really punch up a scene.
• Very dramatic

3 Point Lighting
• Most Common lighting scheme
• Three lights:

Key light

The Key light establishes the dimension, form and surface detail of subject matter.
Fill light
The Fill light fills in the shadows created by the horizontal and vertical angles of the key
light. The fill light should be placed about 90-degrees away from the key light.

Back light
The function of the Back light is to separate the subject from the background by creating
a subtle rim of light around the subject.

Properties of Light summary


1) Light travels in straight lines
2) Light travels much faster than sound
3) We see things because they reflect light into our eyes
4) Shadows are formed when light is blocked by an object
Good ways to use lights
• Look to photographers for good techniques
• Think in terms of balance.
• Avoid the overly dramatic.
• Look at natural lighting.
• Avoid saturated lights and hues
• Normally only need a few lights.
• Avoid disco colors and effects 
Introduction to Audio
This beginner-level tutorial covers the basics of audio production. It is suitable for
anyone wanting to learn more about working with sound, in either amateur or
professional situations. The tutorial is five pages and takes about 20 minutes to complete.

What is "Audio"?
Audio means "of sound" or "of the reproduction of sound". Specifically, it refers to the
range of frequencies detectable by the human ear — approximately 20Hz to 20kHz. It's
not a bad idea to memorise those numbers — 20Hz is the lowest-pitched (bassiest) sound
we can hear, 20kHz is the highest pitch we can hear.
Audio work involves the production, recording, manipulation and reproduction of sound
waves. To understand audio you must have a grasp of two things:

1. Sound Waves: What they are, how they are produced and how we hear them.
2. Sound Equipment: What the different components are, what they do, how to
choose the correct equipment and use it properly.

Analog Vs. Digital Sound


Analog sound versus digital sound compares the two ways in which sound is recorded
and stored. Actual sound waves consist of continuous variations in air pressure.
Representations of these signals can be recorded in either digital or analog formats.
An analog recording is one where the original sound signal is modulated onto another
physical medium or substrate such as the groove of a gramophone disc or the iron oxide
surface of a magnetic tape. A physical quality in the medium (e.g., the intensity of the
magnetic field or the path of a record groove) is directly related, or analogous, to the
physical properties of the original sound (e.g., the amplitude, phase, etc.)

Sound wave properties and characteristics


Sound waves are characterized by the generic properties of waves, which are frequency,
wavelength, period, amplitude, intensity, speed, and direction (sometimes speed and
direction are combined as a velocity vector, or wavelength and direction are combined as
a wave vector).
Transverse waves, also known as shear waves, have an additional property of
polarization.
Sound characteristics can depend on the type of sound waves (longitudinal versus
transverse) as well as on the physical properties of the transmission medium.
Whenever the pitch of the soundwave is affected by some kind of change, the distance
between the sound wave maxima also changes, resulting in a change of frequency. When
the loudness of a soundwave changes, so does the amount of compression in airwave that
is travelling through it, which in turn can be defined as amplitude.

Several Facts about Sound


 Sound travels at about 330 metres a second.
 A pure sound tone consists of a single frequency of vibration.
 The range of human hearing is about 20 Hertz to 20 KiloHertz.
 Most sounds are a mixture of frequencies. See the page on HARMONICS.
 Microphones convert sound pressure waves into electrical signals.
Loudspeakers convert electrical signals into sound waves.
 Loudspeakers and microphones are TRANSDUCERS.
 Frequency, wavelength and the speed of sound are interrelated.
Pitch
 Pitch is the perceived fundamental frequency of a sound. While the actual
fundamental frequency can be precisely determined through physical
measurement, it may differ from the perceived pitch because of overtones, or
partials, in the sound.

 The note A above middle C played on a piano is perceived to be of the same pitch
as a pure tone of 440 Hz. However, a slight change in frequency need not lead to
a perceived change in pitch. The just noticeable difference (the threshold at which
a change in pitch is perceived) is about five cents (that is, about five hundredths of
a semitone), but varies over the range of hearing and is more precise when the two
pitches are played simultaneously.

 Perception of pitch can be likened to attempting to sing an imitation of the sound


you hear.

Acoustics and noise


The scientific study of the propagation, absorption, and reflection of sound waves is
called acoustics.
Noise is a term often used to refer to an unwanted sound. In science and engineering,
noise is an undesirable component that obscures a wanted signal.

Speed / velocity of sound


The speed of sound depends on the medium through which the waves are passing, and is
often quoted as a fundamental property of the material. In general, the speed of sound is
proportional to the square root of the ratio of the elastic modulus (stiffness) of the
medium to its density. Those physical properties and the speed of sound change with
ambient conditions. For example, the speed of sound in gases depends on temperature. In
air at sea level, the speed of sound is approximately 343 m/s, in fresh water 1482 m/s
(both at 20 °C, or 68 °F), and in steel about 5960 m/s.

Equipment for dealing with sound


Equipment for generating or using sound includes musical instruments, hearing aids,
sonar systems and sound reproduction and broadcasting equipment. Many of these use
electro-acoustic transducers such as microphones and loudspeakers.

Audio Equipment required in a studio


 Microphone (condenser or dynamic) not a computer microphone.
 Microphone pre-amp (do not use your dj mixer microphone pre-amp these usually
give bad results)
patch cables.
 Computer with a sound card (one that isn't from the dinosaur age).
 Sound card for your PC (some of the high end sound cards have a microphone pre
amp built into the sound card.
Modern equipment
Modern hi-fi equipment usually includes digital audio signal sources such as CD players,
Digital Audio Tape (DAT) and Digital Audio Broadcasting (DAB) or HD Radio tuners,
an amplifier, and loudspeakers. Some modern hi-fi equipment can be digitally connected
using fiber optic TOSLINK cables, universal serial bus (USB) ports (including one to
play digital audio files), or WiFi support.

Equipment found in a recording studio commonly includes:


1. Mixing console
2. Multitrack recorder
3. Microphones
4. Reference monitors, which are loudspeakers with a flat frequency response
Equipment may also include:
 Digital Audio Workstation
 Music workstation
 Outboard Effects, such as compressors, reverbs, or equalizers
Recording Studio
A recording studio is a facility for sound recording. Ideally, the space is specially
designed by an acoustician to achieve the desired acoustic properties (sound diffusion,
low level of reflections, adequate reverberation time for the size of the ambient, etc.).
Different types of studios record bands and artists, voiceovers and music for television
shows, movies, cartoons, and commercials, and/or even record a full orchestra.

The typical recording studio consists of a room called the


"studio", where instrumentalists and vocalists perform; and the "control room", which
houses the equipment for recording, routing and manipulating the sound. Often, there will
be smaller rooms called "isolation booths" present to accommodate loud instruments such
as drums or electric guitar, to keep these sounds from being audible to the microphones
that are capturing the sounds from other instruments or vocalists.

Microphones
How They Work

A microphone is an example of a transducer, a device that changes information from one


form to another. Sound information exists as patterns of air pressure; the microphone
changes this information into patterns of electric current. The recording engineer is
interested in the accuracy of this transformation, a concept he thinks of as fidelity.
A variety of mechanical techniques can be used in building microphones. The two most
commonly encountered in recording studios are the magneto-dynamic and the variable
condenser designs.

THE DYNAMIC MICROPHONE


In the magneto-dynamic, commonly called dynamic, microphone, sound waves cause
movement of a thin metallic diaphragm and an attached coil of wire. A magnet produces
a magnetic field which surrounds the coil, and motion of the coil within this field causes
current to flow. The principles are the same as those that produce electricity at the utility
company, realized in a pocket-sized scale. It is important to remember that current is
produced by the motion of the diaphragm, and that the amount of current is determined
by the speed of that motion. This kind of microphone is known as velocity sensitive.

MICROPHONE PATTERNS
These are polar graphs of the output produced vs. the angle of the sound source. The
output is represented by the radius of the curve at the incident angle.

Omni
The simplest mic design will pick up all sound, regardless of its point of origin, and is
thus known as an omni directional microphone. They are very easy to use and generally
have good to outstanding frequency response. To see how these patterns are produced,
here's a sidebar on directional microphones.

Bi-directional
It is not very difficult to produce a pickup pattern that accepts sound striking the front or
rear of the diaphragm, but does not respond to sound from the sides. This is the way any
diaphragm will behave if sound can strike the front and back equally. The rejection of
undesired sound is the best achievable with any design, but the fact that the mic accepts
sound from both ends makes it difficult to use in many situations. Most often it is placed
above an instrument. Frequency response is just as good as an omni, at least for sounds
that are not too close to the microphone.

Cardioids
This pattern is popular for sound reinforcement or recording concerts where audience
noise is a possible problem. The concept is great, a mic that picks up sounds it is pointed
at. The reality is different. The first problem is that sounds from the back are not
completely rejected, but merely reduced about 10-30 dB. This can surprise careless users.
The second problem, and a severe one, is that the actual shape of the pickup pattern
varies with frequency. For low frequencies, this is an omni directional microphone. A
mic that is directional in the range of bass instruments will be fairly large and expensive.
Furthermore, the frequency response for signals arriving from the back and sides will be
uneven; this adds an undesired coloration to instruments at the edge of a large ensemble,
or to the reverberation of the concert hall.

A third effect, which may be a problem or may be a desired feature, is that the
microphone will emphasize the low frequency components of any source that is very
close to the diaphragm. This is known as the "proximity effect", and many singers and
radio announcers rely on it to add "chest" to a basically light voice. Close, in this context,
is related to the size of the microphone, so the nice large mics with even back and side
frequency response exhibit the strongest presence effect. Most cardioids mics have a built
in low cut filter switch to compensate for proximity. Mis-setting that switch can cause
hilarious results. Bidirectional mics also exhibit this phenomenon.

Tighter Patterns
It is possible to exaggerate the directionality of cardioids type microphones, if you don't
mind exaggerating some of the problems. The Hyper-cardioids pattern is very popular, as
it gives a better overall rejection and flatter frequency response at the cost of a small back
pickup lobe. This is often seen as a good compromise between the cardioids and
bidirectional patterns. A "shotgun" mic carries these techniques to extremes by mounting
the diaphragm in the middle of a pipe. The shotgun is extremely sensitive along the main
axis, but possesses pronounced extra lobes which vary drastically with frequency. In fact,
the frequency response of this mic is so bad it is usually electronically restricted to the
voice range, where it is used to record dialogue for film and video.

Stereo microphones
You don't need a special microphone to record in stereo, you just need two. A so called
stereo microphone is really two microphones in the same case. There are two kinds:
extremely expensive professional models with precision matched capsules, adjustable
capsule angles and remote switching of pickup patterns; and very cheap units (often with
the capsules oriented at 180 deg.) that can be sold for high prices because they have the
word stereo written on them.

Typical Placement
Single microphone use
Use of a single microphone is pretty straightforward. Having chosen one with appropriate
sensitivity and pattern, (and the best distortion, frequency response, and noise
characteristics you can afford), you simply mount it where the sounds are. The practical
range of distance between the instrument and the microphone is determined by the point
where the sound overloads the microphone or console at the near end, and the point
where ambient noise becomes objectionable at the far end. Between those extremes it is
largely a matter of taste and experimentation.

If you place the microphone close to the instrument, and listen to the results, you will find
the location of the mic affects the way the instrument sounds on the recording. The
timbre may be odd, or some notes may be louder than others. That is because the various
components of an instrument's sound often come from different parts of the instrument
body (the highest note of a piano is nearly five feet from the lowest), and we are used to
hearing an evenly blended tone. A close in microphone will respond to some locations on
the instrument more than others because the difference in distance from each to the mic is
proportionally large. A good rule of thumb is that the blend zone starts at a distance of
about twice the length of the instrument. If you are recording several instruments, the
distance between the players must be treated the same way.
If you place the microphone far away from the instrument, it will sound as if it is far
away from the instrument. We judge sonic distance by the ratio of the strength of the
direct sound from the instrument (which is always heard first) to the strength of the
reverberation from the walls of the room. When we are physically present at a concert,
we use many cues beside the sounds to keep our attention focused on the performance,
and we are able to ignore any distractions there may be. When we listen to a recording,
we don't have those visual clues to what is happening, and find anything extraneous that
is very audible annoying. For this reason, the best seat in the house is not a good place to
record a concert. On the other hand, we do need some reverberation to appreciate certain
features of the music. (That is why some types of music sound best in a stone church)
Close microphone placement prevents this. Some engineers prefer to use close miking
techniques to keep noise down and add artificial reverberation to the recording, others
solve the problem by mounting the mic very high, away from audience noise but where
adequate reverberation can be found.

Stereo
Stereo sound is an illusion of spaciousness produced by playing a recording back through
two speakers. The success of this illusion is referred to as the image. A good image is one
in which each instrument is a natural size, has a distinct location within the sound space,
and does not move around. The main factors that establish the image are the relative
strength of an instrument's sound in each speaker, and the timing of arrival of the sounds
at the listener's ear. In a studio recording, the stereo image is produced artificially. Each
instrument has its own microphone, and the various signals are balanced in the console as
the producer desires. In a concert recording, where the point is to document reality, and
where individual microphones would be awkward at best, it is most common to use two
mics, one for each speaker.

Spaced microphones
The simplest approach is to assume that the speakers will be eight to ten feet apart, and
place two microphones eight to ten feet apart to match. Either omnis or cardioids will
work. When played back, the results will be satisfactory with most speaker arrangements.
(I often laugh when I attend concerts and watch people using this setup fuss endlessly
with the precise placement of the mics. This technique is so forgiving that none of their
efforts will make any practical difference.)

The big disavantage of this technique is that the mics must be rather far back from the
ensemble- at least as far as the distance from the leftmost performer to the rightmost.
Otherwise, those instruments closest to the microphones will be too prominent. There is
usually not enough room between stage and audience to achieve this with a large
ensemble, unless you can suspend the mics or have two very tall stands.

Coincident cardioids
There is another disadvantage to the spaced technique that appears if the two channels are
ever mixed together into a monophonic signal. (Or broadcast over the radio, for similar
reasons.) Because there is a large distance between the mics, it is quite possible that
sound from a particular instrument would reach each mic at slightly different times.
(Sound takes 1 millisecond to travel a foot.) This effect creates phase differences between
the two channels, which results in severe frequency response problems when the signals
are combined. You seldom actually lose notes from this interference, but the result is an
uneven, almost shimmery sound. The various coincident techniques avoid this problem
by mounting both mics in almost the same spot.
This is most often done with two cardioid microphones, one pointing slightly left, one
slightly right. The microphones are often pointing toward each other, as this places the
diaphragms within a couple of inches of each other, totally eliminating phase problems.
No matter how they are mounted, the microphone that points to the left provides the left
channel. The stereo effect comes from the fact that the instruments on the right side are
on-axis for the right channel microphone and somewhat off-axis (and therefore reduced
in level) for the other one. The angle between the microphones is critical, depending on
the actual pickup pattern of the microphone. If the mics are too parallel, there will be
little stereo effect. If the angle is too wide, instruments in the middle of the stage will
sound weak, producing a hole in the middle of the image. [Incidentally, to use this
technique, you must know which way the capsule actually points. There are some very
fine German cardioid microphones in which the diaphragm is mounted so that the pickup
is from the side, even though the case is shaped just like many popular end addressed
models. (The front of the mic in question is marked by the trademark medallion.) I have
heard the results where an engineer mounted a pair of these as if the axis were at the end.
You could hear one cello player and the tympani, but not much else.

You may place the microphones fairly close to the instruments when you use this
technique. The problem of balance between near and far instruments is solved by aiming
the mics toward the back row of the ensemble; the front instruments are therefore off axis
and record at a lower level. You will notice that the height of the microphones becomes a
critical adjustment.

M.S.
The most elegant approach to coincident miking is the M.S. or middle-side technique.
This is usually done with a stereo microphone in which one element is omnidirectional,
and the other bidirectional. The bidirectional element is oriented with the axis running
parallel to the stage, rejecting sound from the center. The omni element, of course, picks
up everything. To understand the next part, consider what happens as instrument is
moved on the stage. If the instrument is on the left half of the stage, a sound would first
move the diaphragm of the bidirectional mic to the right, causing a positive voltage at the
output. If the instrument is moved to center stage, the microphone will not produce any
signal at all. If the instrument is moved to the right side, the sound would first move the
diaphragm to the left, producing a negative volage. You can then say that instruments on
one side of the stage are 180 degrees out of phase with those on the other side, and the
closer they are to the center, the weaker the signal produced.

M.S. produces a very smooth and accurate image, and is entirely mono compatabile. The
only reason it is not used more extensively is the cost of the special microphone and
decoding circuit, well over $1,000.
Types of Microphones
Different Types of Microphones

The use of different types of microphones, in a church setting, will enhance the tone,
sound level and dynamics of the music and importantly, the way the instruments are
created to sound. Having at least one microphone for each instrument your church plays,
is a good number to keep by. The more microphones your church has, the better off
you’ll be in the future.

Wired & wireless


You can also use wireless mics. Wireless lavaliers, for example, are the best choice if
you are shooting a video and you want some long shots (where you can see the actor’s
entire body). For wireless mics you need to get a system—including the clip-on mic,
batteries, a transmitter which the actor wears, a receiver which is mounted on top of the
camera and a cable that connects the receiver to the camcorder’s mic port.

Dynamic & condenser


There are two main kinds of microphone: dynamic and condenser. Condenser mics tend
to be more sensitive to a wider range of frequencies, but they are more fragile and may
be too sensitive if recording really loud sounds. Condenser mics also need a power
supply, either a 48v phantom power supply or batteries.

Windshields/windscreens
These can be bought separately and are attached to the microphone or placed in front of
it. They help cut down on wind noise and p-popping (the distortion caused by the sudden
rush of air if you say plosive consonants like p, b and g directly into a mic).

Condenser, capacitor or electrostatic microphones


In a condenser microphone, also known as a capacitor microphone, the diaphragm acts as
one plate of a capacitor, and the vibrations produce changes in the distance between the
plates.
There are two methods of extracting an audio output from the transducer thus formed.
They are known as DC biased and RF (or HF) condenser microphones

Recording Techniques & Equipments


Analog Recording
Analog recording is also called Reel-to-reel, open reel tape recording is the form of
magnetic tape audio recording in which the recording medium is held on a reel, rather
than being securely contained within a cassette.
In use, the supply reel or feed reel containing the tape is mounted on a spindle; the end of
the tape is manually pulled out of the reel, threaded through mechanical guides and a tape
head assembly, and attached by friction to the hub of a second, initially empty takeup
reel. The arrangement is similar to that used for motion picture film.

Audio Meters
An “audio-first” approach is available in many different contexts and basic video
production programs already have the necessary tools to make it work. Classroom
instruction begins with recording techniques: microphone placement, avoidance of wind
noise, avoidance of mic and cable noise, awareness of noise floor of different devices,
monitoring, slating takes, and use of sound logs. Students work with omni-directional and
directional microphones, and condenser and dynamic microphones. As we have both
analog and digital equipment available, I first introduce students to the warmth of analog
recording (traditional cassette recorders) and then the uniqueness of digital recorders
(DATS, mini discs, or hard disk recorders.) Each device presents its own unique
problems: drop out vs. distortion, necessity of the limiter on digital recording, differences
between monitoring with peak meter as compared to a UV meter.

Audio Control System


Audio Control Systems are designed to provide highly flexible routing, mixing,
distribution and level control of audio signals for a wide variety of applications including
divisible hotel banquet/meeting rooms, performing arts centers, houses of worship, sports
facilities, theaters, and recording/ production studios. The TOA DX-0808 is a full mixing
matrix: any input or combination of inputs can be mixed and routed to any output or
combination of outputs. A single DX-0808 provides eight inputs that are assigned to eight
input mixing buses using a 64 point switching matrix.

Audio mixer
Definition
· An audio mixer is an electronic console that is used to mix different recorded tracks by
changing their volume levels, adding effects and changing the timbre of each instrument
on the tracks.
Names
Audio mixers are also called mixing consoles and soundboards
Use
· Audio mixers are most often used by recording studios but are also typically used in live
situations by live sound engineers.

Types
There are two types of mixers, digital and analog, and both are commonly used by the
same recording studio to achieve different results.

Mixing consoles are used in many applications, including recording studios, public
address systems, sound reinforcement systems, broadcasting, television, and film postproduction.
An example of a simple application would be to enable the signals that originated from
two separate microphones (each being used by vocalists singing a duet, perhaps) to be
heard through one set of speakers simultaneously. When used for live performances, the
signal produced by the mixer will usually be sent directly to an amplifier, unless that
particular mixer is "powered" or it is being connected to powered speakers.

Broadcast Standards
Broadcasting is the sending audio or video content to a wide audience
using the
electromagnetic spectrum (radio waves), for a mass media like TV or radio. It is a
classic example of one to many model of communication.

Transmission of programs from a radio or television station to home receivers over the
spectrum is referred to as OTA (over the air) or terrestrial broadcasting and in most
countries requires a broadcasting license. Thus a standardization is required to make
the relay of messages possible. The standardization of formats across various set ups
is essential.
The three prime most accepted standards of formats for broadcasting are:

NTSC (National Television Standards Committee) the first color TV broadcast


system was implemented in the United States in 1953. This was based on the NTSC -
National Television System Committee standard. NTSC runs on 525 lines & 30
frames/second. It used in USA, Canada, Japan and Latin America.

PAL (Phase Alternating Line) PAL - Phase Alternating Line standard was
introduced in the early 1960's and implemented in most European countries except for
France. The PAL standard utilizes a wider channel bandwidth than NTSC which allows
for better picture quality. A standard used almost everywhere else in the world, has the
ability to display 625 lines of resolution with a frame rate of 25 frames per second.

SECAM (Sequential Color Memory) The SECAM - Sequential Couleur Avec


Memoire or Sequential Color with Memory standard was introduced in the early 1960's
and implemented in France. SECAM uses the same bandwidth as PAL but transmits the colour
information sequentially. SECAM runs on 625 lines & 25 frame/second. is used
sparingly around the world and can be found in France, parts of Greece, Eastern
Europe, Russia, Africa and a few other parts of the world.

The camera we use today to shoot our home video, news or even films was not made in a day.
Years of scientific research and developement, first on the still camera, then on the film, and
with the advent of recording mediums, on the video. Following are the few landmark dates in teh
history of video camera.

1888: Thomas Edison, the inventor of the light bulb, had another light bulb moment in
1888, when he filed a patent for 'kinetoscope', a device 'which does for eye what
phonograph does for the ear'.

1895: The Lumierre brothers patented the cinematograph, a device for capturing,
developing and projecting moving images.

1912: Bell and Howell introduce the first all metal camera after there wood and leather
camear was destroyed by termites.

1927: Philo Farnsworth's video camera tube converts images into electrical signals.

1983: With the advent of CCD's as image sensors, Sony introduces the first one-piece
video camcorder, Betamovie. But by this point the Betamax format is already losing the
war with VHS.

2001: Once Upon a Time in Mexico is the first mainstream movie filmed in
24-frame-per-second high-definition digital video.
2007: The RED one, the first 4k-resolution digital camera, revolutionizes
digital filmmaking.

2009: Slumdog Millionaire is the first film shot mostly in digital to win the
Academy Award for Best Cinematography

2012: Felix Baumgartner straps on five GoPro video cameras before his historic 24-mile
skydive. YouTube sets its live-stream record as more than 8 million tune in.

Professional video cameras, such as


those used in television production, may
be television studio-based or mobile in the
case of an electronic field production
(EFP).

Camcorders combine a camera and a


VCR or other recording device in one unit;
these are mobile, and were widely used
for television production, home movies,
elect ronic news gather ing (ENG)
(including citizen journalism), and similar
applications.

Closed-circuit television (CCTV)


generally uses pan tilt zoom cameras
(PTZ), for security, surveillance, and/or monitoring purposes. Such cameras are
designed to be small, easily hidden, and able to operate unattended.

Webcams are video cameras which stream a live video feed to a computer.

Camera phones - nowadays most video cameras are incorporated into mobile
phones.

Special camera systems are used for scientific research, e.g. on board a satellite or
a spaceprobe, in artificial intelligence and robotics research, and in medical use. For
example the hubble space telescope. Such cameras are often tuned for non-visible
radiation for infrared (for night vision and heat sensing) or X-ray (for medical and video
astronomy use).

Parts of the Camera


All video cameras have a number of standard components (Internal & External),
including an imaging device (beam splitter and image sensor is located inside the video
camera), lens, viewfinder, microphone, camera control unit etc.

1. Camera lens: It consists of one or more pieces of glass that focuses and frames an
image within the camera. The lenscontains focus ring, zoom ring and aperture control
ring that allows the camera operator to adjust the frame and light in accordanc eto the
subject.

2. Lens shade or hood protects the lens from picking up light distortions(called lens
glare) from the sun or a bright light and saves it from direct heat and dust when shooting
outdoors.

3. The power zoom switch: It is located on the side of the lens, allows the camera
operator to electronically zoom the lens. The speed of the zoom can be varied,
depending on the pressure applied on the switch.

4. Microphone: Most portable video cameras include a microphone intended for


environmental (natural) sound pickup. It may be built in or removable. A foam sponge
cover over the microphone reduces low-pitched wind rumble. The camera may or may
not have sockets for more audio inputs.

5. Viewfinder: The viewfinder contains a small screen with a magnifying lens that
enlarges the image to be viewed by the camera operator. Depending on the camera, a
viewfinder can come various shapes and sizes.

Viewfinder Types
The viewfinder of a camcorder can be a CRT, tube-type (like those used in most TV
sets), or a flat, LCD type (similar to those in laptop computers). CRT stands for cathode
ray tube; LCD for liquid crystal display. Unlike studio cameras that typically use at least
seven-inch displays, the viewfinders for camcorders must be much smaller. They
typically use a miniature CRT with a magnifying eyepiece, or, as shown below, a small
LCD screen.

LCD swing-out viewfinder. An increasing number of video cameras are fitted with a
foldout rectangular LCD screen (liquid crystal display), which is typically 2.5 to 3.5
inches wide and shows the shot in colour. It is lightweight and conveniently folds flat
against the camera body when out of use. However, stray light falling onto the screen
can degrade its image, making it more difficult for the camera operator to focus and to
judge the picture quality.

Image sensor
It is a kind of camera device which takes the output of each of the three colour channels
(RGB) and recombines them into one colour signal, including both chrominance and

luminance. This is the encoded colour signal, and when it is displayed on a monitor or
receiver we see the camera shot in full colour.

The two principle components of the colour television signal contains a luminance and
chrominance. Luminance refers to the black and white brightness information. Every
colour television signal contains a luminance signal as well as the colour information,
chrominance is the colour information, and includes two components hue and
saturation. Hue refers to the colour itself: red, green, blue and so on and saturation
refers to the amount of intensity of the colour.

Once the white light that enters the lens has been divided into the three primary colours,
each light beam must be translated into electrical signals. The principal electronic
component that converts light into electricity is called the imaging device or image
sensor*. These imaging devices are pickup tube, CCD (Charge-coupled device) or
CMOS (Complementary metal oxide semiconductor).

(Image sensor is an important component of video camera. The function of image


sensor is conversion of light energy to electrical energy. Camera image sensor is called
‘optical video transducer’).

All of the highest-quality colour cameras use three image sensors. These cameras
produce the best picture because each colour is assigned to its own CCD or pick up
tube thereby ensuring the highest amount of control over the signal of each.

Pick up tube
In older video cameras, before the 1990s, a video camera tube or pickup tube was used
instead of a charge-coupled device (CCD). Several types were in use from the 1930s to
the 1980s. These tubes are a type of cathode ray tube
The Camera Pickup Tube in a television cameras is a small valium tube, usually 3”, or
4” long and 1/2” , 2/3” or 1” in diameter. The tube has several main components. The
tube itself is made up of a glass. Attached to the inside of the glass face of the tube is
extremely thin, transparent photosensitive coating. Next to this is a layer of
photoconductive material. Immediately behind the photoconductive layer is the target,
which has a slight positive electrical charge when the camera is turned on. Light from
the scene that is recorded is focused by the lens of the camera onto the face of the
pickup tube. It passes through the glass and the photosensitive coating, onto the
photoconductive layer. As light hits the photoconductive layer it causes the change on
the target to change in proportion to relative intensity of the light.
The video signal is produced as the target is scanned by a beam of electrons. This
beam of electron is emitted by the electron gun at the back of the pickup tube. The
electron beam is focused on the target plate and scans it in a series of horizontal lines.
Each of the line of the picture is composed of about 500 dots, or bits of information.
Bright spots on the photoconductive layer change the charge on the target generally,
and when that -particular spot is scanned by the electron beam a greater number of
electrons pass through the target than pass
through in places where the image on the
photoconductive layer is darker. These
changes in the charge on the target plate
produce the video signal.
CCD (charge-coupled device)
A charge-coupled device (CCD) is a light-sensitive integrated circuit that stores and
displays the data for an image in such a way that each pixel (picture element) in the
image is converted into an electrical charge the intensity of which is related to a colour
in the colour spectrum.
A CCD normally contains hundreds of thousands or, millions of image-sensing
elements, called pixels (a word made up of pix, for picture, and els for elements), that
are arranged in horizontal and vertical rows, Pixels function very much like tiles that
compose a complete mosaic image. A certain amount of such element is needed to
produce a recognizable image.

An image is projected by a lens on the capacitor array (the photoactive region), causing
each capacitor to accumulate an electric charge proportional to the light intensity at that
location. Digital colour cameras generally use a Bayer mask over the CCD. Each
square of four pixels has one filtered red, one blue, and two green (the human eye is
more sensitive to green than either red or blue). The result of this is that luminance
information is collected at every pixel, but the colour resolution is lower than the
luminance resolution.

CMOS (Complementary Metal Oxide Semiconductor)

Both CCD and CMOS image sensors capture


light using a grid of small photo sites on their
surfaces but CMOS chips traditionally have
been more susceptible to noise; however, the
technology in this area has significantly
improved.
Each one of these pixel sensors is called an
Active Pixel Sensor (APS). CMOS chips
consume as much as 100 times less power
than a CCD and cost less to produce. However,
CCDs have been used for a much longer
period of time and so are much more mature.
Historically, they tend to have much higher quality, but the CMOS has caught up.
CMOS images sensors can offer many advantages over traditional CCD sensors.

CCD CMOS

1. CCD Needs extra circuitry to convert to


digital signal
2. CCD sensors create highquality,
low-noise images.
3. CCDs use a process that consumes lots
of power. CCDs consume as much as 100
times more power than an equivalent
CMOS sensor.
Video Resolution

Video Resolution Video resolution is a measure of the ability of a video camera to


reproduce fine detail. The higher the resolution the sharper the picture will look. The
standard NTSC broadcast TV system can potentially produce a picture resolution equal
to about 300 lines of horizontal resolution. (This is after it goes through the broadcast
process. What you see in a TV control room can be much higher.) CATV, DVD and
satellite transmissions as viewed on a home receiver can reach 400 lines of resolution.
Three- to four-hundred lines of resolution equal what viewers with 20-20 vision can see
when they watch a TV screen at a normal viewing distance. “Normal” in this case
translates into a viewing distance of about eight times the height of the TV picture. So, if
the TV screen were 40 cm (16 inches) high, a so-called 25-inch (64- centimeter) picture,
the normal viewing distance would be about 2 meters (10 feet).(irrelevant)
Lines of resolution as measured by a test pattern, such as the one shown here, are not
to be confused with the horizontal scanning lines in the broadcast TV process- typically
525 and 625-which we discussed earlier. Although most home TV sets are capable of
only 300 or so lines of resolution (and that’s on a good day!), TV cameras are capable
of much higher resolutionsup to 1,000 lines or more.

Camera Controls and adjustment


Television cameras are probably easier to operate than film or still cameras because
we can watch and control the camera output while recording. Most domestic
camcorders can do just about everything automatically. All you have to do is turn them
on, point, and press record. In most situations this is fine, but automatic functions have
some serious limitations.
If you want to improve your camera work, you must learn to take control of your camera.
This means using manual functions. In fact, professional cameras have very few
automatic functions, and professional camera operators would never normally use auto-
focus or auto-iris. There are few electronic controls, and the manual controls on the lens
will be familiar to anyone who has used a good still or motion picture camera. Since
almost any camera can produce sharper, clearer pictures than can be recorded on the
popular videotape formats.
Following are few of the camera controls:

Aperture/iris:
It is an adjustable opening, which controls the amount of light coming
through the lens (i.e. the "exposure"). The video camera iris works in basically the same
way as a still camera iris -- as you open the iris, more light comes in and the picture
appears brighter.

The difference is that with video cameras, the picture in the viewfinder changes
brightness as the iris is adjusted. Professional cameras have an iris ring on the lens
housing, which you turn clockwise to close and anticlockwise to open. Consumer-level
cameras usually use either a dial or a set of buttons. You will probably need to select
manual iris from the menu.

The Correct Exposure:


Before using your manual iris, you need to know what the
correct exposure looks like in your viewfinder (note: adjust the viewfinder settings to
begin with). Always set your iris so that the subject appears correctly exposed. This may
mean that other parts of the picture are too bright or too dark, but the subject is usually
more important.

Professional cameras have an additional feature called zebra stripes which can help
you to judge exposure. The stripes show the area in the frame which is over exposed.
Like still photography, the aperture starts from 2 or 2.4 and ranges till 32 or 64,
depending on the lens .

Shutter speed: The term shutter comes from still photography, where it describes a
mechanical "door" between the camera lens and the film. When a photo is taken, the
door opens for an instant and the film is exposed to the incoming light. The speed at
which the shutter opens and closes can be varied — the faster the speed, the shorter
the period of time the shutter is open, and the less light falls on the film.

Shutter speed is measured in fractions of a second. A speed of 1/60 second means that
the shutter is open for one sixtieth of a second. A speed of 1/500 is faster, and 1/10000
is very fast indeed. Video camera shutters work quite differently from still film camera
shutters but the result is basically the same. The shutter "opens" and "closes" once for
each frame of video; that is, 25 times per second for PAL and 30 times per second for

Focus:
The ability to manually focus your camera is a critical skill at any level of
video production. When the subject is zoomed in and focussed on, it is called sharp
focus. When it is not in sharp focus, but clear enough, it is called soft focus. When the
focus shifts from one subject to another, within same shot, it is called shift focus or rack
focus. The area of acceptable sharpness, in front and behind the point of focus is
called depth of field. The depth of field increases as we close down the aperture, and
decreases when we open it up. This means that Depth of field at F22 will be greater
than that of F2.

Zoom
The zoom is the function which moves your point of
view closer to, or further away from, the subject by
narrowing the angle of view. The two most common
zoom mechanisms are shown below:\

Manual zoom (ring

This is a zoom ring on the lens housing which is rotated


manually, typically by the left thumb and index finger.
Advantages: Speed (you can do super-fast zooms); doesn't require power (so no drain
on your battery).
Disadvantages: More difficult to control; harder to get smooth zooms.

Servo zoom (lever)


This is a lever which sits on the lens housing. It's usually
positioned so that when you slide your right hand into the grip belt, the servo zoom will
be sitting under your first two fingers. Pressing the front part of the lever zooms in,
pressing the rear part zooms out. Cheaper cameras usually have a constant zoom
speed, whereas a good servo zoom will have variable speed -- the further you depress
the lever, the faster the zoom. The lever may have labels such as T and W (tele and
wide).
Advantages: Easy to use in most situations; nice smooth zooms.
Disadvantages: Uses battery power; may be limited to fixed speeds.
There's an important characteristic of zoom lenses that you should be aware of: The
further you zoom in, the more difficult it is to keep the picture steady. At very long
zooms, a tripod is essential.
As a rule, don't zoom unless there is a reason to. If you want to show both the whole
scene as well as some close-up details, you don't need to have a zoom in. Instead,
shoot a wide shot, stop recording, zoom in to a close up, then start recording again. The
result is one shot which cuts cleanly and quickly to another, portraying the same
information as a zoom, but more efficiently.

Digital Zoom vs Optical Zoom


There are two types of zoom on a video camera — digital zoom and optical zoom. A
camera can have either or both types. The two different types are very different and the
unwary buyer can get caught out badly by not understanding how they work.

Digital Zoom
This is often trumpeted as a big selling point by manufacturers. It's common to see a
large "150X" emblazoned on the side of a camcorder. Video stores are full of naive
customers comparing the digital zoom of different cameras.
Digital zoom works by magnifying a part of the captured image using digital
manipulation. This is the same as how a graphics program resizes an image to a larger
size. The process involves taking a certain number of pixels and creating a larger
image, but because the new image is based on the same number of pixels, the image
loses quality. At small zooms (up to 20x) the loss may not be too noticeable. At large
zooms (up to 100x or more) the quality becomes absolutely terrible.
Remember that digital zoom can be done in post-production with any half-decent editing
software, so you really gain nothing by having the camera do it.

Optical Zoom
This is the zoom spec which matters. Optical zoom is provided by the lens (i.e. the
optics) and does not lose image quality. The zoom is provided by a telephoto lens. It
works by narrowing the field of view by decreasing the angle, while exposing it on the
same size of sensor.

White balance:
The colour of the light depends on the source, but our eyes adjust
accordingly and see the true colour in any light. The camera on the other hand needs to
be given the reference of white in all lights. This process is called white balance. It's a
function which tells the camera what each colour should look like, by giving it a "true
white" reference. If the camera knows what white looks like, then it will know what all
other colors look like.

This function is normally done automatically by consumer-level cameras without the


operator even being aware of it's existence. It actually works very well in most
situations, but there will be some conditions that the auto-white won't like. In these
situations the colors will seem wrong or unnatural.
To perform a white balance:

a) either the cameras have a colour temperature preset, through which one can
choose what colour temperature one is shooting at.
b) Or the second method is manual. Point the camera at something matt (non-reflective)
white in the same light as the subject, and frame it so that most or all of the picture is
white. Set your focus and exposure, then press the "white balance" button (or throw the
switch). There should be some indicator in the viewfinder which tells you when the white
balance has completed.
Correct White Balance Balanced for Tungstun Balanced for Daylight

Camera Filter Wheels


As we’ve noted, professional video cameras
have one or two filter wheels located behind
the lens that can hold a number of filters.
Individual filters on each wheel can be rotated
into the lens light path as needed. Note in the
photo on the right that the camera has two
filter wheels.
One labeled 1 - 4, and one A - D. Each wheel
can be turned to rotate into position the
various options noted on the right of the
photo. For example 2-B would be a 1/4 ND
(neutral density) filter, along with a 3,200K
color correction filter. Filter wheels might also
contain -

A fluorescent light filter, which can reduce the blue-green effect of fluorescent
lights
One or more special effects filters, including
the previously discussed star filter
An opaque “lens cap,” which blocks all light
going through the lens Although filters shown are
located behind the lens, it should be noted that some
filters, such as polarizing filters, must be mounted in
front of the camera lens to be most effective

Aspect Ratio
An image's Aspect Ratio, or AR, represents a
comparison of its width to height. Notation for Aspect
Ratio is normally in the form of X:Y, where X
represents screen width and Y represents height.
A standard analog TV has an AR of 4:3 which means
that for every 4 units of width it's 3 units high. And
is resent research and development of high
definition television (HDTV) HDTV system differ from existing Conventional television
system.

So the framing of the image differs with keeping the aspect ratio of the broadcast
standard, or output medium in mind.

Camera Safe Areas:

Because of over scanning and other types of image loss between the camera and the
home receiver, an area around the sides of the TV camera image is cut off before being
seen. To compensate for this, directors must assume that about ten percent of the
viewfinder picture may not be visible on the home receiver.
This area (framed by the lines in the photo) is referred to by various names including
safe area and essential area. All written material should be confined to an “even safer”
area, the safe title area (the area inside the blue frame).

Types of Video Camera


Modern video cameras are available in a number of different configurations, shapes and
sizes that suit all kinds of different situations. They range from units that fit in a pocket to
cameras that are so heavy that they can take a couple of people to lift them. But
television cameras can be divided in two types.

Studio cameras:
Most television studio cameras stand on the floor, and are usually on
wheels. The wheeled tripod is called a dolly.Any video camera when used along with
other video cameras in a multiple-camera setup is controlled by a device known as CCU
(camera control unit) in theproduction control room (PCR). Studio cameras are bulky,
and have no recording compartments as they are not needed to be taken out in the
field.

ENG (Electronic news gathering): ENG cameras are larger and heavier (helps
dampen small movements), and usually supported by a camera shoulder support or
shoulder stock on the camera operator's shoulder, taking the weight off the hand, which
is freed to operate the zoom lens control.
The lens is focused manually and directly, without intermediate servo controls. However
the lens zoom and focus can be operated with remote controls with a television studio
configuration operated by a camera control unit (CCU) in case of outdoor braodcast.

EFP(Electronic Field Production):


These versatile cameras can be carried on the
shoulder, or mounted on camera pedestals and cranes, with the large, very long focal
length zoom lenses made for studio camera mounting. These cameras usually have
recording abilities, so that the footage gathered can be cut on the edit table.

Others: Remote cameras for scientific research and areas where human reach is
usually not possible. CCTV is used for surveillance and Lipstick Camera, or Pen
cameras are used for conducting sting operations.

Camera Movement Equipment


Pictures that are shaky, bounce around, or lean over to one side are a pain to watch. So
it is worth that extra care to make sure that camera shots are steady and carefully
controlled. Camera placement and movements usually require the use of specific
camera mounting devices in order to record steady image. Mounting devices for video
cameras range from pistol grips to cranes. Using of camera support will reduce fatigue
and especially prevent unnecessary and distracting camera motion.

Studio Camera Mount


In the studio the entire camera assembly is
generally mounted on a pedestal or dolly
(shown here) so it can be smoothly rolled
around the studio floor. There are three
wheels in the base of the pedestal that can be
turned using the steering ring.
The camera is directly attached to a pan head
so that the pan and tilt (horizontal and vertical)
camera movements can be adjusted. There
are several types of pan heads; one type, the
cam head, is illustrated here. Controls on the
pan head allow the camera to freely move, be locked into position, or to offer a bit of
resistance to facilitate smooth pans and tilts.
The camera can be raised and lowered by unlocking the telescoping column. Although
the camera may weight more than 100 pounds, there are internal counterweights that
allow an operator to do this without significant effort.

Dollies
A dolly is a camera platform or support device on wheels,
which allows the camera to move smoothly it is widely used
in smaller studios, the rolling tripod dolly does not lend itself
to subtle camerawork. Camera moves tend to be
approximate. Camera height is adjusted resetting the
heights of the tripod legs. So height changes while shooting
are not practical unless a jib is attached to the dolly, the
dolly base should be adjustable.
This type of mount is used for remote productions and in
some small studios. Unlike the elaborate studio pedestal that can be smoothly rolled
across a studio floor-even while the camera is on the air-the wheels on small dollies are
intended primarily to move the camera from place to place between shots.

Slider Mounts
The sliders are handy extensions which can be
added either to the tripods or placed directly on the
floor. As the name suggests, the equipment is used
to give a slight sliding effect either right to left or front
to back. Because of the short lenghth , the effect
can not be achieved for long interval of time.

Camera Jibs
A device that has come into wide use in the last decade
is the camera jib, which is essentially a long, highly
maneuverable boom or crane-like device with a
mounted camera at the end. You will frequently seem
them in action swinging over large crowds at concerts
and events with large audiences.
A jib allows sweeping camera movements from ground
level to 9 or more meters (30 or more feet) in the air.

Camera Tracks
For elaborate productions camera tracks may be
installed that allow the camera to smoothly follow
talent or move through a scene. Although a camera
operator can ride with the camera (as shown here),
some cameras are remotely controlled.

Spider Cam
Looking like a giant spider with a TV camera in its nose, the
spider cam shown here moves along its web and can
provide aerial views of various sporting events. The whole
unit is remotely controlled by a ground observer. The video
is relayed to the production van by microwave.

Handheld Camera:
In handheld camera operator hold
the camera by hand; it is usually because the camera
has to be mobile, able to change positions quickly. This
method is most commonly used by news crews, for
documentaries, at sports events, or for shooting music
videos. In all of these situations, the camera generally
needs to move around to follow the action.
The operator places his or her right hand through a support loop on the side of the lens.
This way, the operator’s fingers are free to control the zoom rocker (servo zoom) switch
while the thumb presses the record/pause switch. The camera operator’s left hand
adjusts the manual zoom ring, the focusing ring, and the lens aperture.

Robotic Camera Mounts


Camera operators are disappearing at many stations-and
even on many network programs. They are being
replaced by remotely controlled camera systems such as
the one shown on the left. From the TV control room,
pan, tilt, zoom, and focus adjustments can be made.
Many of these robotic cameras can also be remotely
dollied and trucked around the studio.
Although robotic cameras are not desirable for
unpredictable or fast-moving subject matter, for programs
such as newscasts and interviews (where operating cameras can get pretty boring
anyway) they can significantly reduce production expenses.

Shoulder Mount or Brace:


One of the most common ways to support a portable
camera or camcorder is to carry it. Some cameras contain
braces that can be changed to accommodate right handed
and left handed operators. In addition well designed
cameras take camera balance into account- the camera
should rest evenly on the operators shoulder.
The shoulder mount gives the camera operator the greatest
amount of flexibility in the movement of the camera. The
camera operator can walk easily with the camera, and all
types of lateral or vertical camera movement are possible.
Steadicam:
The perfect compromise between mobility and
stability appears to have been achived by the
mounting device known as the steadicam. The
steadicam can be used with a film camera but
numbers of different models are available,
depending on the weight of camera, models
include an ENG/EFP for use with professional
video cameras. In steadicam the camera is usually
at the operator’s waist and is detached from
operator’s body.

Monopod:
A monopod is a telescoping rod that is attached to the base of the camera
the monopod is an easily carried, lightweight mounting. It consists of a
collapsible metal tube of adjustable length that screw to the camera base.
This extendable tube can be set to any convenient length. Braced against
a knee, foot, or leg, the monopod can provide a firm support for the
camera yet allow the operator to move it around rapidly for a new
viewpoint. Its main disadvantage is that it is easy to accidentally lean the
camera sideways and get sloping horizons.

Tripod:
A tripod offers a compact, convenient method of holding a camera steady, provided it is
used properly. It has three legs of independently adjustable length that are spread apart
to provide a stable base for the camera. Tripod heads come in two types’ friction heads
and fluid heads. Friction heads, which are less expensive of the two types, give fair
control over camera panning (left to right) and tilting
(up and down).
Professional camera operators always use tripods
with fluid heads.tese are more expensive than
friction heads but are designed so that it is virtually
impossible to make a jerky horizontal or vertical
camera movement. For every smooth, solid camera
operation, the tripod-mounted fluid head is the usual
choice.

Tripod spreader is used to spread the tripod legs


out to their widest stance and hold them firmly in
that position.

Tripod dolly is a wheeled base that can be


attached to the tripod legs for the movement of
tripod when required.
Camera Accessories

Lenses – Functions and Types & Controls


A Lens is a curved piece of glass that causes Light rays to bend. Because glass is
denser than air. Lenses bend light so that it can be controlled and Projected in Proper
focus and size at a specific Point behind the Lens where a Light-Sensitive material can
record or transmit the image. A camera Lens – consists of one or more pieces of glass
that focus and frame an image within the camera. Simply Single Lenses fall into two
basic categories: Concave and Convex.

1. Concave Lenses which are thinner at the Center


than at the edges bend Light rays away from the
center of the Lens.

2. Convex Lenses are thickest at the center and


bend light toward the center of the Lens.

3. Compound lens: Modern film and video camera


Lenses are composed of more than one piece of
glass and are called Compound Lenses. Compound
Lenses – Combine several concave and convex
lenses.
The two basic features of Camera Lenses are their

focal length and f stop ratings.


Focal length is the distnace between the centre of the lens and the focal point. The f
stop is the ratio of focal length to the diameter of the lens.
Focal length determines the “taking angle” of the lens. Only the lenses with variable
focal lenghth can perform the zooming function. Lenses with fixed focal lenghth are
called prime lenses.
Focal length variation and the view from it
Types of lenses based on focal length

Normal lens: A normal lens will have an angle of view corrosponding to human eye,
achieved at 50-55 mm,

Wide angle lens: Angle of view is greater than human eye, achieved at lesser than 50
mm. They are good for shooting sceneries and creating illusion of space. A good wide
angle will have a focal range from 16 mm to 50 mm. Beyond 16 mm, the image starts
getting distorted.

Telephoto lens: Angle of view is lesser than human eye, achived at focal lenghth
greater than 55 mm. Telephoto zoom lenses are great for sports enthusiasts and nature
photographers. They allow you to get close to your subject from a safe distance. A
typical telephoto zoom will offer a focal length range of 75mm to 300mm.

Macro lenses :open up the world of close-up photography, and are a great addition to
your equipment. Focussing is critical when working in close-up photography, so mount
the camera on a tripod and switch to manual focus for the best results.
Lens controls:

FOCUS: The focus control is the ring farthest from the camera body, on the front of the
lens. Distance settings are marked in meters and in feet. While a non-zoom (fixed focal
length) lens is focused simply by turning the ring until the image is sharp, the zoom lens
must be zoomed in to the smallest angle of view and the largest image size to adjust
focus.

Zoom: The center ring on most lenses is the zoom control. Most cameras use a rocker
switch beside the lens. This allows you to change the focal length of the lens through a
range from wide angle (short focal length) to telephoto (long focal length)

IRIS: The ring closest to the camera body controls the amount of light passing through
the lens to the light-sensitive surface of the pickup tube or chip. It is called the iris,
aperture, or f-stop control and is marked off in f-numbers. The lowest f-stop lets in the
most light, and the highest f-stop lets in the least

Filters
Glass filters consist of a transparent colored gel sandwiched between two precisely
ground and sometimes coated pieces of glass. Filters can be placed in a circular holder
that screws over the end of the camera lens, (as shown here) or inserted into a filter
wheel behind the camera lens (to be discussed later). A type of filter that’s much
cheaper than a glass filter is the gel. A gel is a small, square or rectangular sheet of opic
plastic used in front of the lens in conjunction with a matte box, which will be illustrated
later. Among professional videographers, these filter types are referred to as round
filters and rectangular filters. There are many types of filters. We’ll only cover the most
commonly used types.

Ultraviolet Filters
News photographers often put an ultraviolet filter (UV filter) over the camera lens to
protect it from the adverse conditions encountered in ENG (electronic newsgathering)
work. A damaged filter is much cheaper to replace than a lens. Protection of this type is
particularly important when the camera is used in high winds where dirt or sleet can be
blown into the lens. By screening out ultraviolet light, the filter also slightly enhances
image color and contrast and reduces haze in distant scenes. In so doing, the filter also
brings the scene being photographed more in line with what the eye sees. Video
cameras also tend to be more sensitive to ultra-violet light, which can add a kind of haze
to some scenes. Since UV filters screen out ultra-violet light while not appreciably
affecting colors, many videographers keep an ultraviolet filter permanently over the lens
of their camera.

Using Filters for Major Color Shifts

Although general color correction in a video camera is done through the combination of
optical and electronic camera adjustments, it’s sometimes desirable to introduce a
strong, dominant color into a scene. For example, when a scene called for a segment
shot in a photographic darkroom, one camera operator simulated a red darkroom
safelight by placing a dark red glass filter over the camera lens. (Darkrooms haven’t
used red filter safelights to print pictures for decades, but since most audiences still
think they do, directors feel they must support the myth.) If the camera has an internal
white balance sensor, the camera must be color balanced before the filter is placed over
the lens or else the white balance system will try to cancel out the effect of the colored
filter.

Neutral Density Filters


Sometimes it’s desirable to control the amount of light passing through a lens without
stopping down the iris (moving to a higher f-stop number). Under bright sunlight
conditions you may want to keep a relatively wide f-stop and use selective focus to
reduce depth of field. Using this technique you can throw distracting objects in the
background and foreground out of focus.
Although using a higher shutter speed is normally the best solution (we’ll get to that
later), the use of a neutral density or ND filter will achieve the same result. A neutral
density filter is a gray filter that reduces light by one or more f-stops without affecting
color. Professional video cameras normally have one or more neutral density filters
included in their internal filter wheels. To select a filter you simply rotate it into position
behind the lens. The table below shows ND filter grades and the amount of light they
subtract.

0.3 ND filter - 1 f-stop


0.6 ND filter - 2 f-stops
0.9 ND filter - 3 f-stops
1.2 ND filter - 4 f-stops
Even on a bright day a 1.2 ND filter will force the camera iris open enough to allow for
creative selective focus effects.

Polarizing Filters
Most people are familiar with the effect that polarized sunglasses have on reducing
reflections and cutting down glare. Unlike sunglasses, the effect of professional
polarizing filters can be continuously varied—and, as a result, go much farther in their
effect. Polarizing filters can:
• Reduce glare and reflections
• Deepen blue skies
• Penetrate haze
• Saturate (intensify) colors
Once its many applications are understood, a polarizing filter can become a
videographer’s most valuable filter. Simple polarizing filters need between one and
to two extra f stops of exposure. The effect of some professional polarizing filters is
adjustable. This is done by rotating the orientation of two polarizing filters used next to
each other in a single housing. When doing critical copy work, such as photographing
paintings with a shiny surface, polarizing filters can be used over the lights, as well as
the camera lens. By rotating one or more of these filters, all objectionable reflections
can be eliminated.

Contrast Control Filters


Although the latest generation of professional video cameras is capable of capturing
contrast or brightness ranges up to 700:1—far in excess of standard motion picture film
stocks— home television sets and viewing conditions limit that range to not more than
This means that the brightest element in a scene can’t be more than 30 times
brighter than the darkest element with any hope of seeing the detail in each. (Digital
receivers do considerably better, but until everyone has a digital set, we’ll need to “play
it safe.”) “Real world scenes” often contain collections of elements that exceed the 30:1
brightness range. Although we might be able to control this with lighting, etc., in the
studio, things become a bit more challenging outside. For critical exterior scenes the
professional videographer must often consider ways to reduce the brightness range.
One way is with the use of a contrast control filter. There are actually three types of
these filters—low contrast, soft contrast, and the Tiffen Ultra Contrast. The latter filter
seems to affect sharpness the least and result in the least amount of highlight or
shadow flare—assuming that’s desirable. All of these filters, together with various fog
and mist filters, are used to simulate the “film look” in video.
Filters For “the Film Look”
Compared to film, some people feel that some digital video can look overly sharp, harsh
and even brassy. At the same time, many people feel that video is a unique medium,
and it should not try to take on the characteristics of film. Even so, some Directors of
Photography feel that people are more comfortable with the softer “film look.

Day-For-Night
A common special effect—especially in the days of black and white film and television—
was the “night scene” shot in broad daylight—so-called day-for-night. In the black and
white days a deep red filter could be used over the lens to turn blue skies dark—even
black. (Recall the effect that filters have on color that we covered earlier.)
Although not quite as easy to pull off in color, the effect is now created by
underexposing the camera by at least two f-stops and either using a blue filter, or
creating a bluish effect when you white balance your camera. A contrast control filter
and careful control of lighting (including avoiding the sky in scenes) adds to the effect.
To make the “night” effect more convincing other filtering embellishments can be added
during postproduction editing. With the sensitivity of professional cameras now down to
a few foot-candles (a few dozen lux), “night-for-night” scenes are within reach.
Whatever approach you use, some experimentation will be needed using a high-quality
color monitor as a reference.

Color Conversion Filters


Color conversion filters are designed to correct the sizable shift in color temperature
between incandescent light and sunlight— a shift of about 2,000K. Although the
differences in color temperatures of different light sources will make more sense after
we examine the nature of light in a later module, since we are covering filters, we would
be remiss if we didn’t introduce them. Although minor color balancing is done
electronically in professional cameras, major shifts are best handled by colored filters.
There are two series of filters in this category, the Wratten #80 series, which are blue
and convert incandescent light to the color temperature of sunlight, and the Wratten #84
series, which are amber, and convert daylight to the color temperature of tungsten light.
Since video cameras are optimized for one color temperature, these filters are used—
generally in a filter wheel, and often in conjunction with a ND filter—to make the
necessary “ballpark” adjustment. The rest is done electronically when the camera is
color balanced through the filter.

Filters for Fluorescent Light


Some lighting sources are difficult to correct. A prime example, and one that
videographers frequently run into, is fluorescent light. These lights seem to be
everywhere, and they can be a problem. Although in recent years camera
manufacturers have tried to compensate for the greenish cast that fluorescent lights can
create, when it comes to such things as getting true-to-life skin tones—and assuming
you can’t turn off the lights and set up your own incandescent lights—you might need to
experiment with a fluorescent light filter. We say “experiment” because there are dozens
of fluorescent tubes, each with different characteristics. But one characteristic they all
have is a “broken spectrum,” or gaps in the range of colors they emit. The eye can
(more or less) compensate for this when it views things first-hand, but film and video
cameras typically have problems.

Even so, there are other sources of light that are even worse—in particular the metal
halide lights often used in gymnasiums and for street lighting. Although the public has
gotten used to these lighting aberrations in news and documentary footage, when it
comes to such things as drama or commercials it’s generally a different story. As we will
see, there are some color balanced fluorescent lamps that are not a problem, because
they have been especially designed for TV and film work. But, don’t expect to find them
in schools, offices, or board rooms.

Special Effect Filters


Although there are scores of special effect filters available, we’ll just highlight four of the
most popular: the star filter, the starburst filter, the diffusion or soft focus filter, and the
fog filter.

Star Filters
You’ve undoubtedly seen scenes in which “fingers of light” projected out from the sides
of shiny objects—especially bright lights. This effect is created with a glass star filter that
has a microscopic grid of crossing parallel lines cut into its surface. Notice in the picture
on the right that the four-point star filter used slightly softens and diffuses the image.
Star filters can produce four, five, six, or eight-point stars, depending on the lines
engraved on the surface of the glass. The star effect varies with the f-stop used.

A starburst filter (on the left, below) adds color to the diverging rays. Both star filters
and starburst filters slightly reduce the overall sharpness of the image, which may or
may not be desirable.

Soft Focus and Diffusion Filters - Sometimes you may want to create a dreamy, soft
focus effect. By using a soft focus filter or a diffusion filter (on the right, above) you can
do this. These filters, which are available in various levels of intensity, were regularly
used in the early cinema to give starlets a soft, dreamy appearance (while hiding signs
of aging).
A similar effect can be achieved by shooting through fine screen wire placed close to
the lens, or by shooting through a single thickness of a nylon stocking. The f-stop used
will greatly affect the level of diffusion. In the case of soft focus filters or diffusion
materials it’s important to white balance your camera with these items in place.

Fog Filters - A certain amount of “atmosphere” can be added to dramatic locations by


suggesting a foggy morning or evening. Without having to rely on nature or artificial fog
machines, fog filters can create somewhat of the same effect.

General Considerations in Using Filters

Whenever a filter is used with a video camera the black level of the video is raised
slightly. This can create a slight graying effect. Because of this, it’s advisable to readjust
camera setup or black level (either automatically or manually) whenever a filter is used.
Unlike electronic special effects created during postproduction, the optical effects
created by filters during the taping of a scene can’t be undone. To make sure there are
no unpleasant surprises, it’s best to carefully check the results on location with the help
of a high-quality color monitor.
3. Shoot with different filters. Submit in a DVD.

1. Lens: A piece of glass or other transparent substance with curved sides for
concentrating or dispersing light rays. Camera lens is a lens that focuses the image in a
camera.

2. Viewfinder: A device on a camera showing the field of view of the lens, used in
framing and focusing the picture.

3. Microphone: An instrument for converting sound waves into electrical energy


variations, which may then be amplified.

4. Aspect Ratio: An image's Aspect Ratio, or AR, represents a comparison of its


width to height.

5. Iris/Aperture: The adjustable opening—or f-stop—of a lens determines how


much light passes through the lens on its way to the film plane, or nowadays, to the
surface of the camera's imaging sensor. “Faster” lenses have wider apertures, which in
turn allow for faster shutter speeds. The wider the aperture is set, the shallower the
depth of field of the image.

6. Aspect Ratio: Aspect ratio refers to the shape, or format, of the image produced
by a camera. The formula is derived by dividing the width of the image by its height. The
aspect ratio of a 35mm image is 3:2. Most computer monitors and digicams have a 4:3
aspect ratio. Many digital cameras offer the option of switching between 4:3, 2:3 or 16:9.
7. AWB (Auto White Balance): An in-camera function that automatically adjusts
the white balance (color balance) of the scene to a neutral setting regardless of the
color characteristics of the ambient light source.

8. 720p: Shorthand term used to describe HD video recorded at 720 horizontal


scan lines per inch progressively. Measuring 1280 x 720 (921,000 pixels), 720p video
recordings broadcast at 60 frames per second match the highest temporal resolution
levels for ATSC and DVB standards.

9. 1080p: Also known as “Full-HD,” 1080p is a shorthand term for video recorded at
1920 lines of horizontal resolution and 1080 lines of vertical resolution, and is optimized
for 16:9 format playback. The “p” stands for progressive, which means all of the data is
contained in each frame as opposed to “interlaced” (i), in which the image data is split
between two frames in alternating lines of image data.

10. CCD: A semiconductor device that converts optical images into electronic
signals. CCDs contain rows and columns of ultra small, light-sensitive mechanisms
(pixels) that, when electronically charged and exposed to light, generate electronic
pulses that work in conjunction with millions of surrounding pixels to collectively produce
a photographic image. CCDs and CMOS (Complementary Metal Oxide Semiconductor)
sensors are the dominant technologies for digital imaging.

11. CMOS (Complementary Metal Oxide Semiconductor): A type of Imaging Sensor.


CMOS chips are less energy consuming than CCD-type sensors and are the dominant
imaging technology used in DSLRs. Though once considered an inferior technology
compared to CCD sensors, CMOS sensors have vastly improved and are now the
dominant sensor technology.

12. Color Temperature: A linear scale for measuring the color of ambient light with
warm (yellow) light measured in lower numbers and cool (blue) light measured in higher
numbers. Measured in terms of “degrees Kelvin,”* daylight (midday) is approximately
5600-degrees Kelvin, a candle is approximately 800 degrees, an incandescent lamp is
approximately 2800 degrees, a photoflood lamp is 3200 to 3400 degrees, and a midday
blue sky is approximately 10,000-degrees Kelvin.

13. Depth of Field (DOF): Literally, the measure of how much of the background
and foreground area before and beyond your subject is in focus. Depth of field is
increased by stopping the lens down to smaller apertures. Conversely, opening the lens
to a wider aperture narrows the depth of field.

14. Digital Zoom: Unlike an optical zoom, which is an optically lossless function of
the camera’s zoom lens, digital zoom takes the central portion of a digital image and
crops into it to achieve the effect of a zoom. This means that the existing data is not
enhanced or added to, merely displayed at a lower resolution, thereby giving an illusion
of an enlarged image.

15. Electronic Viewfinder (EVF): An electronic viewfinder digitally replicates the


field of view of the area captured by the camera lens. While once considered a poor
replacement for optical viewfinders, newer EVFs containing a million-plus pixels and
faster refresh times have become quite accurate, and in many cases, approach the
clarity levels of optical finders. An advantage of EVFs is their ability to display exposure
data and grids on demand.

16. Follow Focus: A follow focus is a focus-control mechanism used in filmmaking


(with film cameras) and in television production (with professional video cameras).
There are now follow-focus units that have been designed for use with HDSLR cameras
that are used to capture video footage.

17. Gain: Gain refers to the relationship between the input signal and the output
signal of any electronic system. Higher levels of gain amplify the signal, resulting in
greater levels of brightness and contrast. Lower levels of gain will darken the image,
and soften the contrast. Effectively, gain adjustment affects the sensitivity to light of the
CCD or CMOS sensor. In a digital camera, this concept is analogous to the ISO or ASA
ratings of silver-halide films.

18. Interlaced Scan: Interlaced video is a commonly used video capture technique
in which in which the imagery consists of two fields of data captured a frame apart and
played back in a manner that reproduces motion in a natural, flicker-free form that takes
up less storage capacity than progressively captured video.

19. Optical Zoom: Another name for a zoom lens, which is a lens that enables you
to change the magnification ratio, i.e., focal length of the lens by either pushing, pulling
or rotating the lens barrel. Unlike variable focal length lenses, zooms are constructed to
allow a continuously variable focal length, without disturbing focus.

20. Viewfinder: System used for composing and focusing the subject being
photographed. Aside from the more traditional rangefinder and reflex viewfinders, many
compact digital cameras utilize LCD screens in place of a conventional viewfinder as a
method of reducing the size (and number of parts) of the camera. Electronic viewfinders
(EVFs) have become increasingly better in recent years and are slowly finding their way
into traditional DSLRs.

21. White Balance: The camera's ability to correct color cast or tint under different
lighting conditions including daylight, indoor, fluorescent lighting and electronic flash.
Also known as “WB,” many cameras offer an Auto WB mode that is usually—but not
always—pretty accurate.
Dewesh Pandey
Asst Prof BAJMC TIAS
GGSIPU DELHI
____________________________________________________________________

ALL STUDENTS ARE ADVISED TO CONTACT ME DURING COLLEGE TIME


FOR ANY PROBLEM RELATED TO SUBJECT AND STUDY.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy