Film Sound Presentation Lecture 1
Film Sound Presentation Lecture 1
Sanju Augustine
Sound in General
Sound is a pressure wave which is created by a vibrating object. This vibrations set
particles in the surrounding medium (typical air) in vibrational motion, thus
transporting energy through the medium. Since the particles are moving in
parallel direction to the wave movement, the sound wave is referred to as a
longitudinal wave.
Sound waves are characterized into three types. Audible sound waves are those
that humans can hear(20 hz to 20khz). Infrasonic waves are those that are too
low-frequency (below 20 Hz) for humans to hear. Ultrasonic sounds waves are
those that are too high-frequency (above 20,000 Hz) for humans to hear.
Sound is transmitted through gases, plasma, and liquids as longitudinal waves,
also called compression waves. It requires a medium to propagate. Through solids,
however, it can be transmitted as both longitudinal waves and transverse waves.
Longitudinal sound waves are waves of alternating pressure deviations from
the equilibrium pressure, causing local regions of compression and rarefaction,
while transverse waves (in solids) are waves of alternating shear stress at right
angle to the direction of propagation.
Characteristics Of Sound
• Frequency of Sound
Number of oscillations per unit time.
The number of vibrations made by a particle of the medium in one second
is called the frequency of a sound wave. It is same as the number of
waves passing through a point in one second.
Hertz is the unit; 1 Hertz = 1 vibration/second
• The pitch of sounds is determined by the frequency of vibration of the sound
waves that produce them.
• Sound waves with a high frequency produce high-pitched noises, whereas sound
waves with a low frequency produce low-pitched sounds.
• Pitch is a sound property that distinguishes between harsh and flat sounds.
• Pitch is unaffected by the amount of energy received by the ear per unit of time
Amplitude
The amplitude of a sound wave is the measure of the height of the wave.
Means the loudness or the amount of maximum displacement of vibrating
particles of the medium from their mean position when the sound is produced
(ie; Intensity of sound)
Larger the amplitude, the higher the energy. In sound, amplitude refers
to the magnitude of compression and expansion experienced by the
medium the sound wave is travelling through. This amplitude is
perceived by our ears as loudness. High amplitude is equivalent to loud
sounds.
Loudness of Sound
The phenomenon of sound depends on the amplitude of the sound wave. If
the amplitude of the sound wave is large, then the sound is said to be loud.
It is directly proportional to the square of the amplitude of vibration. If the
amplitude of the sound wave becomes double, then the loudness of the
sound will be quadrupled.
It is expressed in decibel (dB).
Sounds above 80 dB becomes noise to human ears.
The pitch of a sound is our ear’s response to the frequency of sound. Whereas
loudness depends on the energy of the wave.
Timbre
Timbre, also called timber, quality of auditory sensations produced by the tone of a
sound wave. The timbre of a sound depends on its wave form, which varies with the
number of overtones, or harmonics, that are present, their frequencies, and their
relative intensities
• Shotgun microphones have a narrow pick-up pattern, making them perfect for
capturing audio from a specific direction while rejecting unwanted noise from
other directions. Understanding microphone polar patterns can greatly influence
microphone selection for different applications.
The Typical Audio Signal Flow Path Explained
• Your audio signal path will shift based on what you're trying to accomplish and your general
preferences in equipment. That being said, we'll illustrate a typical audio signal flow path
from the starting sound to the final output:
• 1. Sound Source
• An audio signal path starts with a sound source. This could be a voice, instrument, or other
noise that is optimized to be recorded by a microphone.
• 2. Microphone
• Your microphone picks up a line level signal. Depending on the noise floor of your
microphone, it's possible that you might need a mic preamp next in your signal flow so that
you're able to track yourself accordingly. Sounds that may need to be pre-amplified can also
include instruments, phono cartridges on vinyl records, and tape decks.
• 3. Preamp
• A preamplifier's job is to bring a weak audio signal and amplify it to much a higher, audible
level so that you can monitor your mix accordingly. The preamp takes in the signal from
something like an electric guitar or microphone and amplifies the sound. Some preamps,
especially mic preamps, can be used to add color or distinct character to a sound quality
before a sound reaches the audio interface.
4. Audio Interface
An audio interface utilizes an analog to digital converter to take the sound coming in from
the mic preamp or microphone to convert the source into digital audio signals so that you
can process within your digital audio workspace. Some audio interfaces also have
Phantom Power built-in so that you can conveniently power microphones accordingly.
5. DAW
Your audio interface then transfers the signal to your digital audio workspace. From here,
you can start to manipulate your recordings digitally. It's worth noting that there are
additional routes of an extended flow within a DAW such as using a send or aux track
which may or may not be applicable to your specific session.
6. Audio Interface
The audio signal flows through another converter, sending the signal to the applicable
outputs connected to your audio interface. It's possible for audio interfaces to cause
latency, or a delay between what's being played and what you hear due to the conversion
process. If you're experiencing latency, you might need to print tracks to reduce CPU, or
check other areas of your signal flow to ensure its not happening elsewhere.
7. Monitors or Headphones
Finally, audio flows out from your audio interface and into your headphones, monitor, or
other set output. This process is repeated every time you press pause or play.
Work flow of Sound Department in film production
sync sound and dub films
Pre-production
• Designing and composing sound and music
• Analyzing recording needs (choosing microphones, hardware and software,
recording environment, RECCE, Sound Script etc.), making a schedule, and
preparing for recording
Production
• Recording, finding, and/or synthesizing sound and music
• Creating sound effects
• Synchronizing
Post-production
• Audio processing individual tracks and mixing tracks (applying EQ, dynamics
processing, special effects, stereo separation, etc.)
• Overdubbing
• Mastering (for CD and DVD music production)
• Finishing the synchronization of sound and visual elements (for production of
sound scores for film or video)
• Channeling output
Sound Design For films
• Sound design is how filmmakers flesh out the aural world of a film to
enhance the mood, atmosphere, and/or tone.
• Sound design components include sound effects or SFX , mixing, Foley ,
dialogue, and music
• Sound design for film is just as important in creating the illusion as
imagery
• what you hear in a movie is always manipulated in some way
• Sound design for film is a way to enhance the visual storytelling. It can give
life, authenticity, and dimension to an otherwise flat, moving picture
• Sound evokes more emotion than picture
• Audio adds depth to visual
• Silence speaks louder than sound
• Audio is an essential part of your video-editing process, and perfecting it
should take precedence in building your final product
Sound Designer
• The person directly in charge of crafting a film’s sound design
• The sound designer typically leads an audio team consisting of a
combination of some or all of the following sound design jobs: Foley
artists, Audio Engineers, Re-recording Mixers, Dialogue Editors,
Supervising Sound Editors, ADR teams, Music Editors and Supervisors,and
even Composers
• Diegetic sound is a noise which has a source on-screen. They are noises
which have not been edited in, for example dialogue between characters
or footsteps. Another term for diegetic sound is actual sound. Non-
diegetic sound is a noise which does not have a source on-screen, they
have been added in
The term “sound designer” was used for the first time in film in 1979. Francis Ford
Coppola granted Walter Murch the title of Sound Designer for his work on Apocalypse
Now, marking the first use of the term as a credit in film Until that point in time, the
usual credit, Supervising Sound Editor or Sound Editor, was generally accepted as a
purely technical role on a film crew
● HUMAN VOICES
The emotive speech of your on-screen subjects tells the story they’re
showcasing, while verbal narration adds
context to the on-screen interactions and tells the audience how to react
● MUSIC
The style of music overlaid on a particular clip relates to the emotion you’re
hoping to reach
● SOUND EFFECTS
The soundscape paints the audio picture of a particular setting: think the
sounds you might hear in a busy newsroom
in New York City are a lot different than those you'd hear in a snowy
mountaintop in the Alps. The sounds in each of
these environments is unique and creates an emotional response with the
viewer, while also making the film feel
more realistic.
• Cinematic sound design has many uses and is
extremely important to the overall effect of a film.
• Apocalypse now Walter Murch
• Elipatahyam
https://www.youtube.com/watch?v=-1-BKYsxPrw
• P Devdas and music by M B Sreenivasan
• Angamaly Diaries
https://www.youtube.com/watch?v=wLDs1KlQuF
8&t=10s
• Punch Drunk Love
https://www.youtube.com/watch?v=F04Gln02m
RI
A Brief History of sound in movies
The first feature film originally presented as a talkie was The Jazz
Singer,directed by Alan Crosland, which premiered on October 6,
1927(it was made with Vitaphone, which was at
the time the leading brand of sound-on-disc technology)
The First Talkie Movie In India Was Alam Ara in 1931(The movie was
also the first to introduce playback singing with the song De De Khuda
Ke Naam Par)