Media
Media
In general, "media" refers to various means of communication. For example, television, radio, and the newspaper are different
types of media. The term can also be used as a collective noun for the press or news reporting agencies. In the computer
world, "media" is also used as a collective noun, but refers to different types of data storage options.
Computer media can be hard drives, removable drives (such as Zip disks), CD-ROM or CD-R discs, DVDs, flash memory,
USB drives, and yes, floppy disks. For example, if you want to bring your pictures from your digital camera into a photo
processing store, they might ask you what kind of media your pictures are stored on. Are they on the flash memory card inside
your camera or are they on a CD or USB drive? For this and many other reasons, it is helpful to have a basic understanding of
what the different types of media are.
Mass media
Mass media refers collectively to all media technologies, including the Internet, television, newspapers, and radio, which are
used for mass communications, and to the organizations which control these technologies. [1][2]
Mass media play a significant role in shaping public perceptions on a variety of important issues, both through the information
that is dispensed through them, and through the interpretations they place upon this information. [3] The also play a large role in
shaping modern culture, by selecting and portraying a particular set of beliefs, values, and traditions (an entire way of life), as
reality. That is, by portraying a certain interpretation of reality, they shape reality to be more in line with that interpretation. [4]
Contemporary research demonstrates an increasing level of concentration of media ownership, with many media industries
already highly concentrated and dominated by a very small number of firms
Purposes
Advocacy, both for business and social concerns. This can include advertising, marketing, propaganda, public
relations, and political communication.
Entertainment, traditionally through performances of acting, music, and sports, along with light reading; since the late
20th century also through video and computer games.
Public service announcements.
Digital media
Digital media is a form of electronic media where data is stored in digital (as opposed to analog) form. It can refer to the
technical aspect of storage and transmission (e.g. hard disk drives or computer networking) of information or to the "end
product", such as digital video, augmented reality or digital art.
Florida's digital media industry association, Digital Media Alliance Florida, defines digital media as "the creative
convergence of digital arts, science, technology and business for human expression, communication, social interaction and
education".
Data conversion
The transformation of an Analog signal to Digital information via an Analog-to-digital converter is called sampling.
According to information theory, sampling is a reduction of information. Most digital media are based on translating analog
data into digital data and vice-versa (see digital recording, digital video, television versus digital television).
Data processing
As opposed to analog data, digital data is in many cases easier to manipulate and the end result can be reproduced indefinitely
without any loss of quality. Mathematical operations can be applied to arbitrary digital information regardless of its
interpretation (you can add "2" to the data "65" and interpret the result either as the hexadecimal number "43" or the letter
"C"). Therefore, it is possible to use the same compression operation onto a text file or an image file or a sound file.
Examples
The following list of digital media is based on a rather technical view of the term media. Other views might lead to
different lists.
Cellular phones
Compact discs
Digital video
Televisions
e-books
Internet
Video games
e-Commerce
Game consoles
Computers
Interactive media
Media sharing requirements
To share your media, you need the following hardware and software:
A wired or wireless private network. For details, see the section about network and firewall requirements
later in this topic.
Either a device known as a networked digital media player (sometimes called a digital media receiver) or
another computer running this version of Windows Vista. Networked digital media players are hardware
devices that connect to your wired or wireless network and allow you to browse and play content from your
Windows Media Player library—even if your computer is in another room. For a list of compatible devices,
see Windows 7 Compatibility Center.
Broadcasting (computing)
In computing, broadcasting refers to a method of transferring a message to all recipients simultaneously. Broadcasting can be
performed as a high level operation in a program, for example broadcasting Message Passing Interface, or it may be a low
level networking operation, for example broadcasting on Ethernet.
Overview
In computer networking, broadcasting refers to transmitting a packet that will be received by every device on the
network[1]. In practice, the scope of the broadcast is limited to a broadcast domain. Broadcast a message is in
contrast to unicast addressing in which a host sends datagrams to another single host identified by a unique IP
address.
Not all network technologies support broadcast addressing; for example, neither X.25 nor frame relay have
broadcast capability, nor is there any form of Internet-wide broadcast. Broadcasting is largely confined to local area
network (LAN) technologies, most notably Ethernet and token ring, where the performance impact of broadcasting
is not as large as it would be in a wide area network.
The successor to Internet Protocol Version 4 (IPv4), IPv6 also does not implement the broadcast method to prevent
disturbing all nodes in a network when only a few may be interested in a particular service. Instead it relies on
multicast addressing a conceptually similar one-to-many routing methodology. However, multicasting limits the
pool of receivers to those that join a specific multicast receiver group.
Both Ethernet and IPv4 use an all-ones broadcast address to indicate a broadcast packet. Token Ring uses a special
value in the IEEE 802.2 control field.
Broadcasting may be abused to perform a DoS-attack. The attacker sends fake ping request with the source IP-
address of the victim computer. The victim computer is flooded by the replies from all computers in the domain.
Open source
HTML 5 [1][2][3][4][5][6]
SwarmPlugin[7][8] - adds BitTorrent-compatible P2P transport to a browser, so you can do <video
src="p2p://torrent">.
FireCoral - browser-based peer-to-peer content distribution network that enables mutually distrustful users
to share their browser caches.[9][10][11]
ReverseHttp [12][13]
Http p2p, et IEEE xplore
JW Player + Open P2P CDN [14]
Proprietary software
IPTV
"IPTV is defined as multimedia services such as
television/video/audio/text/graphics/data delivered over
IP based networks managed to provide the required
level of quality of service and experience, security,
interactivity and reliability."
Internet Protocol television (IPTV) is a system through which Internet television services are delivered using the
architecture and networking methods of the Internet Protocol Suite over a packet-switched network infrastructure,
e.g., the Internet and broadband Internet access networks, instead of being delivered through traditional radio
frequency broadcast, satellite signal, and cable television (CATV) formats.
IPTV is distinguished from general Internet-based or web-based multimedia services by its on-going
standardization process (e.g., European Telecommunications Standards Institute) and preferential deployment
scenarios in subscriber-based telecommunications networks with high-speed access channels into end-user premises
via set-top boxes or other customer-premises equipment.
"IPTV is defined as the secure and reliable delivery to subscribers of entertainment video and related services. These services
may include, for example, Live TV, Video On Demand (VOD) and Interactive TV (iTV). These services are delivered across
an access agnostic, packet switched network that employs the IP protocol to transport the audio, video and control signals. In
contrast to video over the public Internet, with IPTV deployments, network security and performance are tightly managed to
ensure a superior entertainment experience, resulting in a compelling business environment for content providers, advertisers
and customers alike."[
One definition for consumer IPTV is for single or multiple program transport streams (MPTS) which are sourced
by the same network operator that owns or directly controls the "last mile" to the consumer's premises[citation needed].
This control over delivery enables a guaranteed quality of service (QoS), and also allows the service provider to
offer an enhanced user experience such as better program guide, interactive services etc.
In commercial environments IPTV is widely deployed for distribution of live TV, video playout channels and
Video on Demand (VOD) material across LAN or WAN IP network infrastructures, with a controlled QoS.
History
In 1994, ABC's World News Now was the first television show to be broadcast over the Internet, using the CU-
SeeMe videoconferencing software.[3]
The term IPTV first appeared in 1995 with the founding of Precept Software by Judith Estrin and Bill Carrico.
Precept designed and built an Internet video product named IP/TV. IP/TV was an MBONE compatible Windows
and Unix-based application that moved single and multi-source audio/video traffic, ranging from low to DVD
quality, using both unicast and IP multicast Real-time Transport Protocol (RTP) and Real time control protocol
(RTCP). The software was written primarily by Steve Casner, Karl Auerbach, and Cha Chee Kuan. Precept was
acquired by Cisco Systems in 1998.[4] Cisco retains the IP/TV trademark.
Internet radio company AudioNet started the first continuous live webcasts with content from WFAA-TV in
January, 1998 and KCTU-LP on January 10, 1998.[5]
Kingston Communications, a regional telecommunications operator in UK, launched KIT (Kingston Interactive
Television), an IPTV over DSL broadband interactive TV service in September 1999 after conducting various TV
and VoD trials. The operator added additional VoD service in October 2001 with Yes TV, a VoD content provider.
Kingston was one of the first companies in the world to introduce IPTV and IP VoD over ADSL. [6] In 2006 the
KIT service was shuttered, subscribers having declined from a peak of 10,000 to 4,000[7]
In 1999, NBTel (now known as Bell Aliant) was the first to commercially deploy Internet Protocol Television over
digital subscriber line (DSL) in Canada[8][9] using the Alcatel 7350 DSLAM and middleware created by iMagic TV
(owned by NBTel's parent company Bruncor[10]). The service was marketed under the brand VibeVision in New
Brunswick, and later expanded into Nova Scotia in early 2000[11] after the formation of Aliant. iMagic TV was later
sold to Alcatel.[12]
In 2002, Sasktel was the second in Canada to commercially deploy Internet Protocol (IP) video over digital
subscriber line (DSL), using the Lucent Stinger(R) DSL platform.[13] In 2006, it was the first North American
company to offer HDTV channels over an IPTV service[14]
In 2003, Total Access Networks Inc launched its IPTV service, comprising 100 free IPTV stations world wide. The
service has been used in over 100 countries world wide, and has channels in 26 languages.[citation needed]
In 2005, Bredbandsbolaget launched its IPTV service as the first service provider in Sweden. As of January 2009,
they are not the biggest supplier any longer; TeliaSonera who launched their service later has now more customers.
[15]
In 2006, AT&T launched its U-Verse IPTV service in the United States, comprising a national head end and
regional video-serving offices. AT&T offered over 300 channels in 11 cities with more to be added in 2007 and
beyond. In March 2009, AT&T announced that U-verse had expanded to 100 or more High Definition channels in
every U-Verse TV market.[16] While using IP protocols, AT&T has built a private IP network exclusively for video
transport.
In 2010, CenturyLink - after acquiring Embarq (2009) and Qwest (2010), entered five U.S. markets with an IPTV
service called Prism.[17] This was after successful test marketing in Florida.
[edit] Future
In the past, this technology has been restricted by low broadband penetration and by the relatively high cost of
installing wiring capable of transporting IPTV content reliably in the customer's home. In the coming years,
however, residential IPTV is expected to grow at a brisk pace as broadband was available to more than 200 million
households worldwide in the year 2005, projected to grow to 400 million by the year 2010.[18] Many of the world's
major telecommunications providers are exploring IPTV as a new revenue opportunity from their existing markets
and as a defensive measure against encroachment from more conventional Cable Television services.
Also, there are a growing number of IPTV installations within schools, universities, corporations and local
institutions.[19]
In December 2009, the FCC began looking into using set-top boxes to make TVs with cable or similar services into
broadband video players. FCC Media Bureau Chief Bill Lake had said earlier that TV and the Internet would soon
be the same, but only 75 percent of homes had computers, while 99 percent had TV. A Nielsen survey said 99
percent of video viewing was done on TV.[20]
Architecture of IPTV
Elements
TV Head-end: where live TV channels are encoded, encrypted and delivered in the form of IP multicast streams.
VOD platform: where on-demand video assets are stored and served when a user makes a request in the form of IP
unicast stream.
Interactive portal: allows the user to navigate within the different IPTV services, such as the VOD catalog.
delivery network: the packet switched network that carries IP packets (unicast and multicast).
home gateway: the piece of equipment at the user's home that terminates the access link from the delivery network.
user's set-top box: the piece of equipment at the user's home that decodes and decrypt TV and VOD content and
displays it on the TV screen.
Depending on the network architecture of the service provider, there are two main types of video server
architectures that can be considered for IPTV deployment, centralized, and distributed.
The centralized architecture model is a relatively simple and easy to manage solution. For example, as all contents
are stored in centralized servers, it does not require a comprehensive content distribution system. Centralized
architecture is generally good for a network that provides relatively small VOD service deployment, has adequate
core and edge bandwidth and has an efficient content delivery network (CDN).
Distributed architecture is just as scalable as the centralized model, however it has bandwidth usage advantages and
inherent system management features that are essential for managing a larger server network. Operators who plan
to deploy a relatively large system should therefore consider implementing a Distributed Architecture model right
from the start. Distributed architecture requires intelligent and sophisticated content distribution technologies to
augment effective delivery of multimedia contents over service provider's network
Protocols
IPTV covers both live TV (multicasting) as well as stored video (Video-on-Demand, or VoD). The playback of
IPTV requires either a personal computer or a set-top-box connected to a TV. Video content is typically
compressed using either a MPEG-2 or a MPEG-4 codec and then sent in an MPEG transport stream delivered via
IP Multicast in case of live TV or via IP Unicast in case of video on demand. IP multicast is a method in which
information can be sent to multiple computers at the same time. H.264 (MPEG-4) codec is increasingly used to
replace the older MPEG-2 codec.
Live IPTV uses IGMP version 2 or IGMP version 3 for IPv4 for connecting to a multicast stream (TV
channel) and for changing from one multicast stream to another (TV channel change). IGMP operates
within LAN's or VLAN's so other protocols, such as Protocol Independent Multicast (PIM), are used to
route IPTV multicast streams from one LAN segment to another.
VOD uses UDP or RTP protocols for channel streams and control is done using the control protocol RTSP
(Real Time Streaming Protocol).
NPVR (network personal video recorder), like VOD, uses UDP or RTP for IPTV streams, and the RTSP
control protocol for end-user control communications.
Although IPTV and conventional satellite TV distribution have been seen as complementary technologies, they are
likely to be increasingly used together in hybrid IPTV networks that deliver the highest levels of performance and
reliability. IPTV is largely neutral to the transmission medium, and IP traffic is already routinely carried by satellite
for Internet backbone trunking and corporate VSAT networks.[44] The use of satellite to carry IP is fundamental to
overcoming the greatest shortcoming of IPTV over terrestrial cables – the speed/bandwidth of the connection.
The copper twisted pair cabling that forms the last mile of the telephone/broadband network in many countries is
not able to provide a sizeable proportion of the population with an IPTV service that matches even existing
terrestrial or satellite digital TV distribution. For a competitive multi-channel TV service, a connection speed of
20Mbit/s is likely to be required, but unavailable to most potential customers.[45] The increasing popularity of high
definition TV (with twice the data rate of SD video) increases connection speed requirements, or limits IPTV
service quality and connection eligibility even further.
However, satellites are capable of delivering in excess of 100Gbit/s via multi-spot beam technologies, making
satellite a clear emerging technology for implementing IPTV networks. Satellite distribution can be included in an
IPTV network architecture in several ways. Simplest to implement is an IPTV-DTH architecture, in which hybrid
DVB/broadband set-top boxes in subscriber homes integrate satellite and IP reception to give near-infinite
bandwidth with return channel capabilities. In such a system, many live TV channels may be multicast via satellite
(IP-encapsulated or as conventional DVB digital TV) with stored video-on-demand transmission via the broadband
connection. Arqiva’s Satellite Media Solutions Division suggests “IPTV works best in a hybrid format. For
example, you would use broadband to receive some content and satellite to receive other, such as live channels”
Interactivity
An IP-based platform also allows significant opportunities to make the TV viewing experience more interactive and
personalized. The supplier may, for example, include an interactive program guide that allows viewers to search for
content by title or actor’s name, or a picture-in-picture functionality that allows them to “channel surf” without
leaving the program they’re watching. Viewers may be able to look up a player’s stats while watching a sports
game, or control the camera angle. They also may be able to access photos or music from their PC on their
television, use a wireless phone to schedule a recording of their favorite show, or even adjust parental controls so
their child can watch a documentary for a school report, while they’re away from home.
Note that this is all possible, to some degree, with existing digital terrestrial, satellite and cable networks in tandem
with modern set top boxes.[citation needed] In order that there can take place an interaction between the receiver and the
transmitter a feedback channel is needed. Due to this, terrestrial, satellite, and cable networks for television do not
allow interactivity. However, interactivity with those networks can be possible by combining TV networks with
data networks such as the Internet or a mobile communication network.
[edit] Video-on-demand
IPTV technology is bringing Video-on-demand (VoD) to television[57] which permits a customer to browse an
online program or film catalog, to watch trailers and to then select a selected recording. The playout of the selected
item starts nearly instantaneously on the customer's TV or PC.
Technically, when the customer selects the movie, a point-to-point unicast connection is set up between the
customer's decoder (Set Top Box or PC) and the delivering streaming server. The signalling for the trick play
functionality (pause, slow-motion, wind/rewind etc.) is assured by RTSP (Real Time Streaming Protocol).
The most common codecs used for VoD are MPEG-2, MPEG-4 and VC-1.
In an attempt to avoid content piracy, the VoD content is usually encrypted. Whilst encryption of satellite and cable
TV broadcasts is an old practice, with IPTV technology it can effectively be thought of as a form of Digital Rights
Management. A film that is chosen, for example, may be playable for 24 hours following payment, after which time
it becomes unavailable.
[
Broadcasting
Broadcasting is the distribution of audio and video content to a dispersed audience via radio, television, or other,
often digital transmission media. Receiving parties may include the general public or a relatively large subset of
thereof.
The original term broadcast referred to the literal sowing of seeds on farms by scattering them over a wide field.[1] It
was first adopted by early radio engineers from the Midwestern United States to refer to the analogous
dissemination of radio signals. Broadcasting forms a very large segment of the mass media. Broadcasting to a very
narrow range of audience is called narrowcasting.
Historically, there have been several different types of electronic broadcasting media:
Telephone broadcasting (1881–1932): the earliest form of electronic broadcasting (not counting data
services offered by stock telegraph companies from 1867, if ticker-tapes are excluded from the definition).
Telephone broadcasting began with the advent of Théâtrophone ("Theatre Phone") systems, which were
telephone-based distribution systems allowing subscribers to listen to live opera and theatre performances
over telephone lines, created by French inventor Clément Ader in 1881. Telephone broadcasting also grew
to include telephone newspaper services for news and entertainment programming which were introduced in
the 1890s, primarily located in large European cities. These telephone-based subscription services were the
first examples of electrical/electronic broadcasting and offered a wide variety of programming .
Radio broadcasting (experimentally from 1906, commercially from 1920): radio broadcasting is an audio
(sound) broadcasting service, broadcast through the air as radio waves from a transmitter to an antenna and,
thus, to a receiving device. Stations can be linked in radio networks to broadcast common programming,
either in syndication or simulcast or both.
Television broadcasting (telecast), experimentally from 1925, commercially from the 1930s: this video-
programming medium was long-awaited by the general public and rapidly rose to compete with its older
radio-broadcasting sibling.
Cable radio (also called "cable FM", from 1928) and cable television (from 1932): both via coaxial cable,
serving principally as transmission mediums for programming produced at either radio or television stations,
with limited production of cable-dedicated programming.
Satellite television (from circa 1974) and satellite radio (from circa 1990): meant for direct-to-home
broadcast programming (as opposed to studio network uplinks and downlinks), provides a mix of traditional
radio or television broadcast programming, or both, with satellite-dedicated programming.
Webcasting of video/television (from circa 1993) and audio/radio (from circa 1994) streams: offers a mix of
traditional radio and television station broadcast programming with internet-dedicated webcast
programming.
Recorded broadcasts and live broadcasts
The first regular television broadcasts began in 1937. Broadcasts can be classified as "recorded" or "live". The
former allows correcting errors, and removing superfluous or undesired material, rearranging it, applying slow-
motion and repetitions, and other techniques to enhance the program. However, some live events like sports
telecasts can include some of the aspects including slow-motion clips of important goals/hits, etc., in between the
live telecast.
American radio-network broadcasters habitually forbade prerecorded broadcasts in the 1930s and 1940s requiring
radio programs played for the Eastern and Central time zones to be repeated three hours later for the Pacific time
zone. This restriction was dropped for special occasions, as in the case of the German dirigible airship Hindenburg
disaster at Lakehurst, New Jersey, in 1937. During World War II, prerecorded broadcasts from war correspondents
were allowed on U.S. radio. In addition, American radio programs were recorded for playback by Armed Forces
Radio stations around the world.
A disadvantage of recording first is that the public may know the outcome of an event from another source, which
may be a "spoiler". In addition, prerecording prevents live announcers from deviating from an officially approved
script, as occurred with propaganda broadcasts from Germany in the 1940s and with Radio Moscow in the 1980s.
Many events are advertised as being live, although they are often "recorded live" (sometimes called "live-to-tape").
This is particularly true of performances of musical artists on radio when they visit for an in-studio concert
performance. Similar situations have occurred in television ("The Cosby Show is recorded in front of a live studio
audience") and news broadcasting.
A broadcast may be distributed through several physical means. If coming directly from the studio at a single radio
or television station, it is simply sent through the air chain to the transmitter and thence from the antenna on the
tower out to the world. Programming may also come through a communications satellite, played either live or
recorded for later transmission. Networks of stations may simulcast the same programming at the same time,
originally via microwave link, now usually by satellite.
Distribution to stations or networks may also be through physical media, such as analog or digital videotape,
compact disc, DVD, and sometimes other formats. Usually these are included in another broadcast, such as when
electronic news gathering returns a story to the station for inclusion on a news programme.
The final leg of broadcast distribution is how the signal gets to the listener or viewer. It may come over the air as
with a radio station or television station to an antenna and receiver, or may come through cable television [1] or
cable radio (or "wireless cable") via the station or directly from a network. The Internet may also bring either radio
or television to the recipient, especially with multicasting allowing the signal and bandwidth to be shared.
The term "broadcast network" is often used to distinguish networks that broadcast an over-the-air television signal
that can be received using a television antenna from so-called networks that are broadcast only via cable or satellite
television. The term "broadcast television" can refer to the programming of such networks.
Social impact
The sequencing of content in a broadcast is called a schedule. As with all technological endeavours, a number of
technical terms and slang have developed. A list of these terms can be found at List of broadcasting terms.
Television and radio programs are distributed through radio broadcasting or cable, often both simultaneously. By
coding signals and having decoding equipment in homes, the latter also enables subscription-based channels and
pay-per-view services.
In his essay, John Durham Peters wrote that communication is a tool used for dissemination. Durham stated,
“Dissemination is a lens- sometimes a usefully distorting one- that helps us tackle basic issues such as interaction,
presence, and space and time…on the agenda of any future communication theory in general” (Durham, 211).
Dissemination focuses on the message being relayed from one main source to one large audience without the
exchange of dialogue in between. There’s chance for the message to be tweaked or corrupted once the main source
releases it. There is really no way to predetermine how the larger population or audience will absorb the message.
They can choose to listen, analyze, or simply ignore it. Dissemination in communication is widely used in the world
of broadcasting.
Broadcasting focuses on getting one message out and it is up to the general public to do what they wish with it.
Durham also states that broadcasting is used to address an open ended destination (Durham, 212). There are many
forms of broadcast, but they all aim to distribute a signal that will reach the target audience. Broadcasting can
arrange audiences into entire assemblies (Durham, 213).
In terms of media broadcasting, a radio show can gather a large number of followers who tune in every day to
specifically listen to that specific disc jockey. The disc jockey follows the script for his or her radio show and just
talks into the microphone. He or she does not expect immediate feedback from any listeners. The message is
broadcast across airwaves throughout the community, but there the listeners cannot always respond immediately,
especially since many radio shows are recorded prior to the actual air time.
1worldspace
1worldspace is a satellite radio network that provides service to over 170,000 subscribers in eastern
and southern Africa, the Middle East, and much of Asia with 96% coming from India.[ The
1worldspace System
The 1worldspace system comprises three major components: the space segment, the ground segment, and the user
segment. The space segment refers to the company-owned satellites that broadcast the signals over a large
percentage of the eastern hemisphere. The ground segment refers to the operating and broadcasting centers. The
user segment refers to the user-owned devices in which the signal is received. In addition, the company plans to
implement terrestrial repeater networks in order to facilitate access to new markets in Europe and the Middle-East.
The 1worldspace system was built with companies including Alcatel Space (now Thales Alenia Space), EADS
Astrium and Arianespace (France), SED (Canada), GSI (USA), Fraunhofer Institute (Germany), ST
Microelectronics (Italy), Micronas (Germany) and others
1worldspace operates two satellites: AfriStar and AsiaStar. Making it available in Asia, Africa, Middle East, and parts of
Europe.
In digital television, all of these elements are combined in a single digital transmission system.
Frames
Ignoring color, all television systems work in essentially the same manner. The monochrome image seen by a
camera (now, the luminance component of a color image) is divided into horizontal scan lines, some number of
which make up a single image or frame. A monochrome image is theoretically continuous, and thus unlimited in
horizontal resolution, but to make television practical, a limit had to be placed on the bandwidth of the television
signal, which puts an ultimate limit on the horizontal resolution possible. When color was introduced, this limit of
necessity became fixed. All current analog television systems are interlaced; alternate rows of the frame are
transmitted in sequence, followed by the remaining rows in their sequence. Each half of the frame is called a field,
and the rate at which fields are transmitted is one of the fundamental parameters of a video system. It is related to
the frequency at which the electric power grid operates, to avoid flicker resulting from the beat between the
television screen deflection system and nearby mains generated magnetic fields. All digital, or "fixed pixel",
displays have progressive scanning and must deinterlace an interlaced source. Use of inexpensive deinterlacing
hardware is a typical difference between lower- vs. higher-priced flat panel displays (PDP, LCD, etc.).
All movies and other filmed material shot at 24 frames per second must be transferred to video frame rates in order
to prevent severe motion jitter effects. Typically, for 25 frame/s formats (countries with 50 Hz mains supply), the
content is sped up, while a techniques known as "3:2 pulldown" is used for 30 frame/s formats (countries with 60
Hz mains supply) to match the film frames to the video frames without speeding up the play back. .
Viewing technology
Analog television signal standards are designed to be displayed on a cathode ray tube (CRT), and so the physics of
these devices necessarily controls the format of the video signal. The image on a CRT is painted by a moving beam
of electrons which hits a phosphor coating on the front of the tube. This electron beam is steered by a magnetic field
generated by powerful electromagnets close to the source of the electron beam.
In order to reorient this magnetic steering mechanism, a certain amount of time is required due to the inductance of
the magnets; the greater the change, the greater the time it takes for the electron beam to settle in the new spot.
For this reason, it is necessary to shut off the electron beam (corresponding to a video signal of zero luminance)
during the time it takes to reorient the beam from the end of one line to the beginning of the next (horizontal
retrace) and from the bottom of the screen to the top (vertical retrace or vertical blanking interval). The horizontal
retrace is accounted for in the time allotted to each scan line, but the vertical retrace is accounted for as phantom
lines which are never displayed but which are included in the number of lines per frame defined for each video
system. Since the electron beam must be turned off in any case, the result is gaps in the television signal, which can
be used to transmit other information, such as test signals or color identification signals.
The temporal gaps translate into a comb-like frequency spectrum for the signal, where the teeth are spaced at line
frequency and concentrate most of the energy; the space between the teeth can be used to insert a color subcarrier.
Modulation
Given all of these parameters, the result is a mostly-continuous analog signal which can be modulated onto a radio-frequency
carrier and transmitted through an antenna. All analog television systems use vestigial sideband modulation, a form of
amplitude modulation in which one sideband is partially removed. This reduces the bandwidth of the transmitted signal,
enabling narrower channels to be used.
Audio
In analog television, the sound portion of a broadcast is invariably modulated separately from the video. Most
commonly, the audio and video are combined at the transmitter before being presented to the antenna, but in some
cases separate aural and visual antennas can be used. In almost all cases, standard wideband frequency modulation
is used for the standard monaural audio; the exception is systems used by France, which are AM. Stereo, or more
generally multi-channel, audio is encoded using a number of schemes which (except in the French systems) are
independent of the video system. The principal systems are NICAM, which uses a digital audio encoding; double-
FM (known under a variety of names, notably Zweikanalton, A2 Stereo, West German Stereo, German Stereo or
IGR Stereo), in which case each audio channel is separately modulated in FM and added to the broadcast signal;
and BTSC (also known as MTS), which multiplexes additional audio channels into the FM audio carrier. All three
systems are compatible with monaural FM audio, but only NICAM may be used with the French AM audio
systems.its uses FM instead of AM so as to reduce noise and mixing with other sounds
A
ABC
In Australia, the Australian Broadcasting Corporation. In the UK, ABC Weekend TV, a former ITV broadcaster. In
the US, American Broadcasting Company, a television and radio network originally created out of NBC.
A/D
Analog-to-digital conversion.
Absolute Event
A scheduled event whose start time is determined with an assigned time based upon the facility master clock.
Access Time
The total time required to find, retrieve and commence using information, also known as Lead Time.
Actives
Listeners who contact the radio show regarding requests, contests or other interaction.
Aircheck
Analog Recording
Recording of audio using an electronic signal that varies continuously. The main drawback of analog recording is the
introduction of inherent noise to the recorded signal.
Analog Transmission
The broadcasting of a signal using an analog recording. Examples of use include radio.
Arbitron
The company that provides the Industry accepted standard for audience measurement.
Archive
* Archive Copy is a master copy intended solely for storage and not to be used in distribution.
Artifacts
Noticeable loss of video and/or audio fidelity in a broadcast or recording caused by limitations in the technology used.
Usually reflects undesirable distortion(s) of the original when digitized.
Aspect ratio
The ratio between the width and the height of the picture. In 'traditional' television sets, this is 4:3; in widescreen sets,
16:9. Sometimes printed decimally as 1.33:1 for 4:3 and 1.78:1 for 16:9.
Aston
An on-screen overlaid graphic, usually giving the name of the speaker, reporter or place in vision. Name derived from
Aston Broadcast Systems Ltd., an early manufacturer of this equipment.
A committee established by the FCC to decide the technical standards for digital broadcasting in the US.
[edit] B
Backsell
The technique where the DJ announces the song title and/or artist of the song that has just played. Also known as
"back announcing".
Backtiming
Where the DJ calculates the intro time on the song in an attempt to talk over the intro of the song and finish just prior
to the vocals commencing. Frequently referred to as 'Hitting the Post' or 'Talking Up the Song'
Bandwidth
The available space between two given points on the electromagnetic spectrum and, inter alia, the amount of
information that can be squeezed into that space.
The main public service broadcaster in the United Kingdom, founded as the British Broadcasting Company in 1922.
Bed
A production element, usually instrumental music or sound effect played in the background of a spoken commercial,
promo or other announcement.
Bel
A measure of voltage, current or power gain. One Bel is defined as a tenfold increase in power. If an amplifier
increases a signal's power by a factor of 10, its power gain is 1 Bel or 10 decibels (dB). If power is increased by 100
times, the power gain is 2 Bels or 20 decibels. 3dB is considered doubling.
Bias
A constant amplitude high frequency signal added to the recording signal to improve the signal to noise ratio and
reduce the distortion of an analog tape recording.
Billboard
A short announcement to identify a sponsor at the beginning or end of a production element such as the news or
traffic/weather reports.
BTA
Black To Air
Book
An animation or logotype briefly shown after the end of a programme or part of a programme before the advertising.
See also "optical".
Bug
Slang term for a DOG (Digitally Originated Graphic) permanent on screen logo.
A pre-recorded production element containing voice over music that acts as a transition to or from a stop set and other
content.
[edit] C
Call Letters
The official name of the radio station in the USA. Also known as a station's callsign.
Cans
CBS
In English, "International Radio Consultative Committee", the organisation responsible for assigning frequencies to
radio stations between 1927 and 1992. Now known as ITU-R.
Closed Captioning
Text version of a programme's dialogue, overlayed on the screen by an equipped television set for the hearing
impaired.
Clutter
An excessive number of non-programme elements (such as commercials) appearing one after another.
Copy
The written material used in producing a PSA, promo, or commercial that is meant to be read out by the DJ or
presenter.
Crash
When an announcement, jingle or graphic overlaps with a fixed point in the schedule (eg the news or a time signal),
usually due to poor timing.
Crossfade
The technique where a DJ, producer or engineer fades out the out going track at the same time as fading in the new
track.
Coverage
percentage of households that can tune into a radio station within the theoretical broadcast radius.
Cueing
Whilst the previous record was playing the DJ would attempt to find the beginning of the song on the next record. The
DJ would place the needle down in approximately the right area then move the record back and forth Cueing on the
turntable until the beginning of the song was found. When the previous song completed playing the DJ would
introduce the next song and turn the record deck on and the record would quickly whirl up to speed with a
characteristic distortion. This was later minimised by the use of a slip mat.
Cue Burn
Cue burn relates to the days of vinyl records (33rpm , 45rpm). Whilst the previous record was playing the DJ would
attempt to find the beginning of the song on the next record. The DJ would place the needle down in approximately
the right area then move the record back and forth Cueing on the turntable until the beginning of the song was found.
This cueing back and forth would rub the vinyl and damage the records creating a characteristic noise.
Cue dot
A small square inserted in the corner of the picture to inform rebroadcasters that an advertisement break is about to
happen. In the UK, this appeared exactly one minute before the break and disappeared 55 seconds later.
Cue Channel
In the early days of networks a dedicated multi-drop phone line connected all affiliated station engineers to the
network Master Control. The system was backed up with teletype too.
Cue Track
A recorded audio track containing information about upcoming events that the operating engineer should be aware of.
It was first used by Edison on his first talking pictures using records for the sound playback. He used the information
to synchronize picture and sound. On early soundtrack records the introduction of a "beep tone" was used to tell the
projectionist to turn on and off the auditorium speakers so the audience would not hear the projectionist's cue
information. Cue tracks were adopted in the early days of Kinascope to cue the film chain engineer and later used in
early Ampex Quad Tape systems and is still used today either as voice or digitally for station automation systems. In
the early days of bicycled programs cue tracks along with a printed time line was used to inform the engineer of
brakes or jam (insert) spots in the tape including a 5 count to the brake in and out locations. Because the program tape
or film never stopped. Often the original recording engineer would add comments of his own regarding the program,
sometimes humors. When smaller networks that supported independent stations programs, were assembled and the
mew track often had the original engineer's voice and the assembling engineer's voice and humor too.
Cume
Short for cumulative audience. A similar term of measurement to a newspaper or magazines' circulation figures.
[edit] D
DAB - Digital Audio Broadcasting
The use of digital encoding to send higher quality or a greater number of radio services to equipped receivers.
Daypart
The radio station's broadcast day is normally split up (starting at 6am) into a series of 4 hour sessions containing one
or more shows.
DB or Decibel
Television and radio programmes distributed by satellite for reception via a dish at the receiver's property.
Dead air
The time on-air where there is no audible transmission. This silence can be down to any of the following:
* Act of God
DJ - Disc Jockey
A station logo or slogan permanently displayed on screen during a programme. Controversial due to "screenburn"
issues.
Dolby Digital
Also Dolby D. The standard for 5.1 channel (surround sound) audio. Six discrete channels are used (Left, Center,
Right, Left Rear Surround, Right Rear Surround, and Subwoofer).
Double pumping
Putting out two episodes of a show back-to-back, either to boost ratings in a given slot or to burn off episodes of a
cancelled show.
Drive time
Drive time refers to the period of time where the majority of radio listeners travel to work. This is traditionally 6-
10am and 2-6pm and is normally accompanied by the stations highest listenership. Commercials are normally more
expensive during such times.
Drop The Light
Drop the Light is very common industry-wide term meaning 'Lower the Light Levels'. This is often yelled while
shooting when the director wants to continue shooting the action of the scene after the light levels are lowered. It has
nothing to do with any physical dropping of a lighting fixture during the scene.
Drops
These are excerpts of TV, movies and other audio programmes that are used to accentuate programming.
Drop Song
Temporary unselecting a playlist song to better accommodate an accurate clock hour.(Or in English: a song scheduled
but not played for timing reasons.)
Use of digital satellite transmission from remote locations for the purpose of live news event coverage.
Dustbin Dave
The accidental deletion of all media from all server locations within a transmission environment, resulting in a service
to go to BTA.
Television and radio programmes distributed by satellite for reception via a dish at the receiver's property.
The MPEG-2 based standard of digital transmission and reception. Comes in variants according to the type of
broadcast, eg DVB-T for terrestrial.
[edit] E
Encryption
The scrambling of a signal to allow reception via a decoder only be specific viewers, eg after the payment of a fee.
[edit] F
Feedback
A loud noise produced when the amplified sound from an output ( loudspeaker ) is picked up by an input (
microphone, phonograph ) feeding that loudspeaker. This can be potentially damaging to both the speaker(s) in
question, as well as the hearing of the subjected listener. This phenomenon is usually the result of poor engineering,
but more likely due to the lack of understanding (or drunkenness, or both) by an announcer or performer as he walks
in front of a live PA speaker. This may also occur when an input is directly patched into an output of the same device,
usually due to operator error.
In radio broadcasting, feedback may occur when a DJ increases his or her headphone volume to a high enough level
that the microphone is able to pick up the sound coming from the headphones, usually when the DJ's head is turned to
one side or another.
Format Clock
A format clock is a diagram produced by a programme director or a producer to illustrate where each programming
element appears in a typical hour.
The number of times the television is refreshed in a second of time. As a rule of thumb, this is the same as the local
Alternating Current electricity supply - 60 Hz or 50 Hz.
Front sell
[edit] G
Gain
Volume
GHz - Gigahertz
Thousand million cycles per second. The measurement for satellite frequencies.
[edit] H
Hammocking
Placing a new or poorly-performing programme between two established popular programmes in order to boost
viewing figures.
In modern terms, broadcasting using a line standard of greater than 1000. Prior to World War II, "high definition" was
used to mean a line standard greater than 240 lines.
Where a DJ continues to talk right up to the point where the vocals commence.
[edit] I
Ident
Image Liner
A short audio clip played frequently on a radio station between songs and ads to identify the station that is being
aired. I.E the stations call letters or positioning statement.
Systems that allow viewers to interact (eg play games, shop for related items or find further information) either two-
way, via a telephone line, or one-way, via MHEG graphics.
[edit] J
Jingle
A produced programming element usually in the form of vocals to accompanying music often produced in-house to
identify the show, DJ or the station.
[edit] K
kHz - Kilohertz
Thousand cycles per second. kHz is used to measure mediumwave and often shortwave frequencies.
[edit] L
Legal ID
In the US, the station identification consisting of the station call letters followed by the community of license. Given
as close as practical to the top of the hour at a natural break in program offerings.
Letterbox
The appearance of black bars at the top and bottom of a picture when 16:9 or 14:9 widescreen material is shown on
4:3 sets. See also pillar box and postage stamp.
Liner
A piece of written text that the DJ says over the intro of a song or between spots and songs. Liners are designed to
invoke the imagination.
Line standard
The number of lines broadcast to make up a television picture. Generally, 525 in NTSC areas and 625 elsewhere.
Live
Any programming which is broadcast immediately as it is being delivered (a live report); performed (a live concert or
show); or captured (live news or sports coverage). Requires an unbroken communications chain without any
intervening recording or storage technology. Considered the most exciting form of broadcasting, delivered “as it
happens”.
Live-on-tape
A pre-recorded program produced in real time, usually with a studio audience, for later broadcast. Requires precisely
timed pauses for insertion of station breaks and commercials at time of broadcast. Typically employed for network
broadcast across multiple time zones. Also applies to live broadcasting which is simultaneously recorded for
rebroadcast at a later time or date.
Log
* A Commercial Log recording which commercials were played during the day.
[edit] M
Macrovision
MHz
Million cycles per second. The bandwidth area for FM broadcasts and television.
Miscue
A mistake by the DJ or production engineer resulting in two audio elements being played at the same time, eg an
interview and the next song.
[edit] N
NBC - National Broadcasting Company
Network
A system which distributes programming to multiple stations simultaneously, or slightly delayed, for the purpose of
extending total broadcast coverage beyond the limits of a single radio or television signal.
NEMO
(Not Emanating Master Operations) An early term used in remote broadcast operations. It was often used when the
DJ/announcer operated his/her own mixer board that directly fed, via the station's master control, without a dedicated
licensed broadcast engineer continuously monitoring the incoming remote signal, directly to the transmitter. The on-
air talent/engineer needed an FCC third class broadcast license to operate the NEMO remote system.
Nielsen ratings
Survey of US viewers by the AC Nielsen Company to establish the audiences for individual programmes and their
demographics.
NTSC - National Television Standards Committee
An American committiee formed to set the line standard and later color standard for broadcasting. Gave its name to
the method of color reproduction used in the Americas (except Brazil) and in Japan.
[edit] O
Ofcom - Office of Communications
Optical
Generically, any on-screen graphic. Specifically, a graphic inserted between a programme and an advertisement or
between individual advertisements.
A stage instruction noting that a character is not seen when speaking. Also, in continuity announcing, the practice of
speaking over a caption rather than appearing on screen.
OB
onsite broadcasting vans used to give signals directly to satellite from remote locations
[edit] P
PAL - Phase Alternating Line
Television broadcast system used in Europe and Australia & New Zealand, also parts of Asia, Africa and South
America.
A BBC term for a (supposedly contemporaneous) log of a channel's output - also a video (or film) recording of an
individual live programme.
Pay-per-view
Reception of a scrambled film or sporting event after the payment of a one-off fee for that broadcast.
The erroneous effect of pink and green flashing on a video signal usually caused by a disturbance to the SDI
input/output of broadcast equipment.
A government-produced commercial, usually shown for free, giving safety information or advice.
Pillarbox
The appearance of blank bars on either side of the picture when 4:3 material is shown on a 16:9 widescreen television
set.
Pilot
A one-off episode of a proposed series, usually in extended form, to gauge audience reaction. If successful, the rest of
the series is made and the pilot becomes the first episode.
Pips
Slang term for the time signal broadcast by some radio stations at the top of the hour.
Pink Event
Playlist
The official songs that a radio station will play during a given week. The playlist is not usually chosen by the DJ.
Positioning statement
A radio station's mission statement or vision statement. A one to two sentence statement that conveys what you do for
whom, to uniquely solve an urgent need. These are usually aired during Image Liners.
Postage stamp
The appearance of a black border all around the picture, usually in error, when 4:3 material is converted to 16:9 and
then back to 4:3 before broadcast.
Pot - Potentiometer
Production Element
A Production Element is a piece of audio that is used in the final audio mix. This may include commercials, music,
sound effects, audio effects (eg echo) station id or program signatures or announcements.
Producer
The person who performs or manages the day to day business operations of a station. Also the person responsible for
an individual program - a radio producer or a television producer.
Promo
An announcement (either recorded or live) used to promote the station's image or other event.
[edit] Q
Quadraphonic
Sound reproduction utilising four speakers. Now superseded by Dolby 5.1 Surround Sound.
[edit] R
Racks
Control panel where several television cameras are matched together by operator(s) for exposure, colour balance and
black level.
Ramp
[edit] S
SB - Simultaneous Broadcasting
British term for the broadcast of the same programme from multiple transmitters.
Screenburn
Where a permanent mark is burnt into the mask of the TV screen due to prolonged display. Common with sets tuned
to one channel for promotional purposes or on ordinary sets from DOGs inserted by broadcasters. Also known as
Phosphor burn-in.
Slipmat
A slipmat was a mat that was placed on a record deck between the deck and the record. Normally made by the DJ, it
was cut significantly oversized when compared to a vinyl record. The DJ would cue the record to the beginning of a
song and then holding onto the mat would turn the turntable on whilst the record stayed at the beginning of the song.
The DK could then introduce the record and then release the mat onto the already spinning deck thus reducing the
spin up speed to 33 or 45 rpm. The effect was to reduce the whirl effect produced by the turning on of the turntable.
Soundbite
A small portion (usually one or two sentences) of an audio recording (often an interview) used to illustrate a news
story in the words of the interviewee (c.f. a quotation from a politician).
Sponsorship
In the United States, the practice of a company funding the making of a program in order to entertain an audience and
sell a product. In the UK, an advertisement inserted between the end-of-part caption and the breakbumper.
Spot
Spot advertising
A commercial or commercials run in the middle of or between programmes, sold separately from the programme (as
opposed to sponsors' messages).
Stop set
The place where commercials are played during a typical broadcast hour. There may be several scattered throughout a
typical 60 minute period. Stop set length can vary much between local stations and even network programming.
Subtitles
Text version of a programme's dialogue, overlayed on the screen either at broadcast or at reception (often via Teletext
or Closed Captioning) for the hearing impaired or for when a speaker is unclear or speaking in a foreign language.
Sweeps
A period, usually in February, May, July and November, where the A C Neilson Company undertakes to record the
ratings of all shows in all markets with all demographics. This allows networks and local stations to spot surprise hits
and unexpected failures. It is also a time when a successful network will try pilot episodes of new shows, whilst a
failing network will often put existing successful programs in place of poorly performing shows to boost average
ratings.
[edit] T
Tape sync
An interview conducted by phone and recorded in both locations, with the two recordings to be mixed later.
Teaser
A part of a program played before the title sequence, usually featuring a cliffhanger or prefiguring the plot of the
episode to follow.
Teletext
Electronic information inserted into the unused parts of a television signal and decodable by an equipped television
set.
Television
The transmission of pictures and sound by radio frequency or cable for public reception.
Tiling
The appearance of large non-congruent blocks on a video display when a digitally generated broadcast (i.e., image)
was received by the monitor in an incomplete form. Tiling also occurs when the video signal has degraded or been
partially interrupted as it was received by the monitor.
Transponder
A physical part of a satellite that broadcasts the signal. In colloquial use, the satellite equivalent of the "channel" a
television station is broadcast on (eg "broadcasting from Transponder 2C of the satellite").
[edit] U
UHF - Ultra High Frequency
Frequencies between 300 MHz (wavelength 1 meter) and 3.0 GHz (wavelength 10 centimetres), used for television
broadcasting.
[edit] V
VBI - Vertical Blanking Interval
The blank area out of sight at the top and bottom of a television picture that allows the raster gun to reset. The space
created is often used for Teletext and other services.
Frequencies from 30 MHz (wavelength 10 m) to 300 MHz (wavelength 1 m), used for radio and television
broadcasting.
VJ - Video Jockey
[edit] W
WARC - World Administrative Radio Conference
The regular meetings of the CCIR (now ITU-R) to allocate radio frequency spectrum.
Wendy
Watermark
A common practice of displaying a company's logo during a television broadcast, typically a translucent image in the
right hand bottom corner. (See also Bug and DOG)
[edit] X
XM Satellite Radio
[edit] Y
Y
Luminance in many color models used for television broadcast, such as YIQ and YUV.
[edit] Z
Zoom
To go from a long shot to a close-up (or vice versa) with the camera. In the UK, the name given by Associated
TeleVision to their idents.
Webcast
A webcast is a media file distributed over the Internet using streaming media technology to distribute a single
content source to many simultaneous listeners/viewers. A webcast may either be distributed live or on demand.
Essentially, webcasting is “broadcasting” over the Internet.
The largest "webcasters" include existing radio and TV stations, who "simulcast" their output, as well as a
multitude of Internet only "stations". The term webcasting usually refers to non-interactive linear streams or events.
Rights and licensing bodies offer specific "webcasting licenses" to those wishing to carry out Internet broadcasting
using copyrighted material.
Webcasting is also used extensively in the commercial sector for investor relations presentations (such as Annual
General Meetings), in E-learning (to transmit seminars), and for related communications activities. However,
webcasting does not bear much, if any, relationship to web conferencing, which is designed for many-to-many
interaction.
The ability to webcast using cheap/accessible technology has allowed independent media to flourish. There are
many notable independent shows that broadcast regularly online. Often produced by average citizens in their homes
they cover many interests and topics. Webcasts relating to computers, technology, and news are particularly popular
and many new shows are added regularly.
A webcast is a media file distributed over the Internet using streaming media technology to distribute a single
content source to many simultaneous listeners/viewers. A webcast may either be distributed live or on demand.
Essentially, webcasting is “broadcasting” over the Internet.
The largest "webcasters" include existing radio and TV stations, who "simulcast" their output, as well as a
multitude of Internet only "stations". The term webcasting usually refers to non-interactive linear streams or events.
Rights and licensing bodies offer specific "webcasting licenses" to those wishing to carry out Internet broadcasting
using copyrighted material.
Webcasting is also used extensively in the commercial sector for investor relations presentations (such as Annual
General Meetings), in E-learning (to transmit seminars), and for related communications activities. However,
webcasting does not bear much, if any, relationship to web conferencing, which is designed for many-to-many
interaction.
The ability to webcast using cheap/accessible technology has allowed independent media to flourish. There are
many notable independent shows that broadcast regularly online. Often produced by average citizens in their homes
they cover many interests and topics. Webcasts relating to computers, technology, and news are particularly popular
and many new shows are added regularly.
Streaming media
Live streaming is a technology for providing audio and/or video files (either live or on demand) from an online environment.
Streaming media is multimedia that is constantly received by and presented to an end-user while being delivered
by a streaming provider.[note 1] The name refers to the delivery method of the medium rather than to the medium
itself. The distinction is usually applied to media that are distributed over telecommunications networks, as most
other delivery systems are either inherently streaming (e.g., radio, television) or inherently non-streaming (e.g.,
books, video cassettes, audio CDs). The verb 'to stream' is also derived from this term, meaning to deliver media in
this manner. Internet television is a commonly streamed medium.
Live streaming, more specifically, means taking the media and broadcasting it live over the Internet. The process
involves a camera for the media, an encoder to digitize the content, a media publisher where the streams are made
available to potential end-users and a content delivery network to distribute and deliver the content. The media can
then be viewed by end-users live.
Security remains one of the main challenges with this new methodology. Digital rights management (DRM)
systems are an example of a solution to keep this content secure.
A broadband speed of 2.5 Mbps or more is recommended for streaming movies, for example to an Apple TV,
Google TV or a Sony TV Blu-ray Disc Player, 10 Mbps for High Definition content.[3]
Unicast connections require multiple connections from the same streaming server even when it streams the same
content
Streaming media storage size is calculated from the streaming bandwidth and length of the media using the
following formula (for a single user and file):
storage size (in megabytes) = length (in seconds) × bit rate (in bit/s) / (8 × 1024 × 1024)[note 2]
One hour of video encoded at 300 kbit/s (this is a typical broadband video as of 2005 and it is usually encoded in a
320 × 240 pixels window size) will be:
If the file is stored on a server for on-demand streaming and this stream is viewed by 1,000 people at the same time
using a Unicast protocol, the requirement is:
This is equivalent to around 135 GB per hour. Of course, using a multicast protocol the server sends out only a
single stream that is common to all users. Hence, such a stream would only use 300 kbit/s of serving bandwidth.
See below for more information on these protocols.
If the show last for 3 hours, with 3000 viewers then the calculation is:
Number of MB transferred = encoder speed (in bps) × number of seconds × number of viewer /
(8*1024*1024)
Number of MB transferred = 500.000 (bps) × 3 × 3600 ( = 3 hours) × 3000 (nbr of viewers) /
(8*1024*1024) = 1931190 MB
The audio stream is compressed using an audio codec such as MP3, Vorbis or AAC.
The video stream is compressed using a video codec such as H.264 or WebM.
Encoded audio and video streams are assembled in a container bitstream such as FLV, ASF or ISMA.
The bitstream is delivered from a streaming server to a streaming client using a transport protocol, such as MMS or
RTP.
The streaming client may interact with the streaming server using a control protocol, such as MMS or RTSP.
Protocol issues
Designing a network protocol to support streaming media raises many issues, such as:
Datagram protocols, such as the User Datagram Protocol (UDP), send the media stream as a series of small packets.
This is simple and efficient; however, there is no mechanism within the protocol to guarantee delivery. It is up to the
receiving application to detect loss or corruption and recover data using error correction techniques. If data is lost, the
stream may suffer a dropout.
The Real-time Streaming Protocol (RTSP), Real-time Transport Protocol (RTP) and the Real-time Transport Control
Protocol (RTCP) were specifically designed to stream media over networks. RTSP runs over a variety of transport
protocols, while the latter two are built on top of UDP.
Another approach that seems to incorporate both the advantages of using a standard web protocol and the ability to be
used for streaming even live content is the HTTP adaptive bitrate streaming. HTTP adaptive bitrate streaming is based
on HTTP progressive download, but contrary to the previous approach, here the files are very small, so that they can
be compared to the streaming of packets, much like the case of using RTSP and RTP. [4]
Reliable protocols, such as the Transmission Control Protocol (TCP), guarantee correct delivery of each bit in the
media stream. However, they accomplish this with a system of timeouts and retries, which makes them more complex
to implement. It also means that when there is data loss on the network, the media stream stalls while the protocol
handlers detect the loss and retransmit the missing data. Clients can minimize this effect by buffering data for display.
While delay due to buffering is acceptable in video on demand scenarios, users of interactive applications such as
video conferencing will experience a loss of fidelity if the delay that buffering contributes to exceeds 200 ms. [5]
Unicast protocols send a separate copy of the media stream from the server to each recipient. Unicast is the norm for
most Internet connections, but does not scale well when many users want to view the same program concurrently.
Multicasting broadcasts the same copy of the multimedia over the entire network to a group of clients
Multicast protocols were developed to reduce the data replication (and consequent server/network loads) that occurs
when many recipients receive unicast content streams independently. These protocols send a single stream from the
source to a group of recipients. Depending on the network infrastructure and type, multicast transmission may or may
not be feasible. One potential disadvantage of multicasting is the loss of video on demand functionality. Continuous
streaming of radio or television material usually precludes the recipient's ability to control playback. However, this
problem can be mitigated by elements such as caching servers, digital set-top boxes, and buffered media players.
IP Multicast provides a means to send a single media stream to a group of recipients on a computer network. A
multicast protocol, usually Internet Group Management Protocol, is used to manage delivery of multicast streams to
the groups of recipients on a LAN. One of the challenges in deploying IP multicast is that routers and firewalls
between LANs must allow the passage of packets destined to multicast groups. If the organization that is serving the
content has control over the network between server and recipients (i.e., educational, government, and corporate
intranets), then routing protocols such as Protocol Independent Multicast can be used to deliver stream content to
multiple Local Area Network segments.
Peer-to-peer (P2P) protocols arrange for prerecorded streams to be sent between computers. This prevents the server
and its network connections from becoming a bottleneck. However, it raises technical, performance, quality, and
business issues.
There are two primary formats for streaming media that we use: Flash video streaming and Windows Media
video streaming.
P2P computing or networking is a distributed application architecture that partitions tasks or workloads between
peers. Peers are equally privileged, equipotent participants in the application. They are said to form a peer-to-peer
network of nodes.
Peers make a portion of their resources, such as processing power, disk storage or network bandwidth, directly
available to other network participants, without the need for central coordination by servers or stable hosts.[1] Peers
are both suppliers and consumers of resources, in contrast to the traditional client–server model where only servers
supply, and clients consume.
The peer-to-peer application structure was popularized by file sharing systems like Napster. The peer-to-peer
computing paradigm has inspired new structures and philosophies in other areas of human interaction. In such
social contexts, peer-to-peer as a meme refers to the egalitarian social networking that is currently emerging
throughout society, enabled by Internet technologies in general
Definition: Peer-to-Peer (P2P) architecture for multimedia streaming is emerging in recent years which can
eliminate the need for costly dedicated video servers in the traditional client-several approach.
The basic concept of peer-to-peer (P2P) computing is not new and some techniques date back many years when the
Internet was first designed. However, the key phrase “peer-to-peer” has become widely and publicly recognized
mostly after the pioneering Napster file sharing network emerged in the late 1990’s. Peer-to-peer is a very general
term and people associate different concepts with it. Various forms of P2P techniques have been used in the fields
of computing, networking, distributed file systems, and others. In this chapter we focus on how P2P techniques are
being used for streaming media distribution.
P2P systems have some key characteristics that distinguish them from the traditional and widely used client-server
model. The most prominent feature is that a P2P system is composed of a number of member nodes, each of which
combines the functionality that is traditionally associated with both the server and the client. As such, multiple P2P
nodes can form a collective that aggregates their resources and functionality into a distributed system. Node A may
act as a client to node B, while at the same time function as a server to Node C . Beyond this fundamental
characteristic, there are a number of features that are often associated with P2P systems. Note, however, that
usually only a subset of the following characteristics holds true for any practical system.
Reduced central control. Many P2P systems work in a fully decentralized fashion where all the nodes have equal
functionality. The members are connected based on a system-specific construction policy and form a distributed
topology. Exceptions to this model exist. For example, the original Napster file sharing network used a centralized
index to locate files; subsequently the data was exchanged directly between individual peers.
Heterogeneity. Members of a P2P system are usually heterogeneous in terms of their computing and storage capacity,
network bandwidth, etc. A system may include high performance nodes on a university network and computers
owned by residential users with broadband or modem connections.
Flat topology. Members of the P2P network are often treated equally which results in a flat connection topology.
However, hierarchical systems exist that introduce the concept of “super-peers.”
Autonomy. The time and resources that a member node can or will contribute to the system are dynamic and
unpredictable. Often, nodes are under different administrative control. Hence the enforcement of global policies is a
challenge.
Fault resilience. P2P members may join or leave the topology at any time. Therefore, not only is the formed
community very dynamic, but no assumptions should be made about the availability of resources or network paths. A
P2P system must be able to recover from the unexpected and ungraceful leave of any of its members at any time.
Members of a P2P system are also referred to as nodes because they are often represented as network nodes in
topology graph.
Streaming is a process of generating and delivering a steady, isochronous flow of data packets over networking
medium, e.g., the Internet, from a source to a destination. The rendering of the content starts as soon as a small
fraction of the data stream has been received. Streaming media usually denotes digital audio and video data,
however haptic or other data may be streamed as well. One of the main resource bottlenecks that afflicts large
client-server distribution architectures is the massive bandwidth that must be available from the server into the core
of the network. This network connection is often very costly (compared to the server and client hardware) and may
render a technically feasible solution economically not viable. Peer-to-Peer streaming is an alternative that
alleviates the bandwidth cost problem by offering a service to deliver continuous media streams directly between
peer nodes. However, the previously listed characteristics of P2P systems influence the design of such decentralized
streaming solutions.
Theoretically, P2P architecture can be built over any networking medium and at potentially different layers of the
network. However, most of the existing P2P implementations and their associated research have focused on
application-level overlay networks. The Internet, as the dominant networking medium for research, business and
entertainment, is also the preferred choice for P2P network substrates.
One of the virtues of today’s P2P systems is their scalable nature. Peer-to-peer technologies were first widely used
and accepted as file-sharing platforms in systems such as Napster, Gnutella and KaZaA. Subsequently, the P2P
architecture evolved and was adapted for store-and-forward streaming. Examples of streaming systems that may be
used to distribute previously stored content are Narada, HMTP, and Pastry. One distinguishing characteristic among
these proposals is the shape of the streaming topology they construct, which will be described later in this chapter.
Even though these designs promise good performance in terms of network link stress and control overhead, only a
few of them have been implemented in real systems. Next, P2P technology was adapted for live streaming. In this
scenario, media streams are generated by live sources (e.g., cameras and microphones) and the data is forwarded to
other nodes in real-time. We distinguish two types of live streaming: one-way and two-way. The requirements for
the two are quite different and more details follow below.
Streaming Process
A streaming process can be separated into three stages that overlap in time (Figure 1): data acquisition, data
delivery and data presentation. Data acquisition is the stage that determines how the streaming content is acquired,
packetized and distributed for streaming. The data presentation stage represents the methods on how to buffer,
assemble and render the received data. Data delivery is the process of how the stream data is transported from the
source to the destination. The source, the destination and all the intermediate nodes in a streaming system
participate in a topology that is constructed based on the specific system’s protocol. In a P2P streaming system, this
network architecture exhibits peer-to-peer characteristics.
One-way live applications have similar requirements as their on-demand cousins. One obvious difference is that the
source data is generated in real time by a source device such as a camera, a microphone or some other sensor. One
application is the broadcasting of live events such as sports games. Data may be cached for later on-demand
viewing. Two-way live applications have very different requirements. Here, the end-to-end latency is crucial to
enable interactive communications. Note that P2P topologies have a disadvantage in terms of minimizing the
latency among participants because application-level processing is often required at every node. Skype was
probably the first successful Internet telephony system built on a P2P streaming architecture. It demonstrated that
the latency problem can be solved and that P2P technology, with its many advantages, can indeed be used for live
streaming purposes. AudioPeer, which is built on top of the ACTIVE architecture, is another multiparty audio
conferencing tool. It is designed specifically for large user groups. Its design distinguishes active users from passive
users and provides low-latency audio service to active users.
Data Delivery
The transition of one or multiple copies of the content from a source node to a destination node is called a
streaming session. A streaming session starts when a streaming request is made and ends when all associated
destination nodes have received the last byte of the content. Depending on the number of source and destination
nodes involved in a streaming session, we can distinguish three types of streaming systems: one-to-many, many-to-
one and many-to-many (see Figures 4, 5, 6). All of these three types apply to either live or on-demand streaming.
One-to-many streaming is also called broadcasting. It delivers content from a single source to multiple destination
nodes. Much research has focused on how to make the delivery process fast and efficient for one-to-many
streaming. P2P systems naturally produce a multi-cast distribution tree since any peer that receives a stream can
forward it to multiple other nodes. Many-to-one streaming delivers data from multiple sources to a single
destination. A good example is an on-demand movie viewer who simultaneously downloads fragments of the movie
clip from multiple peers. Many-to-many streaming combines the features of the previous two designs and usually
requires a more complicated delivery network, which we will discuss in detail in following sections.
The P2P network architecture represents the topology how the nodes are inter-connected in a P2P system. P2P
streaming architecture is the data path over which the streaming content is delivered from source to destination
nodes. For a P2P streaming system, the network architecture is not necessary the same as the streaming
architecture. For example, Scribe is a P2P network protocol constructing a ring-shaped network architecture and
Pastry is the streaming architecture built on top of Scribe. But for most P2P systems, these two architectures are
identical and can be represented in a single topology graph.
P2P streaming topologies, including the network architecture and the streaming architecture, can be categorized
into four types: tree, mesh, ring and hybrid (see Figure 7). Tree structures start with a root node and add new nodes
in a pa rent/children fashion. Many systems are built as tree topologies, e.g., AudioPeer, Yoid and HMTP. A mesh-
based topology builds a full interconnect from each node to every other node and constructs a fully-connected map.
For example, Narada builds a mesh structure among all the
peers and then for each peer constructs a single-source multicast tree from the mesh structure. Due to its centralized
nature, Narada does not scale well. A ring-shaped topology links every node in the graph sequentially. This is
usually done by assigning each node a unique node ID, which is generated by specific algorithms such as a
distributed hash table (DHT). Finally, a hybrid approach combines two or more of the previous designs into their
topology graph. Hybrid systems are usually divided into multiple hierarchical layers and different topologies are
built at each layer. For example, NICE was developed as a hierarchical architecture that combines nodes into
clusters. It then selects representative parents among these clusters to form the next higher level of clusters, which
then is represented as a tree topology.
From the perspective of a peer, the life-cycle of a P2P streaming session can be decomposed into a series of four
major processes: finding the service, searching for specific content, joining or leaving the service, and failure
recovery when there is an error.
In most P2P systems, service discovery is accomplished through a bootstrap mechanism that allows new nodes to
join the P2P substrate. It may be accomplished through some dedicated “super-peers” to act as the well-known
servers to help new peers to find other member nodes. These “super-peers” are called Rendezvous Point (RP)
servers and are sometimes under the control of the administrator of the P2P system. A new peer finds the existence
of the running Rendezvous Point Server from its pre-loaded RP server list. The list can be updated once a peer is
connected to one of the RP servers. RP servers can also be used to collect statistic data and in some systems, these
“super-peers” are connected to form a backbone streaming platform to make the system more stable.
The next step for a peer, after joining the collective, is to locate a stream or session. The availability of specific
content can be discovered in two distinct ways. In an unstructured design, streams and files are located by flooding
the P2P network with search messages. This technique is obviously wasteful and may result in significant network
traffic. The second approach, called structured, is to index the content such that search messages can be forwarded
efficiently to specific nodes that have a high probability to manage the desired content. To keep with the distributed
theme of P2P systems, indexing is often achieved by hashing a content key and assigning that key to nodes with a
distributed hash table (DHT) mechanism.
JOIN: After retrieving the necessary information from the RP server, or gaining enough information from the P2P
system through some methods such as flood-based search, a new peer can join an existing session by establishing
the necessary connections to already joined peers. After the join operation is done, a peer is considered to be a
legitimate member.
LEAVE: Every member of a P2P system is usually also serving some other peers as part of the duty to share the
load of the whole system. An unexpected departure of a peer can cause disruptions or loss of service for other peers
in the system. Ideally a peer should help to reconcile the disconnect in the streaming network caused by its
departure. If a system protocol is well designed, this process can be very fast and almost unnoticeable to the end
user application.
RECOVERY: In the dynamic environment of a P2P system where peers are under different administrative control,
the unexpected departure of peers is unavoidable. A P2P streaming system must cope with these failures and
include a robust and efficient recovery mechanism to repair the streaming topology. However, on the positive side,
since a robust recovery mechanism is an integral part of the design, this makes P2P systems naturally very tolerant
to faults.
The following tables compare general and technical information for a number of streaming media systems both
audio and video. Please see the individual systems' linked articles for further information.
First
Public Latest
Releas Stable
Cost (USD)
Name Creator e Version license Media Media Player
(yyyy- (Release
MM- Date)
dd)
PeerCast Giles ? 0.1217 Free GPL Audio/Video {?}
Macromedia/Adobe 2002- 4.0 (2010-
Flash Media Server $4,500 proprietary Video Flash Player
Systems 07-9 09-13)
Any with
appropriate
protocol support,
including Flash
players,
Silverlight
players,
Free QuickTime
Developer players, VLC
2007- 2.2.3 (2010- license, $995 Audio/Video/Dat players, Safari
Wowza Media Server Wowza Media Systems proprietary
02-17 12-16) Perpetual, a (HTML5),
$65/mo iPhone/iPad/iPo
Subscription d touch, 3GPP
(Android,
Blackberry,
Symbian, etc.),
IPTV set-top
boxes, game
consoles (Wii,
PS3 and other).
Any with
Darwin Streaming 1999- 6.0.3 (2007-
Apple Inc. Free APSL Audio/Video appropriate
Server 03-16 05-10)
protocol support.
Any with
Flumotion Streaming 2004- 0.8.0 (2010-
Flumotion Free GPL Audio/Video appropriate
Server 11-30 09-15)
protocol support.
Any with
0.2.4.1
Firefly Ron Pedde Free GPL Audio appropriate
(2007-10-21)
protocol support.
FreeCast Alban Peignier 2004- 2006-06-29 Free GPL Audio/Video FreeCast client
09-14
Any with
2003- 11.1 (2006- RCSL/RPS
Helix DNA Server RealNetworks Free Audio/Video appropriate
01-22 06-10) L
protocol support.
Free for 12 Any with
months appropriate
1994- 14.0 (2010-
Helix Universal Server RealNetworks (Basic) and proprietary Audio/Video protocol support
01-01 04-14)
$1,000- (PC & Mobile
$10,000 devices).
Windows Media Windows Media
Microsoft Free proprietary Video
Services Player
Free
(Personal), Any with
2006-
Broadwave NCH Software 1.01 $136 proprietary Audio appropriate
07-21
(Commercial protocol support.
)
Any with
1998- 2.3.2 (2008-
Icecast Xiph.Org Foundation Free GPL Audio/Video appropriate
12 06-02)
protocol support.
2005- 0.9.1 (2010-
Red5 http://www.red5.org/ Free LGPL Audio/Video Flash
10 02-21)
Any with
1998- 1.9.8 (2007-
SHOUTcast Nullsoft Free proprietary Audio appropriate
12 02-28)
protocol support.
Flash, Windows
Unreal Streaming 2003- 7.5 (2010- Free,
Unreal Media Server proprietary Audio/Video Media, UMedia
Technologies 10 11-20) Commercial
players
CasparCG http://www.casparcg.com/ ? ? Free GPL Audio/Video Flash
Any with
2007-
Feng http://lscube.org/feng/ 2009-10-14 Free GPL Audio/Video appropriate
05-31
protocol support.
?
(developmen
http://mammothserver.org t seems
Mammoth Server ? Free LGPL Audio/Video Flash
/ stalled
before beta
release)
Name AVI ASF QuickTime Ogg OGM Matroska MP4 MPEG transport stream FLV
PeerCast ? ? ? Yes ? ? ? ? ?
Firefly No ? ? Yes ? ? ? ? ?
Flash Media Server ? ? ? ? ? ? Yes No Yes
Wowza Media Server No No Yes No No No Yes Yes Yes
Darwin Streaming Server ? ? Yes ? ? ? Yes ? No
Flumotion Streaming Server ? Yes Yes Yes Yes Yes Yes ? Yes
FreeCast ? ? ? Yes ? ? ? ? ?
Helix DNA Server ? ? ? ? ? ? No ? ?
Helix Universal Server No Yes Yes Yes No No Yes Yes Yes
Windows Media Services ? Yes ? ? ? ? ? ? No
Broadwave ? ? ? ? ? ? ? ? ?
Icecast Yes ? ? Yes ? ? ? ? No
Red5 ? ? ? ? ? ? Yes ? Yes
SHOUTcast Yes ? ? Yes ? ? ? ? No
Unreal Media Server Yes Yes Yes Yes Yes Yes Yes Yes ?
Protocol support
Information about which internet protocols are supported for broadcasting streaming media content.
Since its requests use only standard HTTP transactions, HTTP Live Streaming is capable of traversing any firewall
or proxy server that lets through standard HTTP traffic, unlike UDP-based protocols such as RTP. This also allows
a Content delivery network to easily be implemented for any given stream.
Apple has documented HTTP Live Streaming as an Internet-Draft, the first stage in the process of submitting it to
the IETF as a proposed Internet standard.
List of streaming media systems
Servers
Ampache
Broadwave Allows you to create your own broadcast from pre-recorded or live audio
Darwin Streaming Server
dyne:bolic GNU/Linux live CD ready for radio streaming
FFserver included in FFmpeg
Firefly Media Server
Flash Media Server
Flumotion Streaming Server
FreeJ video streamer for Icecast
GNUMP3d Streaming server for MP3s, OGG vorbis files, movies and other media formats.
Helix Universal Server Official Link Helix Universal Streaming Server for mobile phone and broadband
RTSP, HTTP iPhone OS, RTMP delivery developed by RealNetworks
HelixCommunity RealNetworks Open Source development community
Icecast an open source streaming media server
Kaltura Full featured open source video platform running on your own servers or cloud.
Packet Ship Commercial Linux RTSP/MPEG-2 TS video server for OEM applications.
PlayOn a media server that runs on a PC and supports Netflix streaming
Parsifal Radio Streaming An easy to use software with a user interface for creating your Internet radio.
PS3 Media Server open source media server for streaming to a Playstation 3
QuickTime Broadcaster
Red5
Erlyvideo
Sirannon an open source media server and client
SHOUTcast audio streaming (HTTP and/or multicast)
Squeezebox Server Open source music streaming server, backboned by a music database (formerly known
as SlimServer)
Steamcast a freeware streaming media server
Subsonic is an open source, web-based media server
TVersity Media Server partially open source, web-based media server
Unreal Media Server - media server supporting UMS, MMS and RTMP protocols.
VideoLAN
Windows Media Encoder
Windows Media Services
Wowza Media Server Unified media server for Flash, Silverlight, Apple iOS (iPhone/iPad), QuickTime,
3GPP mobile, IPTV and game console video/audio streaming
P2P and multicasting
Software as a service
Clients
Amarok
RTMP Stream Plugin
MediaMonkey
MPlayer
Screamer Radio
SDP Multimedia Open Source project to save streaming media to disk
StationRipper
Streamripper
Totem
VLC media player
Winamp a freeware media player for Microsoft Windows
XBMC, a free and open source media center software and framework platform
XMMS
Zinf
Campcaster Open source radio station management, live broadcast and remote automation
FFmpeg
FORscene Java video reviewing, logging, editing and publishing
LastBASH
Liquidsoap
Mod4Win
Muziic
Qtch
QuickTime
SAM Broadcaster Professional Internet broadcasting automation system
Select Station Internet Radio Player
SomaPlayer
Swarmcast
Traction
Streaming Methods
There are two ways to view media on the internet (such as video, audio, animations, etc): Downloading and
streaming.
Downloading
When you download a file the entire file is saved on your computer (usually in a temporary folder), which you then
open and view. This has some advantages (such as quicker access to different parts of the file) but has the big
disadvantage of having to wait for the whole file to download before any of it can be viewed. If the file is quite
small this may not be too much of an inconvenience, but for large files and long presentations it can be very off-
putting.
The easiest way to provide downloadable video files is to use a simple hyperlink to the file. A slightly more
advanced method is to embed the file in a web page using special HTML code.
Delivering video files this way is known as HTTP streaming or HTTP delivery. HTTP means Hyper Text
Transfer Protocol, and is the same protocol used to deliver web pages. For this reason it is easy to set up and use on
almost any website, without requiring additional software or special hosting plans.
Streaming
Streaming media works a bit differently — the end user can start watching the file almost as soon as it begins
downloading. In effect, the file is sent to the user in a (more or less) constant stream, and the user watches it as it
arrives. The obvious advantage with this method is that no waiting is involved. Streaming media has additional
advantages such as being able to broadcast live events (sometimes referred to as a webcast or netcast).
Progressive Downloading
There is also a hybrid method known as progressive download. In this method the video clip is downloaded but
begins playing as soon as a portion of the file has been received. This simulates true streaming, but doesn't have all
the advantages.
The method you choose will depend on your situation, but most people will opt for HTTP streaming (download or
progressive download). This is the easiest and cheapest way to get started. If necessary you can upgrade to a
streaming server later.
Still, you will want to understand both options so the next two pages of this tutorial look at each one in a bit more
detail. After that we'll talk about how to create the actual video files.
Streaming Video Servers
A streaming media or streaming video server is a specialized application which runs on an Internet server. This is
often referred to as "true streaming", since other methods only simulate streaming. True streaming has advantages
such as:
Note: This is a serious step and is well beyond the needs of most websites.
To run your own streaming server, you can either purchase a standalone server machine or purchase a streaming
server software package and install it on an existing web server. Streaming software is available for all common
server platforms such as Linux, Windows, etc.
Helix Universal Server from RealNetworks. This server supports a variety of formats, including RealMedia, Windows
Media, Quicktime and MPEG-4.
Apple Quicktime Streaming Server, supporting a few formats including MPEG-4 and 3GPP.
Macromedia Communication Server, specializing in Flash-based video and interactive multimedia.
This is the simplest and cheapest way to stream video from a website. Small to medium-sized websites are more
likely to use this method than the more expensive streaming servers.
For this method you don't need any special type of website or host — just a host server which recognises common
video file types (most standard hosting accounts do this). You also need to know how to upload files and how to
create hyperlinks (see our website tutorials for more info).
HTTP streaming is a good option for websites with modest traffic, i.e. less than about a dozen people
viewing at the same time. For heavier traffic a more serious streaming solution should be considered.
You can't stream live video, since the HTTP method only works with complete files stored on the server.
You can't automatically detect the end user's connection speed using HTTP. If you want to create different
versions for different speeds, you need to create a separate file for each speed.
HTTP streaming is not as efficient as other methods and will incur a heavier server load.
These things won't bother most website producers — it's normally only when you get into heavy traffic that you
should be worried about them.
That's essentially all there is to it. When a user clicks the hyperlink, their media player opens and begins streaming
the video file. If the file is embedded, it plays right there on the page.
This page provides a brief overview of how streaming video files are created. The next tutorial will provide more
information about the most common streaming formats and how to include them on a web page.
Note: The methods below are for creating stored video files for the purposes of streaming, not for providing live
video broadcasts. Live events must use a streaming server.
1. Use a conversion utility program. This takes an existing digital video file and converts it into the streaming
format of your choice.
2. Export streaming files from video editing software such as Adobe Premiere, Final Cut Pro, etc.
Conversion Utilities
A conversion utility is a stand-alone program which imports a video clip and exports it to a different format.
Examples include RealNetworks RealProducer and Sorenson Squeeze
Basically, you simply open a file and select which which format to save it as. You can set various parameters to
optimise the final video. The program will then chug away for some time while it makes the conversion.
In the window shown here, the pane on the left shows the original file. The right pane shows the file as it is being
converted to a streaming media format.
Exporting a File
Most serious video editing applications have options to export video for the internet. This is often the easiest way to
create streaming video files.
We will use Adobe Premiere 6 to illustrate the
process. Different applications have different
procedures but this will give you the general idea.
You may need to consult your application's help file
for specific instructions.
Once you have created your streaming video file, upload it to your website. It doesn't matter where you put it as
long as you can access it from a web page. For example, you might like to create a folder caller "media" or "video"
and put all your video files there.
The next step will be to create the appropriate HTML code in your web pages to display the video.
This tutorial provides an overview of the main video streaming formats: Windows Media, Real Media, Quicktime,
MPEG-4 and Flash. We look at the pros and cons of each format and explain the basics of using them to deliver
video on the Internet.
Before you begin, you should understand how video streaming works, especially the difference between true
streaming and HTTP streaming.
In our experience Windows Media performs well. Files are relatively high quality and low size.
On the downside, Microsoft is renowned for frequently changing formats and standards. As a video producer it can
be difficult keeping up with the latest version. Also, Microsoft is a very proprietary format, and any platform other
than Windows/IE may have problems.
1. Choose a format
Windows Media has several different file formats (see the links at the bottom of the page). If you're not sure which
format to use, WMV is the easiest for video. You may also be limited to the options in your editor, which brings us
to...
2. Create a video/audio file
The easiest way to create a Windows Media file is to export a file from your favourite editing application. Open the
original video clip, then look under File > Export to see what options you have.
For an example of how this works, see Exporting WinMedia from Adobe Premiere.
3. Place the files on a web page
This involves entering some HTML code, shown below.
The rest of this page assumes you have a file in your chosen format and you are ready to place it on a web page.
There are two ways to do this: Hyperlinking and embedding.
Make a hyperlink directly to the video file using the following code (change "videofilename.wmv" to your own file
name):
When the end user clicks this hyperlink, their Windows Media Player will open and load the video file for playing.
The video may begin playback almost immediately (simulating true streaming) or it may not. The results are
unpredictable, which is one reason you might choose to embed the file instead.
Embedding a Windows Media file places the video clip on the web page, as per the example on the right. To create
a simple embedded video file like this, use the code below (swap both instances of videofilename.wmv for your
own file name).
If you are keen to explore Quicktime's advanced features, you'll find that you can create interactive video,
panoramas, virtual reality settings and more.
Some producers swear that Quicktime is the king of video formats, others just can't seem to be able to make it
work. Our opinion is somewhere in between. It is certainly a good format with many unique features. Getting good
quality video can be a challenge however — we find that default settings are rarely good enough and
experimentation is essential.
One nice thing about Quicktime is its integration with other products. It is widely supported by many editing,
authoring and general interest applications.
The simplest way to add Quicktime to a web page is to make a hyperlink from the page directly to the video file
like so:
When the end user clicks this hyperlink, the video file will begin downloading. In some cases it will begin playing
once a portion of the file has been retrieved, but in other cases the entire file will need to be downloaded first. For
this reason, we do not recommend using a simple link.
A better method is to embed the Quicktime movie in a web page with something like this:
The MPEG-4 standard is relatively complicated and can be confusing. There are many variations of the format,
some are ISO-compliant and some aren't. Quicktime, for example, deals with both ISO-compliant .mp4 files and
non-compliant .mov files. Some MPEG-4 files play in any player, others will only work in certain players.
Numerous applications are available to create MPEG-4, the best-known being Apple Quicktime Pro .
Flash uses two main formats: .swf for standard Flash files which are used in web pages, and .flv which is a special
Flash video format. flv files can be called from within swf files. As of late 2008 Flash also supports H.264 files,
which is a significant leap forward.
Flash has the disadvantage of being expensive. To get the most from this format you need to own the Adobe Flash
authoring program. As well as being pricey, there's a lot to learn (although if you have experience with video
editors you will pick it up quickly).
On the plus side, if you can afford it and you're prepared for a steep learning curve, Flash will give you power and
flexibility beyond your wildest dreams! Custom controls and menus, interactive video and animations, advanced
integration with web pages... the sky is the limit.
Progressive Download
This is a relatively simple method and only requires a standard website hosting service. Although it does have some
limitations, it simulates true streaming reasonably well. This is the most realistic option for most websites.