_CD -unit 1
_CD -unit 1
Compiler Design
Unit 1: Introduction to Compiler Design
SEMESTER: 7
PREPARED BY: Prof.Ramya
Compiler
• A given source language is either compiled or
interpreted for execution
• compiler is a program that translates a source
program (HLL; C, Java) into target code;
machine re-locatable code or assembly code.
– The generated machine code can be later
executed many times against different data
each time.
– The code generated is not portable to other
systems.
Interpreter
◆ In an interpreted language, implementations
execute instructions directly and freely
without previously compiling a program into
machine code instructions.
◆ Translation occurs at the same time as the
program is being executed.
◆ An interpreter reads an executable source
program written in HLL as well as data for this
program, and it runs the program against the
data to produce some results.
Interpreter
◆ Common interpreters include Perl, Python, and
Ruby interpreters, which execute Perl, Python,
and Ruby code respectively.
◆ Others include Unix shell interpreter, which
runs operating system commands interactively.
◆ Source program is interpreted every time it is
executed (less efficient).
Interpreter
◆ Interpreted languages are portable since they
are not machine dependent. They can run on
different operating systems and platforms.
◆ They are translated on the spot and thus
optimized for the system on which they’re
being run.
Compilers and Interpreters
• “Compilation”
– Translation of a program written in a source
language into a semantically equivalent
program written in a target language
Inpu
t
Source Target
Compiler
Program Program
• “Interpretation”
– Performing the operations implied by the
source program
Source
Program
Interpreter Output
Inpu
t
Error messages
The Analysis-Synthesis Model of
Compilation
There are two parts to compilation:
– Analysis Phase
This is also known as the front-end of the compiler. It reads the source
program, divides it into core parts and then checks for lexical, grammar and
syntax errors. The analysis phase generates an intermediate representation of
the source program and symbol table, which should be fed to the Synthesis
phase as input
– Synthesis Phase
Its also known as the back-end of the compiler.
It generates the target program with the help of intermediate source code
representation and symbol table.
Preprocessors, Compilers, Assemblers and
Linkers
• A preprocessor considered as part of compiler, is a
tool that produces input for compilers. It deals with
macro-processing, file inclusion, language
extension, etc.
• Assembler
An assembler translates assembly language programs
into machine code. The output of an assembler is called
an object file, which contains a combination of
machine instructions as well as the data required to
place these instructions in memory.
Preprocessors, Compilers, Assemblers and
Linkers
• Linker
A computer program that links and merges various
object files together in order to make an executable
file.
● All these files might have been compiled by separate
assemblers. The major task of a linker is to search
and locate referenced module/routines in a program
and to determine the memory location where these
codes will be loaded, making the program
instruction to have absolute references.
Compiler Design - Architecture of a
Compiler
• A compiler can have many phases and passes.
• Pass : A pass refers to the traversal of a compiler
through the entire program.
• Phase : A phase of a compiler is a distinguishable
stage, which takes input from the previous stage,
processes and yields output that can be used as input
for the next stage. A pass can have more than one
phase.
Phases of a Compiler
• The compilation process is a sequence of various
phases.
• Each phase takes input from its previous stage and
has its own representation of source program, and
feeds its output to the next phase of the compiler.
Traditional three pass compiler
errors
Phases of a Compiler - Front end
▪ The front end analyzes the source code to
build an internal representation of the
program, called the intermediate
representation (IR).
▪ It also manages the symbol table, a data
structure mapping each symbol in the source
code to associated information such as
location, type and scope.
Phases of Compiler
Phases of a Compiler - Front end cont’d
The front end includes all analysis phases and
the intermediate code generator.
• Lexical analysis is the first phase of compiler
which is also termed as scanning.
• During this phase, Source program is scanned to
read the stream of characters and those characters
are grouped to form a sequence called lexemes
which produces token as output. Tokens are defined
by regular expressions which are understood by the
lexical analyzer.
Lexical Analysis
● lexical analysis: The process of converting a sequence
of characters (such as in a computer program) into a
sequence of tokens (strings with an identified
"meaning")
● Lexical analysis takes the modified source code from
language preprocessors that are written in the form of
sentences.
● The lexical analyzer breaks these syntaxes into a
series of tokens, by removing any whitespace or
comments in the source code.
Lexical Analysis
● The lexical analyzer (either generated automatically
by a tool like lex, or hand-crafted) reads in a stream
of characters, identifies the lexemes in the stream, and
categorizes them into tokens.
● This is called "tokenizing". If the lexer finds an
invalid token, it will report an error.
Front end: Terminologies
• Token: Token is a sequence of characters that
represent lexical unit, which matches with
the pattern, such as keywords, operators,
identifiers etc.
• Lexeme: Lexeme is instance of a token i.e.,
group of characters forming a token.
• Pattern: Pattern describes the rule that the
lexemes of a token takes. It is the structure
that must be matched by strings.
Token and Lexeme
▪ Once a token is generated the corresponding
entry is made in the symbol table.
At lexical analysis phase,
Input: stream of characters
Output: Token
Token Template:
<token-name, attribute-value>
▪ For example, for c=a+b*5;
Hence,
<id, 1><=>< id, 2>< +><id, 3 >< * >< 5>
Token and Lexeme Cont’d
Syntax Analysis
▪ Syntax Analyze is sometimes called as
parser. It constructs the parse tree. It takes all
the tokens one by one and uses Context Free
Grammar to construct the parse tree.
Why Grammar ?
▪ The rules of programming can be entirely
represented in some few productions. Using
these productions we can represent what the
program actually is. The input has to be
checked whether it is in the desired format or
not.
Syntax Analysis cont’d
errors
errors
errors
• Scanner:
– Maps characters into tokens – the basic unit of syntax
• x = x + y becomes <id, x> = <id, x> + <id, y>
– Typical tokens: number, id, +, -, *, /, do, end
– Eliminate white space (tabs, blanks, comments)
• A key issue is speed so instead of using a tool like
LEX it sometimes needed to write your own
scanner
Front end
Source tokens IR
Scanner Parser
code
errors
• Parser:
– Recognize context-free syntax
– Guide context-sensitive analysis
– Construct IR
– Produce meaningful error messages
– Attempt error correction
• There are parser generators like YACC which
automates much of the work
Phases of a Compiler cont’d
Middle End – The Optimizer
errors
errors