Unit 4
Unit 4
Unit 4
To discuss various types of Tree ADT such as binary tree, binary search tree, AVL tree
A (rooted) tree consists of a set of nodes (or vertices) and a set of arcs (or edges). Each arc links a parent
node to one of the parent's children. A special root node has no parent. Every other node has exactly one
parent. It is possible to reach any node by following a unique path of arcs from the root.
Tree Definition
A single node is a tree and this node is the root of the tree.
Suppose r is a node and T1, T2,..., Tk are trees with roots r1, r2,..., rk, respectively, then we can construct a
new tree whose root is r and T1, T2,..., Tk are the subtrees of the root. The nodes r1, r2,..., rk are called the
children of r.
Tree is a sequence of N nodes and N-1 edges. Each edge connects some node to its parent, and every node except
the root node has one parent. There is a starting node known as root node. Nodes may have any number of children.
Path -> A sequence of nodes n1, n2,..., nk, such that ni is the parent of ni + 1 for i = 1, 2,..., k - 1. The length of a path
is 1 less than the number of nodes on the path. Thus there is a path of length zero from a node to itself.
Siblings -> The children of a node are called siblings of each other.
Ancestor and Descendent -> If there is a path from node a to node b, then a is called an ancestor of b is called a
descendent of a
Subtree -> A subtree of a tree is a node in that tree together with all its descendents.
Height -> The height of a node in a tree is the length of a longest path from the node to a leaf. The height of a tree
is the height of its root.
Depth ->The depth of a node is the length of the unique path from the root to the node.
Node declarations for trees
Typedef struct TreeNode *PtrToNode;
Struct TreeNode
{
ElementType Element;
PtrToNode FirstChild;
PtrToNode NextSibling;
}
First child next sibling representation
Tree representation
In the tree of Figure, the root is A. Node F has A as a parent and K, L, and M as children. Each node may have an
arbitrary number of children, possibly zero. Nodes with no children are known as leaves; the leaves in the tree
above are B, C, H, I, P, Q, K, L, M, and N. Nodes with the same parent are siblings; thus K, L, and M are all
siblings. Grandparent and grandchild relations can be defined in
a similar manner.
A binary tree is a tree in which no nodes can have more than two children. A binary consists of a root and two
subtrees, TL and TR , both of which could possibly be empty.
A property of binary tree is that the depth of an average binary tree is considerably smaller than N.
An anlaysis shows that the average depth is o(N)
And for binary search tree , the average value of the depth is o(log N).
The depth can be as large as N-1.
Implementation
A binary has atmost two children, we can keep direct pointers to them. Here a node is structure consisting of the
key information + two pointers(Left and Right) to other nodes.
Typedef struct TreeNode *PtrToNode;
Typedef struct PtrToNode Tree;
Struct TreeNode
{
ElementType Element;
Tree Left;
Tree Right;
};
Many of the rules apply to the linked list will apply to trees as well. When an insertion is performed , anode will
have to be created by a call to malloc. Nodes can be freed after deletion by calling free. Trees generally drawn as
circles connected by lines, because they are actually graphs. We never explicitly draw NULL pointers when
referring to trees, because every binary tree with nodes would require N+1 NULL pointers.
Binary trees have many important uses not associated with searching. One of the principal uses of binary trees is in
the area of compiler design.
Expression Trees
Any expression that is going to be reprenseted by an tree structure is called expression tree. In the following
expression tree, The leaves of an expression tree are operands, such as constants or variable names and other node
contain operators. This particular tree happens to be binary, because all of the operations are binary. It is possible
for nodes to have more than two children. It is alsd possible for a node to have only one child as the case of unary
minus operator. We can evaluate an expression tree T, by applying the operator at the root to the values obtained by
recursively evaluating the left and right subtrees.
Next, c, d, and e are read, and for each a one-node tree is created and a pointer to the corresponding tree is pushed
onto the stack.
Continuing, a '*' is read, so we pop two tree pointers and form a new tree with a '*' as root.
Finally, the last symbol is read, two trees are merged, and a pointer to the final tree is left on the stack.
4.3.Tree Traversals(Preorder, Inorder, Postorder )
Tree-traversal refers to the process of visiting (examining and/or updating) each node in a tree data structure,
exactly once, in a systematic way.
If a tree is null, then the empty list is the preorder, inorder, and postorder listing of T
If T comprises a single node, that node itself is the preorder, inorder, and postorder list of T
Otherwise
1. The preorder listing of T is the root of T, followed by the nodes of T1 in preorder, . . . , and the nodes of
Tk in preorder.
2. The inorder listing of T is the nodes of T1 in inorder, followed by the root r, followed by the nodes of T2
in inorder, . . . , and the nodes of Tk in inorder.
3. The postorder listing of T is the nodes of T1 in postorder, . . . , the nodes of Tk in postorder, all followed
by the root r.
Inorder traversal
This procedure is used to find out the inorder traversals. This is done by recursively producing a parenthesized left
expression, then printing out the operator at the root, and finally recursively producing a parenthesized right
expression. This general strategy(left,nod,right) is known as inorder traversal It is easy to remember because of the
type of expression it produces.
Preorder traversal
This procedure is used to find out the preorder traversals. This is done by printing out the operator at the root,
recursively producing a parenthesized left expression and then recursively producing a parenthesized right
expression. This general strategy (nod,left,right) is known as preorder traversal.
void preorder(struct tnode *p)
{
if(p!=NULL)
{
printf("%d\t",p->data);
preorder(p->lchild);
preorder(p->rchild);
}
}
Example:
Preorder + - A * / C 5 2 % * D 5 4
Postorder A C 5 / 2 * - D 5 * 4 % +
Inorder A – C / 5 * 2 + D * 5 % 4
5. Delete()
This routine is used to delete an element from the search tree T. This is done by identifying the node X in the search
tree T.
If T is empty, then return error
Else if X<T then search the node X in the left subtree of T
Else if X>T then search the node X in the right subtree of T
If X is a leaf node then simply delete it by assigning the parent of X to NULL
Else if X may contain either the left child or right child then delete the node X by assigning the address of
left child or child child to its parent node
Else if X that consists of both left and right child, then delete the node X by assigning the address of the
minimum value of the right subtree of X to its parent
SearchTree Delete(element_type x, SEARCH_TREE T)
{
Position tmp_cell;
if (T == NULL)
printf("Element not found\n");
else if (x < T->element)
T->left = delete(x,T->left);
else if (x > T->element)
T->right = delete(x,T->right);
else
if (T->left && T->right)
{
tmp_cell = find_min(T->right);
T->element = tmp_cell->element;
T->right = delete(T->element,T->right);
}
else
tmp_cell = T;
if (T->left == NULL)
T = T->right;
else
if (T->right == NULL)
T = T->left;
free(tmp_cell);
}
return(T);
}
4.5.AVL tree
An AVL tree is another balanced binary search tree. Named after their inventors, Adelson-Velskii and Landis, they
were the first dynamically balanced trees to be proposed. Like red-black trees, they are not perfectly balanced, but
pairs of sub-trees differ in height by at most 1, maintaining an O(logn) search time. Addition and deletion
operations also take O(logn) time.
An AVL tree is a binary search tree which has the following properties:
1. The sub-trees of every node differ in height by at most one.
Rotation
o Single rotation
o Double rotation
Balance is restored by tree rotations. A single rotation switches the roles of the parent and child while maintaining
search order. The first case, in which the insertion occurs on the “outside” (that is, left-left or right-right), is fixed
by a single rotation of the tree. A single rotation switches the roles of the parent and child while maintaining search
order. The second case, in which the insertion occurs on the “inside” (that is, leftright or right-left) is handled by the
slightly more complex double rotation.
Single Rotation
Case 1: An insertion into the left subtree of the left child of X.
The figure shows an AVL tree that satisfies the balanced property. If we want to insert an element A which are less
than X then insert an node A into the left subtree of X. After inserting an a node A, the k2 violates the balanced
property. So we rortate a k2 in clockwise rotation. Then the node k1 becomes a root then k2 is inseted into the right
subtree of k1. All other nodes X,Y & Z are inserted into the AVL tree based on their values. Now the tree bacomes
an AVL tree after performing a single rotation.
Case 2: An insertion into the right subtree of the right child of X.
The figure shows an AVL tree that satisfies the balanced property. If we want to insert an element A which are
greater than Z then insert an node A into the right subtree of Z. After inserting an a node A, the k1 violates the
balanced property. So we rotate a k1 in anti clockwise rotation. Then the node k2 becomes a root then k1 is inseted
into the left subtree of k1. All other nodes X,Y & Z are inserted into the AVL tree based on their values. Now the
tree bacomes an AVL tree after performing a single rotation.
// Rotate binary tree note with left child
// For AVL trees, this is a single rotation for case 1
template <class Etype>
BinaryNode<Etype> *
RotateWithLeftChild( BinaryNode<Etype> *K2 )
{
BinaryNode<Etype> *K1 = K2->Left;
K2->Left = K1->Right;
K1->Right = K2;
return K1;
}
Example
Here, after inserting 3, node 1 violates the property. So, perform single rotation by rotating in anti clockwise
direction. Now 2 becomes the root.
Here, after inserting 5, node 3,2 violates the property. So, perform single rotation by rotating a node 3 in anti
clockwise direction. Now 4 becomes the root. All the remaining nodes 1 & 2 will be the same. Now it is an AVL
tree
Here, after inserting 7, node 5,4 violates the property. So, perform single rotation only with 5,6,7 by rotating in anti
clockwise direction. Now 6 becomes the root for 5 & 7. All the remaining nodes are the same.
Double Rotation
Double rotation is going to performed whenever the single rotation fails. This is done by performing single rotation
twice.
Case 3: An insertion into the right sub tree of the left child of X.
The figure shows an AVL tree. All the nodes in the tree that satisfies the property. Now if we want to insert a node
X which are smaller than B then insert a node X into the left subtree of B. But after performing an insertion the
node k1,k3 that violates the property. So perfrom a rotation only to k1 in anti clockwise direction. Now k2 becomes
the root and k1 is inserted into th left subtree of k1. All other nodes A,B & C are inserted based on their values.
Even thought the tree is not an AVL. Because k3 violates the property. Again performing a single rotation to k3
clockwise. Now k2 becomes a root and k3 is inserted into the right subtree of k2. Now it becomes an AVL tree.
Case 3: An insertion into the right sub tree of the left child of X.
The figure shows an AVL tree. All the nodes in the tree that satisfies the property. Now if we want to insert a node
X which are greater than C then insert a node X into the right subtree of C. But after performing an insertion the
node k1,k3 that violates the property. So perfrom a rotation only to k1 in clockwise direction. Now k2 becomes the
root and k1 is inserted into th right subtree of k1. All other nodes A,B & C are inserted based on their values. Even
thought the tree is not an AVL. Because k3 violates the property. Again performing a single rotation to k3 anti
clockwise. Now k2 becomes a root and k3 is inserted into the left subtree of k2. Now it becomes an AVL tree.
4.6.Graph
A graph consists of a set of nodes (or Vertices ) and a set of arc (or edge ). Each arc in a graph is
specified by a pair of nodes.
A node n is incident to an arc x if n is one of the two nodes in the ordered pair of nodes that constitute x.
The degree of a node is the number of arcs incident to it. The indegree of a node n is the number of arcs that
have n as the head, and the outdegree of n is the number of arcs that have n as the tail. Loop is an edge that
connects a vertex to itself.
The graph is the nonlinear data structure. The graph shown in the figure represents 7 vertices and 12
edges. The Vertices are { 1, 2, 3, 4, 5, 6, 7} and the arcs are {(1,2), (1,3), (1,4), (2,4), (2,5), (3,4), (3,6), (4,5),
(4,6), (4,7), (5,7), (6,7) }. Node (4) in figure has indegree 3, outdegree 3 and degree 6.
4.7.Graph Representation:
There are several different ways to represent graphs in a computer. Two main representaions are
Adjacency Matrix
Adjacency list.
(i) Adjacency Matrix Representation:
An adjacency matrix of a graph G=(V,E) (let V = { v 1 , v 2 ....., v n}) is a n X n matrix A, such that
A [i, j] = 1 if there is edge between vi and v j.
0 other wise
Accessing any element can be done quickly, in constant time. Size of the adjacency matrix is nXn for a graph of n
vertices, which is independent of number of edges. The following representation uses memory efficiently for a
sparse graph.
(ii) Adjacency List Representation:
It consists of a list of vertices, which can be represented either by linked list or array. For each vertex, adjacent
vertices are represented in the form of a linked list.
An example is given below.
Size of the adjacency lists is proposnal to number of edges and vertices.It is an efficient emory use for sparse
graphs. Accessing an element takes linear time, since it involves traversal of linked list.
4.8.Topological Sort
• Topological sort is a method of arranging the vertices in a directed acyclic graph (DAG), as a sequence, such
that no vertex appear in the sequence before its predecessor.
• Topological sort is not unique.
• The following are all topological sort of the graph below:
• One way to find a topological sort is to consider in-degrees of the vertices.
• The first vertex must have in-degree zero -- every DAG must have at least one vertex with in-degree zero.
s1 = {a, b, c, d, e, f, g, h, i}
s2 = {a, c, b, f, e, d, h, g, i}
s3 = {a, b, d, c, e, g, f, h, i}
s4 = {a, c, f, b, e, h, d, g, i}
etc.
v Known dv pv
------------------------------------
v1 1 0 0
v2 0 2 v1
v3 0 ∞ 0
v4 0 1 v1
v5 0 ∞ 0
v6 0 ∞ 0
v7 0 ∞ 0
v Known dv pv
------------------------------------
v1 1 0 0
v2 0 2 v1
v3 0 3 v4
v4 1 1 v1
v5 0 3 v4
v6 0 9 v4
v7 0 5 v4
v Known dv pv
-----------------------------------
v1 1 0 0
v2 1 2 v1
v3 0 3 v4
v4 1 1 v1
v5 0 3 v4
v6 0 9 v4
v7 0 5 v4
v Known dv pv
-----------------------------------
v1 1 0 0
v2 1 2 v1
v3 1 3 v4
v4 1 1 v1
v5 1 3 v4
v6 0 6 v7
v7 1 5 v4
After v7 is declared known
v Known dv pv
-------------------------------------
v1 1 0 0
v2 1 2 v1
v3 1 3 v4
v4 1 1 v1
v5 1 3 v4
v6 1 6 v7
v7 1 5 v4
Definition:-
Properties of Trees
° A graph is a tree if and only if there is one and only one path joining any two of its vertices.
° A connected graph is a tree if and only if every one of its edges is a bridge.
Definitions:-
° A subgraph that spans (reaches out to ) all vertices of a graph is called a spanning subgraph.
° A subgraph that is a tree and that spans (reaches out to ) all vertices of the original graph is called a spanning tree.
° Among all the spanning trees of a weighted and connected graph, the one (possibly more) with the least total
weight is called a minimum spanning tree (MST).
Figure 1
The best way of connecting them is shown in the following figure.
Figure 2
The cost of the laying the above pipe line connecting these oil wells is 20. Any other way of connecting increases
the cost. Resultant graph is a tree which spannes all vertices of the graph. Hence called spanning tree. The
spanning tree with minimum cost is called Minimum Spanning tree (MST).
Implementation of Minimum spanning trees
Prim’s algorithm
Kuskl’s algorithm
(i)Prims’ algorithm
Prim's algorithm is a greedy algorithm that finds a minimum spanning tree for a connected weighted undirected
graph. This means it finds a subset of the edges that forms a tree that includes every vertex, where the total weight
of all the edges in the tree is minimized.
This is done by
Step 0: Pick any vertex as a starting vertex. (Call it S). Mark it with any given colour, say red.
Step 1: Find the nearest neighbour of S (call it P 1). Mark both P1 and the edge SP1 red. cheapest unmarked
(uncoloured) edge in the graph that doesn't close a coloured circuit. Mark this edge with same colour of Step 1.
Step 2 : Find the nearest uncoloured neighbour to the red subgraph (i.e., the closest vertex to any red vertex). Mark
it and the edge connecting the vertex to the red subgraph in red.
Step 3 : Repeat Step 2 until all vertices are marked red. The red subgraph is a minimum spanning tree.
v Known dv pv
------------------------------------
v1 1 0 0
v2 0 2 v1
v3 0 4 v1
v4 0 1 v1
v5 0 ∞ 0
v6 0 ∞ 0
v7 0 ∞ 0
v Known dv pv
------------------------------------
v1 1 0 0
v2 0 2 v1
v3 0 2 v4
v4 1 1 v1
v5 0 7 v4
v6 0 8 v4
v7 0 4 v4
(ii)Krsukals algorithm
Kruskal's algorithm is an algorithm in graph theory that finds a minimum spanning tree for a connected weighted
graph. This means it finds a subset of the edges that forms a tree that includes every vertex, where the total weight
of all the edges in the tree is minimized. If the graph is not connected, then it finds a minimum spanning forest (a
minimum spanning tree for each connected component). Kruskal's algorithm is an example of a greedy
algorithm.This is done by,
Step 1 : Find the cheapest edge in the graph (if there is more than one, pick one at random). Mark it with any given
colour, say red.
Step 2 : Find the cheapest unmarked (uncoloured) edge in the graph that doesn't close a coloured or red circuit.
Mark this edge red.
Step 3 : Repeat Step 2 until you reach out to every vertex of the graph (or you have N ; 1 coloured edges, where N
is the number of Vertices.) The red edges form the desired minimum spanning tree.
This algorithm can also be viewed as follows: Initially, each vertex is component. Add an edge between a pair
of components to make a bigger component without any cycle. The edge chosen is a least weighted.
Repeatedly add edges one after the another till we have only one component.
An implementation of the algorithm can be done using a set data structure. Each component is represented as
a set. Add next least weighted edge (u,v), if u and v are in two different components or sets. That is, make a
bigger component by taking the union of the sets containing u and v.
What is NP?
NP is the set of all decision problems (question with yes-or-no answer) for which the 'yes'-answers
can be verified in polynomial time (O(n^k) where n is the problem size, and k is a constant) by a
deterministic Turing machine. Polynomial time is sometimes used as the definition of fast or quickly.
What is P?
P is the set of all decision problems which can be solved in polynomial time by a deterministic Turing
machine. Since it can solve in polynomial time, it can also be verified in polynomial time. Therefore P is a
subset of NP.
What is NP-Complete?
A problem x that is in NP is also in NP-Complete if and only if every other problem in NP can be quickly
(ie. in polynomial time) transformed into x. In other words:
1. x is in NP, and
2. Every problem in NP is reducible to x
What is NP-Hard?
NP-Hard are problems that are at least as hard as the hardest problems in NP. Note that NPComplete
problems are also NP-hard. However not all NP-hard problems are NP (or even a decision problem),
despite having 'NP' as a prefix. That is the NP in NP-hard does not mean 'non deterministic polynomial
time'. Yes this is confusing but its usage is entrenched and unlikely to change.
----------------------------------------------
Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.
Alternative Proxies: