DS_Assi-1
DS_Assi-1
Let the time of running the A(n) function be T(n) then when it is called again with argument
passed in √n then it will be T(√n) and other unit operations which are constant in number .So
we can assume it to be 1. So final recurrence relation will be
T(n)=T(√n)+1
Ans 2:
Yes, the masters theorem can be applied to this recurrence relation as value of a=4 ,value of
b=2 and value of f(n)=n2log(n),Which satisfy the condition of masters theorem .
Ans 4:
T(n)=T(n-1)+log(n)+log(n-1);
T(n)=T(1) +logn+logn-1+……….+log4+log3+log2;
T(n)=T(1)+log(n*(n-1)*……*3*2*1)
T(n)=1+logn!
Ans 5 :
The algorithm that we use is back and forth exploring both the possibilities.
Ans 6(a) :
No, the masters theorem can’t be applied to this recurrence relation because the function
f(n) is n/logn and in this case the masters theorem is not applicable .
Ans 6(b) :
T(n)=T(n-1)+log(n)+log(n-1);
T(n)=T(1) +logn+logn-1+……….+log4+log3+log2;
T(n)=T(1)+log(n*(n-1)*……*3*2*1)
T(n)=1+logn!
Ans 6(c) :
100n2 < 2n
Ans 8(b) :
We have to traverse the linked list to (k-1)th node, by using for loop which runs for k-1 times .
Then we have to make next element of k-1 equals to the next to next element of k-1 . So in
this way the next element of k-1 is get out of the linked list.
Ans 8(c) :
For counting the even number in the array . We create a count variable with initial value
equal to zero .Then ,we traverse the entire array using for loop and have to check for the
even number by using arr[i]%2==0 , where arr[i] is the element at ith index .If the statement
holds true then increment the value of counter by 1 .
Ans 9(a):
In this we traverse through both the arrays and put them into a single array of
sufficient size. Initially we traverse through the first array and put all the elements of it
into new array then after that we traverse the second array and put all elements to
new array.
So let us suppose the size of first array is m and size of second array is n then time
complexity it will take to traverse both the arrays is O(m+n).
And we also have to create a new array of size sum of the size of both the arrays to
store all these elements so the space complexity will be O(m+n).
2) In place merging
Ans 9(b) :
For merge sort , We divide the array into two parts and then try to sort it .
Merge_sort(A,left,right)
If left<right then
Mid=(left+right)/2
Merge_sort(A,left,mid)
Merge_sort(A,mid+1 ,right)
Merge(A,left,mid,right)
Merge(A,left,mid,right)
Temp array l and r
l=A[left to mid]
r=A[mid+1 to right]
i=j=0;
k=left;
Ans 10(a) :
The output of the code is 30, because we passed the address of the variable y into the
function and function changed its value .
Ans 10(b) :
#include <stdio.h>
int main(){
if(head==NULL){
printf(“Head not found”);
while(ptr!=NULL){
printf(“Element : %d”,ptr->data);
ptr=ptr->next;
while(ptr-->next-->next!=NULL){
ptr=ptr->next;
}
free(ptr->next);
ptr->next=NULL;
if(head==NULL){
printf(“Error , head not found “);
return;
}
if(!new){
printf(“Memory limit exceeds”);
return;
new->data=value;
struct Node*ptr=head;
while(ptr->next!=NULL){
ptr=ptr->next;
}
new->next=NULL;
ptr->next=new;
struct Node{
int data;
struct Node *next;
}
int main(){
head->data=1;
second->data=2;
third->data=3;
fourth->data=4;
head->next=second;
second->next=third;
third->next=fourth;
fourth->next=NULL;
traverse(head);
push(head,30);
push(head,40);
pop(head);
Ans 11:
e.g. Searching for 1 in the linked list , 1->2->3->4->NULL (It found at 1 st node)
The worst case time complexity will be O(n) , where n is the no of nodes in the linked list.
In this case the key element will be found at last node.
The best case time complexity will be O(1), when the insertion is at the 1st node .
For insertion , we have to traverse the ptr it to 1st position , it take O(1) . And for insertion it
will take finite steps . So overall time complexity will be O(1).
The worst case time complexity will be O(n), where n is the no of nodes in the list.In this
case the insertion will be at last node
In this case the we have to traverse to the last node where we insert the new node , which
takes O(n) time complexity .
Ans 12:
Log(f(n)) = logn*logn=(logn)2
Log(g(n))=n*log22=n;
Since (logn)2 grows much slower than n we can conclude that by definition of O(n) that this
statement holds true.
Ans 13: