0% found this document useful (0 votes)
6 views8 pages

DS_Assi-1

The document contains various answers to algorithmic problems, including recurrence relations, time complexities, and data structure operations. Key points include the application of the Master Theorem, the analysis of linked list operations, and the conversion of infix expressions to postfix. Additionally, it discusses merging arrays and the complexities associated with different operations.

Uploaded by

Akash Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views8 pages

DS_Assi-1

The document contains various answers to algorithmic problems, including recurrence relations, time complexities, and data structure operations. Key points include the application of the Master Theorem, the analysis of linked list operations, and the conversion of infix expressions to postfix. Additionally, it discusses merging arrays and the complexities associated with different operations.

Uploaded by

Akash Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Ans 1:

Let the time of running the A(n) function be T(n) then when it is called again with argument
passed in √n then it will be T(√n) and other unit operations which are constant in number .So
we can assume it to be 1. So final recurrence relation will be

T(n)=T(√n)+1

Ans 2:

Yes, the masters theorem can be applied to this recurrence relation as value of a=4 ,value of
b=2 and value of f(n)=n2log(n),Which satisfy the condition of masters theorem .

Ans 4:

T(n)=1 for n=1 and


T(n)=T(n-1) +logn for n>1

Putting n=n-1 in the eqn we get T(n-1)= T(n-2) + log(n-1)

Putting the value of T(n-1) in T(n) we get

T(n)=T(n-1)+log(n)+log(n-1);

Now after k times it becomes

T(n)=T(n-k)+logn +log(n-1) +log(n-2) +……….+log(n-k+1). ……..=>eqn1

We know the value of T(1) . So put n-k=1 => k=n-1


Putting these value in the eqn1 , we get

T(n)=T(1) +logn+logn-1+……….+log4+log3+log2;

By using the properties of log we get

T(n)=T(1)+log(n*(n-1)*……*3*2*1)
T(n)=1+logn!

So in the upper bound notation It will be O(n*logn)

Ans 5 :

The algorithm that we use is back and forth exploring both the possibilities.

So, Move 1 step right , if door found then stops

Then move 2 step left from original position then stops


Similary approach in this way alternatively in both direction , by doubling the distance each
time. This ensures that minimum time is taken with all direction possibilities .

So , in this way we can reach the door at most O(n) steps

Ans 6(a) :

No, the masters theorem can’t be applied to this recurrence relation because the function
f(n) is n/logn and in this case the masters theorem is not applicable .

Ans 6(b) :

T(n)=1 for n=1 and


T(n)=T(n-1) +logn for n>1

Putting n=n-1 in the eqn we get T(n-1)= T(n-2) + log(n-1)

Putting the value of T(n-1) in T(n) we get

T(n)=T(n-1)+log(n)+log(n-1);

Now after k times it becomes

T(n)=T(n-k)+logn +log(n-1) +log(n-2) +……….+log(n-k+1). ……..=>eqn1

We know the value of T(1) . So put n-k=1 => k=n-1


Putting these value in the eqn1 , we get

T(n)=T(1) +logn+logn-1+……….+log4+log3+log2;

By using the properties of log we get

T(n)=T(1)+log(n*(n-1)*……*3*2*1)
T(n)=1+logn!

So in the upper bound notation It will be O(n*logn)

Ans 6(c) :

Since, we want lesser running time of 100n2 than 2n to become faster .


So,

100n2 < 2n

By hit and trial method , we get minimum value of n=15


Ans 8(a) :

Infix expression is (A+B)/D)↑((E−F)∗G)

So first step is (A+B) -> AB+


Then E-F -> EF-
((E-F)*G) -> EF-G*
((A+B)/D ) ->. AB +D/

((A+B)/D)^((E-F)*G) can be written as AB + D/EF-G* ^

Ans 8(b) :

For deleting the kth node from the linked list .

We have to traverse the linked list to (k-1)th node, by using for loop which runs for k-1 times .
Then we have to make next element of k-1 equals to the next to next element of k-1 . So in
this way the next element of k-1 is get out of the linked list.

Ans 8(c) :

For counting the even number in the array . We create a count variable with initial value
equal to zero .Then ,we traverse the entire array using for loop and have to check for the
even number by using arr[i]%2==0 , where arr[i] is the element at ith index .If the statement
holds true then increment the value of counter by 1 .

So at the end of the loop we get the no of EVnum in the array.

Ans 9(a):

Three ways to merge two arrays are :-

1) By concatenation of two arrays into a single new array

In this we traverse through both the arrays and put them into a single array of
sufficient size. Initially we traverse through the first array and put all the elements of it
into new array then after that we traverse the second array and put all elements to
new array.

So let us suppose the size of first array is m and size of second array is n then time
complexity it will take to traverse both the arrays is O(m+n).

And we also have to create a new array of size sum of the size of both the arrays to
store all these elements so the space complexity will be O(m+n).

2) In place merging
Ans 9(b) :

For merge sort , We divide the array into two parts and then try to sort it .

Merge_sort(A,left,right)

If left<right then
Mid=(left+right)/2

Merge_sort(A,left,mid)
Merge_sort(A,mid+1 ,right)
Merge(A,left,mid,right)

Merge(A,left,mid,right)
Temp array l and r
l=A[left to mid]
r=A[mid+1 to right]
i=j=0;
k=left;

while i<size(l) and j<size®


if(l[i]<=r[j] then
A[k] =l[i]
I++;
Else
A[k]=R[j]
j++;
k++
while i < size(l)
A[k] = l[i]
i=i+1
k=k+1
while j < size(R)
A[k] = r[j]
j=j+1
k=k+1

Ans 10(a) :

The output of the code is 30, because we passed the address of the variable y into the
function and function changed its value .

Ans 10(b) :

#include <stdio.h>
int main(){

struct Node * traverse(struct Node *ptr){

if(head==NULL){
printf(“Head not found”);

while(ptr!=NULL){

printf(“Element : %d”,ptr->data);
ptr=ptr->next;

void pop(struct Node * head){


struct Node* ptr=head;

while(ptr-->next-->next!=NULL){
ptr=ptr->next;
}

free(ptr->next);
ptr->next=NULL;

void push(struct Node *head,int value){

if(head==NULL){
printf(“Error , head not found “);
return;
}

struct Node *new =(struct Node*)malloc(sizeof(struct Node));

if(!new){
printf(“Memory limit exceeds”);
return;

new->data=value;

struct Node*ptr=head;

while(ptr->next!=NULL){
ptr=ptr->next;
}
new->next=NULL;

ptr->next=new;

struct Node{

int data;
struct Node *next;
}

int main(){

struct Node * head=(struct Node*)malloc(sizeof(struct Node));


struct Node * second=(struct Node*)malloc(sizeof(struct Node));
struct Node * third=(struct Node*)malloc(sizeof(struct Node));
struct Node * fourth=(struct Node*)malloc(sizeof(struct Node));

head->data=1;
second->data=2;
third->data=3;
fourth->data=4;

head->next=second;
second->next=third;
third->next=fourth;
fourth->next=NULL;

traverse(head);

push(head,30);
push(head,40);
pop(head);

Ans 11:

For search operation in linked list ,


The best case time complexity will be O(1) , When the key element found at the first
node .

e.g. Searching for 1 in the linked list , 1->2->3->4->NULL (It found at 1 st node)

The worst case time complexity will be O(n) , where n is the no of nodes in the linked list.
In this case the key element will be found at last node.

e.g. Searching for 4 in the linked list , 1->2->3->4->NULL

For insertion operation in linked list,

The best case time complexity will be O(1), when the insertion is at the 1st node .

For insertion , we have to traverse the ptr it to 1st position , it take O(1) . And for insertion it
will take finite steps . So overall time complexity will be O(1).

e.g. Inserting 30 at 1st node in the linked list , 1->2->3->4->NULL .


After inserting , 30->1->2->3->4->NULL.

The worst case time complexity will be O(n), where n is the no of nodes in the list.In this
case the insertion will be at last node
In this case the we have to traverse to the last node where we insert the new node , which
takes O(n) time complexity .

e.g. Inseting 40 at last node in linked list , 1->2->3->4->NULL


After inserting , 1->2->3->4->40->NULL

Ans 12:

Let f(n) = nlog2n

Take log both sides we get,

Log(f(n)) = logn*logn=(logn)2

And let g(n)=2n

Taking log on both sides we get,

Log(g(n))=n*log22=n;

Since (logn)2 grows much slower than n we can conclude that by definition of O(n) that this
statement holds true.

Ans 13:

In first step it will be ([AB+]-C)*([DE-]/[FG+H-]^J)


Then similarly final answer will be AB+C-DE-FG+H-J^/

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy