Space Complexity 2
Space Complexity 2
Space Complexity 2
Section: 2
Group Members:
1. Anansi Sime : UGR/9691/15
2. Meron Sisay : UGR/0752/15
3. Selamawit Shimeles : UGR/8982/15
4. Tsion Shimelis : UGR/0654/15
5. Yabtsega Kinfe : UGR/2887/15
6. Yedi Worku : UGR/1035/15
7. Yordanos Abay : UGR/0919/15
Space complexity is a fundamental concept in computer science that describes the amount of
memory an algorithm needs to run as a function of the length of its input. It’s a measure of
the efficiency of an algorithm in terms of memory usage which includes all the memory that
the algorithm requires, including the input data and any auxiliary space that the algorithm
needs to for temporary storage during execution.
1.Instruction Space: It's the amount of memory used to save the compiled version of
instructions.
For example, If a function A () calls function B () inside it, then all the variables of the
function A () will get stored on the system stack temporarily, while the function B() is called
and executed inside the function A().
3.Data Space: Amount of space used by the variables and constants. But while calculating
the Space Complexity of any algorithm, we usually consider only Data Space and we neglect
the Instruction Space and Environmental Stack.
For calculating the space complexity, we need to know the value of memory used by different
type of data type variables, which generally varies for different operating systems, but the
method for calculating the space complexity remains the same.
Space complexity in Big O notation measures the amount of memory used by an algorithm
with respect to the size of its input. It represents the worst-case memory consumption as the
input size increases. These are;
1. O(1) — Constant Space: The algorithm uses a fixed amount of memory that does not
depend on the input size.
Example 1:
int sum = 0;
sum += i;
return sum;
Space complexity -> O(1). Spaces used by variables sum, i are constant with respect the
input (N).
Example 2:
int a = 0, b = 0;
a = a + 5;
b = b + 6;
Space complexity -> O(1). Spaces used by a, b, i, j, N, M are constant with respect to
inputs (N, M).
2, O(n) — Linear Space: The algorithm’s memory usage grows linearly with the input size.
Example
// n is the length of array a[]
int sum (int arr[], int n)
{
int sum = 0; // 4 bytes for sum
for (int i = 0; i < n; i++) // 4 bytes for i
{
sum = sum + arr[i];
}
return(sum);
}
In the above example, we need 4*n bytes of space for each
element of the array.
4 bytes each for sum, n, i and the return value
So, the total amount of memory will be(4n*16) which is increasing
linearly with an increase in the input value n. This is called
linear space complexity. If you have a loop variable i, then the
required space complexity will be 1 word.
3, O(n²) — Quadratic Space: The algorithm’s memory usage increases proportionally to the
square of the input size.
Example
for i in range(N):
for j in range(N):
return true
return false
-In this algorithm we iterate through the array twice (nested loops), comparing each pair of
elements. If we find a pair whose sum equals Z, we return true. otherwise, we return False.
-For each iteration of the outer loop (indexed by i), we need memory for the inner loop
(indexed by j) and the memory used for j is released after each iteration of the outer loop.
Therefore, the maximum memory usage occurs when run through all N elements, resulting in
O(N^2) space complexity.
- Example: Creating a two-dimensional array of size n*n to represent a matrix, where each
element occupies space. This results in O(n^2) space complexity.
matrix = [[0] * n for _ in range(n)]
Space complexity is not the same for all programming languages. Space complexity depends
on various factors, including the programming language, the implementation of data
structures and algorithms, and the underlying runtime environment. Below we will see how
each one of these factors affect space complexity of a program.
Choosing the right algorithm and the right data structure can dramatically affect the
performance of our code (time and memory). In this paragraph we will talk a lot about how it
affects our memory. The space complexity helps to determine the efficiency and scalability
of a solution, and it is an important factor to consider when choosing a data structure or
designing an algorithm.
By understanding these space complexities, programmers can make informed decisions about
which data structures to use in their spacecraft software. Careful selection can help optimize
memory usage and ensure the smooth operation of critical systems within the limited space
available onboard. An algorithmic paradigm is a general approach or strategy for solving a
class of problems, which defines the main idea or concept behind an algorithm without
specifying the exact details or implementation. Algorithm design paradigms can significantly
influence space usage in a program. Here's how some common paradigms impact space
complexity:
-Divide and conquer involves breaking down a large problem into smaller and simpler sub
problems, solving them recursively, and combining the solutions to get the final answer. For
example, to sort a given list of n natural numbers, split it into two lists of about n/2 numbers
each, sorts each of them in turn, and interleave both results appropriately to obtain the sorted
version of the given list. This approach is known as the merge sort algorithm. It often requires
storing solutions to those sub problems before combining them. Sorting algorithms like
Merge Sort are a good example. They use extra space (often logarithmic) to hold these
temporary solutions, but this grows slowly with data size.
The other factors affecting space utilization in a program are programming languages and
runtime environment. Different programming languages provide varying levels of low-level
control, different standard libraries, and different language features, which can affect the
efficiency and complexity of algorithms implemented in those languages.
It's important to consider the specific characteristics and performance considerations of the
programming language being used when analysing time and space complexity. Additionally,
different implementations and optimizations within a programming language can also impact
the efficiency and complexity of algorithms. The runtime environment is the environment in
which a program or application is executed. It's the hardware and software infrastructure that
supports the running of a particular codebase in real time.
The environment where your program runs can significantly impact how efficiently it uses
memory. Factors like virtual memory versus physical memory, automatic garbage collection,
and pre-loaded libraries can all influence space usage. Even the operating system's memory
management can play a role. By understanding these environmental factors and writing
memory-conscious code, you can ensure your program gets the most out of its available
space.
Case study
-Choose a simple algorithm with O(1) space complexity and elucidate its characteristics.
Provide a detailed explanation of why its space complexity remains constant irrespective of
input size.
Selection sort arranges an array's elements in ascending or descending order without extra
memory allocation, maintaining a constant space complexity of O(1). It iteratively selects the
smallest (or largest) element from the unsorted part and swaps it with the first (or last)
element in the sorted portion. This process reduces the unsorted segment while expanding the
sorted one. It operates solely on the input array, using a fixed number of variables for
comparisons and swaps, making it space-efficient for large datasets with limited memory
resources.
Illustrative Example:
Consider an array of integers: [5, 2, 9, 1, 5, 6]. Initiating the algorithm, we observe the
following progression:
1. Initial State: Unsorted: [5, 2, 9, 1, 5, 6], Sorted: []
2. First Pass: Identify smallest (1) and swap with first element (5).
Updated array: [1, 2, 9, 5, 5, 6]
3. Second Pass: Smallest in remaining unsorted is 2. Swap with second element (2). No
change.
4. Third Pass: Identify smallest (5) and swap with third element (9).
Updated array: [1, 2, 5, 9, 5, 6]
5. Fourth Pass: Identify smallest (5) and swap with fourth element (9).
Updated array: [1, 2, 5, 5, 9, 6]
6. Fifth Pass: Identify smallest (6) and swap with fifth element (9).
Updated array: [1, 2, 5, 5, 6, 9]
The sorted array: [1, 2, 5, 5, 6, 9].
def selection_sort(arr):
n = len(arr)
for i in range(n):
# Find the index of the minimum element in the remaining unsorted array
min_idx = i
min_idx = j
# Swap the found minimum element with the first element
return arr
# Given array
arr = [5, 2, 9, 1, 5, 6]
# Initial state
sorted_arr = selection_sort(arr)
# Sorted array
The space complexity of selection sort remains constant at O(1) as it operates solely on the
input array, avoiding additional data structures. It conducts in-place sorting by rearranging
elements within the existing array, eliminating the need for extra memory allocation.
Additionally, the algorithm employs a fixed number of variables for comparisons and swaps,
regardless of input size, ensuring consistent space usage. This optimization makes selection
sort suitable for sorting large datasets efficiently with limited memory resources.
-Select a moderately complex algorithm with O(n) space complexity and analyse its
memory utilization pattern. Discuss the factors contributing to its linear space complexity
and its implications for scalability.
Merge Sort:
Merge Sort, a comparison-based algorithm, divides an array into halves, sorts them
recursively, and merges them, achieving O(n) space complexity. Its divide-and-conquer
approach simplifies complex sorting tasks. Recursion divides arrays until single-element
subarrays, intensifying memory usage due to call stack maintenance. During merging,
temporary arrays store sorted elements, contributing to O(n) space complexity. Merge Sort's
stability maintains equal element order. Its memory usage pattern involves stack space
proportional to recursion depth (O(log n)) and temporary arrays sized to input (O(n)). Linear
space complexity arises from merging and recursion. While efficient for large datasets, Merge
Sort's memory usage can hinder scalability on memory-limited systems. Optimizations like
in-place merging or iterative approaches can mitigate memory demands but may complicate
implementation.
Step 1: Divide
The array is recursively divided into halves until each subarray contains only one element.
[7, 2, 5, 3, 9, 1, 6, 8]
↓
[7, 2, 5, 3] [9, 1, 6, 8]
↓ ↓
[7, 2] [5, 3] [9, 1] [6, 8]
↓ ↓ ↓ ↓
[7] [2] [5] [3] [9] [1] [6] [8]
Step 2: Merge
The sorted subarrays are merged back together while maintaining the sorted order.
[7, 2] [5, 3] [9, 1] [6, 8]
↓ ↓ ↓ ↓
[2, 7] [3, 5] [1, 9] [6, 8]
↓ ↓ ↓ ↓
[2, 3, 5, 7] [1, 6, 8, 9]
↓ ↓
[1, 2, 3, 5, 6, 7, 8, 9]
Memory Utilization:
During divide, memory is allocated for the call stack with a max depth of log₂(n), where n is
the array size. In merging, temporary arrays of size n are created, contributing to O(n) space
complexity.
def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left_half = arr[:mid]
right_half = arr[mid:]
left_half = merge_sort(left_half)
right_half = merge_sort(right_half)
merged = []
i = j = 0
merged.append(left[i])
i += 1
else:
merged.append(right[j])
j += 1
# Append remaining elements
merged.extend(left[i:])
merged.extend(right[j:])
return merged
# Example usage
arr = [7, 2, 5, 3, 9, 1, 6, 8]
sorted_arr = merge_sort(arr)
Space complexity refers to the amount of memory a data structure consumes during its
operations. It's crucial alongside time complexity for efficient program design. We express
space complexity using Big O notation.
Arrays: Arrays offer constant time access (O(1)) for any element using indexing.
However, their space complexity is directly tied to their size. They require contiguous
memory allocation to store all elements, resulting in a space complexity of O(n),
where n is the number of elements in the array. The size of the array directly affects
its space complexity. More elements require more memory allocation.
Example:
as we can see in the above example the space complexity is dominated by the array
numbers. Since its size depends on n, the space complexity is O(n). This means the
memory usage grows linearly with the number of elements in the array.
if we change the value of int n=10; to int n=24; the memory usage growth linearly .
Space complexity of two-dimensional array:
Similar to single-dimensional arrays, multidimensional arrays also have a space
complexity related to the number of elements they store. However, for
multidimensional arrays, space complexity is the product of the space complexities of
each dimension.
Formula: O(n1 * n2 * ... * nk), where n1, n2, ..., nk represent the sizes of each
dimension (k dimensions).
In the above example the code creates a 2D array named matrix with 3 rows (first
dimension) and 4 columns (second dimension).
Here's the key point: memory is allocated for all elements of the array. Since it's a 2D
array, the total space used depends on both the number of rows and columns.
The space complexity for this 2D array is O(3 * 4), which simplifies to O(12). This is
because the space used is proportional to the product of the number of rows (3) and
the number of columns (4).
Generally, The concept extends to arrays with more dimensions. A 3D array with
dimensions (m x n x p) would have a space complexity of O(m * n * p).
Each dimension contributes to the overall space used by the multidimensional array.
Linked List Space complexity: Linked lists don't have fixed sizes like arrays.
Each node stores data and a reference (pointer) to the next node. This dynamic
allocation allows for insertions and deletions at any point. However, space
complexity is O(n) because each node uses memory to store data and the pointer.
Additionally, random access is inefficient (O(n)) as you need to traverse the list to
find a specific element.
Example:
Each node in the linked list uses a constant amount of space to store its data and the
reference (next). as we add n nodes to the list, the total space complexity grows
linearly with n. This is because the number of nodes directly determines the amount
of memory used by the linked list.
let’s see the difference between space complexity of singly and doubly linked list:
Both singly and doubly linked lists have a space complexity of O(n), but doubly
linked lists use a constant amount of extra space per node due to the additional "prev"
pointer.
in the above example, Each node in a singly linked list stores two pieces of data.
The actual data value (int in this example).
A reference (next) to the next node in the list. This reference typically occupies the
same amount of space as a pointer (e.g., 4 bytes on a 32-bit system).
As you add more nodes (n) to the list, the total space used increases linearly. Each
additional node contributes a constant amount of space for its data and the next
reference.
The space complexity is still O(n) because the memory usage is still directly
proportional to the number of nodes.
However, there's a slight difference from singly linked lists, The additional prev
reference adds a constant overhead per node. This overhead doesn't affect the
overall space complexity being O(n), but it does mean a doubly linked list uses
slightly more memory per node compared to a singly linked list.
Stacks:
Stacks follow a Last-In-First-Out (LIFO) principle. They typically have two main
implementation approaches:
Space Complexity:
Queues: Queues follow a First-In-First-Out (FIFO) principle. Similar to stacks, they can be
implemented using arrays or linked lists.
Space Complexity:
1. Array-Based Queue: O(n) (similar to array-based stacks)
o The entire array needs to be allocated upfront, even if not fully used.
o The space complexity is O(n), tied to the array size, regardless of the number
of elements in the queue.
2. Linked List-Based Queue: O(n)
Each node in the linked list uses a constant amount of space for data and a reference.
The space complexity is O(n) as memory usage grows linearly with the number of
elements in the queue.
Space complexity of Trees: Trees have a hierarchical structure with nodes containing data
and references to child nodes. The space complexity of a tree depends on how full it is. A
perfectly balanced tree (like a binary search tree) has a space complexity of O(n) in the worst
case. However, a skewed tree (where most nodes lean to one side) can have a space
complexity as bad as O(n) in the worst case.
Balanced Tree: In a well-balanced scenario, the tree might have 4 levels with approximately
2-3 nodes on each level. The space complexity would be closer to O(log 10) (around O(4) in
this simplified example).
Skewed Tree: Here, all 10 nodes might be on a single path, resulting in a very deep tree with
a single level holding all nodes. The space complexity would be O(10), which simplifies to
O(n).
Space complexity plays a crucial role in various real-world scenarios, especially systems
where memory resources are limited or expensive. Understand and managing space
complexity is vital to ensure the efficient operation of these systems. Some practical
applications might include:
-Embedded systems: devices like microcontrollers in embedded systems often have very
limited memory. Algorithms used in such systems must be optimized for space to ensure they
fit within the constraints of the hardware.
-Mobile applications mobile devices, though increasingly powerful, still have limitations in
terms of memory, especially when multiple applications run simultaneously. Space-efficient
algorithms are essential to avoid exhausting the device’s memory and to ensure smooth
application performance.
-Big data application: in big data scenarios, where the volume of data is enormous, even
algorithms with linear space complexity can become impractical. Algorithms in these
contexts need to be especially mindful of space usage to handle large datasets effectively.
-Browser applications: web applications running in browsers have to operate within the
memory constraints of the browser and the underlying devices. Optimizing algorithms for
space can lead to faster and more responsive web applications.
Conclusion
Space complexity is a measure of how much memory or space an algorithm needs to execute.
It is important to consider space complexity when dealing with large datasets or limited
memory resources. The space complexity of an algorithm is typically expressed in terms of
the amount of memory it uses relative to the size of its input.
In resource-constrained environments like mobile devices or embedded systems, space
efficiency is vital. These devices often have limited memory capacities, necessitating space-
efficient programming to ensure optimal performance, minimize power consumption, and
provide a smooth user experience.
Optimizing space usage improves performance. Efficient memory utilization reduces time
spent on memory allocation, deallocation, and garbage collection. This results in faster
execution, reduced latency, and improved overall responsiveness. Space efficiency is
especially crucial in performance-critical applications like real-time systems or high-
throughput data processing.
Considering the importance of space efficiency is vital for building robust and efficient
software solutions in today’s data-intensive and resource-constrained environments. By
emphasizing optimized memory utilization, developers can create lean, scalable, and high-
performing applications.
Reference
GfG. (2023). Time Complexity and Space Complexity. Retrieved from
https://www.geeksforgeeks.org/time-complexity-and-space-complexity/
Huang, S. (2022). What is Big O Notation Explained: Space and Time Complexity.
Retrieved from https://www.freecodecamp.org/news/big-o-notation-why-it-matters-and-
why-it-doesnt-1674cfa8a23c/
Okeke, C. (2023). Introduction to BIG O Notation - Time and Space Complexity.
Retrieved from https://medium.com/@DevChy/introduction-to-big-o-notation-time-and-
space-complexity-f747ea5bca58
Space Complexity of Algorithms. (n.d.). Retrieved from
https://www.studytonight.com/data-structures/space-complexity-of-algorithms
Upadhyay, S. (2023). Time and Space complexity in Data Structure - Ultimate Guide.
Retrieved from https://www.simplilearn.com/tutorials/data-structure-tutorial/time-and-
space-complexity