Time Complexity
Asymptotic Notation
When the input size for an algorithm is quite small, the efficiency of the algorithm doesn't significantly affect its overall performance. However, the efficiency of the algorithm becomes crucial when dealing with larger input sizes. As a result, in analyzing an algorithm's runtime, we focus on the cases where the input size is large, a process called asymptotic analysis.
The Need for Asymptotic Analysis
Asymptotic analysis provides an effective way to address these issues. This technique offers a standard for objectively comparing an algorithm's runtime or space usage based on the size of its input data. There are several forms of asymptotic notation, including Big O, Big Omega, and Theta. Big O and Theta notations are typically the most commonly used.
Big O Notation (O Notation)
This represents the asymptotic upper bound, meaning it defines the maximum runtime we can expect from an algorithm, regardless of how poorly it performs. If an algorithm's runtime is for an input size of n, this means it will take, at most, a time proportional to . So, is the set of all functions whose asymptotic growth rate does not exceed . For instance, expressions such as , , , and all fall under the category of .
Some Important Big-Oh's
constant
assignment statement invoking a function
logarithmic
binary search, inserting into a red-black tree
linear
searching in an unordered list, inserting a new item into a sorted vector
n log n
mergesort
quadratic
insertion sort
cubic
brute force maximum subsequence sum algorithm
exponential
finding all subsets of n items
Time Complexity Comparison

Big Omega Notation (Ω Notation)
This notation represents the asymptotic lower bound, defining the minimum runtime we can expect from an algorithm, no matter how optimally it performs. If an algorithm's runtime is for an input size of n, it implies that it will take at least a time proportional to n. Therefore, is the set of all functions whose asymptotic growth rate is at least . Thus, we can say that , , , and all fall under the category of .
Theta Notation (Θ Notation)
This notation signifies the intersection of the asymptotic upper and lower bounds. This means it provides a tighter bound on runtime, regardless of how well or poorly the algorithm performs. If an algorithm's runtime is for an input size of n, it means it will take approximately a time proportional to n. Hence, is the set of all functions whose asymptotic growth rate is about f(n). Therefore, expression such as , , , , can all be classified as .
Small o Notation (o Notation)
This notation defines a relaxed upper bound in the asymptotic notation. Regardless of how poorly an algorithm performs, it will always perform better than the function being compared. The small o notation is used when you want to express that a function's growth rate is smaller than a certain limit. For instance, the functions , , are because their growth rates are smaller than .
Small omega Notation (ω Notation)
This notation defines a relaxed lower bound within the asymptotic notation. No matter how optimally an algorithm performs, it cannot perform better than the function being compared. The small omega notation is used when you want to express that a function's growth rate is larger than a certain limit. For example, the function can be said to be because the growth rate of is greater than the growth rate of .
Last updated