Time Complexity

Asymptotic Notation

When the input size for an algorithm is quite small, the efficiency of the algorithm doesn't significantly affect its overall performance. However, the efficiency of the algorithm becomes crucial when dealing with larger input sizes. As a result, in analyzing an algorithm's runtime, we focus on the cases where the input size is large, a process called asymptotic analysis.

The Need for Asymptotic Analysis

Asymptotic analysis provides an effective way to address these issues. This technique offers a standard for objectively comparing an algorithm's runtime or space usage based on the size of its input data. There are several forms of asymptotic notation, including Big O, Big Omega, and Theta. Big O and Theta notations are typically the most commonly used.

Big O Notation (O Notation)

This represents the asymptotic upper bound, meaning it defines the maximum runtime we can expect from an algorithm, regardless of how poorly it performs. If an algorithm's runtime is O(n2)O(n^2) for an input size of n, this means it will take, at most, a time proportional to n2n^2. So, O(f(n))O(f(n)) is the set of all functions whose asymptotic growth rate does not exceed f(n)f(n). For instance, expressions such as n2+3n^2 + 3, 3n2+53n^2 + 5, n+4n + 4, and 2nlogn+3n2nlog n + 3n all fall under the category of O(n2)O(n^2).

Some Important Big-Oh's

Big-Oh
Name
Algorithms

O(1)O(1)

constant

assignment statement invoking a function

O(logn)O(logn)

logarithmic

binary search, inserting into a red-black tree

O(n)O(n)

linear

searching in an unordered list, inserting a new item into a sorted vector

O(nlogn)O(nlogn)

n log n

mergesort

O(n2)O(n^2)

quadratic

insertion sort

O(n3)O(n^3)

cubic

brute force maximum subsequence sum algorithm

O(2n)O(2^n)

exponential

finding all subsets of n items

Time Complexity Comparison

Notability

Big Omega Notation (Ω Notation)

This notation represents the asymptotic lower bound, defining the minimum runtime we can expect from an algorithm, no matter how optimally it performs. If an algorithm's runtime is (n)Ω(n) for an input size of n, it implies that it will take at least a time proportional to n. Therefore, (f(n))Ω(f(n)) is the set of all functions whose asymptotic growth rate is at least f(n)f(n). Thus, we can say that 5n25n^2, 3n23n^2, 5n35n^3, 4n4+234n^4+23 and 2n67n2n^6 - 7n all fall under the category of (n2)Ω(n^2).

Theta Notation (Θ Notation)

This notation signifies the intersection of the asymptotic upper and lower bounds. This means it provides a tighter bound on runtime, regardless of how well or poorly the algorithm performs. If an algorithm's runtime is Θ(n)Θ(n) for an input size of n, it means it will take approximately a time proportional to n. Hence, Θ(f(n))Θ(f(n)) is the set of all functions whose asymptotic growth rate is about f(n). Therefore, expression such as 5n5n, 3n3n, 5n35n-3, 4n+44n+4, 2n7n2n-7n can all be classified as Θ(n)Θ(n).

Small o Notation (o Notation)

This notation defines a relaxed upper bound in the asymptotic notation. Regardless of how poorly an algorithm performs, it will always perform better than the function being compared. The small o notation is used when you want to express that a function's growth rate is smaller than a certain limit. For instance, the functions 5n5n, 5n+35n + 3, 2n+72n + 7 are o(n2)o(n^2) because their growth rates are smaller than n2n^2.

Small omega Notation (ω Notation)

This notation defines a relaxed lower bound within the asymptotic notation. No matter how optimally an algorithm performs, it cannot perform better than the function being compared. The small omega notation is used when you want to express that a function's growth rate is larger than a certain limit. For example, the function 5n35n^3 can be said to be ω(n2)ω(n^2) because the growth rate of 5n35n^3 is greater than the growth rate of n2n^2.

Last updated