Skip to content

Latest commit

 

History

History
230 lines (180 loc) · 4.12 KB

BIG-O.md

File metadata and controls

230 lines (180 loc) · 4.12 KB

Analysis of algorithms

A language and metric to describe the efficiency of algorithm.

Asymptotic Analysis

  • The main idea of asymptotic analysis is to have a measure of the efficiency of algorithms that don't depend on machine specific constants and doesn't require algorithms to be implemented and time taken by programs to be compared.

  • Asymptotic notations are mathematical tools to represent Time and Space Complexity.

Notations

  1. Big O
  • Useful for evaluating worst case performance of an algorithm.
  • At most upper bound.
  1. Big Omega Ω
  • Useful for evaluating best case performance of an algorithm.
  • At lease lower bound.
  1. Big Theta Θ
  • Useful for evaluating average case performance of an algorithm.
  • Exact time

Time Complexity

Rules

  1. Find the fastest growing term.

    Example

    O(1) < O(Log n) < O(n) < O(n Log n) < O(n^2) < O(2^n) < O(n!)
    
  2. Take out the co-efficient/constants.

    Example

    5n --> O(n)
    

Constant time

Independent of input size (n). Hence, O(1)

Example

T = c = 0.343 = 0.343 * 1 --> O(1)
x = 5 + (15 * 20) --> O(1)
x = 5 + (15 * 20) --> O(1)
y = 15 - 2 --> O(1)
print x + y --> O(1)

Total time = O(1) + O(1) + O(1) = O(1)

Logarithmic time

O(Log n)

Linear time

n * O(1) = O(n)

Example

```
T = an + b = O(n)
```

```
for x in rage (0, n): // O(n)
  print x; // O(1)
```

```
x = 5 + (15 * 20) // O(1)
for x in rage (0, n): // O(n)
  print x;
```
Total time = O(1) + O(n) = O(n)

Quadratic time

O(n^2)

Example

```
T = cn^2 + dn + e = O(n^2)
```

```
for x in rage (0, n): // O(n)
  for y in rage (0, n): // O(n)
    print x * y // O(1)
```
Total time = O(n^2) + O(1) = O(n^2)

Understand **Total time**

```
// O(1)
x = 5 + (15 * 20) // O(1)

// O(n)
for x in rage (0, n): // O(n)
  print x;

// O(n^2)
for x in rage (0, n): // O(n)
  for y in rage (0, n): // O(n)
    print x * y // O(1)
```
Total time = O(1) + O(n) + O(n^2) = O(n^2)


```
if x > 0:
  // O(1)
else if x < 0:
  // O(Log n)
else:
  // O(n^2)
```
Total time = O(1) + O(Log n) + O(n^2) = O(n^2)

From Slowest to Fastest Growing

Function Type Example
Constant O(1) 5, 25, 2500
Logarithmic O(Log n) Log n
Linear O(n) 5n, 25n, 2500n
Linearithmic O(n Log n) n Log n
Polynomial O(n^2) 5n^2, 25n^4, 2500n^8
Exponential O(c^n) 5^n, 25^25000n
Factorial O(n!)

Where n = number of inputs and c = constant

Algorithms

Algorithm Time Complexity
Best Average Worst
Searching
Binary search O(1) O(Log n) O(Log n)
Sequential/Linear search O(1) O(n) O(n)
Sorting
Bubble sort O(n) O(n^2) O(n^2)
Selection sort O(n^2) O(n^2) O(n^2)
Insertion sort O(n) O(n^2) O(n^2)
Merge sort O(n Log n) O(n Log n) O(n Log n)
Quick sort O(n Log n) O(n Log n) O(n^2)
Heap sort O(n Log n) O(n Log n) O(n Log n)

References