## Table of Contents

# Big O

Planted:

Status: seed

*Big O* is the language & metric to describe the efficency of an algorithm.
It describes:

- ▪
**time complexity**: execution time & - ▪
**space complexity**: the memory or disk space used.

Big O is important when dealing with algorithms that operate at **scale**.
If your sorting an array of 5 elements, the algorithm you use will have no noticably effect on time or space.
If your sorting an array of millions of elements, the algorithm will have a big impact.

Analysing an algorithm is always based on the **worst-case** scenario.
Imagine an algorithm that searched an array for an element.
The algorithm started from the beginning & moved towards the end.
When analysing it, we would base our results as if the element was in the last position of the array.
This would take the longest time, thus the worst case scenario.

## Examples

An example of the notation is **O(n)**.
**n** representing the input size.
Below are examples of algorithms of different complexity.

# Time Complexity

## O(1) - Constant Time

An algorithm whose execution time is constant, regardless of input size.
Example, a function that takes an array & returns the 1^{st} element.

## /index.js

function myAlgorithm(array) { return array[0] }

## O(n) - Linear Time

An algorithm whose execution time increases linearly as the input size increases. Example, a function that takes an array & logs each element.

## /index.js

function myAlgorithm(array) { array.forEach(elem => { console.log(elem) }) }

## O(log n) - Logarithmic Time

An algorithm whose execution time increases in proportion to the logarithm of the input size. Time will barely increases as the input exponentially increases. Example, a binary search. When you see a problem where the number of elements in the problem space gets halved each time, that will likely be a O(log n) runtime.

## /index.js

function binarySearch(arr, x) { let start = 0 let end = arr.length - 1 while (start <= end) { const mid = Math.floor((start + end) / 2) const value = arr[mid] if (value === x) { return true } else if (value > x) { end = mid - 1 } else { start = mid + 1 } } }

## O(n^{2}) - Quadratic Time

An algorithm whose execution time is the squared size of the input.
Example, a selection sort.
Commonly, 2 nested loops indicates an algorithm is O(n^{2}),
O(n^{3}) for 3 nested loops,
O(n^{4}) for 4 nested loops, ..

## /index.js

function selectionSort(array) { let n = inputArr.length; for(let i = 0; i < n; i++) { let min = i; for(let j = i+1; j < n; j++){ if(inputArr[j] < inputArr[min]) { min=j; } } if (min != i) { let tmp = inputArr[i]; inputArr[i] = inputArr[min]; inputArr[min] = tmp; } } return inputArr; };

## O(2^{n}) - Exponential Time

An algorithm whose execution time doubles with each addition to the input. Example, calculating the Fibonacci sequence.

## /index.js

function fibonacci(num) { if (num <= 1) return 1; return fibonacci(num - 1) + fibonacci(num - 2); }

# Space Complexity

If we create an array of size **n**, it will require **0(n)** space.
If we create a 2-dimensional array of size **n x n**, it will require **0(n ^{2})** space.
Call stack space in recursive calls counts too:

## O(n) - Linear Space

Below, each call adds a frame to the stack, each requiring memory.

## /index.js

function sum(n) { if (n === 0) { return n; } return n + sum(n - 1); } const result = sum(3) // sum(3) // sum(2) // sum(1)

# Drop Constant & Non-Dominant Terms

Big 0 describes the rate of increase.
For this reason, we drop the constants in runtime.
Example, an algorithm that iterates through an array twice.
You could describe this as O(2n).
However, we simplify it to O(n).
We also drop non-dominate terms.
O(n^{2} + n) becomes O(n^{2}).

# Where to Next?

│└── JavaScript│└── PerformanceYOU ARE HERE│├── Big O