Algorithms
- like a recipe
- concrete steps to solve the problem
- language independent (logic itself)
Complexity Analysis
- Objective Criteria
time
,space
- best, average, worst (always think in worst case!)
- Big O
Be aware of the Complexity
Recognise patterns
Think if you can bring it down
Big O
DROP the constant in front of the N
- Constant - O(k) (k: constant number)
: stays the same independently of the size of the problem - Linear - O(n) (n: number of items)
: grows proportional to the size of the problem
e.g. for loop, linked list- find the item - Logarithmic - O(logn)
- random array => cut half => repeat until find the element (Binary Search)
- sorted => increase by one
1
2
3
4
5
6
7let count = 0;
let limit = 100;
for (let i=0; i<limit; i++) {
limit /= 2;
count++;
}
- Quadratic - O(n^2)
: nested for loop - N log n - O (nlogn)e.g. mergeSort
1
2
3
4
5
6
7
8
9let count = 0;
let limit = 100;
for (let i=0; i<limit; i++) {
limit /= 2;
for (let j=0; j<limit; j++) {
count++;
}
}
BAD Time Complexity
- Exponential - O(2^n)
- Factorial - O(n!)
e.g. Queen Problem
Hash Table: O(1) average and amortized case complexity, however it suffers from O(n) worst case time complexity.
- Worst Case
- If too many elements were hashed into the same key: looking inside this key may take O(n) time.
- Once a hash table has passed its load balance - it has to rehash [create a new bigger table, and re-insert each element to the table].
- average and amortized case
- It is very rare that many items will be hashed to the same key [if you chose a good hash function and you don’t have too big load balance.
- The rehash operation, which is O(n), can at most happen after n/2 ops, which are all assumed O(1): Thus when you sum the average time per op, you get : (n*O(1) + O(n)) / n) = O(1)