The minute you hear to find Time or Space complexity of code , it intimidates at first and we might think its better to write rather than analyze it .
I always wondered , Why do we care to find complexity , if code works good in my project context and giving me desired output I want to see.
But less I know it is a fundamental pillar of any code or algorithm (approach to make code work).
Lets go on to a ride of breaking this Time Complexity into pieces.
Felt Quite Boring , I know.
So anytime we google a solution to any problem , we find more than one approach or implementation to solve that particular problem .
Though we know they all will lead to same output but still we find , read and understand the one that's best to our understanding or our project context .
Not every piece of puzzle can solve my maze right?
IMAGINE if we had implementation of same function F1 in different approaches :
example:
INPUT : SCOPE
OUTPUT : EPOCS
It can have multiple approaches, for say one with loop , one with string built in functions and others as well.
To know which solution to reverse the string is best this is what Big-O is about,
It is a system or a way of generalizing code and comparing code and its performance to other pieces of code
Too much theory I know
Think with an analogy:
If we solve a particular problem which can have s1 to s10 solutions there should be a Richter scale to measure which solution is better than other ?
Big-O is about that. It is just that if we assign numeric codes instead of good better best and that works as BIG-O.
TAKING AN EXAMPLE:
Write a function that calculates sum of all numbers upto n
1+2+3 =6
What can be the alogrithm to this code?
//First Approach:
function addupton(n)
{
let total=0;
for(let i=0;i<=n;i++)
{
total= total+i;
}
return total;
}
//Second Approach
function addupton(n)
{
return n*(n+1)/2;
}
So which approach is better in terms of being faster , less memory used , more readable ?
BETTER : COUNT NO OF SIMPLE OPERATIONS , COMPUTER HAS TO PERFORM. As operations remains constant irrespective of input size.
Keep the answer in your mind and check: Approach-2: is using 3 operations only regardless of any input size.
For Approach-1 is taking a lot of operations i.e n additions and assignments , n comparisons where every operation is dependent on input size n
But Counting no of operations here is tricky, regardless of how exact you count , the number of operations grows roughly proportion with n .
THERE COMESS BIG-O FOR THIS TRICKY COUNTING PROBLEM SOLUTION
Big-O talks about how runtime of an algorithm grows as an input grows.
Talking about BIG-O is talking about worst case scenario i.e upper bound of runtime!
This is all BIG-O.
THUMB RULES TO SOLVE BIG-O
- CONSTANTS does not matter
- O(5N) → O(N)
- O(1000)→ O(1) -O(15 N^2) → O(N^2)
- SMALLER TERMS DOESNOT MATTER
- O(N+15) → O(N)
- O(10000N + 50) → O(N)
- O(N^2 + 5N+ 8 ) → O(N^2)
- Arithmetic operations does not count.
- Variable assignments are also constant
- Accessing elements in an array is constant
- In a loop : complexity = length of loop x complexity of what happens inside loop.
Now lets solve some examples:
function addupton(n)
{
return n*(n+1)/2;
}
console.log(addupton(5))
It has 3 operations only irrespective of input size n → O(1)
function addupton(n)
{
let total=0;
for(let i=0;i<=n;i++)
{
total= total+i;
}
return total;
}
console.log(addupton(5))
It has pretty number of operations bounded by multiple of n → O(n)
function countUpandDown(n)
{
console.log("going up");
for(let i=0;i<n;i++)
{
console.log(i);
}
console.log("going down");
for(let j=n-1;j>0;j--)
{
console.log(j);
}
}
It is O(n) for first loop , O(n) for second loop as well: O(n) + O(n) = 2 O(n) = O(n)
function displayatleast6(n)
{
for(var i=1;i<= Math.max(6,n) ; i++)
{
console.log(i);
}
}
What do you think its time complexity to be? If N grows larger and larger and larger , 5 does not matter. Thus its O(n)
function displayatmost6(n)
{
for(var i=1;i<= Math.min(6,n) ; i++)
{
console.log(i);
}
}
No matter how much N grows the output would be minimum value i.e. 6. The loop wil run 6 times only , all static. THUS Its O(1)
function onlyElementsAtEvenIndex(array) {
var newArray = Array(Math.ceil(array.length / 2));
for (var i = 0; i < array.length; i++) {
if (i % 2 === 0) {
newArray[i / 2] = array[i];
}
}
return newArray;
}
It is O(n) time complexity.
function subtotals(array) {
var subtotalArray = Array(array.length);
for (var i = 0; i < array.length; i++) {
var subtotal = 0;
for (var j = 0; j <= i; j++) {
subtotal += array[j];
}
subtotalArray[i] = subtotal;
}
return subtotalArray;
}
It is 0(n^2) time complexity! Better the way you count operations, best you will find the Time Complexity of it.
SPACE COMPLEXITY
Time complexity was focusing on analyzing the run time of algorithm as size of input increases.
where space complexity targets how much additional memory do we need to allocate in order to run code. Space complexity = Auxilliary space complexity i.e. space required by algorithm not space required by inputs.
THUMB SPACE COMPLEXITY RULES
- primitive data types require constant space
- string requires O(n) space
- Arrays and objects generally take O(n) space
function addarrayelements(array)
{
let total =0;
for(let i=0 ; i<arr.length ;i++)
{
total+=i
}
}
So no matter what array length is we got one variable "total" followed by looping where we get another variable i=0 . That is enough of space!
Results to be constant space of O(1) space.
function doublesum(arr)
{
let newarray =[];
for(let i=0;i<arr.length;i++)
{
newarray.push(2*arr[i]);
}
return newarray;
}
As the input size increases , new array grow longer and longer in proportion to old array. So space taken up is directly proportionate to input size.
Results O(n) space!!
This is all about Time and Space Complexity as a fundamental to start over algorithms.
DO NOT GO , THERE IS MORE!
BIG-O LENS FOR OBJECTS AND ARRAYS
Object : An unordered data structure with key-value pairs, yet quick with no order . When we do not need any ordering , BIG-O is the choice with:
- INSERTION - O(1)
- REMOVAL -O(1)
- SEARCHING -O(N)
- ACCESS - O(1)
JS can add, remove , retrieve from objects in constant time with no order. A key acts as a gateway for it. But searching is over linear time , searching here means checking if given searched information is in value there.
BIG-O OF OBJECT METHODS:
//All of them has O(N) time complexity
1. Object.keys()
2. Object.values()
3. Object.entries()
But
4.Object.hasOwnProperty() has constant time O(1).
Arrays : The big difference they comes with an order. But order with a cost through the lens of Big-O performance. Big-O for insertion and removal in array totally depends on array length where as Search takes up O(N) with a quick constant time O(1) for accessing the element in array.
BIG-O OF ARRAY OPERATIONS:
1. push/pop : O(1)
2. shift/unshift : O(N)
3. concat/splice/sort : O(N)
4. sort : O(N * log N )
5. forEach/map/filter/reduce : O(N)
Objects are best if you want to be quick with no hassle of order whereas use the array wisely if you need order and want to add/remove from beginning that would result improving the performance much better with less cascading effect of shift/unshift.
This completes the base of essence on what time complexity is about in terms of JAVASCRIPT.