understanding time complexity beginner's pov
Time complexity is simply the time it takes for an algorithm to run, not as in seconds but as in steps; with respect to the size of the input.
For instance we have an array :
- fig 1.0 image from geekforgeeks.
Analogy 1: Let's say we want to get the value of index (0) in our array(list of items), the time or should i say steps it will take the computer to get it will be very fast because it already knows where it is in the memory, and that applies to getting any value of any index.
- And the time it will take to do that will be constant, which is denoted as O(1)
Analogy 2: Now let's assume we have no idea what the index of a particular value we have in the array above is, let's pick the value or array element (3), As you can see it's the last element on the list.
The time complexity (steps) it'll take to get the value 3 is 5 steps because we would have to start comparing each indexes until we get our desired value, so imagine we have a thousand array element and the item we are searching for which is 3 is index 999, that is, we'd have to compare each value from the first index to the last.
Now we can agree that the time complexity here is not constant it's relative to the size of the inputs (array items), which we can denote as O(n)
obviously the time complexity of this analogy is greater than the former.
so what the kcuf is this 👉 Big O notations i keep using to describe the time complexities above.
Well Big O notation is like a function that describes the growth rate of a variable.
so when you say O(n) time complexity; you are essentially saying the number of operation this algorithm performs grows linearly with respect to its input size (n) and so on and so forth.
here is a graphical representation of each time complexity:
- fig 1.1 image by ajak cycer on medium
This was a quick pour down from my head, i hope i'm able to communicate something to someone, cheers
- make sure to always aim for the green lines in the graph above happy codiing 🤖 #dsa