// essence of logarithms, or specifically logs with the base "2" is in partitioning, and therefore having log in the expression of an algorithm's time complexity only means that the algorithm has to deal with space partitioning methods
There are many mathematical constructs present but if you go through the time complexities of some algorithms it is denoted in a log time.
For example, Binary search —> O(logn) Quicksort …etc
But why log? What does log time complexity actually mean?
In order to answer the above questions, we’ll have to take a slight detour of 3 lines for understanding logarithms.
So, LogxY = Z ( log Y to the base x ) mathematically means that the power of "x" required to get Y for example, log2 8, that is the power of 2 required to get 8. Therefore log2(8) = 3.
The above explanation is what you might encounter 99.99% of the time everywhere by anyone who goes about to explain logarithms. But shifting out POV could help us answer our question..
Be with me while I do that,
if log2(8) = 3, which means 2^3 = 8, which also means 8 can be halved only 3 times
8/2 --> 4/2 --> 2/2
Similarly, if log10(100) means tenthing ( a word that I just made up :P ) 100 until we reach one
100/10 --> 10/10
So, we can conclude that the logarithm also represents the number of times a number could be halved/tenthed depending on the base.
And all these algorithms, for example, Binary search have halving partitions throughout their input space (n) So, the worst-case time required for the algorithm ( Binary search ) to find an element would be to go on halving the input space until it reaches the element. Which is nothing but log2(n)
Therefore, all the algorithms that have partitions in their procedure would be having "log" in their time complexity