Logarithms In Time Complexities

Moushumi Pardesi
2 min readMay 27, 2021

Let’s start with the VERY BASICS. What is a log?

Log of ‘a’ to the base ‘b’ is essentially asking what ‘b’ should be raised to, to get ‘a’.

Log (base 10) 100 is essentially asking what do you need to raise 10 to, to get 100. The answer is 2.

Why do we use log base 2 in computer science?

When we’re dealing with any binary structure or a binary tree of sorts, if we take the asymptotic complexity (or behavior) of the traversal of the height, we are going to be using logarithmic values.

In balanced binary trees, we’re asking how many times can we split the original tree in half. For eg. if we start we N=8, we would split it like below to create a binary search tree:

To know how many levels this tree has, we ask what do I need to raise 2 to, to get 8. The answer is 3, so the above tree has 3 levels.

So QuickSort and MergeSort run in O(n *(log n ))time. When we have unsorted data, the best we can do is this type of search. In recursion, we are going to be able to split the data into (n *(log n )) levels of work.

Where else do logarithms show up?

If you have an indexed, sorted array with N=8, (n *(log n )) gives the maximum number of operations one might perform to find a number.

Consider the below array:

[1,2,3,4,5,6,7,8]

If the task is to find 1, we start with the midpoint 4:

[1,2,3,4,5,6,7,8]

Since 4>1, we move to the midpoint of the left array:

[1,2,3,4,5,6,7,8]

Since 2>1, we move further left and find 1.

[1,2,3,4,5,6,7,8]

This took 2 operations, but if one were to find an element that was not a part of the array, that would take another operation, resulting in a total of 3 total operations.

So (n *(log n )) gives the maximum number of operations one might have to perform in a binary search tree to get to the desired element.

The reason log base 2 is important here is because it asks a very relevant question related to halving.

--

--