Member-only story

O(log n) Time Complexity: A Comprehensive Analysis

Kuldeep Singh
8 min readJul 3, 2023

--

Time complexity is a way of measuring the performance of algorithms by estimating how much time they take to run for a given input size. It is important to analyze the time complexity of algorithms because it helps us to compare different solutions, optimize our code, and understand the trade-offs and limitations of our algorithms.

One of the most common ways of expressing the time complexity of an algorithm is using the Big O notation. The Big O notation describes the worst-case scenario of an algorithm’s growth rate, or how fast its running time increases as the input size increases. For example, an algorithm that has a time complexity of O(n) means that its running time is proportional to the input size n, or that it takes n steps to complete. An algorithm that has a time complexity of O(n²) means that its running time is proportional to the square of the input size n, or that it takes steps to complete.

There are many different time complexities that can be expressed using the Big O notation, such as O(1), O(log n), O(n log n), O(n²), O(2^n), etc. Each of these time complexities represents a different class of algorithms that have different performance characteristics and implications. In this article, we will focus on one of the most common and useful time complexities: logarithmic time complexity, or O(log n).

Logarithms and Exponents

Logarithmic time complexity means that the algorithm’s running time grows proportionally to the logarithm of the input size. But what is a logarithm and how does it relate to the input size?

A logarithm is a mathematical operation that is the inverse of exponentiation. Exponentiation is when we raise a number to a power, such as 2³ = 8. Logarithm is when we find what power we need to raise a number to get another number, such as log_2(8) = 3. In other words, logarithm is asking the question: how many times do we need to multiply a number by itself to get another number?

Logarithms can have different bases, which are the numbers that we multiply by themselves. For example, log_2(n) means how many times do we need to multiply 2 by itself to get n, while log_10(n) means how many times do we need to multiply 10 by itself to get n. The base of a logarithm…

--

--

Kuldeep Singh
Kuldeep Singh

No responses yet

Write a response