Home/Blog/Time Complexity Explained: From O(1) to O(n²) With Real Code Examples
DSATime ComplexityAlgorithmsInterview

Time Complexity Explained: From O(1) to O(n²) With Real Code Examples

Time complexity is the single most important concept in DSA interviews. This guide teaches you to calculate and compare complexities with real Python examples — no maths degree required.

S
SCS Team·5 March 2026·10 min read

In every DSA interview, the interviewer will ask: "What is the time complexity of your solution?"

Most students memorise answers. This article teaches you to derive them — so you can answer for any code you write, even code you've never seen before.


What Time Complexity Actually Means

Time complexity is not the number of seconds your code takes. It's how the number of operations scales as the input size n grows.

We use Big-O notation to describe the worst-case upper bound. When n = 1,000,000:

| Complexity | Operations | Verdict | |---|---|---| | O(1) | 1 | Instant | | O(log n) | ~20 | Instant | | O(n) | 1,000,000 | Fast | | O(n log n) | ~20,000,000 | Fast | | O(n²) | 1,000,000,000,000 | Too slow | | O(2ⁿ) | 10^301,000 | Never finishes |

This is why an O(n²) solution that works on n=100 will TLE on n=100,000. The interviewer isn't being cruel — they're checking if you understand scale.


The 3 Rules for Calculating Complexity

You don't need to count every operation. These three rules cover 95% of cases.

Rule 1: Drop Constants

# O(3n) → O(n)
def triple_pass(arr):
    for x in arr: print(x)   # n steps
    for x in arr: print(x)   # n steps
    for x in arr: print(x)   # n steps

Three passes over arr is still O(n). Constants don't affect the growth rate.

Rule 2: Drop Smaller Terms

# O(n² + n) → O(n²)
def mixed(arr):
    for i in arr:          # O(n)
        for j in arr:      # O(n) for each i → O(n²) total
            print(i, j)
    for x in arr:          # O(n)
        print(x)

When n = 1000, n² = 1,000,000 and n = 1,000. The n term is irrelevant. Total: O(n²).

Rule 3: Different Variables Stay Separate

# O(a * b) — NOT O(n²)
def nested_different(arr_a, arr_b):
    for x in arr_a:    # a iterations
        for y in arr_b:  # b iterations
            print(x, y)

Only simplify to O(n²) if both loops iterate over arrays of the same size n.


O(1) — Constant Time

The operation takes the same time regardless of input size.

def get_first(arr):
    return arr[0]        # always 1 step, regardless of arr length

def is_even(n):
    return n % 2 == 0    # always 1 step

Hash map lookups are O(1). This is why they're so powerful:

seen = {}
seen[5] = "hello"  # O(1) insert
val = seen[5]      # O(1) lookup — doesn't matter if dict has 10 or 10 million entries

O(log n) — Logarithmic Time

The input is halved (or divided by a constant factor) at each step.

def binary_search(arr, target):
    left, right = 0, len(arr) - 1
    
    while left <= right:
        mid = (left + right) // 2
        
        if arr[mid] == target:
            return mid
        elif arr[mid] < target:
            left = mid + 1   # discard left half
        else:
            right = mid - 1  # discard right half
    
    return -1

Each iteration halves the search space. For n=1,000,000, this takes only ~20 iterations. That's O(log₂ n).

How to spot it: If the loop variable doubles (i *= 2) or the input is halved each step, it's O(log n).

i = 1
while i < n:
    print(i)
    i *= 2    # 1, 2, 4, 8, 16 ... → log₂(n) steps

O(n) — Linear Time

You look at each element exactly once (or a constant number of times).

def find_max(arr):
    max_val = arr[0]
    for x in arr:        # exactly n iterations
        if x > max_val:
            max_val = x
    return max_val
def two_sum(nums, target):
    seen = {}
    for i, num in enumerate(nums):   # n iterations
        complement = target - num
        if complement in seen:       # O(1) lookup
            return [seen[complement], i]
        seen[num] = i
    return []

Note: the inner hash map lookup is O(1), so the total is O(n × 1) = O(n).


O(n log n) — Linearithmic Time

This is the best possible time for comparison-based sorting (Merge Sort, Heap Sort, Python's built-in sorted()).

arr = [5, 3, 8, 1, 9, 2]
sorted_arr = sorted(arr)   # O(n log n) — this is as fast as sorting gets

Many optimal solutions sort first (O(n log n)) and then do a linear pass (O(n)). The total is O(n log n) — dominated by the sort.

def three_sum(nums):
    nums.sort()          # O(n log n)
    result = []
    
    for i in range(len(nums)):      # O(n)
        left, right = i + 1, len(nums) - 1
        while left < right:         # O(n) total across all iterations
            ...
    
    return result
# Total: O(n log n) + O(n²) → O(n²)
# Note: the two-pointer inner loop makes this O(n²) total, not O(n log n)

O(n²) — Quadratic Time

A loop inside a loop, both iterating over the input.

# Bubble sort — the classic O(n²) algorithm
def bubble_sort(arr):
    n = len(arr)
    for i in range(n):           # n iterations
        for j in range(n - i - 1):  # ~n iterations
            if arr[j] > arr[j + 1]:
                arr[j], arr[j + 1] = arr[j + 1], arr[j]

O(n²) is usually a sign you need a better approach. For most interview problems, O(n²) gets TLE beyond n=10,000.

The classic optimisation: replace the inner loop with an O(1) hash map lookup.

# O(n²) brute force
def two_sum_slow(nums, target):
    for i in range(len(nums)):       # n
        for j in range(i+1, len(nums)):  # n
            if nums[i] + nums[j] == target:
                return [i, j]

# O(n) with hash map
def two_sum_fast(nums, target):
    seen = {}
    for i, num in enumerate(nums):   # n
        if target - num in seen:     # O(1)
            return [seen[target - num], i]
        seen[num] = i

Space Complexity — the other half

Space complexity measures extra memory used (not counting the input itself).

def reverse(arr):
    result = []
    for x in arr:
        result.append(x)    # storing n elements → O(n) space
    return result[::-1]

def reverse_inplace(arr):
    left, right = 0, len(arr) - 1
    while left < right:
        arr[left], arr[right] = arr[right], arr[left]  # O(1) extra space
        left += 1
        right -= 1
    return arr

Recursion uses stack space. A recursive function that calls itself n times deep uses O(n) space even if no extra variables are created.


Quick Reference

| Code Pattern | Time Complexity | |---|---| | One loop over arr | O(n) | | Two nested loops over arr | O(n²) | | Binary search | O(log n) | | Hash map insert/lookup | O(1) | | sorted() / .sort() | O(n log n) | | Recursive function halving input | O(log n) | | Recursive function with one branch per element | O(n) | | Recursive function with two branches per element | O(2ⁿ) | | Building all subsets | O(2ⁿ) | | Building all permutations | O(n!) |


How to Answer in an Interview

After writing your solution, proactively say:

"The time complexity is O(n) because I iterate through the array once and each hash map operation is O(1). The space complexity is O(n) for the hash map in the worst case where no pair is found until the end."

Interviewers aren't looking for perfection — they're looking for structured thinking. Walking them through your reasoning out loud is more impressive than just stating the correct answer.

Practice applying this: Every time you solve a problem in the Practice section, calculate the complexity before submitting. Use the AI Tutor to verify your reasoning — it'll explain where you went wrong if you're off.

Ready to practice what you just learned?

Apply these concepts with AI-powered tools built for CS students.