## DSA Tutorial

DSA HOME DSA Intro DSA Simple Algorithm

## Arrays

DSA Arrays DSA Bubble Sort DSA Selection Sort DSA Insertion Sort DSA Quick Sort DSA Counting Sort DSA Radix Sort DSA Merge Sort DSA Linear Search DSA Binary Search

## Stacks & Queues

DSA Stacks DSA Queues

## Hash Tables

DSA Hash Tables DSA Hash Sets DSA Hash Maps

## Trees

DSA Trees DSA Binary Trees DSA Pre-order Traversal DSA In-order Traversal DSA Post-order Traversal DSA Array Implementation DSA Binary Search Trees DSA AVL Trees

## Graphs

DSA Graphs Graphs Implementation DSA Graphs Traversal DSA Cycle Detection

## Shortest Path

DSA Shortest Path DSA Dijkstra's DSA Bellman-Ford

## Minimum Spanning Tree

Minimum Spanning Tree DSA Prim's DSA Kruskal's

## Maximum Flow

DSA Maximum Flow DSA Ford-Fulkerson DSA Edmonds-Karp

## Time Complexity

Introduction Bubble Sort Selection Sort Insertion Sort Quick Sort Counting Sort Radix Sort Merge Sort Linear Search Binary Search

## DSA Reference

DSA Euclidean Algorithm DSA Huffman Coding DSA The Traveling Salesman DSA 0/1 Knapsack DSA Memoization DSA Tabulation DSA Dynamic Programming DSA Greedy Algorithms

## DSA Examples

DSA Examples DSA Exercises DSA Quiz DSA Certificate

# DSA Radix Sort Time Complexity

See this page for a general explanation of what time complexity is.

The Radix Sort algorithm sorts non negative integers, one digit at a time.

There are $$n$$ values that need to be sorted, and $$k$$ is the number of digits in the highest value.

When Radix Sort runs, every value is moved to the radix array, and then every value is moved back into the initial array. So $$n$$ values are moved into the radix array, and $$n$$ values are moved back. This gives us $$n + n=2 \cdot n$$ operations.

And the moving of values as described above needs to be done for every digit. This gives us a total of $$2 \cdot n \cdot k$$ operations.

This gives us time complexity for Radix Sort:

$O(2 \cdot n \cdot k) = \underline{\underline{O(n \cdot k)}}$

Radix Sort is perhaps the fastest sorting algorithms there is, as long as the number of digits $$k$$ is kept relatively small compared to the number of values $$n$$.

We can imagine a scenario where the number of digits $$k$$ is the same as the number of values $$n$$, in such a case we get time complexity $$O(n \cdot k)=O(n^2)$$ which is quite slow, and has the same time complexity as for example Bubble Sort.

We can also image a scenario where the number of digits $$k$$ grow as the number of values $$n$$ grow, so that $$k(n)= \log n$$. In such a scenario we get time complexity $$O(n \cdot k)=O(n \cdot \log n )$$, which is the same as for example Quicksort.

See the time complexity for Radix Sort in the image below.

Run different simulations of Radix Sort to see how the number of operations falls between the worst case scenario $$O(n^2)$$ (red line) and best case scenario $$O(n)$$ (green line).

{{ this.userX }}

{{ this.userK }}

Operations: {{ operations }}

The bars representing the different values are scaled to fit the window, so that it looks ok. This means that values with 7 digits look like they are just 5 times bigger than values with 2 digits, but in reality, values with 7 digits are actually 5000 times bigger than values with 2 digits!

If we hold $$n$$ and $$k$$ fixed, the "Random", "Descending" and "Ascending" alternatives in the simulation above results in the same number of operations. This is because the same thing happens in all three cases.

×

## Contact Sales

If you want to use W3Schools services as an educational institution, team or enterprise, send us an e-mail:
sales@w3schools.com