This post is a quick, comprehensive Time Complexity Cheat Sheet summarizing the Worst-Case Time Complexity of the most important algorithms and data structure operations used in computer science. This is a vital resource for coding interviews, academic study, and software design.
Understanding Big notation is crucial for predicting an algorithm's performance and scalability as the input size grows.
1. Sorting Algorithms Time Complexity
Efficient sorting is a cornerstone of computer science. Here is the complexity analysis for standard comparison-based and non-comparison-based sorts.
| Algorithm | Worst-Case Time Complexity (Time) | Space Complexity (Auxiliary) | Notes / Keyword Focus |
|---|---|---|---|
| Insertion Sort | Simple sort, good for small arrays. | ||
| Merge Sort | Stable sort. Guaranteed performance. | ||
| Heapsort | In-place sort using a Binary Heap. | ||
| Quicksort (Worst-Case) | or | High performance on average-case . | |
| Counting Sort | Linear time. Requires input to be in range . | ||
| Radix Sort | Linear time. Sorts by digit (Non-Comparison Sort). |
2. Data Structure Operations Complexity
The time cost of fundamental operations on core Data Structures.
| Data Structure / Operation | Worst-Case Time Complexity | Keyword Focus / Notes |
|---|---|---|
| Binary Search (Sorted Array) | Efficient searching method. | |
| Hash Table (Insert/Delete/Search) | (Average) / (Worst) | Crucial for fast lookups. Worst case due to collisions. |
| Red-Black Tree (Insert/Delete/Search) | Self-balancing BST, guarantees logarithmic complexity. | |
| B-Tree (Insert/Delete/Search) | Optimized for disk I/O (External Storage). | |
| Heap (MAX-HEAPIFY) | Used for Priority Queues and Heapsort. | |
| Disjoint Sets (FIND-SET / UNION) | (Amortized) | Inverse Ackermann function is nearly constant time. |
3. Graph Algorithms Time Complexity
Complexity analysis for common Graph Algorithms, where is vertices and is edges.
| Algorithm / Problem | Worst-Case Time Complexity | Keyword Focus / Notes |
|---|---|---|
| Breadth-First Search (BFS) | Finds shortest paths in unweighted graphs. | |
| Depth-First Search (DFS) | Used for Topological Sort and finding cycles. | |
| Prim's Algorithm (MST) | (Binary Heap) | Finds the Minimum Spanning Tree (MST). |
| Kruskal's Algorithm (MST) | or | MST algorithm using the Disjoint Set structure. |
| Dijkstra's Algorithm (SSSP) | Single-Source Shortest Paths for non-negative weights. | |
| Bellman-Ford Algorithm (SSSP) | Handles graphs with negative edge weights. | |
| Floyd-Warshall Algorithm (APSP) | Finds All-Pairs Shortest Paths using Dynamic Programming. |
4. Dynamic Programming and Other Techniques
| Algorithm / Problem | Worst-Case Time Complexity | Technique Used / Keyword |
|---|---|---|
| Matrix-Chain Multiplication | Dynamic Programming optimization. | |
| Longest Common Subsequence (LCS) | Dynamic Programming for sequence comparison. | |
| Activity Selection | Greedy Approach (requires initial sort). | |
| Fast Fourier Transform (FFT) | Divide-and-Conquer for signal processing/multiplication. |
Summary of Common Big Classes
This table visually summarizes the hierarchy of performance for algorithm analysis.
| Complexity Class | Name | Performance Rank | Example Algorithm |
|---|---|---|---|
| Constant Time | Best | Accessing an array element by index. | |
| Logarithmic Time | Excellent | Binary Search. | |
| Linear Time | Good | Searching an unsorted array. | |
| Linearithmic Time | Standard/Optimal | Merge Sort, Heapsort. | |
| Quadratic Time | Poor/Inefficient | Bubble Sort (Worst-Case). | |
| Exponential Time | Worst | Naive Traveling Salesperson Problem (TSP). |
Conclusion
This Algorithm Complexity Cheat Sheet provides the foundational Big O values essential for any developer or computer science student. Mastering these complexities is the first step toward writing efficient, scalable code.
For full-stack developers, data scientists, and engineers, always prioritize algorithms with logarithmic or quasilinear () complexity when possible!

