Time complexity


The time complexity of an algorithm is a function \(T(n) : \mathbb Z^+ \rightarrow \mathbb Z^+\) that maps input size to runtime, in units of atomic operations. Its order of growth can be analyzed with asymptotic notation.

Here's a list of common time complexities sorted by increasing order of growth:

Name Order Example
Constant \(O(1)\) Determining if an integer is even or odd, printing "Discretopia" 1,000,000 times
Logarithmic \(O(\log n)\) Binary search: halving until finding the target value in a sorted array
Linear \(O(n)\) Linear search: checking each value in an array until the target value is found
Linearithmic \(O(n \log n)\) Merge sort, and many other comparison-based sorting algorithms
Quadratic \(O(n^2)\) Insertion sort: building a sorted array part-by-part
Polynomial \(O(n^k)\)
\(k > 2\)
Most problems that can be solved "efficiently"
Exponential \(O(c^n)\)
\(c > 1\)
Using a truth table to determine the logical equivalence of two compound propositions
Factorial \(O(n!)\) Solving the travelling salesman problem by brute-force

To quickly determine which worst-case time complexity your algorithm has, discard any lower-order terms and constant factors.

Here's a few more simplification rules you may find helpful:

  • If \(f \in O(h)\) and \(g \in O(h)\), then \(f + g \in O(h).\)
  • If \(f \in \Omega(h)\) or \(g \in \Omega(h)\), then \(f + g \in \Omega(h).\)
  • If \(f \in O(g)\) and \(c \in \mathbb R^+\), then \(c * f \in O(g).\)
  • If \(f \in \Omega(g)\) and \(c \in \mathbb R^+\), then \(c * f \in \Omega(g).\)
  • If \(f \in O(g)\) and \(g \in O(h)\), then \(f \in O(h).\)
  • If \(f \in \Omega(g)\) and \(g \in \Omega(h)\), then \(f \in \Omega(h).\)

If the worst-case scenario for an algorithm is unusual, this may produce overly pessimistic time complexities. That's why it's sometimes more reasonable to look at the average-case scenario.

Logic & Proofs
IntegerRational numberInequalityReal numberTheoremProofStatementProof by exhaustionUniversal generalizationCounterexampleExistence proofExistential instantiationAxiomLogicTruthPropositionCompound propositionLogical operationLogical equivalenceTautologyContradictionLogic lawPredicateDomainQuantifierArgumentRule of inferenceLogical proofDirect proofProof by contrapositiveIrrational numberProof by contradictionProof by casesSummationDisjunctive normal form
Set Theory
SetElementEmpty setUniversal setSubsetPower setCartesian productStringBinary stringEmpty stringSet operationSet identitySet proof
Functions
FunctionFloor functionCeiling functionInverse function
Algorithms
AlgorithmPseudocodeCommandAsymptotic notationTime complexityAtomic operationBrute-force algorithm
Relations
RelationReflexive relationSymmetric relationTransitive relationRelation compositionEquivalence relationEquivalence class
Number Theory
Integer divisionLinear combinationDivision algorithmModular arithmeticPrime factorizationGreatest common divisorLeast common multiplePrimality testFactoring algorithmEuclid's theoremPrime number theoremEuclidean algorithm
Induction
Proof by inductionFibonacci sequenceProof by strong inductionWell-ordering principleSequenceFactorialRecursive definition
Combinatorics
Rule of productRule of sumBijection rulePermutationCombinationComplement ruleExperimentOutcomeSample spaceEventProbabilityProbability distributionUniform distributionMultisetSixfold wayInclusion-exclusion principlePigeonhole principle
Graph Theory
GraphWalkSubgraphRegular graphComplete graphEmpty graphCycle graphHypercube graphBipartite graphComponentEulerian circuitEulerian trailHamiltonian cycleHamiltonian pathTreeHuffman treeSubstringForestPath graphStarSpanning treeWeighted graphMinimum spanning treeGreedy algorithmPrim's algorithm
Recursion
RecursionRecursive algorithmCorrectness proofDivide-and-conquer algorithmSorting algorithmMerge sort