Atomic operation


An atomic operation (ao) is a simple command that takes a fixed, constant amount of time to execute. The runtime of an algorithm is determined by the total number of atomic operations it executes.

For algorithms with compound commands, like loops, the number of atomic operations executed is often a function of the input size, resulting in time complexity. Here's an example of how time complexity can be obtained by counting up atomic operations:

result ← 0 (1 ao)
for i = 1 to n (runs n times)
result ← result + i (1 ao)
result ← result * i (1 ao)

By recognizing that the loop's block of \(2\) atomic operations runs \(n\) times, we get a time complexity of \(T(n) = 2n + 1 \in O(n).\) That \(+ 1\) at the end is from the assignment outside the loop. Now, try this example:

for i = 1 to n (runs n times)
for j = 1 to n (runs n times)
x ← j (1 ao)

This can be tricky. Let's look at just the innermost loop. By itself, it runs \(1\) atomic operation \(n\) times. However, since the loop is nested in a loop that also runs \(n\) times, the total number of atomic operations executed is \(n * n\), or \(n^2.\) Therefore, the time complexity \(T(n) = n^2 \in O(n^2).\) Try this last example:

for i = 1 to n (runs n times)
x ← i (1 ao)
for j = 1 to n (runs n times)
x ← j (1 ao)

The time complexity here is not \(n^2.\) Each loop runs \(1\) atomic operation \(n\) times. That means \(n\) atomic operations are executed by each, which in total is \(n + n\), or \(2n.\) So then, the time complexity \(T(n) = 2n \in O(n).\)

To optimize an algorithm, you'll want to reduce its total number of atomic operations. This can be done in two ways. First, by removing a few unnecessary lines of code, which makes the algorithm a little faster. Second, you could rethink the problem and come up with an entirely new approach that lowers the order of the algorithm's time complexity. If you manage to accomplish this, a dramatic amount of time can be saved for large inputs. However, if you're exclusively dealing with small inputs, this optimization may not be worth it.

Logic & Proofs
IntegerRational numberInequalityReal numberTheoremProofStatementProof by exhaustionUniversal generalizationCounterexampleExistence proofExistential instantiationAxiomLogicTruthPropositionCompound propositionLogical operationLogical equivalenceTautologyContradictionLogic lawPredicateDomainQuantifierArgumentRule of inferenceLogical proofDirect proofProof by contrapositiveIrrational numberProof by contradictionProof by casesSummationDisjunctive normal form
Set Theory
SetElementEmpty setUniversal setSubsetPower setCartesian productStringBinary stringEmpty stringSet operationSet identitySet proof
Functions
FunctionFloor functionCeiling functionInverse function
Algorithms
AlgorithmPseudocodeCommandAsymptotic notationTime complexityAtomic operationBrute-force algorithm
Relations
RelationReflexive relationSymmetric relationTransitive relationRelation compositionEquivalence relationEquivalence class
Number Theory
Integer divisionLinear combinationDivision algorithmModular arithmeticPrime factorizationGreatest common divisorLeast common multiplePrimality testFactoring algorithmEuclid's theoremPrime number theoremEuclidean algorithm
Induction
Proof by inductionFibonacci sequenceProof by strong inductionWell-ordering principleSequenceFactorialRecursive definition
Combinatorics
Rule of productRule of sumBijection rulePermutationCombinationComplement ruleExperimentOutcomeSample spaceEventProbabilityProbability distributionUniform distributionMultisetSixfold wayInclusion-exclusion principlePigeonhole principle
Graph Theory
GraphWalkSubgraphRegular graphComplete graphEmpty graphCycle graphHypercube graphBipartite graphComponentEulerian circuitEulerian trailHamiltonian cycleHamiltonian pathTreeHuffman treeSubstringForestPath graphStarSpanning treeWeighted graphMinimum spanning treeGreedy algorithmPrim's algorithm
Recursion
RecursionRecursive algorithmCorrectness proofDivide-and-conquer algorithmSorting algorithmMerge sort