The first quantum speedup — one query instead of two
Can you determine a global property of a function (constant or balanced?) by evaluating it only once, when classically you need two evaluations?
Deutsch’s algorithm (1985, refined by Deutsch and Jozsa in 1992) was the first proof that a quantum computer can solve a specific problem strictly faster than any classical computer. The problem it solves is simple: given a black-box function f(x) on one bit, decide whether f is constant (f(0)=f(1)) or balanced (f(0)≠f(1)). Classically, you must query f twice. Deutsch’s algorithm does it with one query. The speedup is modest, but the principle — encoding the answer in phase and extracting it through interference — is the blueprint for every quantum algorithm that followed.
The trick has three parts. First, prepare an ancilla in |−⟩ so that the oracle kicks its answer into phase rather than flipping a visible bit. Second, put the input in superposition so both f(0) and f(1) are queried simultaneously — but the result is encoded in relative phase, not in two separate answers. Third, apply Hadamard to convert the phase difference into a definite measurement outcome: |0⟩ means constant, |1⟩ means balanced.
The oracle does not return f(x) as a bit you read. It encodes f(x) as a phase (±1) via kickback. Interference then converts the global property (constant vs balanced) into a single deterministic bit.
Apply X then H to qubit 1. This creates |−⟩ = (|0⟩−|1⟩)/√2, the eigenstate that enables phase kickback.
Apply H to qubit 0. Now the input queries both x=0 and x=1 simultaneously.
The oracle applies f(x) via phase kickback. Each branch picks up (−1)^{f(x)}. If f is balanced, the two branches get opposite signs. If f is constant, both branches get the same sign.
Hadamard converts the phase pattern into a definite computational basis state. Same phases → |0⟩ (constant). Opposite phases → |1⟩ (balanced).
Read qubit 0. The result is f(0)⊕f(1): 0 means constant, 1 means balanced. One query determined a global property.
Measuring |0⟩ on qubit 0 means f is constant (f(0)=f(1)). Measuring |1⟩ means f is balanced (f(0)≠f(1)). The result is deterministic, not probabilistic.
The oracle U_f maps |x,y⟩ → |x, y⊕f(x)⟩. With the ancilla in |−⟩, the action becomes |x⟩|−⟩ → (−1)^{f(x)}|x⟩|−⟩ (phase kickback). Applying this to the input in |+⟩ gives: if f is constant, the phase is the same on both branches, leaving |+⟩. If f is balanced, the phases differ, creating |−⟩. A final Hadamard maps |+⟩→|0⟩ (constant) or |−⟩→|1⟩ (balanced). One query, deterministic answer.
Classically, you must evaluate f(0) and f(1) separately and compare. Two queries are always required, regardless of strategy.
Quantumly, superposition lets you evaluate both inputs at once, and interference converts the global property (same or different) into a single measurable bit. One query suffices.
Deutsch’s algorithm does not compute f(0) and f(1) and compare them. It never learns the individual values. It directly determines the global property (constant vs balanced) through interference, without ever extracting f(0) or f(1) separately.
Deutsch-Jozsa generalizes this to n-bit functions: one query vs 2^{n−1}+1 classical queries. The same phase-kickback-then-interference pattern scales up to Bernstein-Vazirani, Simon’s algorithm, and eventually Shor’s factoring algorithm.
In Deutsch’s algorithm, the oracle is implemented as a CNOT. What type of function does this CNOT oracle represent?