Reliable Quantum Memories with Unreliable Components

Reliable Quantum Memories with Unreliable Components
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Quantum memory systems are vital in quantum information processing for dependable storage and retrieval of quantum states. Inspired by classical reliability theories that synthesize reliable computing systems from unreliable components, we formalize the problem of reliable storage of quantum information using noisy components. We introduce the notion of stable quantum memories and define the storage rate as the ratio of the number of logical qubits to the total number of physical qubits, as well as the circuit complexity of the decoder, which includes both quantum gates and measurements. We demonstrate that a strictly positive storage rate can be achieved by constructing a quantum memory system with quantum expander codes. Moreover, by reducing the reliable storage problem to reliable quantum communication, we provide upper bounds on the achievable storage capacity. In the case of physical qubits corrupted by noise satisfying hypercontractivity conditions, we provide a tighter upper bound on storage capacity using an entropy dissipation argument. Furthermore, observing that the time complexity of the decoder scales non-trivially with the number of physical qubits, achieving asymptotic rates may not be possible due to the induced dependence of the noise on the number of physical qubits. In this constrained non-asymptotic setting, we derive upper bounds on storage capacity using finite blocklength communication bounds. Finally, we numerically analyze the gap between upper and lower bounds in both asymptotic and non-asymptotic cases, and provide suggestions to tighten the gap.


💡 Research Summary

The paper addresses the fundamental question of whether reliable quantum memories can be built from noisy, unreliable components—qubits, gates, and measurements—by formalizing a model that incorporates both information‑theoretic and circuit‑complexity considerations. The authors introduce the notion of a stable quantum memory: a sequence of memory devices whose encoding and decoding maps are ideal, but whose internal dynamics (the “memory”) are subject to stochastic noise. They define the complexity χ of a memory as the total number of physical qubits, quantum gates, and periodic measurements required, and the storage overhead θ as the ratio χ/k where k is the number of logical qubits stored. The storage capacity Q is then the reciprocal of the smallest achievable overhead across all stable memory families.

The system model consists of repeated “wait‑refresh” cycles. During a wait interval of duration τ the physical qubits decohere under a specified noise model; during a refresh interval a syndrome measurement circuit, a classical decoder, and an error‑correction block are applied. Two noise models are considered: (i) a local stochastic model with parameters (p,q) that bounds the probability of any set of X‑type errors on data qubits and any set of measurement errors on ancillas, and (ii) independent depolarizing noise with parameter \tilde p. The authors explicitly include measurement errors, reflecting realistic experimental constraints.

Two families of quantum error‑correcting codes are employed to construct concrete memory schemes. The first is the quantum expander code, obtained as the hypergraph product of two classical expander codes. Its parity‑check matrices have constant weight d_A + d_B, yielding an LDPC structure with parameters


Comments & Academic Discussion

Loading comments...

Leave a Comment