Description
The precise quantification of the ultimate efficiency in manipulating quantum resources lies at the core of quantum information theory. However, purely information-theoretic measures fail to capture the actual computational complexity involved in performing certain tasks. In this work, we rigorously address this issue within the realm of entanglement theory, a cornerstone of quantum information science. We consider two key figures of merit: the computational distillable entanglement and the computational entanglement cost, quantifying the optimal rate of entangled bits (ebits) that can be extracted from or used to dilute many identical copies of n-qubit bipartite pure states, using computationally efficient local operations and classical communication (LOCC). We demonstrate that computational entanglement measures diverge significantly from their information-theoretic counterparts. While the von Neumann entropy captures information-theoretic rates for pure-state transformations, we show that under computational constraints, the min-entropy instead governs optimal entanglement distillation. Meanwhile, efficient entanglement dilution incurs a major cost, requiring maximal (order n) ebits even for nearly unentangled states. Surprisingly, in the worst-case scenario, even if an efficient description of the state exists and is fully known, one gains no advantage over state-agnostic protocols. Our results reveal a stark, maximal separation between computational and information-theoretic entanglement measures. Finally, our findings yield new sample-complexity bounds for measuring and testing the von Neumann entropy, fundamental limits on efficient state compression, and efficient LOCC tomography protocols. https://arxiv.org/abs/2502.12284