We adopt the beam-splitter model for losses to analyze the performance of a recent compact continuous-variable entanglement distillation protocol implemented using realistic quantum memories. We show that the decoherence undergone by a two-mode squeezed state while stored in a quantum memory can strongly modify the results of the preparatory step of the protocol. We find that the well-known method for locally increasing entanglement, phonon subtraction, may not result in entanglement gain when losses are taken into account. Thus, we investigate the critical number mc of phonon subtraction attempts from the matter modes of the quantum memory. If the initial state is not de-Gaussified within mc attempts, the protocol should be restarted to obtain any entanglement increase. Moreover, the condition m c>1 implies an additional constraint on the subtraction beam-splitter interaction transmissivity, viz., it should be about 50% for a wide range of protocol parameters. Additionally, we consider the average entanglement rate, which takes into account both the unavoidable probabilistic nature of the protocol and its possible failure as a result of a large number of unsuccessful subtraction attempts. We find that a higher value of the average entanglement can be achieved by increasing the subtraction beam-splitter interaction transmissivity. We conclude that the compact distillation protocol with the practical constraints coming from realistic quantum memories allows a feasible experimental realization within existing technologies.
|Journal||Physical Review A - Atomic, Molecular, and Optical Physics|
|Publication status||Published - 11 Oct 2013|