Publication Date
2-2003
Description
We consider a fully quantized model of spontaneous emission, scattering and absorption, and study propagation of a single photon from an emitting atom to a detector atom both with and without an intervening scatterer. We find an exact quantum analogue to the classical complex analytic signal describing an electromagnetic wave scattered by a medium of charged oscillators. This quantum signal exhibits classical phase delays. We define a time of detection which, in the appropriate limits, exactly matches the predictions of a classically defined delay for light propagating through a medium of charged oscillators. The fully quantized model provides a simple, unambiguous and causal interpretation of delays that seemingly imply speeds greater than c in the region of anomalous dispersion.
Journal
Journal of Optics B: Quantum and Semiclassical Optics
Volume
5
First Page
85
Last Page
94
Department
Physics & Astronomy
Link to Published Version
https://iopscience.iop.org/article/10.1088/1464-4266/5/1/312/meta
DOI
https://doi.org/10.1088/1464-4266/5/1/312
Recommended Citation
Purdy, Thomas; Taylor, Daniel; and Ligare, Martin. "Manifestation of classical wave delays in a fully-quantized model of the scattering of a single photon." (2003) : 85-94.