Theory of Large-Momentum-Transfer Atom Interferometry in the Quasi-Bragg Regime
- verfasst von
- Jan-Niclas Kirsten-Siemß
- betreut von
- Klemens Johannes Hammerer
- Abstract
Atom interferometers are versatile instruments offering great accuracy and stability, suitable for fundamental science and practical applications. In usual setups, the sensitivity of the sensor to inertial forces including gravitational signals scales with the spatial separation of two atomic wave packets. Consequently, increasing this separation using large momentum transfer (LMT) promises to enhance the performance of today’s devices by orders of magnitude. To date, despite several proof-of-principle experiments, only a handful of Bragg diffraction-based LMT implementations have yielded actual metrological gain. Hence, in this thesis we investigate the current sensitivity limits of Bragg interferometers resulting from the insufficient control of the atom-light interaction in two parts. In the first part we develop an analytical theory for Bragg pulses based on the pivotal insight that the elastic scattering of atoms from time-dependent optical lattices can be accurately described using the adiabatic theorem. We show that efficient Bragg operations can be realized with any smooth pulse shape, suggesting that adiabaticity may be a necessary requirement. Moreover, we find that high-quality Gaussian pulses are exclusively adiabatic. Our model incorporates corrections to the adiabatic evolution due to LandauZener processes, as well as the effects of a finite atomic velocity distribution. We verify its accuracy by comparison with exact numerical descriptions of Gaussian pulses transferring four, six, eight, and ten photon recoils (ℏk). We then extend our formalism to study the rich phenomenology of Bragg interferometers, which is quite different from that of a standard two-mode interferometer. We confirm the accuracy of our analysis through extensive numerical simulations for the example of a Mach-Zehnder interferometer. In particular, we determine the atomic projection noise limit of the interferometer and provide the means to saturate it. Furthermore, we evaluate the systematic errors intrinsic to the Bragg diffraction process, commonly known as the diffraction phase. We demonstrate their suppression by two orders of magnitude down to a few µrad using appropriate pulse parameters. In the second part of this thesis, we present twin-lattice interferometry based on symmetric Bragg diffraction and Bloch oscillations combined with slowly expanding BoseEinstein condensates. This method promises to address many of the constraints of previous LMT implementations enabling unprecedented momentum separations of up to 408 ℏk in the QUANTUS-1 experiment. We model the experimental contrast decay with increasing momentum transfer and conclude that in particular the interaction of the atomic ensemble with a distorted laser beam leads to spatial decoherence and to contrast loss. The results presented in this thesis indicate that technical imperfections currently limit the scalability of the experiment and our theoretical analysis will be highly instrumental in the design of future sensors with momentum separations of up to one thousand photon recoils or more.
- Organisationseinheit(en)
-
QUEST Leibniz Forschungsschule
- Typ
- Dissertation
- Anzahl der Seiten
- 214
- Publikationsdatum
- 2023
- Publikationsstatus
- Veröffentlicht
- Elektronische Version(en)
-
https://doi.org/10.15488/14603 (Zugang:
Offen)