Overview

The emergent quantum technologies play a key role in future developments of information transfer, processing and storage and may revolutionise the way we treat information. In the same way as classical information is not restricted to electrical signals, quantum information can be carried by atoms, ions or molecules, photons or micro-mechanical oscillators, and can be stored in discrete or continuous variables. Processing of quantum information requires not only a high degree of quantum control, but also a reliable and robust means of gauging its operational success. With the increasing complexity of quantum devices being implemented, verification and validation become ever more indispensable, as the correct operation of such a devices requires an intricate and time-consuming fine-tuning of the parameters of its components and of their alignment. To overcome the bottleneck of this tuning process one has to find ways to automate it and that requires a proper system characterisation. Moreover, the ability to diagnose problems and pinpoint error mechanisms becomes highly desirable. In this context, quantum tomography of states, processes and detectors plays a central role as a means for characterising a quantum device, both as a diagnostic tool, and as a proper component in quantum control setups.


In recent years, a variety of statistical tools have been developed that are capable not only of reconstructing quantum states, processes and detectors accurately, but also to provide statistical information about the reliability of the reconstructions. Most notably, engineering methods such as Kalman filtering have been adapted for quantum tomographic purposes, and tools from entanglement theory provide means of fast reconstruction of quantum states of systems with many constituents whose accessible Hilbert space has previously been thought to be too vast to make tomography viable. This set of tools now pave the way for statistically reliable tomographic reconstruction of any component of a quantum engineering device, to a large degree independent of its physical implementation and its complexity.


Aims

The school aims to provide an overview over the very latest theoretical and experimental developments in quantum tomographic reconstruction, with emphasis on real data processing. The lectures will cover a variety of both established and novel methods such as Maximum-Likelihood methods, Kalman filtering and approximation methods for the efficient reconstruction of quantum-mechanical many-body systems. The aim is to bring together researchers in the field of quantum information processing and quantum engineering from all over Europe and to give them the opportunity to gain vital experience in statistical methods for verification and validation of their experimental efforts.


The selection of invited speakers ensures that the full variety of theoretical methods and experimental implementations are represented at the school. Each set of lectures will be accompanied by exercise classes in which real data from quantum tomography experiments will be used to demonstrate the various tools. Participants are encouraged to provide their own experimental data, with additional data sets being provided by the school organisers.


Intended audience

The school is targeted at students/researchers of all levels, but with a strong emphasis on early-stage researchers (Diploma students, Master's students and PhD students). The participant's main scientific interest should be in the realm of implementations of quantum information processing and quantum engineering, either experimental or theoretical. The structure of the lectures requires a solid knowledge of quantum mechanics and basic notions of statistics and quantum information.