Gå til hovedindhold

Granted Scholarships

In the first scholarships calls a total of 21 PhD and 17 Postdoc applications were received. 12 of these were granted, and you can read more about them below.

A new call is now open and with deadline 7 January 2025.

Read more about Grants and Funding here

Research projects by PhD Scholarship recipients

University of Southern Denmark

Jacob Kongsted, Peter Reinholdt, and Erik Rosendahl Kjellgren

Quantum chemistry is one of the scientific disciplines foreseen to be significantly impacted by the introduction of quantum computers. However, several challenges need to be overcome before quantum chemistry can really benefit from the opportunities brought by quantum computers. In this project we will focus on one of the factors limiting real quantum chemistry applications when implemented on quantum computers, namely lack of development of efficient algorithms for error mitigation strategies. Based on our recent algorithmic development for quantum chemical computing approaches to the calculation of molecular properties, we suggest a novel and efficient error mitigation strategy designed for current noisy intermediate-scale (NISQ) quantum computers. Recently, we have performed a pilot implementation of our novel approach to error mitigation (unpublished), and very encouraging initial results showed that our approach leads to superior performance compared to other related error mitigation strategies. The current research project will investigate several aspects of this novel approach, including robustness, scalability, performance of the error mitigation strategy across different quantum computing architectures, and synergistic effects obtained from combining our suggested error mitigation strategy with other previously published approaches. Our goal is to establish the suggested error mitigation approach as the "standard" error mitigation strategy broadly used for quantum computing chemistry, thereby making a quantum leap in realizing accurate and scalable quantum chemical computing based on the opportunities brought by quantum computing.

Aalborg University

Kim Guldstrand Larsen, Christian Schilling, and Max Tschaikowski

We study the problem of equivalence checking between two quantum circuits. This scenario is motivated by the compilation process of a high-level quantum circuit down to a low-level quantum circuit that can be implemented on a concrete quantum computer. Since current quantum computers severely minimize the circuit depth. It is evident that these optimizations must preserve functional equivalence of the circuit. However, the problem of deciding equivalence between quantum circuits is known to be hard.

In this project, we will investigate the problem from a new angle based on tensor decision diagrams (TDDs). TDDs promise to combine the strengths of two highly successful research directions from literature: tensor networks, which offer low-dimensional reasoning, as well as decision diagrams, which offer efficient encoding. However, to apply TDDs for equivalence checking, we must overcome several challenges. First, we need an efficient implementation with strong parallelization to effectively make use of high-performance computers. Second, the main operation on TDDs - contraction - needs to be executed in a certain order following a so-called contraction plan. While each such plan will lead to the same result, different plans lead to different intermediate representations, some of which may become prohibitively large. Thus, it is crucial to select an efficient contraction plan. However, it is currently unknown how this choice can be made. To this end, we will employ machine leaming to find efficient contraction plans for TDDs. Finally, we will demonstrate the developed tools in case studies certifying quantum compilations.

Overall, this project will accelerate the validation of quantum compilers, which has immediate benefits for quantum computing across application domains.

University of Copenhagen

Matthias Christandl

Quantum communication through noisy quantum channels is possible with the help of encoding and decoding procedures. The design of the encoding and decoding procedures falls under the field of quantum Shannon theory. In quantum Shannon theory, it is often assumed that the encoding and decoding procedures can be implemented with noise free gates. However, this assumption does not hold true in practice as quantum computers are inherently noisy. Recently, Christandl and Müller-Hermes {Christandl and Müller-Hermes, IEEE Trans. Inf. Th. 70, 282 (2024)), and later together with Belzig (Belzig, Christandl and Müller-Hermes, IEEE Trans. Inf. Th. (2024)) established a high-level theory of fault-tolerant quantum communication, which also accounts for the noise in the realization of encoders and decoders. This has been done by combining fault-tolerant computation with the quantum Shannon theory, with the help of so-called "interfaces" This PhD thesis aims to extend the fault-tolerant quantum communication for practical codes, such as surface and colour codes, which are promising for fault tolerant quantum computing due to their planar layout with nearest neighbour connectivity. The fault tolerant communication will be analysed against practical noise models such as amplitude and phase damping, and coherent errors. Finally, some fundamental questions will also be explored, for example, what is the threshold value of the fault-tolerant quantum communication with practical codes? What is the fault tolerant capacity and the rate of communication? Whether it is possible to achieve threshold in a uniform way such that it does not depend on the quantum communication channel.

University of Copenhagen

Michael James Kastoryano

The development of practical, fault tolerant quantum algorithms remain THE critical challenge for the success of the field. The Quantum Metropolis (qMet) algorithm, which I and my collaborators have recently developed stands as an especially promising candidate in this direction. By engineering the preparation of thermal states of quantum many-body Hamiltonians, qMet mirrors the classical Metropolis algorithm's utility in simulating materials, molecules, and optimization tasks. Previous attempts at quantum Metropolis algorithms faced significant challenges lacking mathematical rigor and practicality. Our work overcomes these hurdles, providing a foundation for practical applications.

This PhD project aims to delve into qMet's practical applications, focusing on resource estimation, simulations of materials, and optimization tasks. The Metropolis algorithm has been pivotal in simulating complex systems across various disciplines, leveraging system configurations to explore statistical distributions. The quantum version, qMet, promises similar revolutionary impacts, especially in quantum simulations and potentially in optimization and quantum inference.

The project will tackle key challenges such as resource estimation for practical quantum advantage and exploring qMet's efficacy in simulating quantum many-body systems and optimizing classical problems. The feasibility of qMet for simulating ground states and its application in thermal annealing and dynamical systems will also be investigated. This includes a novel approach to optimization that leverages quantum dynamics, aiming to surpass classical and quantum annealer efficiency.

University of Copenhagen

Laura Mancinska

Leaming properties of physical systems is an essential task across various scientific disciplines. Given that quantum states cannot be observed directly, understanding their properties introduces unique challenges. The core focus of this project is on creating new quantum subroutines and algorithms to tackle three fundamental quantum leaming problems: tomography, spectrum estimation, and purification. For tomography and spectrum estimation, our objective is to derive a classical description. In contrast, for the purification problem, we aim to produce a purified version of a quantum state. Our three focus problems are well known and studied. A key quantity of interest is the number of copies of the quantum state needed to learn the desired quantity. A novel aspect of our project is the investigation of these three problems within a model that allows for the retention of quantum memory across different measurements-a concept that, despite its natural appeal, has been underexplored, partly because proving lower bounds in this context is challenging. Another reason this model has remained less explored could be that in current experiments it is challenging to perform mid-circuit measurements. The aim of this research is to understand and quantify the benefits of retaining quantum memory between measurements and to explore the trade-offs between sample and memory complexity in quantum algorithms. This could pave the way for new, more efficient algorithms for quantum learning. tasks, with broader implications for quantum computing and information theory.Leaming properties of physical systems is an essential task across various scientific disciplines. Given that quantum states cannot be observed directly, understanding their properties introduces unique challenges. The core focus of this project is on creating new quantum subroutines and algorithms to tackle three fundamental quantum leaming problems: tomography, spectrum estimation, and purification. For tomography and spectrum estimation, our objective is to derive a classical description. In contrast, for the purification problem, we aim to produce a purified version of a quantum state.

University of Copenhagen

Albert Helmut Werner and Daniel Malz

Motivated by branch prediction in classical computing, this PhD project explores branch prediction in the context of quantum computing. Every modem computer chip contains branch predictors - circuits that try to guess the outcome of conditional statements and switches (i.e., which branch of the computation must be done next). This allows the chip to pre-load larger chunks of the program. If the prediction was correct, this can save a tremendous amount of time. If it was wrong, the computed result is discarded, and the actual branch is executed. In Rydberg tweezer arrays, currently perhaps the most advanced quantum computing platform, there is a significant difference (about an order of magnitude) between gate times and measurement time. Since operations atter a measurement depends on the measurement outcome, the computer must idle for significant amounts of time until the measurement result becomes available and the computation can proceed, which introduces error. A simple back-of-the-envelope calculation shows that branch prediction may lead to significant speed-ups, but there are many open questions, both of practical and fundamental nature, which are the topic of the proposed PhD project. The first goal will be to study this new paradigm in fundamental quantum circuits: those used for active error correction and the quantum singular value transformation. Both are fundamental operations that employ measurements and conditional operations afterwards and run extensively as subroutines of larger algorithms such as phase estimation. Thus, speeding up these routines would lead to large and general applicable benefits. The goal will be to establish a theory of quantum branch prediction complete with the basic principle, key examples, and resource estimates on realistic platforms.

Revideret
27 nov 2024