Ulysse Réglade: An introduction to quantum computing

An introduction to quantum computing

During the last last few years, the field of quantum computation has exploded. While the first quantum algorithms have been theorized during the last century, it’s only during the last decade that the scientific community started to believe we are not far from being able to implement them.

Indeed, due to technological breakthroughs, the qubit coherence time has grown from a few nanoseconds to a few seconds. Intel, IBM, and Google are investing a lot into the development of supra-conducting quantum chip-sets, and one can easily understand why. The application fields seems to be endless: cryptography, calculation of proteins folding for the medical industry, machine-learning...

In this presentation, I will give a quick overview of the technology behind quantum computing, and then introduce the basics of how a quantum algorithm works. Such an algorithm is, at hardware level, highly parallel, and one can imagine that it makes possible to perform time efficient fluid dynamic computation.

Publisert 16. aug. 2018 14:10 - Sist endret 16. aug. 2018 14:10