A short story of quantum and information thermodynamics

This Colloquium is a fast journey through the build-up of key thermodynamical concepts, i.e. work, heat and irreversibility -- and how they relate to information. Born at the time of industrial revolution to optimize the exploitation of thermal resources, these concepts have been adapted to small systems where thermal fluctuations are predominant. Extending the framework to quantum fluctuations is a great challenge of quantum thermodynamics, that opens exciting research lines e.g. measurement fueled engines or thermodynamics of driven-dissipative systems. On a more applied side, it provides the tools to optimize the energetic consumption of future quantum computers.

1 Legacy of classical thermodynamics 1

.1 Macroscopic thermodynamics
Thermodynamics was developed in the XIXth century, providing a unified framework between mechanical sciences and thermometry. At the time, the motivation was very practical, namely use temperature to put bodies into motion -as clearly indicated by its name. In other words, the goal was to design and optimize thermal engines, i.e. devices that exploit the transformations of some "working substance" to convert heat into work. Work and heat are two ways to exchange energy, and from the first law of thermodynamics, there is nothing wrong in converting one into another.
However, turning heat into work is like turning lead into gold: It has severe caveats. The most famous is Kelvin no-go statement: It is not possible to extract work cyclically from a single hot bath. This no-go statement turned out to become one of the expressions of the second law of thermodynamics, which deals with (ir)reversibility. This is how an initially applied area of physics turned out to deliver fundamental concepts like entropy and time arrow.
As a matter of fact, the first boundary between work and heat was intimately related to the (ir)reversible nature of their exchanges. The concept of work comes from mechanical sciences, and represents a form of energy that can be exchanged reversibly: In principle, there is no time arrow associated with work exchanges -at least those associated to conservative forces. Conversely, the heat exchanges between a body and thermal baths are in general not reversible: heat spontaneously circulates from hot to cold bodies. In particular, if a body cyclically exchanges an amount of heat Q with a hot bath of temperature T h and −Q with a cold bath of temperature T c , the irreversible nature of heat transfers is captured by the phenomenological formula Q(1/T c − 1/T h ) ≥ 0, with equality if T c = T h .
This observation led to define the entropy change of a body in contact with a bath at temperature T as ∆S = Q rev /T , where Q rev is the amount of heat exchanged reversibly. More generally, any isothermal heat exchange follows the Clausius inequality ∆S − Q/T = ∆ i S ≥ 0. ∆ i S is the so-called entropy production that quantifies the irreversibility of the transformation. Introducing the system's internal energy U , and its free energy F = U − T S, Clausius inequality becomes The meaning of Eq.1 is transparent: It is not possible to extract more work than the free energy of the system. Reciprocally, to increase the free energy of a system, one has to pay at least the same amount of work. Since they are natural consequences of the thermodynamic arrow of time, these inequalities are called fundamental bounds. Extending these bounds to the quantum realm is an important motivation of quantum thermodynamics.
Eq.1 provides intuitions on Kelvin no-go statement, as exemplified by the Carnot engine. In this paradigmatic device, work is extracted during the expansion of the gas while it is coupled to the hot bath -increasing its entropy and thus lowering its free energy by an amount T h ∆S. ∆S is fixed by the settings of the engine (number of particles of the gas, minimal and maximal volume of the chamber). Therefore once the maximal volume has been reached, it is not possible to extract work anymore and one has to "reset" the engine, i.e. bring it back to its initial settings by compressing the gas. If the compression is performed at the same temperature, no net work is extracted from the cycle, hence the need for at least two baths at two different temperatures.

Information thermodynamics
Latter Maxwell suggested that information could be used to sort out the molecules of the gas and lower its entropy, apparently at no work cost. Such mechanism blatantly violates the Second Law and Kelvin no go, since no cold bath here would be needed to reset the enginecompression being realized for free, only using information.
It took one century to exorcize Maxwell's demon paradox. With the rise of information theory after the Second World War, it became clear that information was not some immaterial concept that could escape the laws of thermodynamics. This idea is captured by the famous "Information is physical" attributed to Landauer, one of the fathers of information thermodynamics (together with Bennett, Lloyd...). To understand how information and thermodynamics are related, Carnot engines are enlightening -however, instead of considering a gas made of a large number of particles, one can consider a "single-particle gas" rather, that is either positioned on the left or on the right of the chamber. Left or right can be used to encode one bit of information. Denoting by p the probability that the particle is on the left, the Shannon entropy of the probability distribution reads (in bits) . Now imagine that we know the particle is on the left. While it expands to eventually fill the whole volume, one bit of information is lost, such that the Shannon entropy change reads ∆H = 1 bit. Conversely, from this elementary amount of information, one can extract some work. In agreement with the Second Law, the amount of extractable work is bounded by W 0 = k B T log 2, where T is the temperature of the chamber in which the expansion takes place. Here k B stands for the Boltzmann constant. This is the basic principle of the so-called Szilard engine, that evidences the conversion of information into work.
This conversion is reversible and its reverse has even stronger practical implications. Indeed, starting from an initial configuration where the particle has equal chances to be on the left or on the right, and then compressing it, e.g. to the left of the chamber, is what is called in information theory a RESET operation: Whatever the initial state of the bit, it ends up in the state "0". This operation is logically irreversible: When it is performed, the initial state cannot be traced back. However, it is extremely useful since initializing bits is the beginning of any computation. Formula 1 evidences that resetting a bit has a work cost, that cannot be lower than W 0 -the bound being reached when the operation is thermodynamically reversible: This is the famous Landauer's erasure work [1].
Of course, the single particle model is a convenient approach to quickly get an intuition of the main equations, but it is idealized. First experimental evidences of information to energy conversions (Szilard engines, Landauer's erasure) have been obtained around 2010-2012 [2,3].

Stochastic thermodynamics
While introducing information thermodynamics, one has departed from the usual scenery of macroscopic thermodynamics that involved large amounts of particles as working substances. Within information thermodynamics, the working substance is now elementary since it solely involves one particle whose phase space reduces to two micro-states, 0 and 1. This new scenery is the one of "stochastic thermodynamics", that deals with small enough working substances such that fluctuations become predominant [4].
In this new realm, the dynamics of the system results from the action of some external operator that drives the system to implement some protocol. The system's evolution is perturbed by a thermal bath that induces random, "stochastic" fluctuations. Thus, the dynamics of the system is described by Markovian, stochastic trajectories is its phase space -one trajectory consisting in continuous sequences where the drive controls the system, intertwined by stochastic jumps imposed by the bath.
This new realm sheds new light on the First Law. Work now corresponds to the part of energy exchanged with the controller during the continuous sequences. On the other hand, heat is defined as the part of energy stochastically exchanged during the jumps induced by the bath. From an energetic point of view, it appears that the heat/work boundary now reflects the boundary between noise and control. From that perspective, an engine is a device made to extract energy from noise, by rectifying the fluctuations it induces.
The framework of stochastic thermodynamics also invites to reconsider the meaning of the Second Law. As a matter of fact, the laws of physics at the level of single particles are expected to be reversible, so where does irreversibility come from? There is a simple, operational answer to this question. Let us suppose that the system is initially prepared in a well-defined micro-state. The controller now implements a protocol aimed to bring the system into another micro-state. In the absence of a bath, the trajectory is perfectly deterministic. The controller is thus able to reverse the protocol, to bring back the system to its initial micro-state. However, if a bath remains coupled to the system during the protocol, the random perturbations it induces prevent the controller from perfectly reversing the trajectory followed by the system, making the protocol irreversible.

Fluctuation theorems
Interestingly, stochastic thermodynamics allows us to quantify the amount of irreversibility per trajectory γ. This is captured by the so-called stochastic entropy production ∆ i S[γ], that is defined as We have introduced the time-reversed trajectory γ * , and the probability P F [γ] (resp. P B [γ]) of the trajectory γ (resp. γ * ) while the protocol is run forward (resp. backward). With this definition, ∆ i S has no dimension. The meaning of Eq.2 is obvious, entropy production being positive if γ is more probable forward than backward. Some particular trajectories may lead to a negative entropy production. However, this is not contradictory with the Second Law which deals with average values. Namely, the quantity that should remain positive is the average value of the entropy production given . This condition is automatically fulfilled. This is obvious by . In most situations (at the noticeable exception of so-called absolute irreversibility [5]), this boils down to which is called the Integral Fluctuation Theorem (IFT). From the convexity of the exponential, one easily gets that ∆ i S[γ] γ ≥ 0, in agreement with the Second Law. The bound is saturated, if and only if ∆ i S[γ] = 0 for all γ. This strong condition defines the equilibrium distribution. As we show below, it is fully equivalent to define the equilibrium distribution by imposing the micro-reversibility condition. From Eq.2, Eq.3 is a tautology -but it is also the seed of so-called "fluctuation theorems", Jarzynski Equality (JE) probably being the most famous [6]. To recover it, one considers a system initially prepared at thermal equilibrium, then driven out of equilibrium by some external operator. It can be shown that the expression of the stochastic entropy production exactly matches the classical one, namely . This is JE, which has been experimentally verified on many different platforms.

Two-points trajectories
While simple, two-points trajectories are interesting since they allow us to build many useful intuitions. Let us consider a system with micro-states denoted by σ i , i being an integer. The stochastic behavior of the system is fully captured by the probability of jump from the state j to the state i, P [σ i |σ j ]. Here we take this probability as a constant for the sake of simplicity. Denoting as γ an elementary trajectory γ = (σ i , σ j ), its forward (resp. backward) probability reads . We have introduced p 0 (σ) (resp. p 1 (σ)) the probability distribution before (resp. after) the jump. The entropy produced by the trajectory γ reads The term on the left is called the boundary term, the term on the right the conditional term. Let us first characterize the equilibrium distribution p ∞ (σ). According to the definition above, it is characterized by ∆ for all (i, j). This evidences that the equilibrium distribution fulfills the micro-reversibility condition. Reciprocally, any distribution fulfilling this condition is an equilibrium distribution.
It is now possible to rewrite Eq.4, Averaging Eq.5 over all forward trajectories yields . This result evidences that each application of a stochastic map brings the system nearer its equilibrium distribution, which characterizes a relaxation. At this point it is interesting to consider the textbook case where the stochastic behavior is induced by a thermal bath of temperature T . Introducing E(σ) the internal energy of the system in the micro-state σ, the probability distribution characterizing thermal equilibrium is simply the Boltzmann distribution p(σ) = Z −1 exp(−E(σ)/k B T ). Z is the partition function, that allows us to define the system free energy F = − log(Z). Let us introduce ∆S[γ] = − log(p 0 (σ i )) + log(p 1 (σ j )). ∆S[γ] can be called the system's "stochastic entropy change": Once averaged over all forward trajectories, it gives back the standard expression for the system's entropy change. We get Eq.4 becomes ∆ i S[γ] = ∆S[γ] − Q[γ]/k B T , whose average value is in agreement with the classical definition (Eq. 1). If the system's initial states of the forward and the backward protocol correspond to the thermal equilibrium (

Generalized integral fluctuation theorem
A very important achievement of stochastic thermodynamics has been to incorporate information in the expression of a fluctuation theorem, giving rise to the so-called 'Generalized Integral Fluctuation Theorem" [7]. A simple intuition can be grasped, again by considering the case of a two-point trajectory that now involves a system and a demon's memory. The system has been read by the demon beforehand, such that the system and the memory state are correlated. Denoting as x and m the system and memory micro-states, p(x, m) (resp. p(x), p(m)) the joint (resp. marginal) probabilities, the correlation is quantified by the stochastic mutual information I(x, m) = log(p(x)p(m)) − log(p(x, m)). Averaged over the distribution, one recover the usual expression of the mutual information between the system and the demon, One focuses on the feedback operation. Namely, the demon exploits its knowledge on the system to perform some operation on it. Supposing that the memory state is not altered by the feedback, the forward (resp. the backward) trajectory reads γ = (x, m, y) and γ * = (y, m, x), and their respective probabilities read An ideal feedback is perfectly defined by the memory state, yielding P [y|x, m] = P [x|y, m]. We get eventually where ∆S[γ] = log(p 0 (x))−log(p 1 (y)) is the stochastic entropy change of the system. ∆I[γ] = I 1 (y, m) − I 0 (x, m) is the change of the stochastic mutual information between the system and the memory, where I k characterizes the joint probability distribution p k . Importantly, this expression puts information and entropy production on an equal footing, allowing to quantitatively address the work value of information. The same kind of argument as developed for the IFT can indeed be used to demonstrate that ∆S ≥ ∆I(S : M ), opening a rigorous path to exorcize Maxwell's demon. It basically states that information as quantified by I(S : M ) is a resource that can be consumed (∆I(S : M ) ≤ 0) to lower the entropy of a system at no work cost. Equivalent expressions can be derived when work is extracted from the protocol, leading to generalized fundamental bounds W ≥ ∆F + k B T ∆I. This can be used to define efficiencies of Maxwell's demons, e.g. η = W/(∆F + k B T ∆I). Just like usual engines, maximal efficiency is reached when the bound is saturated, i.e. when the process is run reversibly.

First batch of take-home messages
Macroscopic thermodynamics has given rise to the concept of engine, as a device that converts heat into work. Maximal efficiency is reached when the device is operated reversibly, connecting the notion of energetic performance to the thermodynamic arrow of time.
These concepts have been extended at the level of single particles by stochastic thermodynamics. Here noise plays a key role to define heat, work, and time arrow. For historical reasons, thermal noise due to the action of thermal baths was first considered. Thus, thermal engines were the first to be designed and experimentally implemented, and they still remain the most studied kind of nano-engines. In the same way, the concepts of entropy production and equilibrium are still widely understood with respect to a thermal bath playing the role of a reference.
However, the framework brought by stochastic thermodynamics is sufficiently general and flexible, such that other kinds of noise can be used as seeds to build "other" thermodynamical frameworks and explore new physics. We shall adopt this strategy below (See Section 3.2).

Quantum thermodynamics 3.1 Motivations
Quantum thermodynamics is the converging point of many areas of research, making it a very exciting field where new and transversal concepts are built. Many current lines of research are exposed in reviews and books (See e.g. [8]) and it is not my purpose to summarize them here. I shall rather put them in perspective with respect to the legacy of classical thermodynamics presented above, and focus on original research topics developed in my group in the past few years.
Firstly, quantum thermodynamics is the natural follow-up of stochastic thermodynamics, where systems switch from nano to quantum. What are irreversibility, work and heat in the quantum realm? is one of the most important questions. More technically, one aims to evidence new and genuinely quantum components in fluctuation theorems, that could be related to quantum coherence or entanglement. On the more applied side, one important research line investigates if quantum coherence and correlations can be a resource for nanoengines, that would lead them to outperform their classical counterparts. Reciprocally, what is the energetic cost of fighting against quantum noise?
To answer these questions, a natural scenery is provided by quantum open systemsnamely, driven quantum systems interacting with one or several baths. In the equations describing the system's dynamics, the action of the drive is usually modeled by some timedependent Hamiltonian. Hence, the drive can exchange energy with the system, without changing its von Neumann entropy: This is consistent with the classical definition of work. Conversely, the action of the bath(s) is non-unitary, such that the system's entropy is not necessarily conserved by the interaction: This is reminiscent of a heat exchange.
However, there is still no consensus on the definitions of heat and work in the quantum realm. One important reason is that the baths interacting with the system are not necessarily at thermal equilibrium. Therefore new thermodynamical concepts must be built, in the absence of temperature. Another reason is genuinely quantum: Quantum measurement perturbs. This is well known, but the energetic consequences of this effect had not been drawn until recently.

Rebuilding quantum thermodynamics on quantum measurement
As mentioned above, a consistent thermodynamical framework can be built for any driven system subjected to noise (See Fig.1). For historical reasons, the thermal noise was first considered -but there are many other kinds of noise in the quantum world. A fundamental one is the noise induced by projective quantum measurement. The scenery in this case is as simple as it can be: A quantum system evolving under some time-dependent Hamiltonian on the one hand, and projectively measured at discrete times. Knowing the outcomes of the measurement   (1) (1) (1) 12) .50.Gy; 42.50.Pq ; 42.65.Hw         : Classical versus quantum thermodynamics. A system S exchanges work W with an external controller, and heat Q with a stochastic entity. a) Historical framework. The stochastic entity is a thermal reservoir whose action is symbolized by the dice k B . b) Rebuilding quantum thermodynamics on quantum measurement. The stochastic entity is a measuring device whose action is symbolized by the diceh. and the applied Hamiltonian, it is possible to reconstruct at any time the trajectory of pure quantum states followed by the system, that consists in continuous sequences intertwined by the measurement-induced, stochastic quantum jumps. These quantum trajectories are the quantum counterpart of the stochastic trajectories introduced above, that provide the bread and butter of stochastic thermodynamics. However in the present situation, the stochasticity is genuinely quantum, since it is due to measurement back-action. For a given quantum trajectory, the system's internal energy along time is identified with the expectation value of the Hamiltonian along the trajectory. In the spirit of stochastic thermodynamics, work can be defined as the system's energy change during the unitary sequences -"heat", on the other hand, being identified with the sudden energy changes during the quantum jumps. This heat has no classical equivalent, since it comes from the fluctuations induced by measurement back-action. Such fluctuations can only take place, if the measured state has coherences in the basis of the measured observable. It is thus a quantum effect, due to quantum coherences. For this reason, my coworkers and I dubbed it "quantum heat" [9].
Let us now focus on time arrow. For the sake of simplicity, we focus on a protocol defined by some initial eigenstate |m 0 of the observableM , a unitary evolution defined by the operatorÛ , and a final measurement ofM with stochastic result m kγ . This elementary quantum trajectory γ is perfectly defined by the two points γ = (m 0 , m kγ ). Its probability reads P F [γ] = P [m kγ |m 0 ], i.e. P F [γ] = m kγ Û |m 0 . Averaged over all trajectories, the final system state is a mixture ρ of pure states |m k with probability p k = m k |Û |m 0 .
Reciprocally, the backward protocol consists in measuring the observableM while the Averaged over all trajectories, the entropy production thus reads where ∆S VN is the increase of the Von Neumann entropy of the system along the forward protocol. This expression can easily be extended to multi-points trajectories, or to the case where the protocol does not start with a pure state. Eq.7 is extremely important since it connects the Von Neumann entropy -widely used in quantum physics -to entropy production, which is a purely thermodynamical concept. It provides a rigorous demonstration of the wellknown "irreversibility of quantum measurement" that fully exploits the relevant framework of stochastic thermodynamics. In particular, it now allows measuring "how much irreversible" a measurement is.

Measurement driven engines
It should now be clear that projective measurement, just like any stochastic process, can be seen as a source of irreversibility and energy -playing a role quite similar to the good old thermal bath. In particular, a non-zero amount of quantum heat can be exchanged on average between the system and the measurement channel, as soon as the measured observable does not commute with the system's Hamiltonian. Building on this analogy, my coworkers and I have suggested to use quantum measurement as a new kind of energetic resource that could fuel quantum engines (See [?] and Fig.2). The experiment that we have suggested involves a qubit of energy eigenstates denoted |0 and |1 , of transition frequency ω 0 as a working substance, that exchanges work with some resonant driving field. The mechanism is a classical Rabi oscillation, where the qubit's state evolves as |ψ(t) = cos(Ωt/2) |0 + sin(Ωt/2) |1 with Ω the classical Rabi frequency. Work extraction takes place during stimulated emission, when the qubit provides energy to the field. Maximal power extraction is reached when the qubit is in the coherent superposition |+ = (|0 + |1 )/ √ 2 which gives rise to the maximal slope of the Rabi oscillation.
To extract work cyclically, the strategy we suggested was to use a measurement followed by a feedback loop, to stabilize the qubit in the state |+ . One cycle consists of the four following steps: (i) Work extraction: after the qubit is initialized in the state |+ , it evolves in the state |φ(τ ) = cos(Ωτ /2) |+ +sin(Ωτ Actually, this machine is nothing but a new kind of Maxwell's demon engine. The feature that makes it really quantum is that it does not extract energy from a hot thermal bath, but from the measurement process itself. Thus the two facets of quantum measurement are exploited in this device: Measurement not only allows to extract information, but it also provides energy since it back-acts on the system's state. Stated in fancy words, with such an engine you can put a body in motion," just by looking at it".
For an engine, an important figure of merit is its yield. It is computed, by comparing the net extracted work W − W L to the consumed resource Q q i.e. η = 1 − W L /Q q . Interestingly, η → 1 when Ωτ 1. This corresponds to the Zeno regime, where measurements are performed at such a fast rate that the qubit is "frozen" in the |+ state. In this situation, the measurement outcome is certain and the memory's entropy vanishes, such that no erasure is needed. Reaching such a yield means that the quantum heat provided by the measurement channel is fully converted into work, the engine behaving as a transducer. The power is another relevant figure of merit. It turns out that maximal power is also reached in the Zeno regime, as a simple consequence of the fact that |+ is the best state for power extraction -as mentioned above. Unlike classical engines where one has to chose between maximal efficiency and maximal power, the present device allows to operate at maximal efficiency and power simultaneously. This is typical of the fact that we have now departed from standard thermodynamics and that new intuitions must be built.

Discussion
The engine presented above was the first to explicitly exploit "quantum heat", i.e. measurement induced back-action, for the sake of work extraction. Obviously, the main value of the proposal is not its practical interest. It is a proof of concept, that evidences the reality of energy exchanges with a measurement channel. Since then, the concept of quantum heat has bloomed to give rise to new proposals for measurement driven engines [10,11], to cool down qubits [12], or to track entanglement generation [13]. Since decoherence is nothing but an unwanted measurement performed by some uncontrolled environment, quantum heat is also expected to be a relevant concept to estimate the energetic costs related to feedback-based stabilization [9].
The quantum heat is the energetic counterpart of the measurement postulate and as such, it can be perceived differently by different users of quantum theory. It can be seen as a practical, effective quantity allowing to take quantitatively into account the effect of a measurement on thermodynamical quantities. An interesting line of research now will consist in "opening the black box", i.e. modeling the measurement process itself and track the energetic and entropic fluxes within the measurement channel. Just like in classical thermodynamics, where heat and irreversibility are expected to vanish when one reaches a complete, ultimate description of the system, one could thus expect to find that quantum heat is not a fundamental concept.
On the other hand, one can be of the opinion that quantum mechanics entirely relies on some act of measurement, and that whichever the degree of precision in the described mechanism, its very meaning is built on the irreversible records of measurement outcomes. In this view, the quantum heat rather appears as a fundamental concept that will always be part of the thermodynamical description. Current debates about heat and work in quantum thermodynamics reflect in some sense the still ongoing debates about the status of the measurement postulate -quantum thermodynamics providing a new playground, and why not? Maybe new ideas and new experiments to explore interpretations of quantum mechanics [11,14].

Thermodynamics of quantum computing
As a final promising field of investigations for quantum thermodynamics, it is worth mentioning the thermodynamical study of quantum computing in the context of the present school. As a matter of fact, the "quantum advantage" usually put forward to motivate quantum versus classical computing is its reduction in complexity. However, another advantage is that quantum computing is in principle reversible. Being more specific, an ideal quantum algorithm like the Deutsch problem consists in initializing the data register in the state |0 of the computational basis, a unitary operation, and a final measurement in the computational basis. In the absence of noise, the algorithm can be seen as a quantum interference: The quantum state of the register before the final measurement pertains to the computational basis and the outcome provides a noise-free answer to the asked question. In particular, there is no back-action associated to this final measurement, i.e. the act of measurement has no effect on the system's state. Therefore, it is conceivable that the agent performing the computation records the outcome, and then reverses the whole protocol to bring back the register to its initial state |0 , such that there is no heat dissipation associated to the reset.
Conversely, activating the gates of the quantum circuit is not energetically free. Simply considering a single qubit gate, the minimal energetic cost to run it is to let the qubit interact resonantly with a coherent field during a well-defined time. The energetic cost associated to this operation can be quantified as the minimal number of photons put in the coherent field. To reduce the cost, it is tempting to work with a small number of photons. However, small fields get entangled with the qubit, leading to some fundamental noise affecting the process as soon as the field is traced out [15,16]. Preliminary studies show that at least 1000 photons in the field to keep a sufficiently good fidelity on the gate. For microwave photons interacting with superconducting qubits, the energetic bill to activate a single gate is typically 10 −21 J. This is the same order of magnitude as the ultimate heat dissipated by the erasure of a single bit. Large scale quantum computers will involve a large number of gates, in particular for error correction. As a consequence, energy will quite certainly play a key role to benchmark future quantum computing architectures.