CEA | CIMAP | LIDYL | LLB | LSI | NIMBE | SPEC | webmail : intra - extra | Accès VPN-SSL | 

Foundational questions of quantum information
April 4-5, 2012

Workshop "Foundational questions of quantum information"

Dates: April 4-5, 2012

Jointly organized by LARSIM and QuPa

Venue: Amphi Opale, 46 rue Barrault, Paris 13e


April 4

9:30-9:45 Coffee and Opening

9:45-10:45 Robert Raussendorf (University of British Columbia)

10:45-11:00 Coffee

11:00-12:00 Oscar Dahlsten (University of Oxford)


14:15-15:15 Matthew Pusey (Imperial College London)

15:15-16:15 Michel Bitbol (CREA, CNRS-Ecole Polytechnique)

16:15-16:45 Coffee

16:45-17:45 Virginie Lerays (LRI, Université Paris Sud)


April 5

9:30-9:45 Coffee

9:45-10:45 Damian Markham (LTCI, CNRS-Télécom ParisTech)

10:45-11:00 Coffee

11:00-12:00 Kavan Modi (University of Oxford and Centre for Quantum Technologies, National University of Singapore)


14:15-15:15 Giacomo Mauro d'Ariano (University of Pavia)

15:15-16:15 Caslav Brukner (University of Vienna)

16:15-16:45 Coffee

16:45-17:45 Alexei Grinbaum (LARSIM, CEA-Saclay)



Robert Raussendorf
"Symmetry constraints on temporal order in measurement-based quantum computation"

We discuss the interdependence of resource state, measurement setting and temporal order in measurement-based quantum computation. The possible temporal orders of measurement events are constrained by the principle that the randomness inherent in quantum measurement should not affect the outcome of the computation. We provide a classification for all temporal relations among measurement events compatible with a given initial quantum state and measurement setting, in terms of a matroid. Conversely, we show that classical processing relations necessary for turning the local measurement outcomes into computational output determine the resource state and measurement setting up to local equivalence. Further, we find a symmetry transformation related to local complementation that leaves the temporal relations invariant. 

Oscar Dahlsten
"Tsirelson’s bound from a Generalised Data Processing Inequality"

The strength of quantum correlations is bounded from above by Tsirelson’s bound. We establish a connection between this bound and the fact that correlations between two systems cannot increase under local operations, a property known as the data processing inequality. More specifically, we consider arbitrary convex probabilistic theories. These can be equipped with an entropy measure that naturally generalizes the von Neumann entropy, as shown recently by Short and Wehner. We prove that if the data processing inequality holds with respect to this generalized entropy measure then the underlying theory necessarily respects Tsirelson’s bound. We moreover generalise this statement to any entropy measure satisfying certain minimal requirements. Based on arXiv:1108.4549.

Matthew Pusey
"Comparing two explanations for qubits"

I will discuss two long-standing realist models for qubits - one due to Bell and the other to Kochen and Specker. I will argue that the latter provides a much more compelling explanation of various quantum information phenomena, mainly thanks to the feature that multiple quantum states can apply to the same real state. Finally I will show that, on the other hand, it is precisely this feature that prevents the latter model from explaining a very particular phenomena. Based on arXiv:1111.3328.

Michel Bitbol
"Kant and quantum mechanics: a middle way between the ontic and epistemic approaches"

Instead of either formulating new metaphysical images of the so-called "quantum reality" or rejecting any metaphysical attempt in an empiricist spirit, the case of quantum mechanics might require a redefinition of metaphysics. The sought redefinition will be performed in the spirit of Kant, according to whom metaphysics is the discipline of the boundaries of human knowledge. This can be called a "reflective" conception of metaphysics. Along with this perspective, theoretical structures are neither ontic nor purely epistemic. They do not express exclusively the structure of reality out there, or the form of our own knowledge, but their active interface. Our understanding of the structure of quantum mechanics then works in two steps :
(1) The most basic structures of quantum mechanics are neither imposed onto us (by some pre-structured reality) nor arbitrary (just meant to "save the phenomena"), but made necessary by the general characteristics of our demand of knowledge.
(2) Yet, there can also be additional features of theoretical structures corresponding to special characteristics of our demand of knowledge, adapted to certain directions of research or to cultural prejudice. The "surplus structure" of some of the most popular interpretations of quantum mechanics will be understood this way.
Finally, it will be shown that some of the major "paradoxes" of quantum mechanics, such as the measurement problem, can easily be dissolved by way of this reflective attitude.

Virginie Lerays
"Detector efficiency and communication complexity"

In the standard setting of communication complexity, two players each have an input and they wish to compute some function of the joint inputs. This has been the object of much study in computer science and a wide variety of lower bound methods have been introduced to address the problem of showing lower bounds on communication. Physicists have considered a closely related scenario where two players share a predefined entangled state. Each is given a measurement as input, which they perform on their share of the system. The outcomes of the measurements follow a distribution which is predicted by quantum mechanics. The goal is to rule out the possibility that there is a classical explanation for the distribution, through loopholes such as communication or detector inefficiency. In an experimental setting, Bell inequalities  are used to distinguish truly quantum from classical behavior.
Bell test and communication complexity are both measures of how far a distribution is from the set of local distributions (those requiring no communication), and one would expect that if a bell test shows a large violation for a distribution, it should require a lot of communication and vice versa.
We present a new lower bound technique for communication complexity based on the notion of detector inefficiency  for the setting of simulating distributions, and show that it coincides with the best lower bound in communication complexity known until now.  We show that it amounts to constructing an explicit Bell inequality. Joint work with Sophie Laplante and Jérémie Roland.

Damian Markham
"On non-linear extensions of quantum mechanics"

We present some observations on the restrictions imposed on non-linear extensions of quantum mechanics with respect to non-signaling. We see that non-signaling can be understood as imposing the destruction of correlations, a property noticed for closed time-like curves by Bennett et al, arising from the 'non-linearity trap'. We discuss in what sense such theories can still allow for 'local' cloning and state discrimination. Joint work with Julien Degorre.

Kavan Modi
"Entanglement distribution with quantum communication"

Two distant labs cannot increase the entanglement between them via classical communication. However, they can do so via quantum communication. Surprisingly, the communicated system need not be entangled with either / both of the labs, but it must be quantum correlated (as determined by quantum discord). We show that quantum discord that bounds the increase in the entanglement via quantum communication. Additionally, the bound also leads to subadditivity of entropy and gives an interpretation for negative conditional entropy.

Giacomo Mauro d'Ariano
"Physics from Informational Principles"

Recently quantum theory has been derived from six principles that are of purely informational nature. The "(epistemo)logical" nature of these principles makes them rock solid. We want now to take a pause of reflection about the general foundations of Physics, and re-examine how solid are principles as the Galilean relativity and the Einsteinian equivalence principle. Are they truly compelling? Why are they under dispute, and violations are considered? Following the route of the informational paradigm, I will suggest three new candidate principles, all of informational nature: 1) The Church–Turing–Deutsch principle, namely that theory must allow simulating any physical process by a universal finite computer (this implies that the information involved in any process is locally bounded); 2) topological locality of interaction; 3) topological homogeneity of interactions. These principles along with the six ones for Quantum Theory suggest a new foundation of Quantum Field Theory as Quantum Cellular Automata theory. I will show how this framework can actually provide an extension of Quantum Field Theory to include localized states and observables, whereas Galileo's and Einstein's covariance and other symmetries are only approximate, and to be  recovered only in the field-limit, whereas their violation make the extended theory in-principle falsifiable. The new informational principles open totally unexpected routes and re-definitions of mechanical notions (as inertial mass, Planck constant, Hamiltonian, Dirac equation as free flow of information), Minkowsian space­‐time as emergent, and an unexpected role for Majorana field in the solution of the so-called Feynman problem of simulating anti-­commuting fields by the automaton.

Caslav Brukner
"Tests distinguishing between quantum and more general probabilistic theories"

The historical experience teaches us that every theory that was accepted at a certain time was later inevitably replaced by a deeper and more fundamental theory. There is no reason why quantum theory should be an exception in this respect. At present, quantum theory has been tested against very specific alternative theories, such as hidden variables, non-linear Schrödinger equations or the collapse models. The common feature of all of them is that they keep one or the other basic principle of the classical world intact. Yet, it is very unlikely that a post-quantum theory will be based on pre-quantum concepts. In contrast, it is likely that it will break not only principles of classical but also quantum physics. This gives us a motivation for the following research program: 1) To reconstruct quantum mechanics from a set of axioms. 2) To weaken the axioms and to look for broader structures. 3) To test quantum theory against them. Following this approach I will present two tests that can distinguish between quantum theory and more general probabilistic theories.

Alexei Grinbaum
"Quantum observers and Kolmogorov complexity"

Different observers do not have to agree on how they identify a quantum system. We explore a condition based on algorithmic complexity that allows a system to be described as an objective "element of reality". We also suggest an experimental test of the hypothesis that any system, even much smaller than a human being, can be a quantum mechanical observer.




maj : 28-03-2012 (1899)