| | | | | | | webmail : intra-extra| Accès VPN| Accès IST| Contact
Physics and Computation
June 28-30, 2012

Workshop "Physics and Computation"

Dates: June 28-29, 2012

Jointly organized by LARSIM and QuPa with support from ANR CausaQ

Venue: Institut Henri Poincaré

 

June 28

 

9:00-9:15 Opening / Coffee served

9:15-10:15 Dan Browne (University College London)

10:15-10:30 Coffee

10:30-11:30 Marc Kaplan (LTCI, CNRS and Télécom Paristech)

11:30-12:30 Thomas Lawson (LTCI, CNRS and Télécom Paristech)

Lunch

14:30-15:30 Karl Svozil (Technical University of Vienna)

15:30-16:30 Jeffrey Barrett (University of California in Irvine)

16:30-17:00 Coffee

17:00-18:00 Bob Coecke (University of Oxford)

 

June 29

9:15-10:15 Gilles Dowek (INRIA)

10:15-10:30 Coffee

10:30-11:30 Pablo Arrighi (Université de Grenoble)

11:30-12:30 Maël Pégny (Université Paris I and LARSIM, CEA-Saclay)

Lunch

14:30-15:30 José Félix Costa (Technical University of Lisbon)

15:30-16:30 Florent Franchette (Université Paris I)

16:30-17:00 Coffee

17:00-18:00 Olivier Bournez (Ecole Polytechnique)

 

Titles/Abstracts

Dan Browne
"Bell Inequalities and Computation"
What does it mean to violate a Bell inequality? In my talk, I will report on recent work [1,2,3] which shows that a clean operational answer this question can be given in the framework of measurement-based quantum computation.
[1] J. Anders and D.E. Browne, Phys. Rev. Lett. 102, 050502 (2009)
[2] M. J. Hoban et al, New J. Phys. 13 023014 (2011).
[3] M.J.  Hoban and D.E. Browne, Phys. Rev. Lett. 107, 120402 (2011).
[4] H. J. Briegel, et al, Nature Physics 5 1, 19-26 (2009).

Marc Kaplan
"Simulating equatorial measurements with finite expected communication cost"
The communication cost of simulating probability distributions obtained by measuring quantum states is a natural way to quantify quantum non-locality. While much is known in the case of bipartite entanglement, little has been done in the multipartite setting. In this paper, we focus on the GHZ state. Specifically, equatorial measurements lead to correlations similar to the ones obtained with Bell states. We give a protocol to simulate these measurements on the n-partite GHZ state using O(n^2) bits of communication on average.

Thomas Lawson
"Adversarial entanglement verification in realistic settings"
We address the problem of guaranteeing entanglement between several distrustful parties under various assumptions. A test is provided such that, given n parties each in possession of one part of a multipartite quantum state, an honest player can guarantee and quantify entanglement. Furthermore, this is extended to allow a guarantee for all honest parties.  We then demonstrate that this model can be made viable in a realistic setting with experimental losses and errors. Finally, a novel method of witnessing entanglement when each party has unaligned reference frames is described. It is shown that two parties can check for entanglement even if one party's equipment is untrusted and no shared reference frame is available to them. Indeed, the lack of reference frame can even contribute to the security of the protocol.

Karl Svozil 
"Can there be randomness in quantum coin tosses involving beam splitters? On the relationship between quantum randomness and Turing computability"
Although quantum randomness, or at least Turing incomputability from quantum coin tosses involving beam splitters, is often judged to be the "ultimate" randomness, and even "the message of the quantum," some questions with regard to its conceptual foundation arise. For instance, an ideal beam splitter can be modelled by a unitary transformation, which is a (Laplacian type causal) one-to-one isometric transformation in Hilbert space. No information is lost in such devices incapable of irreversibility. Operationally this can be demonstrated by the serial composition of two beam splitters, which effectively renders a Mach-Zehnder interferometer yielding the original quantum state in its output ports. One may say that the "randomness resides" in the (classical) detection of, say, the photons, after the half-shivered mirror,. But, as pointed out by Everett, this is just an idealization, as quantum mechanics, and in particular the unitary quantum evolution of all components involved in the detection, in order to be universally valid, at least in principle must hold uniformly. This also relates to the issue of "quantum jellification" posed by the late Schroedinger with regards to the coherent superposition and co-existence of classically contradictory physical states; at least in the absence of irreversible measurements. But how can irreversibility "emerge" from irreversibility? Besides "for-all-practical-purposes" being effectively irreversible, in principle there does not seem to exist any conceivable unitary route to irreversibility; at least if quantum theory is universally valid. One possibility to circumvent this conundrum is the postulate of "context translation," in which a mismatch between the preparation and the measurement results in the "translation" of the original information encoded by a quantum system into the answer requested, thereby introducing noise through the many degrees of freedom of a suitable "quasi-classical" measurement apparatus.

Jeffrey Barrett
 "On the Limits of Physical Computation"
An α-Turing machine, for an arbitrary ordinal α, is a generalization of a standard Turning machine, and typically much stronger. An α-Turing machine can complete a Θ-step computation for any ordinal Θ<α. A natural question, then, concerns what α-Turing machines might be physically realizable. In general, this depends on (1) the details of one's physical theory and (2) the details of one's computational model. More specifically, there is a strict limitation on the strength of α-Turing machines that may arise from the structure of spacetime itself. In particular, one can show that an α-Turing machine is realizable in standard spacetimes only if α is at most countably infinite. Further, while there are spacetimes where uncountably infinite computations would be possible, there is good reason to suppose that such nonstandard spacetimes are nonphysical. This suggests a natural revision of Church's thesis regarding the limits of physical computation.

Bob Coecke
"Canonical complementarity, non-locality and exponential speedup in generalized process theories"
We establish a tight relationship between three key quantum   theoretical notions: non-locality, complementarity, and exponential speed-up. In particular, we establish a direct connection between Mermin-type non-locality scenarios, which we generalise to an arbitrary number of parties, using systems of arbitrary dimension, and performing arbitrary measurements, a new stronger `canonical' notion of complementarity, and hidden subgroup algorithms. Our derivation of the fact that canonical complementarity is a necessary condition for a Mermin scenario provides a crisp operational interpretation for canonical complementarity. We also provide a complete classification of canonical complementary observables for quantum theory. Since our main results are expressed in a diagrammatic language (the one of dagger-compact categories) they can be applied outside of quantum theory, in any setting which supports the purely algebraic notion of canonical complementary observables. The diagrammatic calculus substantially simplifies (and sometimes even trivialises) many of the derivations, and provides new insights. In particular, the diagrammatic computation of correlations clearly shows how local measurements interact to yield a global overall effect. In other words, we depict non-locality.
Part of this is joint work with Ross Duncan, Aleks Kissinger and Quanlong (Harny) Wang. Based on 1203.4988.

Gilles Dowek
"The Principle of Finite Information Density"
Some arguments in favor of the hypothesis of a computable universe are based on two symmetric hypotheses of a finite velocity and finite density of information. I will try in this talk to investigate the impact of the second on the algebraic formalism used to describe quantum theory. I will compare this hypothesis to related ideas in the theory of formal languages: in both cases emerges a notion of an actual but modest infinite.

Pablo Arrighi
"Causal Graph Dynamics"
We extend the theory of Cellular Automata to arbitrary, time-varying graphs. In other words we formalize, and prove theorems about, the intuitive idea of a labelled graph which evolves in time - but under the natural constraint that information can only ever be transmitted at a bounded speed, with respect to the distance given by the graph. The notion of translation-invariance is also generalized. The definition we provide for these `causal graph dynamics' is simple and axiomatic. The theorems we provide also show that it is robust. For instance, causal graph dynamics are stable under composition and under restriction to radius one. In the finite case some fundamental facts of Cellular Automata theory carry through: causal graph dynamics admit a characterization as continuous functions, and they are stable under inversion. The provided examples suggest a wide range of applications of this mathematical object, from complex systems science to theoretical physics.

Maël Pégny
"Aaronson's No Supersearch Principle: Its Meaning for Physics and Complexity theory"
In recent years, MIT quantum complexity theory S. Aaronson has defended the view that physicists should adopt a new physical principle, the No SuperSearch Principle : there is no physical means to solve  NP-complete problems in polynomial time. I will discuss this proposition and try to guess what it might reveal about the relationships between physics and complexity theory.

José Félix Costa
"The Computational Power of Experiments in Physics"
Earlier we developed a theory of combining algorithms with physical systems based upon using physical experiments as oracles to algorithms. Although our concepts and methods are general, each physical oracle requires its own analysis based upon some fragment of physical theory that specifies the equipment and its behaviour. For specific examples of physical system (mechanical, optical, electrical)  the computational power has been characterised using non-uniform complexity classes. The power of the known examples vary according to assumptions on precision and timing but seem to lead to the same complexity classes, namely P/log* and BPP/!/log*. In this talk we address sets of axioms for the interface between physical equipment and algorithms that allow us to prove general characterisations, in terms of P/log* and BPP/!/log*, for large classes of physical oracles, in a uniform way. Sufficient conditions on physical equipment are given that ensure a physical system satisfies the axioms. Joint work with Edwin Beggs and John V. Tucker.

Florent Franchette
"Is the ‘verification problem’ a true problem for hypercomputation?"
Alan Turing devised the Turing machine, which is in logic the formalisation of the notion of a computable function. Turing is also behind an another model called "oracle Turing machine", which is able to compute more functions than the Turing machine, called "hypercomputation". Although many hypercomputation models have been devised, the notion of hypercomputation is not fully accepted by scientists and philosophers. The debate concerns the claim that it is possible to physically build a hypercomputation model ("hypercomputation thesis"). To prove this thesis, one possible strategy could be to physically build an oracle Turing machine. However, there is a recurring epistemological problem about the physical construction of this hypercomputation model. This problem called the "verification problem" may be phrased as follows: if we assume that we have an oracle Turing machine physically built, it would be impossible to verify that this model is able to compute a non Turing-computable function. In my presentation I will propose an analysis of the verification problem in order to show that it does not explicitly dispute the strategy about a physical construction of an oracle Turing machine.

Olivier Bournez
"Comparing analog models of computation to digital models of computation"
In this talk, we will survey a few results comparing some analog models of computations to classical modern digital models such as Turing machines. Considered analog models will include the General Purpose Analog Computer from Claude Shannon, a model of Differential Analyzers built for the first time in the 30's. We will wonder if these models do satisfy the Church-Turing thesis, as well as the Effective Church-Turing Thesis. This will lead us to talk about the complexity of solving differential equations using digital means.

 

Organization: , Damian Markham, Eleni Diamanti, Alexei Grinbaum

 
#1900 - Last update : 06/19 2012

 

Retour en haut