Jean-Jacques Lévy's Bio [back]

  • I worked in Operating Systems [1969-72], Semantics of Programming Languages and Syntax of the lambda calculus [1972-78], Term Rewriting Systems [1978-1980], CAD for VLSI [1981-84], sm90 Unix kernel [1985-87], Parallel functional languages [1987-93], Distribution and mobility [1994-2000]. I taught at Ecole Polytechnique (Palaiseau, France) [1989-2006] and managed the new Microsoft Research-INRIA Joint Centre in Saclay [2006-2012]. I am INRIA Emeritus since 2012. I joined the Pi.r2 Inria research-team [2014] at PPS (Université Paris-Diderot) and now work on the lambda-calculus and formal proofs of programs

  • Lambda-calculus:
    This area is related to Mathematical Logic and Models of Computation. Church invented the lambda-calculus in 1935 as a model of computation with functions. For instance, the identity function I is such that I(x) = x for all x. Therefore I(I) = I. Similarly the constant function K is such that K(x,y) = x for all x and y. Hence K(K,I) = K. Again let D(x) = x(x) for all x, then D(D) = D(D) ! These are nonsenses in standard mathematics, since a function cannot be applied to itself. But in computer science, these equations look natural. For instance, the evaluation of D(D) is just looping. Church proved that the lambda-calculus is as expressive as other models of computation, which was a striking result for a calculus with the sole functions. This led to his famous Church-Turing thesis, claiming that all general models of computation have the same expressiveness. But the lambda-calculus remained an obscure object reserved to logicians during a long period. In 1970, Scott gave a renewal of this calculus with its first functional model (in the logical sense). It was the kick off for the studies of meanings of programs and the denotational semantics of programming languages. Scott's idea was to incorporate non-termination inside functions at any order of functionality and to restrict denotations to "continuous" functions. For the lambda-calculus, Scott's model was a strong hint for the discovery of new properties of its syntax.

  • Optimal reductions in the lambda-calculus:
    My PhD work focused on safe and optimal evaluation strategies of the lambda-calculus. Safe computations are terminating; optimal computations minimize the number of computation steps to the result. For the latter, the difficulty is to define and implement sharing in the evaluation of functions. In my PhD, I designed a theory of Redex Families which extends the classic notion of residuals and which characterizes the redexes (reductible expressions) to share. These families are invariant w.r.t. permutations of reductions which by themselves give another interesting theory. The so-called Levy's labeled calculus permits to give computable names to redex families. But optimal strategies were difficult to implement until 1990 when Lamping provided a solution. With Abadi and Gonthier, we simplified Lamping's algorithm by relating it to Girard's geometry of interaction. In practice, these optimal strategies correspond (for weaker calculi) to lazy evaluators of functional programming languages such as Haskell or LML. My work on reduction strategies is the basis of chapter 13 in Barendregt's reference book about the Syntax and Semantics of the lambda-calculus and chapter 8 in Klop's Terese book. I also worked on free models with Bohm trees and proved the continuity theorem, as a corollary of completeness of inside-out strategies.

  • From the lambda-calculus to other calculi:
    I was a promoter (with Abadi, Cardelli and Curien) of Explicit Substitutions[1990]. It is a refined calculus of the lambda-calculus, which was studied at length by several authors, mainly to model run-time environments of programming languages. Another calculus related to my labeled lambda-calculus was the calculus of dependencies and incremental computations with Abadi and Lampson[1997], which was implemented and patented in the DEC/SRC Vesta makefile system (Levin et al). This dependency calculus has also been used and extended in various works about Information Flow for Security of Computing Systems. With Huet, we ported these results about the lambda-calculus to the general theory of Term Rewriting Systems [1980] and developed strongly sequential systems, as a basis for further theories on efficient pattern-matching in programming languages such as ML or Haskell. Klop wrote a book on these sequential systerms. Finally, Boudol, Castellani and Laneve showed the connection of permutations of reductions to Winskel's theory of events structures in concurrent systems. Indeed the labeled calculus reflects causality within various models of computations (e.g. reversible calculi for biology by Danos and Krivine).

  • Concurrency and distributed systems:
    Milner invented calculi to model concurrent processes, namely programs with several processes running in parallel. All computer scientists know that concurrency is a nightmare where debugging is Mission Impossible. Therefore concurrency is a wide area of research to discover the right tools for writing correct programs. Our contribution was the Join Calculus with Fournet, Gonthier, Maranget and Rémy[1997]. The Join Calculus is claimed to be the right language for abstract distributed implementions of Milner's pi-calculus and to avoid the distributed consensus problem. In this calculus, synchronization is based on Join Patterns. The Join Calculus induced the Jocaml implementation by le Fessant and Maranget on top of Ocaml and the Polyphonic C# by Russo on top of C# 2.0 and Join Patterns in Visual Basic 9.0 included in Microsoft Visual Studio. Another output of the Join Calculus was the applied pi-calculus of Abadi and Fournet used for the study of Security protocols.

  • Real Programming languages:
    I luckily worked in a central area with applications to compilers, run-times and designs of programming languages. Functional languages (Lisp, Haskell, Ocaml, ML, F#, Vesta, etc) use pattern-matching and higher-order functions. The semantics of imperative and object-oriented programming languages is also influenced by the semantics of the lambda-calculus. The study of evaluation rules in the lambda-calculus hints optimizations in the efficiency of modern proof-assistants such as Coq, HOL or Isabelle. More generally, the lambda-calculus makes a tight connection between computer science and mathematical logic. As a practical outcome, in my Para group at INRIA, we developed several concurrent implementations of functional programming languages (Concurrent LML, Concurrent Caml, Jocaml). In 1996, Gilles Kahn asked us to work on the debugging of the Embedded Software of the Ariane 5 rocket after its explosion due to a software bug (-300 millions euros). At EADS (les Mureaux, France), we made a static analysis of concurrent accesses to the numerous shared variables of the on-board flight programs. This successful ADA code inspection was based on the fantastic alias analyzer of Deutsch and the outstanding review of Gonthier. We were declared experts of space programs; I headed an international review of the embedded code of the ISS Columbus european module in 1997 at Matra Marconi Space (Toulouse) and DASA (Bremen); we also contributed to the new programming rules for CNES and ESA!

  • More general programming:
    A less important part of my research has been related to Operating Systems, Interactive Graphics and CAD. In my INRIA early days, I wrote the LP70 compiler (LP70 was the implementation of the Esope time-sharing system) [1970] and a text editor for Esope [1971]. Later I wrote a raster-op graphics package for LeLisp [1983], the Luciole layout editors for VLSI circuits [1984] written in LeLisp, a Unix event driver (multiplexing keyboard and mouse events) for the french sm90 workstation [1986]. Although less prestigious than theoretical papers, I always considered that part of my work as my most difficult achievements.

  • Program verification checked by computer:
    In my research-teams at INRIA and MSR-INRIA, we have experience of very long proofs checked with computer assistance: correctness of the non-obstrusive concurrent garbage collector of C-Caml (proved with the Larch prover by Doligez and Gonthier), the four-color theorem in planar graph theory (proved with Coq by Gonthier), the Feit-Thompson theorem in finite groups theory (proved with Coq/Ssreflect by Gonthier and his Math-Components group). There are long proofs in mathematics, there are long proofs in computer science to show properties of programs or security of software. We definitely want to make critical software proved 100% correct. My current research plans to make correctness proofs of standard algorithms, totally checked by computer in Why3 + Coq/Ssreflect. Why3 is (to my opinion) the best system (with Frama-C, Spec#, F*) for generating proof obligations on top of a programming language with Hoare logic assertions. I want to study the feasability of mixing automatic tools and interactive systems to prove properties of standard algorithms (as already achieved in Filliâtre's repository).

  • Teaching:
    At Ecole Polytechnique, I was in charge (with Cori) of the first-year grand Programming Course ("Algorithmes et Programmation", 430 students/year) [1991-2000]; I taught the second-year course on Basics of Computer Science ("Informatique Fondamentale", 200 students/year) [2000-2006] and several third-year courses in Majeures; I was coordinator of the Entrance CS Examination during 13 years. I also gave graduate level courses at the MPRI (Univ. of Paris 7).

  • Administration:
    I participated to program committees of POPL'91 (Orlando), PLDI'92 (Toronto), FPCA'93 (Copenhague), LICS'94 (Paris), ACSC Asian'95 + 96(Pathumthani + Singapore), TACS'97 (Sendai), PLILP'98 (Pisa), ICALP'98 (Aalborg); I co-chaired IFIP TCS 2004 (Toulouse) with E.Mayr; I organized ICALP'01 BOTH workshop (Heraklion). I have been member of the Scientific Council of the Fondation Sciences Mathématiques de Paris [2007-2010]; I was coordinator of 3 european projects (Confer 1-2-3); Head of INRIA research-teams Para [1988-2000], Moscova [2004, 2012]; chairman of the INRIA-Rocquencourt Comité des projets [1994-1997]; vice-president of the INRIA Commission d'Evaluation [1997-2000]. I am also proud of having managed the new Microsoft Research-INRIA Joint Centre in Saclay, France [2006-2012].

  • CV:
    I am born in Toulouse; live in Paris; have 2 sons (Rémi lawyer, Laurent math teacher); I was educated in Nancy; graduated from Ecole Polytechnique [1968]; PhD at University of Paris 7 [1978]; Researcher at INRIA [1970-2012] [junior (Chargé de Recherche) and senior researcher (Directeur de Recherche)]; Member of the Research Staff at Digital Equipment Corporation (PRL) [1987-1989]; Professor, Ecole Polytechnique, Palaiseau [1992 - 2006]; Director of the Microsoft Research-INRIA Joint Centre, Saclay [2006 - 2012]; INRIA Emeritus [2012]; Visiting Professor at the Chinese Academy of Sciences (SKLCS-ISCAS, Beijing) [2013-2014]; consultant at Xerox PARC (Palo Alto, USA) [1984] and DEC/SRC (Palo Alto) [1990].