## Types 2003 Schedule

** **### Wednesday, April, 30

** 9: Opening of the congress **
** Programming Language design. **
9.30-10. Containers and their derivatives. * Thorsten Altenkirch *.

* We shall give an overview of some recent joint work
(FOSSACES 03, TLCA 03) with Michael Abbott, Neil Ghani and Conor
McBride on container types. We shall also discuss Conor's notion of a
derivative of a type in this context.. *

10-10.30. Inductive familes need not store their indices. * Edwin Brady *.

* Dependently typed programming presents many advantages to the programmer, including type safety, additional structural information and the ability to prove properties of programs within the language. However, these advantages present run time overheads. In particular, information which is used to check the type correctness of a program is unused at run time, but is still stored in the program. For example, the indices of an inductive family must be stored at run time if we are to use the structural information they give. This talk will be about such run time issues of dependently typed programs. I will show the use of program transformations to change inefficient but proof friendly data structures to more efficient run time structures. I will give natural numbers and the simply typed lambda calculus as examples of structures which have more efficient run time representations than that given by the programmer. *

* 10-10.30. Coffee break.*

11-11.30. Epigram: an overview. * Conor McBride *. Co-author: James McKinna.

* Epigram is a platform for dependently typed programming. Itsprogramming language is based on that defined in `The view from theleft' (McBride & McKinna, 2003)---a high-level programming syntax which elaborates incrementally into a conventional type theory basedon Luo's UTT. This elaboration proceeds by mapping programming constructs onto tactics in an interactive theorem-prover, hence both interactive and `batch-mode' development are supported.Epigram supports inductive families with freely chosen indices. Its pattern-matching programs are constructed internally to the type theory, with respect not to a hard-wired notion of constructor case,but to the types of arbitrary elimination operators programmable bythe user. This presents us with a wide variety of data structures andand an enriched language for programming with them. Epigram is thus a platform for experimenting with new programming.
*

11.30-12. Type Inference for Nested Self Types. * Viviana Bono *. Co-authors: Jerzy Tiuryn, Pawel Urzyczyn.

* We address the issue of decidability of the type inference problem for a type system of an object-oriented calculus with general selftypes. The fragment considered in the present paper is obtained by restricting the set of operators to the method invocation only. The resulting system, despite its syntactical simplicity, is sufficiently complicated to merit the study of the intricate constraints emerging in the process of type reconstruction. The main result of the paper is the decidability of type reconstruction, together with a certain form of a principal type property.
*

12-12.30. Specification and Verification of Programs With Nested Datatypes:Illuminating Examples. * Ralph Matthes *, University of Munich.

* Nested datatypes such as powerlists or higher-order de Bruijn representations of lambda calculi are families of inductive or coinductive types, indexed over types, where the defining clauses refer to other members of the family - typically with a more complex index. In joint works with A. Abel and T. Uustalu, the author has developped programming schemes within the framework of higher-order parametric polymorphism, i.e., Girard's system F^omega. The question arises how to reason within this syntactical framework about the correctness of programs. The main obstacle is the following: One cannot program a function for one of the family members in isolation although one often only wants to use a specific one. Consequently, a specification for the other members is missing. But since also verification cannot process one of the datatypes in isolation, these specifications have to be provided.
*

* 12.30-14 Lunch *

** Formalizing Theories. **
14-14.30. Formalization of Imperative Object-Calculi in (Co)Inductive Type Theories. * Alberto Ciaffaglione *. co-authors: Luigi Liquori, Marino Miculan.

* We describe a formal development of the (meta)theory of Abadi and Cardelli's impsigma, an object-based calculus with side-effects, in the proof assistant Coq. In order to make the formal development of the theory of impsigma easier, we reformulate the static and dynamic semantics taking most advantage of the features provided by CC(Co)Ind, the coinductive type theory underlying Coq. The new presentation is thus in the style of Natural Deduction Semantics, the counterpart of Kahn's Natural Semantics in Natural Deduction style, using higher-order abstract syntax and hypothetical-general premises a la Martin-L\"of. For a significant fragment of impsigma, we can even use coinductive typing systems, avoiding to deal with ``store types'', thus making simpler the formal proofs of subject reduction and typesoundness. The solutions we have devised in the encoding of and metareasoning on impsigma can be readily applied to other imperative calculi featuring similar issues.
*

14.30-15: C-CoRN, the Constructive Coq Repository of Nijmegen. * Luís Cruz-Filipe *. Co-author: Herman Geuvers.

* We will present C-CoRN, a documented library of constructive mathematics formalized in Coq, and maintained by the Foundations group of the Computer Science Department of the University of Nijmegen. In the talk we will first present the background motivations for this activity (why formalize mathematics on a computer, why work constructively, why make a big library?) Then we will describe the setup of the repository and the way it is organised: The repository is "open" in the sense that people can download it and are invited to contribute to it. However, we only want to add contributions that obey certain standards, which we will describe in the talk.Finally we will give an overview of the content of the repository as it is and the developments which we plan to do in the near future.
*

15-15.30. Typed Ambients in Coq. * Ivan Scagnetto *. Co-author: Marino Miculan

* We present an implementation and formal development of core properties of Typed Ambient Calculus [CGG00] in Coq, using higher-order abstract syntax andNatural deduction semantics. Similarly to the untyped case [SM02], theHigher-Order Abstract Syntax approach allows to formally represent the calculus in aCleaner way, focusing only on the essential features and delegating to theMetalanguage the machinery of alpha-conversion, capture avoiding substitution of namesFor names and schemata instantiation. In order to give a smoothFormal representation of the typing judgment, we first rephrase the original systemIn Natural Deduction style. This allows to delegate to the Coq metalanguageThe (tedious) treatment of typing environments and the related machinery(e.g. swapping of type declarations). As a consequence, the formal proof ofSubject Reduction is substantially simplified since the number of auxiliary lemmataIs greatly reduced and, moreover, we can avoid an explicit treatment ofThe appearance of new groups during reduction. The core of the formalDevelopment is built upon some general rewriting results stating the invariance ofThe typing judgment w.r.t. the replacement of names with fresh ones having theSame type. This is where the Theory of Contexts [HMS01] plays a fundamentalRole allowing a smooth treatment of those structural rewriting lemmata.This work allowed to discover and fix a previously unknown (minor?) flaw inThe original presentation of Typed Ambients.
*** References: **
[CGG00] Luca Cardelli, Giorgio Ghelli, and Andrew D. Gordon
Types for the Ambient Calculus.
To Appear in I&C special issue on TCS'2000.
[HMS01] Furio Honsell, Marino Miculan, Ivan Scagnetto
An axiomatic approach to metareasoning on systems in
higher-order abstract syntax.
In Proceedings of ICALP'01, Number 2076 in Lecture Notes in
Computer Science, pages 963-978, 2001.
[SM02] Ivan Scagnetto and Marino Miculan
Ambient Calculus and its Logic in the Calculus of Inductive
Constructions.
In Proceedings of LFM 2002. ENTCS 70.2, Elsevier, 2002.

* 15-15.30. Coffee break.*

16-16.30. An operational formalization of square matrices in Coq. * Nicolas Magaud * (INRIA Sophia).

* In this work, we build an operational formalization of square matrices. (m,n)-Matrices are represented as vectors of length n. Each vector(a row) is itself a vector whose length is m. Vectors are actually implemented as dependent lists.Square matrices are barely (m,n)-Matrices with n=m. We define basic operations for this datatype (addition, product, neutralelements O_n and I_n). We then prove the ring properties for theseoperations. The development uses Coq modules to specify the interface(the ring structure properties) as a signature.This development deals with dependent types and partial functions. Most of the functions are defined by dependent case analysisand some functions such as getting a column requirethe use of preconditions (to check whether we are within thebounds of the matrix).
*

16.30-17. Nested Data Types in Constructive Type Theory. * Peter Aczel *.

* In a recent paper Abel, Matthes and Uustalu have shown how to
represent nested data types of any finite rank in the impredicative
system F^omega. I will start by presenting what seems to be a
simplification of their representation. In the main part of the talk I
will review a few of the natural examples of nested data types and
outline how they can be represented in a rather weak predicative
dependent type theory that has a scheme for introducing a restricted
kind of finitary inductive indexed type. It would seem that all the
natural examples of nested data types that have been considered so
far can be represented in the same way. *

** **### Thursday, May, 1.

** Logical Frameworks. **
9-10. Invited talk: A Concurrent Logical Framework. * Pfenning Frank *.

* The logical framework LF has been designed to support therepresentation of judgments as types and deductions as objects. Inprior work we have extended this in the linear logical framework(LLF)to represent state via linear hypotheses and imperative computations as linear objects. In this talk we highlight some of the inadequacies of LLF as a framework for concurrency and show how to extend it to incorporate further connectives from intuitionistic linear logic to overcome them. The resulting concurrent logical framework (CLF) uses a monad in order to ensure conservativity over LF and LLF and to obtain a practical definitional equality which identifies computations that differ only in the order of independent steps. The underlying representation principle is that concurrent computations become monadic expressions. We illustrate the expressive power of CLF through several examples.
This talk describes joint work with Kevin Watkins, Iliano Cervesato (NRL), and David Walker (Princeton).
*

10-10.30. A Logical Framework with Dependently Typed Records. Authors: * Coquand, Pollack, Takeyama *.

* Our long term goal is a system to formally represent complex structured mathematical objects, and proofs and computation on such objects; e.g. a foundational computer algebra system. Our approach is informed by the long development of module systems for functionalprogramming based on dependent record types as signatures. For ourlogical purposes, we use a dependently typed base language as well.In this paper we propose an extension of Martin-Lof's logical framework with dependently typed records, and present the semanticfoundation and the typechecking algorithm of our system. While the type theory has eta-conversion, a unit type, singleton types and records with surjective pairing, the semantics of types is PERs overpure untyped lambda terms with equality up to beta-conversion only. Some of the work is formally checked in Coq. We have also implemented and experimented with several related systems. Our proposal combinesa semantic foundation, provably sound typechecking, goodexpressiveness (e.g.subtyping, sharing) and first-class higher-order modules.
*

* 10.30-11. Coffee break. *

* *

11-11.30. PAL+, a lambda-free logical framework, and its implementation.
* P.C. Callaghan, Yong Luo, and Z. Luo *.

* Logical frameworks such as that developed by Per Martin-Lof may be viewedas foundations for type theory and the associated proof systems. We shallgive an introduction to PAL+, a lambda-free logical framework, and explainthat it is an adequate calculus for specifying type theories and a usefulfoundation for designing proof development systems based on type theory.Recent work on a prototype implementation of PAL+ will be discussed.
*

11.30-12. A Modular Approach to Logical Frameworks. * Robin Adams *.

* It is not surprising that certain features of a logical framework, such as local and global definitions, are 'optional', in the sense that the system that includes that power is a conservative extension of the one that does not. Perhaps more surprising is that judgemental equality and first-class lambda abstractions are optional in this sense, and that removing them may have several advantages over including them. In thistalk, I shall present a hierarchy of logical frameworks constructed by choosing a set of powers to add to a basic type-assignment system, and show that several well-known logical frameworks, including ELF and PAL+,are among the systems that appear in this hierarchy.
*

12-12.30. A Framework for Typed HOAS and Semantics.
* Marino Miculan and Ivan Scagnetto *.

* We investigate a framework for representing and reasoning about syntactic and semantic aspects of typed languages with variable binders. First, we introduce typed binding signatures and develop a theory of typed abstract syntax with binders. Each signature is associated to a category of ``presentation'' models, where the language of the typed signature is the initial model. At the semantic level, types can be given also a computational meaning in a (possibly different) semantic category. In general, this category is connected to the presentation category by an adjunction, which allows for reflecting at the syntactic level semantic aspects of terms and variables. Thus, the category of presentation models is expressive enough to represent both the syntactic and the semantic aspects of languages. We introduce then a metalogical system, based on the internal languages of the presentation category, which can be used for reasoning on both the syntax and the semantics of languages. The metalogical system is composed by a core equational logic tailored for reasoning on the syntactic aspects; when a specific semantics is chosen, the system can be modularly extended with further ``semantic'' notions, as needed.
*

* 12.30-14 Lunch *

** Type System Design **.
14-14.30. Modules in Coq with rewriting. * Jacek Chrzaszcz *.

* We add ML-style modules to a calculus consisting of an arbitrary pure type system, inductive types and definitions by rewriting. We show that the module extension is conservative, provided the syntactic conditions on inductive types and definitions by rewritingare closed under substitutions of variables to variables. Besides, any typing algorithm for the system without modules can be extended toaccount for modules as well.The syntactic restrictions on module expressions are similar to those by Xavier Leroy presented at POPL'94. The theoretical analysis is inspired by the work of Judicael Courant.The presented module system has been successfully implemented in the Coq proof assistant version 7.4. It is compatible with futureextension of Coq with a rewriting mechanism.
*

14.30-15. Combining Testing and Proving in Dependent Type Theory.
* Qiao Haiyan *. Co-authors: Peter Dybjer, Makoto Takeyama

* We extend the proof assistant Agda/Alfa for dependent type theory with a modified version of Claessen and Hughes' tool QuickCheck for random testing of functional programs. In this way we combine testing and proving in one system. Testing is used for debugging programs and specifications before a proof is attempted. Furthermore, we demonstrate by example how testing can be used repeatedly during proof for testing suitable subgoals. Our tool uses testdata generators which are defined inside Agda/Alfa. We can therefore use the type system to prove properties about them, in particular surjectivity stating that all possible test cases can indeed be generated.
*

15-15.30. Induction and Co-induction in Sequent Calculus.
* Alberto Momigliano *. Co-authors: Dale Miller, Alwen Tiu

* Proof search has been used to specify a wide range of computation systems. In order to build a framework for reasoning about such specifications, we make use of a sequent calculus involving induction and co-induction. These proof principles are actually based on a proof theoretic notion of definition, following on work by Hallnas &Schroeder-Heister, Girard, and McDowell & Miller. Definitions are essentially logic program which are stratified so as toavoid cyclic calling through negations. The introduction rules for defined atoms treat the definitions as defining fixedpoints. The use of definitions also makes it possible to reason intensionally about syntax since the inference rules that introduce defined atoms make use of unification and matching, and both of theseoperations can be used on first-order terms or higher-orderlambda-terms. The full system thus allows inductive and co-inductiveproofs involving higher-order abstract syntax.We extended the earlier work by allowing induction on generaldefinitions and show that cut-elimination holds for this extension.For example, proofs about the operational semantics for mini-ML arequite natural using such induction.We extend the logic further by adding co-induction rules. To maintainconsistency, mutually recursive definitions involving induction andco-induction need to be avoided. While we have proved cut-elimination for some interesting uses of co-induction, we have not proved yet that it holds for the full system. We present some examples involving bisimulation. A prototype implementation of much of this logic is available in theHybrid system implemented on top of Isabelle/HOL.We consider two additional extensions to this proof system. Oneis the use of circular proofs as a way to lazily determine the post-fixpoint to use in co-inductive proofs. Another is to integratethe nabla quantifier, a recently proposed quantifier that provides adirect way to deal internally with the notion of freshness, thereby allowing proofs involving higher-order abstract syntax to be used in an even more direct way.
*

* 15.30-16: Coffee break. *

16-16.30. Why: a multi-language multi-prover verification tool.
* Jean-Christophe Filliatre * (LRI, Université Paris Sud)

* This talk introduces the verification tool Why. This tool produces verification conditions from annotated programs given as input. It differs from other systems in that it accepts several languages as input (currently C and ML, and Java with the help of the companion tool Krakatoa) and outputs conditions for several existing provers (currently Coq, Pvs, HOL Light and haRVey). It also provides a great safety through some de Bruijn criterion: once the obligations are established, a proof that the program satisfies its specification is built and type-checked automatically.
*

16.30-17. Rank 2 itersection types for modules. * Damiani Ferruccio *.

* We propose a rank 2 intersection type system for a language of
modules built on a core ML-like language. The principal typing
property of the rank 2 intersection type system for the core
language plays a crucial role in the design of the type system
for the module language. We first consider a "plain" notion of
module, where a module is just a set of mutually recursive
top-level definitions, and illustrate the notions of:
- module intrachecking (each module is typechecked in isolation
and its interface, that is the set of typings of the defined
identifiers, is inferred);
- interface interchecking (when linking modules, typechecking is
done just by looking at the interfaces);
interface specialization (interface intrachecking may require
to specialize the typings listed in the interfaces);
- principal interfaces (the principal typing property for the type
system of modules); and
- separate typechecking (looking at the code of the modules
does not provide more type information than looking at their
interfaces).
Then we illustrate some limitations of the "plain" framework
and extend the module language and the type system in order to
overcome these limitations. The decidability of the system is
shown by providing algorithms for the fundamental operations
involved in module intrachecking and interface interchecking.
*

** 17.30-19.30 BUSINNES MEETING **

** **### Friday, May 2

** Constructivism. **
9-10. Invited Talk. Are the objects of propositional attitudes propositions in the sense of propositional and predicate logic? * P. Martin Lof *.

* On the one hand, propositions serve as the objects of propositional attitudes. On the other hand, propositions and propositional functions are the objects on which the logical operators, the connectives and the quantifiers, operate. The question to be discussed is whether we are really dealing with one and the same concept of proposition in the two cases.
*

10-10.30. TBA. * Giovanni Sambin *.

* *

* 10.30-11. Coffee break. *

11-11.30. TBA. * Berardi Stefano *.

* *

11.30-12. Witnesses in Constructive Analysis. * Helmut Schwichtenberg *.

* We modify some seminal notions from constructive analysis, byproviding witnesses for (strictly) positive quantifiers occurring intheir definitions. For instance, a real number is viewed as a modulated Cauchy sequence of rationals, and a continuous function(which is given by its values on rationals only, and hence is a typeone object) comes with (uniform) moduli of continuity and o fCauchyness. This approach allows in many cases to avoid usage of theaxiom of (countable or dependent) choice, and seems suitable for program extraction.
*

12-12.30. Induction and co-induction at work in formal topology.
* S. Valentini *. Co-author: M.E.Maietti.

* We enhance the method for the inductive generation of a cover relation by adding a co-inductive generation of a positivity predicate. This method allows to construct, for any inductively generated cover, the formal topology whose cover extends in the minimal way the given one. In an impredicative setting, this result states that open locales are coreflective in locales, that is, the inclusion functor of open locales into locales has a right adjoint. All our proofs are developed within Martin-Loef predicative eory, but at present to justify the co-inductive definition of the positivity predicate we use Tarski fixed-point theorem for monotone operators.*

** Category Theory. **
12.30-13. Abstract Stone Duality. * Paul Taylor * (di.unito.it/~pt/ASD).

*
Abstract Stone Duality (ASD) is a type theory in which the topology on
a space is an exponential with a $\lambda$-calculus, not an infinitary
lattice. But instead of rewriting old proofs in a pre-conceived
logic, it exploits a deep mathematical theme, Stone duality,
reconciling conceptual and computational traditions in mathematics.
Its term model is equivalent to the category of recursively based
locally compact spaces and recursive continuous functions, wherein
the topology on the natural numbers is the lattice of RE sets.
It has a very natural treatment of many basic notions in topology.
Since directed unions are automatic, it highlights the lattice duality
between open and closed notions, which must normally be studied quite
separately.
This talk will concentrate on the type-theoretic formulation of ASD.
I will show how the original categorical ideas (Sigma classifies both
open and closed subspaces, and the self-adjoint functor Sigma^(-) is
monadic) have been developed into a lambda calculus that is usable
by both programmers and topologists. In particular, the "sobriety"
axiom provides general recursion, and there is an elimination theorem
for the "Sigma-split subtypes".
Finally, an extra axiom (that gives any space an "underlying set" but
it not computable) makes ASD equivalent to the theory of locally
compact locales over an elementary topos.
*

* 13-14 Lunch *

** Type Theory and Semantic **.
14-14.30. Reduction Systems and Isomorphisms of Types. * Sergei Soloviev *. Co-author: David Chemouil

* We shall present our work on the project where we are exploring the possibilities of extensions preserving strong normalisation and confluence of standard reduction systems by new reductions of the form f^{-1}(ft))-->t where f^{-1} and f are in certain sense mutually inverse. In particular isomorphisms of products (considered as inductive types) and isomorphisms between isomorphic copies of inductive types are studied.
*

14.30-15. Set-theoretic Functors. * Furio Honsell *. Co-author: Daniela Cancila, Marina Lenisa.

* We investigate the interplay between the two conceptual
frameworks of set-theory and category theory, in the context
of coalgebras and non-wellfounded sets.
Starting from Aczel's Special Final Coalgebra Theorem and
Aczel-Mendler's Final Coalgebra Theorem, we carry out an analysis of
some important families of functors on Class, that is the category of
sets and classes of a possibly non-wellfounded universe.
In particular, we study the following families of functors: inclusion
preserving, set-based, uniform on maps. Moreover, we solve a number of
open questions concerning the existence of set-theoretic functors,
given their value on objects, e.g. constant or identity.
*

15-15.30. Incomplete Proofs and Terms. * G. Jojgov *.

* In this talk we will consider the problem of representing incomplete proofs and terms. We analyze the possible kinds of incompleteness and the translation of incomplete objects to type theory. This analysis yields a novel kind of incomplete objects where one can represent terms with variables that are intended to be bound, but whose binders are not constructed yet. We propose an extension of the logic and type theory based on hereditary parametrization that can handle such incomplete objects. We re-establish the formulas-as-types interpretation for this extension.
*

* 16-16.30. Coffee break. *

16-16.30. Classical proofs and typed processes. * Silvia Ghilezan * (University of Novi Sad, Serbia). Co-author: Pierre Lescanne (Ecole Normale Superieure de Lyon, France).

* Curien and Herbelin provided a Curry and Howard correspondence between classical propositional logic and a computational model called lambda-mu-mu calculus, which is a calculus for interpreting sequents. A new terminology for lambda-mu-mu calculus in terms of pairs of callers--callees which we name capsules enlightens a natural link between lambda-mu-mu calculus and process calculi. In this work we propose an intersection type assignment system for lambda-mu-mu calculus and investigate its properties.
*

16.30-17. Termination of rewriting in the Calculus of Constructions. * Daria Walukiewicz-Chrzaszcz *.

* It is shown how to incorporate rewriting into the Calculus ofConstructions and it is proven that the resulting system is stronglynormalizing with respect to beta and rewrite reductions. Moreprecisely, a decidable criterion called HORPO (Higher Order RecursivePath Ordering) on rewrite rules is presented and it is shown that inthe Calculus of Constructions extended by any set of rules satisfyingHORPO all reductions are finite. The criterion is general enough toaccept definitions by rewriting of many well-known higher orderfunctions, for example dependent recursors for inductive types orproof carrying functions.The subject of the work is concerned both with the research on thehigher-order rewriting and on interesting extensions of the Calculusof Constructions. A practical motivation of this work is theperspective of extending the proof assistants based on the Calculus of Constructions, with a rewriting mechanism. The talk will briefly present the results described in Walukiewicz doctoral thesis defended in April 2003.*

** 19.30: SOCIAL DINNER. **

** **### Saturday, May, 3

** Linguistic. **
9-9.30. Type Theory on a PDA: From Grammars to Gramlets.
* Kristofer Johannisson *. Co-authors: Markus Forsberg, Janna Khegai, Aarne Ranta.

* A gramlet is a program automatically produced from a GF (Grammatical Framework) grammar. GF is a grammar formalism based on Type Theory. Given a GF grammar, the corresponding gramlet provides some of the functionality of GF specialized for that grammar. The gramlet could for instance function as a syntax editor in which you can edit and linearize the syntax trees described by the grammar. One application of gramlets is for using GF on handheld computers.
*

** Formalizing Proofs **.
9.30-10. Extracting Test Cases from Correctness Proofs. * Savi Maharaj *. Co-author: Jeremy Bryans

* We have been investigating the idea that, given a formal proof of the correctness of an abstract model of a system to be developed, one can then extract data to be used in testing concrete implementations of that system. Our hypothesis is that this approach will address shortcomings inherent in both proof and testing: it will bolster the proof approach by bridging the gap between the abstract model whose correctness is proved and the actual implementation, and it will improve testing by providing test data which is especially effective at finding flaws, by virtue of the way that it has been derived.
To make this idea more specific, we have been looking at correctness proofs carried out within Coq, using techniques such as Filiatre's approach for verifying imperative programs. Our intention is to extract test cases from the proof objects associated with these correctness proofs. The work is in its preliminary stages.
*

10-10.30. Examples of applications of a proof assistant: From free modules on principal rings to a boiler. * Frédéric Ruyer *, université de Chambéry.

* What can we do with a proof assistant? On a particular example (the Phox proof assistant), we will develop two different topics related to this question:
*

- The verification of mathematical proofs : developping the proof
on a non trivial result in algebra , we have pointed out some
problems of the expressivity in HOL, and the translation of
intuistically obvious proofs into less understandable formal proofs.

- Application of a prover to program verification : we are
developping a tool linked to the B method, an industrial method of
programmation based on Hoare's logic, in the Bcaml project develloped at
the INRETS.The aim is to get both a powerfull and friendly verifier of
proof obligation.

We will compare our work with what has be done in other provers.

* 10.30-11. Coffee break. *

11-11.30.Formal proof sketches versus Lamport-style proof. * Freek Wiedijk *.

* In his "How to Write a Proof", Leslie Lamport presented
a method for the development of hierarchical proofs using
top-down refinement. His approach has been implemented in
the IMP project of Paul Cairns and Jeremy Gow. They created
on-line topology course notes in which the proofs can be
interactively browsed in Lamport's format.
A related way of presenting a proof is through a "formal
proof sketch". In a formal proof sketch one uses the proof
language of the Mizar system, but leaves out many of the
proof steps and most of the references between steps that are
needed to turn it into a fully checkable Mizar formalization.
In contrast with Lamport's proofs, formal proof sketches
look similar to the proofs of ordinary textbook mathematics.
In this talk I will compare both proof styles. I will
present some examples, and will discuss the main differences
between the two approaches.
*

11.30-12. Formalizing Higman's lemma in Isabelle.
* Stefan Berghofer (Muenchen) *. Co-author: Monika Seisenberger (Swansea).

* Higman's lemma, which can be seen as a specific instance of Kruskal's theorem, is an interesting result from the area of combinatorics, which has often been used as a test case for theorem provers. We present a constructive proof of Higman's lemma in Isabelle, based
on a paper proof by Coquand and Fridlender. Making use of Isabelle's newly-introduced infrastructure for program extraction, we show how a program can automatically be extracted from this proof in Isabelle, and analyze its computational behaviour.
*

12-12.30. Program extraction from large proof developments.
* Bas Spitters *. Co-authors: Luis Cruz-Filipe.

* It is well known that mathematical proofs often contain (abstract) algorithms, but although these algorithms can be understood by a human, it still takes a lot of time and effort to implement this algorithm on a computer; moreover, one runs the risk of making mistakes in the process. From a fully formalized constructive proof one can automatically obtain a computer implementation of such an algorithm together with a proof that the program is correct. As an example we consider the fundamental theorem of algebra which states that every non-constant polynomial has a root. This theorem has been fully formalized in the Coq proof assistant. Unfortunately, when we first tried to extract a program, the computer ran out of resources. We will discuss how we used logical techniques to make it possible to extract a feasible program. This example is used as a motivation for a broader perspective on how the formalization of mathematics should be done with program extraction in mind. *

* 12.30-14: lunch. *

** Demo Session **: 14-15.30.
PAF!, a proof assistant for ML programs verification.
* Sylvain Baro, Yves Legrangérard *, PPS, Université Paris 7.

* This demonstration will present a prototype of the system PAF!, a proof assistant dedicated to the proof of programs written in a functional subset of the ML language including: `let rec' definitions, inductive polymorphic data types and pattern matching. The underlying logic is a second order multi sorted predicate calculus where terms are those of ML programs. The evaluation rules of ML (whr) are embeded in the logic and play the role of an equational theory. The system support structural induction over user defined recursive data types. It provides the user with all classical second order logic features via a set of tactics. On the implementational vue point, our system makes use of object oriented approach. This allows the programmers of the system to extend the set of tactics in a rather simple and fully safe way. The prototype will be presented with it's graphics user interface, which allows "full page edition", i.e. insertion or deletion at any point in the edited document (definitions, proofs, etc.), opposed to usual command line interfaces. The graphics user interface is independent from the logic kernel and they communicate using a dedicated binary client/server protocol. The logic kernel is written in Objective Caml and the graphics user interface with Python/Tk.
*

A prototype of PAL+, a lambda-free logical framework.
* P.C. Callaghan, Y. Luo, and Z. Luo *.

* We are going to demonstrate a recent implementation base on PAL+, a lambda-free logical framework., and deliver a different flavour from lambda calculi.
*

Epigram: the story so far. * Conor McBride. * Co-author: James McKinna.

* Epigram is currently under development. I shall demonstrate its implementation, in whatever state it has reached, and discuss its
possibilities.*

Tmcoq: Coq Integration within TeXmacs. * Philippe Audebaud *.

* *

Modules in Coq 7.4. * Jacek Chrzaszcz *. Co-author: Pierre Letouzey.

* Pierre Letouzey will present the extraction mechanism. *

* 15.30-16: coffee break. *

** Optional Session (extra room for demos and talks) **: 16-17.

Conferences and some demos will take place in Room A, using a projector. In Room B there are 10 cables with IP number: you may use them for demo and mail, but you need your laptop. In Room C there are 20 computers ranged in a classroom, for mail and experimenting with proof assistants. A projector is available for demos.