Venanzio CaprettaPostdoctoral researcher
Computer Science Institute (ICIS)
Radboud University Nijmegen
P.O. Box 9010,
NL-6500 GL Nijmegen
e-mail: venanzio @ cs.ru.nl
We know nothing, not even whether we know or do not know, or what it is to know or not to know, or in general whether anything exists or not.
-- Metrodorus of Chios
Reflections on Type Theory, Lambda Calculus, and the Mind
Essays Dedicated to Henk Barendregt on the Occasion of his 60th Birthday
Erik Barendsen, Herman Geuvers, Venanzio Capretta, Milad Niqui (Eds.)
A collection of essays associated with the celebration of Henk Barendregt's 60th Birthday
These are some articles that I and some coauthors are working on. Click on the title to get the PDF file.
Corecursive Algebras: A Study of General Structured Corecursion
Coauthors: Tarmo Uustalu and Varmo Vene
Slides of two seminars about this topic:
Higher Order Abstract Syntax in Type Theory
Coauthor: Amy Felty
Coq formalization and example application.
A polymorphic representation of induction-recursion
Also in PostScript format.
Common Knowledge as a Coinductive Modality
Reflections on Type Theory, Lambda Calculus, and the Mind. Essays Dedicated to Henk Barendregt on the Occasion of his 60th Birthday, December 2007, pages 51-61 (bibtex entry).
I prove in Coq Aumann's Theorem: In perfect information games, common knowledge of rationality implies backward induction equilibrium. The notion of common knowledge is formalized, using a coinductive definition, as a modality containing an infinite amount of information.
Computation by Prophecy
Coauthor: Ana Bove.
TLCA 2007, pages 70-83 (bibtex entry).
We describe a new method to represent (partial) recursive functions in type theory. For every recursive definition, we define a co-inductive type of prophecies that characterises the traces of the computation of the function. The structure of a prophecy is a possibly infinite tree, which is coerced by linearisation to a type of partial results defined by applying the delay monad to the co-domain of the function. Using induction on a weight relation defined on the prophecies, we can reason about them and prove that the formal type-theoretic version of the recursive function, resulting from the present method, satisfies the recursive equations of the original function. The advantages of this technique over the method previously developed by the authors via a special-purpose accessibility (domain) predicate are: there is no need of extra logical arguments in the definition of the recursive function; the function can be applied to any element in its domain, regardless of termination properties; we obtain a type of partial recursive functions between any two given types; and composition of recursive functions can be easily defined.
Formal Correctness of Conflict Detection for Firewalls
Coauthors: Bernard Stepien, Amy Felty, and Stan Matwin.
ACM Workshop on Formal Methods in Security Engineering, November 2007, pages 22-30 (bibtex entry).
We describe the formalization of a correctness proof for a conflict detection algorithm for firewalls in the Coq Proof Assistant. First, we give formal definitions in Coq of a firewall access rule and of an access request to a firewall. Formally, two rules are in conflict if there exists a request on which one rule would allow access and the other would deny it. We express our algorithm in Coq, and prove that it finds all conflicts in a set of rules. We obtain an OCaml version of the algorithm by direct program extraction. The extracted program has successfully been applied to firewall specifications with over 200,000 rules.
Combining de Bruijn Indices and Higher-Order Abstract Syntax in Coq
Coauthor: Amy Felty
TYPES 2006, LNCS 4502, pages 63-77 (bibtex entry).
The use of higher-order abstract syntax is an important approach for the representation of binding constructs in encodings of languages and logics in a logical framework. Formal meta-reasoning about such object languages is a particular challenge. We present a mechanism for such reasoning, formalized in Coq, inspired by the Hybrid tool in Isabelle. At the base level, we define a de Bruijn representation of terms with basic operations and a reasoning framework. At a higher level, we can represent languages and reason about them using higher-order syntax. We take advantage of Coq's constructive logic by formulating many definitions as Coq programs. We illustrate the method on two examples: the untyped lambda calculus and quantified propositional logic. For each language, we can define recursion and induction principles that work directly on the higher-order syntax.
Recursive Coalgebras from Comonads (long version)
Coauthors: Tarmo Uustalu and Varmo Vene
Information and Computation, volume 204, issue 4 (2006), pages 437-468. (bibtex entry).
The concept of recursive coalgebra of a functor was introduced in the 1970s by Osius in his work on categorical set theory to discuss the relationship between wellfounded induction and recursively specified functions. In this paper, we motivate the use of recursive coalgebras as a paradigm of structured recursion in programming semantics, list some basic facts about recursive coalgebras and, centrally, give new conditions for the recursiveness of a coalgebra based on comonads, comonad-coalgebras and distributive laws of func- tors over comonads. We also present an alternative construction using countable products instead of cofree comonads.
General Recursion via Coinductive Types
Also in PostScript format.
Logical Methods in Computer Science, Vol. 1, Iss. 2 (2005), pages 1-28. (bibtex entry)
Coq formalization (requires vectors).
A fertile field of research in theoretical computer science investigates the representation of general recursive functions in intensional type theories. Among the most successful approaches are: the use of wellfounded relations, implementation of operational semantics, formalization of domain theory, and inductive definition of domain predicates. Here, a different solution is proposed: exploiting coinductive types to model infinite computations. To every type A we associate a type of partial elements A^, coinductively generated by two constructors: the first, return(a) just returns an element a:A; the second, step(x), adds a computation step to a recursive element x:A^. We show how this simple device is sufficient to formalize all recursive functions between two given types. It allows the definition of fixed points of finitary, that is, continuous, operators. We will compare this approach to different ones from the literature. Finally, we mention that the formalization, with appropriate structural maps, defines a strong monad.
Recursive Functions with Higher Order Domains
Coauthor: Ana Bove.
In TLCA 2005, LNCS 3461, pages 116-130. (bibtex entry)
In a series of articles, we developed a method to translate general recursive functions written in a functional programming style into constructive type theory. Three problems remained: the method could not properly deal with functions taking functional arguments, the translation of terms containing λ-abstractions was too strict and partial application of general recursive functions was not allowed. Here, we show how the three problems can be solved in an impredicative type theory. The solution hinges on a definition of the type of partial functions between given types. Every function, including arguments to higher order functions, λ-abstractions and partially applied functions, is translated as a pair consisting of a domain predicate and a function dependent on the predicate. Higher order functions are assigned domain predicates that inherit termination conditions from their functional arguments. The translation of a λ-abstraction does not need to be total anymore, but generates a local termination condition. The domain predicate of a partially applied function is defined by fixing the given arguments in the domain of the original function. As in our previous articles, simultaneous induction-recursion is required to deal with nested recursive functions. Impredicativity is essential for the method to apply to all functions, since the inductive definition of the domain predicate can refer globally to the domain predicate itself.
Privacy in Data Mining Using Formal Methods
Coauthors: Stan Matwin, Amy Felty, and Istvan HernNadvNvlgyi.
In TLCA 2005, LNCS 3461, pages 116-130. (bibtex entry)
There is growing public concern about personal data collected by both private and public sectors. People have very little control over what kinds of data are stored and how such data is used. Moreover, the ability to infer new knowledge from existing data is increasing rapidly with advances in database and data mining technologies. We describe a solution which allows people to take control by specifying constraints on the ways in which their data can be used. User constraints are represented in formal logic, and organizations that want to use this data provide formal proofs that the software they use to process data meets these constraints. Checking the proof by an independent verifier demonstrates that user constraints are (or are not) respected by this software. Our notion of "privacy correctness" differs from general software correctness in two ways. First, properties of interest are simpler and thus their proofs should be easier to automate. Second, this kind of correctness is stricter; in addition to showing a certain relation between input and output is realized, we must also show that only operations that respect privacy constraints are applied during execution. We have therefore an intensional notion of correctness, rather that the usual extensional one. We discuss how our mechanism can be put into practice, and we present the technical aspects via an example. Our example shows how users can exercise control when their data is to be used as input to a decision tree learning algorithm. We have formalized the example and the proof of preservation of privacy constraints in Coq.
Recursive Coalgebras from Comonads (short version)
Coauthors: Tarmo Uustalu and Varmo Vene.
Electronic Notes in Theoretical Computer Science, Volume 106, Proceedings of CMCS 2004, pages 43-61. (bibtex entry)
We discuss Osius's concept of a recursive coalgebra of a functor from the perspective of programming semantics and give some new sufficient conditions for the recursiveness of a functor-coalgebra that are based on comonads, comonad-coalgebras and distributive laws.
Setoids in type theory
Coauthors: Gilles Barthe and Olivier Pons.
In Journal of Functional Programming, 13(2), pages 261-293, 2003. (bibtex entry)
Formalising mathematics in dependent type theory often requires to use setoids, i.e. types with an explicit equality relation, as a representation of sets. This paper surveys some possible definitions of setoids and assesses their suitability as a basis for developing mathematics. In particular, we argue that a commonly advocated approach to partial setoids is unsuitable, and more generally that total setoids seem better suited for formalising mathematics.
Type-theoretic functional sematics
Coauthors: Yves Bertot and Kuntal Das Barman.
In TPHOLs 2002, LNCS 2410, pages 83-97. (bibtex entry)
We describe the operational and denotational semantics of a small imperative language in type theory with inductive and recursive definitions. The operational semantics is given by natural inference rules, implemented as an inductive relation. The realization of the denotational semantics is more delicate: The nature of the language imposes a few difficulties on us. First, the language is Turing-complete, and therefore the interpretation function we consider is necessarily partial. Second, the language contains strict sequential operators, and therefore the function necessarily exhibits nested recursion. Our solution combines and extends recent work by the authors and others on the treatment of general recursive functions and partial and nested recursive functions. The first new result is a technique to encode the approach of Bove and Capretta for partial and nested recursive functions in type theories that do not provide simultaneous induction-recursion. A second result is a clear understanding of the characterization of the definition domain for general recursive functions, a key aspect in the approach by iteration of Balaa and Bertot. In this respect, the work on operational semantics is a meaningful example, but the applicability of the technique should extend to other circumstances where complex recursive functions need to be described formally.
Nested General Recursion and Partiality in Type Theory
Coauthor: Ana Bove.
In TPHOLs 2001, LNCS 2152, pages 121-135. (bibtex entry)
We extend Bove's technique for formalising simple general recursive algorithms in constructive type theory to nested recursive algorithms. The method consists in defining an inductive special purpose accessibility predicate, that characterizes the inputs on which the algorithm terminates. As a result, the type-theoretic version of the algorithm can be defined by structural recursion on the proof that the input values satisfy this predicate. This technique results in definitions in which the computational and logical parts are clearly separated; hence, the type-theoretic version of the algorithm is given by its purely functional content, similarly to the corresponding program in a functional programming language. In the case of nested recursion, the special predicate and the type-theoretic algorithm must be defined simultaneously, because they depend on each other. This kind of definitions is not allowed in ordinary type theory, but it is provided in type theories extended with Dybjer's schema for simultaneous inductive-recursive definitions. The technique applies also to the formalisation of partial functions as proper type-theoretic functions, rather than relations representing their graphs.
Certifying the Fast Fourier Transform with Coq
In TPHOLs 2001, LNCS 2152, pages 154-168. (bibtex entry)
We program the Fast Fourier Transform in type theory, using the tool Coq. We prove its correctness and the correctness of the Inverse Fourier Transform. A type of trees representing vectors with interleaved elements is defined to facilitate the definition of the transform by structural recursion. We define several operations and proof tools for this data structure, leading to a simple proof of correctness of the algorithm. The inverse transform, on the other hand, is implemented on a different representation of the data, that makes reasoning about summations easier. The link between the two data types is given by an isomorphism. This work is an illustration of the two-level approach to proof development and of the principle of adapting the data representation to the specific algorithm under study. CtCoq, a graphical user interface of Coq, helped in the development. We discuss the characteristics and usefulness of this tool.
The logic and mathematics of occasion sentences
Coauthors: Pieter A. M. Seuren and Herman Geuvers.
In Linguistics and Philosophy, 24(5), 2001. (bibtex entry)
The prime purpose of this paper is, first, to restore to discourse-bound occasion sentences their rightful central place in semantics and secondly, taking these as the basic propositional elements in the logical analysis of language, to contribute to the development of an adequate logic of occasion sentences and a mathematical (Boolean) foundation for such a logic, thus preparing the ground for more adequate semantic, logical and mathematical foundations of the study of natural language. Some of the insights elaborated in this paper have appeared in the literature over the past thirty years, and a number of new developments have resulted from them. The present paper aims at providing an integrated conceptual basis for this new development in semantics. In Section 1 it is argued that the reduction by translation of occasion sentences to eternal sentences, as proposed by Russell and Quine, is semantically and thus logically inadequate. Natural language is a system of occasion sentences, eternal sentences being merely boundary cases. The logic has fewer tasks than is standardly assumed, as it excludes semantic calculi, which depend crucially on information supplied by cognition and context and thus belong to cognitive psychology rather than to logic. For sentences to express a proposition and thus be interpretable and informative, they must first be properly anchored in context. A proposition has a truth value when it is, moreover, properly keyed in the world, i.e. is about a situation in the world. Section 2 deals with the logical properties of natural language. It argues that presuppositional phenomena require trivalence and presents the trivalent logic PPC_3, with two kinds of falsity and two negations. It introduces the notion of Σ-space for a sentence A (or /A/, the set of situations in which A is true) as the basis of logical model theory, and the notion of /P^A/ (the Σ-space of the presuppositions of A), functioning as a `private' subuniverse for /A/. The trivalent Kleene calculus is reinterpreted as a logical account of vagueness, rather than of presupposition. PPC_3 and the Kleene calculus are refinements of standard bivalent logic and can be combined into one logical system. In Section 3 the adequacy of PPC_3 as a truth-functional model of presupposition is considered more closely and given a Boolean foundation. In a noncompositional extended Boolean algebra, three operators are defined: 1_a for the conjoined presuppositions of a, Nc for the complement of a within 1_a, and Nb for the complement of 1_a within Boolean 1. The logical properties of this extended Boolean algebra are axiomatically defined and proved for all possible models. Proofs are provided of the consistency and the completeness of the system. Section 4 is a provisional exploration of the possibility of using the results obtained for a new discourse-dependent account of the logic of modalities in natural language. The overall result is a modified and refined logical and model-theoretic machinery, which takes into account both the discourse-dependency of natural language sentences and the necessity of selecting a key in the world before a truth value can be assigned.
Recursive Families of Inductive Types
Also in PostScript format.
In TPHOLs 2000, LNCS 1869, pages 73-89. (bibtex entry)
Families of inductive types defined by recursion arise in the formalization of mathematical theories. An example is the family of term algebras on the type of signatures. Type theory does not allow the direct definition of such families. We state the problem abstractly by defining a notion, strong positivity, that characterizes these families. Then we investigate its solutions. First, we construct a model using wellorderings. Second, we use an extension of type theory, implemented in the proof tool Coq, to construct another model that does not have extensionality problems. Finally, we apply the two level approach: We internalize inductive definitions, so that we can manipulate them and reason about them inside type theory.
Universal Algebra in Type Theory
In TPHOLs 1999, LNCS 1690, pages 131-148. (bibtex entry)
We present a development of Universal Algebra inside Type Theory, formalized using the proof assistant Coq. We define the notion of a signature and of an algebra over a signature. We use setoids, i.e. types endowed with an arbitrary equivalence relation, as carriers for algebras. In this way it is possible to define the quotient of an algebra by a congruence. Standard constructions over algebras are defined and their basic properties are proved formally. To overcome the problem of defining term algebras in a uniform way, we use types of trees that generalize wellorderings. Our implementation gives tools to define new algebraic structures, to manipulate them and to prove their properties.
A general method for proving the normalization theorem for first and second order typed λ-calculi
Coauthor: Silvio Valentini.
In Mathematical Structures in Computer Science, volume 9, issue 6, pages 719-739, 1999. (bibtex entry)
In this paper we describe a method for proving the normalization property for a large variety of typed lambda calculi of first and second order, based on a proof of equivalence of two deduction systems. We first illustrate the method on the elementary example of simply typed lambda calculus and then we show how to extend it to a more expressive dependent type system. Finally we use it for proving the normalization theorem for Girard's system F.