A Coalgebraic Theory of Sequences in PVS
We present an axiomatic description of (possibly infinite) sequences in
PVS, using (final)
coalgebras. Various theories are based on this axiomatisation.
You can look at (version 2.0) of the
and also download the whole dump file.
It should run in PVS version 2.414.
(Version 1.0 is still available as
gzipped dump file.)
We make this work publicly available, but
we appreciate an acknowledgement if you make use
of these files in your publications. If you have any comments,
or suggestions for additions, please let us know.
A finite or infinite sequence of values from a parameter type
A can be seen as generated by a machine or automaton with state space
S, a transition function f : [S -> Lift[[A,S]]]---where Lift[X]
is the type X augmented with a bottom element---and a start state s
in S . This f is a partial function providing either the
value bot (for undefined) or an output in A together with a next state
in S. The behavior of such a machine is given by
consecutively applying f (as long as possible) and observing the
emerging output in A. Such a behavior corresponds to a sequence of
elements of A.
The above machine with state space S and transition function f
is an example of a coalgebra of type
Lift[[A,-]]. The collection Seq[A] of sequences of elements of
A also carries such a coalgebra structure next: Seq[A] ->
Lift[[A,Seq[A]]]. The function next decomposes a
sequence into the car and the cdr (head,tail), if these
exist (otherwise it produces the value bot). This
coalgebra next is very special; it is the final coalgebra of
this type: for an arbitrary coalgebra ``struct'' there exists a
unique homomorphism ``coreduce(struct)'' to it. This is the content
of the (single) axiom that we use for sequences.
The finality axiom provides us with powerful definition schemes
and proof mechanisms. In order to define a funtion g
taking values in Seq[A] it suffices to describe a
coalgebra structure on the domain of g. Uniqueness is
used to prove equality of such functions.
Bisimulation, invariant, simulation (for prefix
order on sequences), and admissability proof methods
are developed based on the uniqueness property. The main advantage
of this coinductive approach is that it allows us to deal
with sequences in a step-by-step manner: we always define and
prove by looking at the next step in a sequence. This is
similar (but dual, in a suitable sense) to inductive definitions
and proofs, where one also only takes single steps.
For more information and
references, there is a paper
which explains the formalisation in detail. It will appear
in the Journal of Logic and Computation. Also, there is a
on (Co)algebras and (Co)induction providing further background
information (from: Bulletin of the European
Association for Theoretical Computer Science 62 (1997), p.222-259).
We briefly describe the various theories divided into four groups.
- Lift[X], Lift_prop[X]
- Seq_defn[A], Seq_ax[A,X]. Definition of the type
Seq[A] and its destructor next. Introduction of the
concept of homomorphism (to the final coalgebra). Finality axiom and
definition of the unique homomorphism coreduce(struct) for
each coalgebra struct. The (simple) coinductive
proof method struct_map_unique which identifies two
homomorphism by the uniqueness argument. Explicit rewrite rule for
the destructor next applied to a value generated by
- Intermezzo, in which the validity of the
finality axiom is demonstrated. As implementation
of the type Seq[A] of sequences we use the set of functions
f : [nat -> Lift[A]] which satisfy the property
that once they are undefined, they remain undefined.
Hence, our finality axiom
introduces no inconsistencies. We take it
as an axiom to make sure that none of our results depend on
- Bisim[A,X1,X2]. Bisimulation proof method
derived from uniqueness. A relation is a bisimulation if it is closed
under application of the destructor of a coalgebra. Two states of a
coalgebra generate the same sequence in the terminal coalgebra iff they
are bisimilar. This is reflected by the proof rule
bisim_finality which is heavily used in what follows.
- Seq_functoriality[A,B], Seq_map_prop[A,B,C],
Seq_strength[A,B], Seq_shape[A], Seq_zip[A,B]. Some properties
from the theory of (co)datatypes. Seq can be extended to a
mapping on functions seq_map (similar to map for
lists). This mapping preserves identies and composition, thus
Seq is (in
categorical terms) a functor (it is strong as whitnessed by
seq_strength. The shape of a sequence is its length if
it is finite or just infinity. Sequences of equal shape can be zipped
to a sequence of pairs.
- Seq_prop[A], Seq_comp[A]. A variety of
properties, for instance: next is an isomorphism.
A bisimulation proof
principle on sequences (final_bisim_finality). Reading
the nth value of a sequence (at), providing the nth
tail. Induction on natural numbers for checking equality of
sequences (at_eqn). Sequences are either finite or infinite
(fin_inf). Composition (concatenation) of sequences.
- Final_invariant[A],Invariant[A,X]. Definition of
invariant on sequences, least and greatest invariants generated
by an arbitrary predicate, proof principle for (greatest)
- Prefix Order and algebraic cpo
Basic results about the prefix order on sequences. Powerful
prefix_simulation proof principle (which is often used).
- Seq_alg_cpo[A] Least upperbounds of ascending chains
of sequences exist. And every sequence is the least upperbound of
its finite prefixes. This gives a proof principle for admissable
- Special operations: Filter and flatten
- Seq_filter[A].The function filter takes a
predicate on elements and a sequence and produces the sequence of
elements for which the predicate is true. This function is specified
coalgebraically. Lemma filter_comp
states that filter commutes with concatenation of a finite sequence
with an arbitrary one. Lemma filter_and expresses the fact
that a consecutive filtering with two predicates is the same as filtering
once with the conjunction of the predicates. A useful consequence is
that filtering is idempotent (filter_idempotent).
- Flattening[A], Flatten_naturality[A,B]. The function
flatten transforms a sequence of lists into a sequence of
all elements (pictorially, by removing the inner brackets). The lemma
flatten_comp states that flatten commutes with
concatenation comp. Lemma filter_flatten says that
filtering a flattened sequence of lists (finite sequences) is the same as
first filtering each individual list and then flatten the result.
flatten_chop states that flattening an arbitrarily partitioned
sequence (into a sequence of lists) returns the original sequence.
flatten_naturality epxresses that flatten is
a natural transformation.
Last revised 24 June 1998, Bart Jacobs and Ulrich Hensel