pvs, temporal logic and stacks

147 views
Skip to first unread message

fl

unread,
May 31, 2016, 1:24:00 PM5/31/16
to tlaplus



Having a look at PVS, ( a nice system but with no temporal logic embedded)


and reading in their tutorials the pages where a stack is specified


I have come across axioms like 

> top(push(x, s)) = x

or

> pop(push(x, s)) = s

and theorems like

>  pop(pop(push(x, push(y, s)))) = s 

so far so good. But it comes to my mind that the style of theorems you really want would be better

 > if you push an x and then applies any sequence of "push" and "pop" ("push" and "pop" in equla number) 
 > you will retrieve your x with "top".

And I think that you need a temporal logic to express (easily) this property. But I don't
exactly know which one?

    So it is my question: WHICH ONE?

(To tell the truth I suppose with a sequence of "push" and "pop" we can express this theorem in PVS but
 with a temporal logic it would look more natural).

-- 
FL

Ron Pressler

unread,
Jun 2, 2016, 2:07:32 PM6/2/16
to tlaplus
To use a temporal logic you need a temporal modality, i.e. a machine. In TLA, the machine is explicit. What you have is a specification of three mathematical functions, not a machine. You can describe the stack itself as a machine (that interprets "push", "pop" and "top"), which in TLA(+), would require s to be a variable, say, a sequence, which you can then hide with the temporal existential operator. Alternatively, you can keep push/pop/top as functions or operators, and describe just your theorem as a machine. The specification of your theorem would contain the constant N, which is the number of push/pop operations. You would then initialize two variables, m and n in the starting state to N, and in each step, nondeterministically do a push or a pop, decrementing m or n respectively until they reach zero.

Leslie Lamport

unread,
Jun 2, 2016, 4:34:18 PM6/2/16
to tlaplus

What you have come across is an example of what's called algebraic
specification.  This is a way of specifying data structures that was
popular in the 70s and 80s.  I believe that most people, including I
believe most mathematicians, would describe a stack as a finite
sequence of elements and describe the operations of push, pop, and top
in terms of sequences.  In algebraic specification, one starts with a
bunch of axioms on push, pop, top, and the empty stack, and defines a
stack as something you get by performing a sequence of push operations
on the empty stack.  This is supposed to be better because it's more
abstract and doesn't depend on a concrete representation of a stack
as a sequence.


Thinking that this kind of abstraction is better is rather silly,
since the two ways of defining stack are isomorphic and one can define
the "concrete" representation in terms of the "abstract" one.  In
practice, algebraic specifications don't work for anything but a few
simple data structures like stacks.  It's very hard for mathematicians
to figure out the correct set of axioms, and impossible for engineers.
I remember years ago talking to someone who worked on this kind of
specification, and he said that you should specify a small number of
primitive data types this way and define anything else "concretely" in
terms of them.  You can think of TLA+ this way, where sets and
functions are defined algebraically and other data structures are
defined in terms of them.  Algebraic specification seems to have
fallen out of fashion.


   if you push an x and then applies any sequence of "push" and "pop"

   ("push" and "pop" in equal number) you will retrieve your x with
   "top".


You need an additional requirement--e.g., it's not true if you perform
all the pops first.  Perhaps this illustrates the difficulty of using
algebraic specification.


As Ron pointed out, what whoever wrote that spec you found was doing
was defining a data structure, not a machine/program.  For any data
structure, you can define a machine that takes its operations as input
and produces results as output.  I suggest doing that for an arbitrary
data structure rather than specializing it for a stack.


Leslie




Ron Pressler

unread,
Jun 2, 2016, 4:43:42 PM6/2/16
to tlaplus
Even though TLA+ encourages machine specification (and, indeed, this works better for arbitrary data structures, and allows powerful use of refinement), I think that in this case, just for the sake of exercise, we can keep the algebraic/functional/denotational axiomatization, yet state the theorem in TLA. Perhaps like so:

------------------------------------------ MODULE Stack  ------------------------------------------
CONSTANTS X, \* The set of elements
          S, \* The set of stacks (intentionally left abstract)
          push,
          pop,
          top

\* Algebraic structure
ASSUME /\ push \in [S \X X -> S]
       /\ pop \in [S -> S]
       /\ top \in [S -> X]

\* Algebraic axioms
ASSUME \A x \in X, s \in S :
         /\ top[push[<<x, s>>]] = x
         /\ pop[push[<<x, s>>]] = s

------------------------------------------ MODULE PushPop ------------------------------------------
EXTENDS Naturals

CONSTANTS S1, x \* initial stack and "x"

ASSUME S1 \in S /\ x \in X

VARIABLES s, n

Init == /\ s = push[<<x, S1>>]
        /\ n = 0
       
Next == \/ n' = n + 1 /\ \E y \in X : s' = push[<<y, s>>]
        \/ n > 0 /\ n' = n - 1 /\ s' = pop[s]

Spec == Init /\ [][Next]_<<s, n>>

THEOREM PP == Spec => [](n = 0 => top[s] = x)
====================================================================================================

PP(S1, x, s, n) == INSTANCE PushPop
THEOREM
    \A s1 \in S, x \in X : \EE s, n : PP(s1, x, s, n)!PP

====================================================================================================


Stephan Merz

unread,
Jun 3, 2016, 3:23:49 AM6/3/16
to tla...@googlegroups.com
Hi Ron,

PP(S1, x, s, n) == INSTANCE PushPop
THEOREM 
    \A s1 \in S, x \in X : \EE s, n : PP(s1, x, s, n)!PP

this is not what you want, and in fact, that theorem doesn't mean much: PP(...)!PP is an implication, so for it to be true it's enough to choose s and n such that PP(...)!Spec is false. For example, it is true for a variable n that is 42 initially. What you really mean is

    \A s1 \in S, x \in X : \AA s, n : PP(s1, x, s, n)!PP

but there is no need to state this theorem: it is just an instance of theorem PP in module PushPop.

Now, coming back to the original discussion, the question remains how you would actually prove theorem PP. Obviously, you need some form of induction, intuitively relying on an invariant saying that n equals the length of the sequence of elements that the stack contains above the element that you pushed initially, and that no execution satisfying Spec ever touches the initial part of the stack. But since S is kept abstract, there is no way of talking about that length or that part of the stack directly. People working on algebraic specifications introduced a whole theory of initial models (and also, dually, of final models) that give you suitable inductive proof rules, and various conditions that ensure when a specification has such models.

In "model-based" specification formalisms such as TLA+, you explicitly specify the variable s to be a sequence of X's and implement push, pop, and top using corresponding sequence operations. Then, equating S with Seq(X), the necessary invariant can be stated precisely as

Inv == /\ n \in Nat
       /\ s \in Seq(X)
       /\ Len(S) > Len(S1)
       /\ n = Len(S) - Len(S1) - 1
       /\ SubSeq(S, 1, Len(S1)+1) = Append(S1, x)

THEOREM Spec => []Inv

The proof of that theorem is a routine invariant proof, and theorem PP is an immediate consequence. No need to resort to specific "initial models".

It is true that the specification stated like this commits to implementing the stack as a sequence, which may not be what you want. The abstract spec is easily obtained by hiding s and n, writing

Spec == \EE s,n : Spec

Now, you can implement the stack as you like and need only prove that it behaves as if there were a sequence s and counter n. I believe that is essentially what Leslie meant when he wrote that the algebraic and the model-based styles of specifying a stack were isomorphic.

Best regards,
Stephan


--
You received this message because you are subscribed to the Google Groups "tlaplus" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tlaplus+u...@googlegroups.com.
To post to this group, send email to tla...@googlegroups.com.
Visit this group at https://groups.google.com/group/tlaplus.
For more options, visit https://groups.google.com/d/optout.

Ron Pressler

unread,
Jun 3, 2016, 3:41:20 AM6/3/16
to tlaplus


On Friday, June 3, 2016 at 10:23:49 AM UTC+3, Stephan Merz wrote:
this is not what you want, and in fact, that theorem doesn't mean much: PP(...)!PP is an implication, so for it to be true it's enough to choose s and n such that PP(...)!Spec is false. For example, it is true for a variable n that is 42 initially. What you really mean is

    \A s1 \in S, x \in X : \AA s, n : PP(s1, x, s, n)!PP

Oh, of course. Thanks.
 

Now, coming back to the original discussion, the question remains how you would actually prove theorem PP. Obviously, you need some form of induction, intuitively relying on an invariant saying that n equals the length of the sequence of elements that the stack contains above the element that you pushed initially, and that no execution satisfying Spec ever touches the initial part of the stack. But since S is kept abstract, there is no way of talking about that length or that part of the stack directly. People working on algebraic specifications introduced a whole theory of initial models (and also, dually, of final models) that give you suitable inductive proof rules, and various conditions that ensure when a specification has such models.

This I don't understand. Is there no way to do the induction structurally, based on the axioms to (as you'd do in algebraic proofs)? The invariant is true in the initial condition based on the first axiom, and then you can introduce a sequence of elements (ys) for the sake of the proof, while making no further assumptions about S. So you can talk about that part of the stack as a sequence (literally) of algebraic operations (push) without exploring S (and then Len(ys) >= 0).
 
Now, you can implement the stack as you like and need only prove that it behaves as if there were a sequence s and counter n. I believe that is essentially what Leslie meant when he wrote that the algebraic and the model-based styles of specifying a stack were isomorphic.

Yeah, I know, I just wanted to see if a hybrid approach is possible (I know it's not what you'd do or want to do in TLA+, but just to test the limits of the logic), i.e. to introduce a machine only for the theorem, and keep the spec algebraic, as that is what FL was asking about.


Ron

fl

unread,
Jun 3, 2016, 4:55:13 AM6/3/16
to tlaplus

Thinking that this kind of abstraction is better is rather silly,
since the two ways of defining stack are isomorphic and one can define
the "concrete" representation in terms of the "abstract" one. 


I didn't know that. Very interesting.


Your post and the other ones raise many questions. I'm realizing that the style of specification
is worth thinking.

You need an additional requirement--e.g., it's not true if you perform
all the pops first.  Perhaps this illustrates the difficulty of using
algebraic specification.



OK. I just wanted to give a light sketch of the theorem I had in mind.
 
--
FL

fl

unread,
Jun 3, 2016, 5:01:30 AM6/3/16
to tlaplus

It is true that the specification stated like this commits to implementing the stack as a sequence, which may not be what you want. The abstract spec is easily obtained by hiding s and n, writing

Spec == \EE s,n : Spec

Now, you can implement the stack as you like and need only prove that it behaves as if there were a sequence s and counter n.


Yes. In many specifications engine now

> http://frama-c.com/download/acsl-implementation-Aluminium-20160501.pdf (p. 72 & 74)

they have what they call model variables or ghost variables and the formula


> Spec == \EE s,n : Spec

seems to be the exact way of translating the concept into TLAPLUS.

 
I believe that is essentially what Leslie meant when he wrote that the algebraic and the model-based styles of specifying a stack were isomorphic.


It looks so.

--
FL

fl

unread,
Jun 3, 2016, 5:24:11 AM6/3/16
to tlaplus

Now, coming back to the original discussion, the question remains how you would actually prove theorem PP. Obviously, you need some form of induction, intuitively relying on an invariant saying that n equals the length of the sequence of elements that the stack contains above the element that you pushed initially, and that no execution satisfying Spec ever touches the initial part of the stack. But since S is kept abstract, there is no way of talking about that length or that part of the stack directly.


Except that you can count the number of "push" and the number of "pop" 
as it was done in Ron's specification. That way you don't need to refer to the model.
And it is the interesting part of the exercise since it involves that
you may need to use the intriguing operators of the linear temporal logic to formalize it: next, globally,
eventually, until.

For instance if you try to formalize

> top(pop(push(s,x))) = x

I think you need to use the operator "next"

> if you push x in the first state, and pop in the "next" one and use "top" in the "next" one you get the
> value used in the first one.

Other theorems implying that we use temporal operators can be imagined. It is of interest otherwise
temporal operators are never used. Only axiomatic specifications make them reappear.

--
FL

fl

unread,
Jun 3, 2016, 5:55:32 AM6/3/16
to tlaplus

they have what they call model variables or ghost variables and the formula

 

An attempt at using model-based specification while at the same time being
independant of the implementation. Thus it has the easiness of model-based
specification and the generality of algebraic specification.

--
FL

Stephan Merz

unread,
Jun 4, 2016, 3:31:03 AM6/4/16
to tla...@googlegroups.com
Now, coming back to the original discussion, the question remains how you would actually prove theorem PP. Obviously, you need some form of induction, intuitively relying on an invariant saying that n equals the length of the sequence of elements that the stack contains above the element that you pushed initially, and that no execution satisfying Spec ever touches the initial part of the stack. But since S is kept abstract, there is no way of talking about that length or that part of the stack directly. People working on algebraic specifications introduced a whole theory of initial models (and also, dually, of final models) that give you suitable inductive proof rules, and various conditions that ensure when a specification has such models.

This I don't understand. Is there no way to do the induction structurally, based on the axioms to (as you'd do in algebraic proofs)? The invariant is true in the initial condition based on the first axiom, and then you can introduce a sequence of elements (ys) for the sake of the proof, while making no further assumptions about S. So you can talk about that part of the stack as a sequence (literally) of algebraic operations (push) without exploring S (and then Len(ys) >= 0).

I understand the intention, but your axioms

\* Algebraic structure
ASSUME /\ push \in [S \X X -> S] 
       /\ pop \in [S -> S]
       /\ top \in [S -> X]

\* Algebraic axioms
ASSUME \A x \in X, s \in S : 
         /\ top[push[<<x, s>>]] = x
         /\ pop[push[<<x, s>>]] = s

are not strong enough for justifying a principle of structural induction. In particular, there is no formal basis for asserting that the parameter set S is constructed by (the empty stack and) the push operation. Also, there could be other operations on stacks. It seems to me that if you want to simulate an algebraic specification of a stack in TLA+, you need to write something like

StackAxioms(X,S,mt,ps,pp,tp) ==
  /\ mt \in S
  /\ ps \in [S \X X -> S]
  /\ pp \in [S -> S]
  /\ tp \in [S -> X]
  /\ \A x \in X, s \in S : pp[ps[s,x]] = s /\ tp[ps[s,x]] = x
  /\ \A x \in X, s \in S : mt # ps[s,x]
  /\ \A T \in SUBSET S : mt \in T /\ (\A x \in X, s \in T : ps[s,x] \in T) => S = T

push(X) == CHOOSE ps : \E S, mt, pp, tp : StackAxioms(X,S,mt,ps,pp,tp)
Stack(X) == DOMAIN push(X)
empty(X) == CHOOSE mt : \E pp, tp : StackAxioms(X, Stack(X), mt, push(X), pp, tp)
pop(X) == CHOOSE pp : \E tp : StackAxioms(X, Stack(X), empty(X), push(X), pp, tp)
top(X) == CHOOSE tp : StackAxioms(X, Stack(X), empty(X), push(X), pop(X), tp)

In order to justify that these definitions are sensible, you first need to prove

THEOREM StackExistence == \A X : \E S,mt,ps,pp,tp : StackAxioms(X,S,mt,ps,pp,tp)

which is possible by exhibiting the standard model based on sequences. Then, prove similar theorems justifying the definitions involving CHOOSE. Finally, you can derive the desired induction rule

THEOREM StackInduction ==
  ASSUME NEW X, NEW P(_),
         P(empty(X)),
         \A s \in Stack(X), x \in X : P(s) => P(push(X)[s,x])
  PROVE  \A s \in Stack(X) : P(s)

Finally, it looks like you can augment your specification with a history variable y that records the sequence of elements that have been pushed on top of the element x pushed in the initial condition but not yet popped off and prove that the stack s in the specification behaves like that sequence y (i.e., top and pop applied to s yield the same values as the analogous operations applied to y). If you want to use TLA+, the standard way of specifying a stack directly as a sequence (and possibly hiding the sequence using \EE) looks much more straightforward.

Formalisms for algebraic specifications hide all this complexity in customized patterns whose correctness has been proved once and for all at the meta-level. For example, in CASL you'd write

spec STACK =
   sorts X
   free type Stack ::= empty | push(Stack; X)
   ops top : Stack -> X
          pop : Stack -> Stack
   vars s : Stack; x : X
   axioms
       top(push(s,x)) = x
       pop(push(s,x)) = s
end

and the semantic analyzer would check that this specification is well-formed and generate the corresponding induction rule. It ought to be possible to pre-define corresponding patterns within TLA+, but it's not clear to me how often they would be useful. When it comes to more complex data structures (say, balanced trees), you tend to need more control than algebraic specifications can provide.

NB: The above construction of stacks is similar to the definition of natural numbers in TLA+'s standard modules.

Best regards,
Stephan

Ron Pressler

unread,
Jun 5, 2016, 2:06:01 AM6/5/16
to tla...@googlegroups.com


On Saturday, June 4, 2016 at 10:31:03 AM UTC+3, Stephan Merz wrote:

In particular, there is no formal basis for asserting that the parameter set S is constructed by (the empty stack and) the push operation. Also, there could be other operations on stacks.

That's true, and indeed the axioms aren't complete. I believe I understand how your axioms specify S in terms of the operations operating on it and in terms of X. Suppose we have this operator:

    RECURSIVE _ ** _
    f ** k ==
        IF k = 0 THEN [t \in DOMAIN f |-> t]
                 ELSE [t \in DOMAIN f |-> f[(f ** (k-1))[t]]]


Then your axioms are absolutely necessary to prove the theorem:

    \A s \in Stack(X) : \E n \in Nat : (pop(X) ** n)[s] = empty(X)

So I think I understand your point about the necessity for a "framework" to support algebraic reasoning (and agree that it is probably not worthwhile in TLA+).

But just for the sake of the exercise, I'm not sure that all of that is required for FL's particular theorem. More below.


Finally, it looks like you can augment your specification with a history variable y that records the sequence of elements that have been pushed on top of the element x pushed in the initial condition but not yet popped off and prove that the stack s in the specification behaves like that sequence y (i.e., top and pop applied to s yield the same values as the analogous operations applied to y).

 Right, that was my original idea, but perhaps we can do something much simpler:
            
    Inv == top[(pop ** n)[s]] = x


where n and s are the variables in my PushPop module. I think that the invariant Inv is an inductive invariant, that it proves the theorem, and that no further axioms than "my" two (specifically with regards to the empty stack and the construction of S) are necessary to prove Inv.

Finally, while your axiomatization constructs S and its operations "from scratch" in terms of X and the axioms, wouldn't it be easier in practice to do something like the following, instead of CHOOSEing the functions etc.:

CONSTANTS X, \* The set of elements
          S, \* The set of stacks
          push, pop, top, EMPTY
         
ASSUME /\ EMPTY \in S

       /\ push \in [S \X X -> S]
       /\ pop \in [S \ {EMPTY} -> S]
       /\ top \in [S
\ {EMPTY} -> X]
       /\ \A x \in X, s \in S : push[<<x, s>>] \in S \* closure
       /\ \A x \in X, s \in S : push[<<x, s>>] /= EMPTY
       /\ \A x \in X, s \in S : top[push[<<x, s>>]] = x
       /\ \A x \in X, s \in S : pop[push[<<x, s>>]] = s

       /\ \A s \in S : s = EMPTY \/ \E s1 \in S, x \in X : s = push[<<x, s1>>] 

Doesn't this specify S sufficiently in terms of X and the operations, rather than constructing S as you've done?

Ron

Ron Pressler

unread,
Jun 5, 2016, 2:57:44 AM6/5/16
to tlaplus
P.S.

My closure axiom is unnecessary as it is covered by the first axiom for push. Also, due to my lack of algebra/logic/model-theory knowledge, I don't understand why your axiom about the uniqueness of S is necessary. Anyway, I think that this exercise proves Leslie's point about the difficulty of the algebraic/axiomatic approach for engineers (like me), not to speak of the difficulty of model-checking such specifications (BTW, are there model checkers for that approach?)

Ron

Stephan Merz

unread,
Jun 5, 2016, 3:21:06 AM6/5/16
to tla...@googlegroups.com
Thanks – I agree with almost everything you say, in particular the invariant for proving the original property via iterated function application.

On constructing the data structure vs. relying on assumptions: I think both have their place. For example, in mathematics one typically constructs, say, the real numbers via Dedekind cuts, but presents algebraic structures such as groups through characteristic axioms. For our purposes, presentation through ASSUME leaves more freedom for later implementations of the data structure later, since only the stated properties have to be satisfied – but knowing when you have stated enough properties can be difficult.

Case in point: your last conjunct

       /\ \A s \in S : s = EMPTY \/ \E s1 \in S, x \in X : s = push[<<x, s1>>]  

(trivially) follows from the induction rule for S that was included in my specification, but does not imply that rule. To see this, consider a structure where S is the set of finite *and infinite* sequences over X, where EMPTY is the empty sequence, and where the operations work on the first element of the sequence. This model satisfies all of your axioms, but not the induction rule: otherwise, you'd prove that the length of all elements of S is finite since

Len(EMPTY) \in Nat  and
\A s \in S, x \in X : Len(x) \in Nat => Len(push[x,s]) \in Nat

but the conclusion \A s \in S : Len(s) \in Nat is not true in the model. The existence of such non-standard models can cause frustration when proofs do not go through. Personally, I tend to think about data structures in terms of intended models, so I find it easier to specify them in model-based style.

P.S.

Also, due to my lack of algebra/logic/model-theory knowledge, I don't understand why your axiom about the uniqueness of S is necessary.

See above.

Anyway, I think that this exercise proves Leslie's point about the difficulty of the algebraic/axiomatic approach for engineers (like me), not to speak of the difficulty of model-checking such specifications (BTW, are there model checkers for that approach?)

Fully agreed. Tool support for algebraic specifications comes mainly in the form of proof assistants, although I believe that some animation techniques similar to Alloy are possible for restricted fragments.

Best regards,
Stephan


On 05 Jun 2016, at 08:06, Ron Pressler <r...@paralleluniverse.co> wrote:



On Saturday, June 4, 2016 at 10:31:03 AM UTC+3, Stephan Merz wrote:

In particular, there is no formal basis for asserting that the parameter set S is constructed by (the empty stack and) the push operation. Also, there could be other operations on stacks.

That's true, and indeed the axioms aren't complete. I believe I understand how your axioms specify S in terms of the operations operating on it and in terms of X. Suppose we have this operator:

    RECURSIVE _ ** _
    f ** k ==
        IF k = 0 THEN [t \in DOMAIN f |-> t]
                 ELSE [t \in DOMAIN f |-> f[(f ** (k-1))[t]]]


Then your axioms are absolutely necessary to prove the theorem:

    \A s \in Stack(X) : \E n \in Nat : (pop ** n)[s] = empty(X)


So I think I understand your point about the necessity for a "framework" to support algebraic reasoning (and agree that it is probably not worthwhile in TLA+).

But just for the sake of the exercise, I'm not sure that all of that is required for FL's particular theorem. More below.


Finally, it looks like you can augment your specification with a history variable y that records the sequence of elements that have been pushed on top of the element x pushed in the initial condition but not yet popped off and prove that the stack s in the specification behaves like that sequence y (i.e., top and pop applied to s yield the same values as the analogous operations applied to y).

 Right, that was my original idea, but perhaps we can do something much simpler:
            
    Inv == top[(pop ** n)[s]] = x


where n and s are the variables in my PushPop module. I think that the invariant Inv is an inductive invariant, that it proves the theorem, and that no further axioms than "my" two (specifically with regards to the empty stack and the construction of S) are necessary to prove Inv.

Finally, while your axiomatization constructs S and its operations "from scratch" in terms of X and the axioms, wouldn't it be easier in practice to do something like the following, instead of CHOOSEing the functions etc.:

CONSTANTS X, \* The set of elements
          S, \* The set of stacks
          push, pop, top, EMPTY
         
ASSUME /\ EMPTY \in S
       /\ push \in [S \X X -> S]
       /\ pop \in [S -> S]
       /\ top \in [S -> X]

       /\ \A x \in X, s \in S : push[<<x, s>>] \in S \* closure
       /\ \A x \in X, s \in S : push[<<x, s>>] /= EMPTY
       /\ \A x \in X, s \in S : top[push[<<x, s>>]] = x
       /\ \A x \in X, s \in S : pop[push[<<x, s>>]] = s

       /\ \A s \in S : s = EMPTY \/ \E s1 \in S, x \in X : s = push[<<x, s1>>] 

Doesn't this specify S sufficiently in terms of X and the operations, rather than constructing S as you've done?

Ron

Ron Pressler

unread,
Jun 5, 2016, 5:55:48 AM6/5/16
to tlaplus

On Sunday, June 5, 2016 at 10:21:06 AM UTC+3, Stephan Merz wrote:

On constructing the data structure vs. relying on assumptions: I think both have their place. For example, in mathematics one typically constructs, say, the real numbers via Dedekind cuts, but presents algebraic structures such as groups through characteristic axioms. For our purposes, presentation through ASSUME leaves more freedom for later implementations of the data structure later, since only the stated properties have to be satisfied – but knowing when you have stated enough properties can be difficult.

Doesn't your approach essentially use CHOOSE to magically choose a model (basically, pull it out of thin air) in case the axiomatization isn't complete, in a way that is equally opaque to proofs? I mean, in the end, all the proofs have to work with is your assumptions, not any other property of a model that's been CHOsen (unless maybe those properties are known through existing algebras, like Nat  ???)
 

Case in point: your last conjunct (trivially) follows from the induction rule for S that was included in my specification, but does not imply that rule. To see this, consider a structure where S is the set of finite *and infinite* sequences over X

Ah, right. Whenever you're confused about axioms, the answer lies with uncountable models! But I still don't see how your approach addresses this. You might as well have mistakenly included my axiom instead of yours, and CHOOSE would still work (wouldn't it?), and so the problem of knowing when you've stated enough properties remains, no?

Ron

Stephan Merz

unread,
Jun 6, 2016, 2:56:51 PM6/6/16
to tla...@googlegroups.com


On constructing the data structure vs. relying on assumptions: I think both have their place. For example, in mathematics one typically constructs, say, the real numbers via Dedekind cuts, but presents algebraic structures such as groups through characteristic axioms. For our purposes, presentation through ASSUME leaves more freedom for later implementations of the data structure later, since only the stated properties have to be satisfied – but knowing when you have stated enough properties can be difficult.

Doesn't your approach essentially use CHOOSE to magically choose a model (basically, pull it out of thin air) in case the axiomatization isn't complete, in a way that is equally opaque to proofs? I mean, in the end, all the proofs have to work with is your assumptions, not any other property of a model that's been CHOsen (unless maybe those properties are known through existing algebras, like Nat  ???)

Absolutely. I said that I'd justify the existence of a structure satisfying the characteristic predicate by proving the theorem

THEOREM StackExistence == \A X : \E S,mt,ps,pp,tp : StackAxioms(X,S,mt,ps,pp,tp)

You'll also want to prove

StackAxioms(X, Stack(X), empty(X), ...)

so that you can actually reason about the specification. In that sense, you need to prove well-definedness upfront – but of course, that doesn't mean that your specification is complete, since you can only rely on what the axioms assert.

When you ASSUME the structure, you are not forced to justify the assumptions, although it would perhaps be good practice to make sure that you haven't introduced an inconsistency.

Stephan
Reply all
Reply to author
Forward
0 new messages