Improved definition checking of set.mm, and df-sbc no longer an exception

149 views
Skip to first unread message

David A. Wheeler

unread,
Sep 24, 2019, 2:54:42 PM9/24/19
to metamath
FYI:

@wlammen recently posted an example where an unsound definition causes problems: https://github.com/metamath/set.mm/pull/1089
Details below if you'd like.

This is expected per the Metamath book, because definitions are just axiomatic ($a) statements, and Metamath verifiers merely confirm that proven ($p) statements follow from $a statements. There are far fewer definitions than proven statements, and definitions are generally far simpler than proofs.

That said, we don't *really* want unsound definitions. mmj2 already has a number of checks on definitions. Thanks to Mario, mmj2 version v2.5.3 now has a tweaked definition checker that does a deifnitional soundness check, which rejects this kind of nonsense. This version of mmj2 is now incorporated in our CI build, so every merged change into set.mm will be checked to prevent these kinds of crazy things.

A related improvement: previously mmj2's checkers were configured to exclude df-sbc , because it reused a constant that has already been introduced, namely [ A / x ] ph. However, Norm changed it to use new constants in Jan 2017. df-sb is '[ y / x ] ph' and df-sbc is now '[. A / x ]. ph'. So df-sbc can be removed from the exception list, so that's one less thing that needs to be handled specially.

For more about this, see: https://github.com/metamath/set.mm/pull/1089

--- David A. Wheeler

P.S. Here's the definition that caused the problem.

$( Declare a new symbol. $)
$c BustVerifier $. $( A meaningless symbol $)

$( Extend wff definition to include BustVerifier. $)
wbustverifier $a wff BustVerifier $.

${
$d x y $.
$( Try to bust the proof verifier using this definition. (Contributed by
Wolf Lammen, 9-Sep-2019.) $)
df-bust-verifier $a |- ( BustVerifier <-> x = y ) $.
$}

${
$d x y $.
$( A contradiction. (Contributed by Wolf Lammen, 9-Sep-2019.) $)
bust-verifier $p |- ( -. x = y -> F/ x x = y ) $=
( weq wnf wn wbustverifier df-bust-verifier bicomi nfv nfxfr a1i ) ABCZAD
LELFAFLABGHFAIJK $.
$}

Jim Kingdon

unread,
Sep 24, 2019, 6:29:03 PM9/24/19
to meta...@googlegroups.com
Thanks to everyone who worked on this issue. As described in that github
issue, we now have a (better) working definition checker as part of the
Travis build.

The definition checker means that unsound definitions are a matter of
"well, we might or might have an elegant design but we at least have a
way of catching mistakes or people who don't understand the system well
enough" as opposed to "OMG we need to rearchitect the whole thing
because how can we trust the hundreds of definitions in set.mm?".

Mario Carneiro

unread,
Sep 24, 2019, 7:18:12 PM9/24/19
to metamath
I think that this incident has made it clearer that we need a proper spec for definition checking though, as well as at least one other implementation, suitable as a reference implementation and not tied to a complex system. Unfortunately definition checking requires higher parsing requirements than verification, because axioms don't come with parse trees, plus you have to be able to read $j rules (or be willing to hardcode lots of things like the names of sorts).

--
You received this message because you are subscribed to the Google Groups "Metamath" group.
To unsubscribe from this group and stop receiving emails from it, send an email to metamath+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/metamath/a0108bbd-b35e-71ec-223a-6b9e0c591be7%40panix.com.

Giovanni Mascellani

unread,
Sep 25, 2019, 3:17:53 AM9/25/19
to meta...@googlegroups.com
Hi,

Il 25/09/19 01:17, Mario Carneiro ha scritto:
> I think that this incident has made it clearer that we need a proper
> spec for definition checking though, as well as at least one other
> implementation, suitable as a reference implementation and not tied to a
> complex system. Unfortunately definition checking requires higher
> parsing requirements than verification, because axioms don't come with
> parse trees, plus you have to be able to read $j rules (or be willing to
> hardcode lots of things like the names of sorts).

mmpp more or less has these capabilities, and I have though about
reimplementing the definition checker in the past, but never did it.
This could be an occasion to resume that project. It should not be hard.
In case there isn't one already, though, I would need a specification of
what has to be checked. I have a fairly good idea of what it is, but I
might be mistaken, and comparing with an external reference definition
is always good in these cases.

Giovanni.
--
Giovanni Mascellani <g.masc...@gmail.com>
Postdoc researcher - Université Libre de Bruxelles

signature.asc

David A. Wheeler

unread,
Sep 25, 2019, 9:06:49 PM9/25/19
to metamath, metamath
On Tue, 24 Sep 2019 19:17:58 -0400, Mario Carneiro <di....@gmail.com> wrote:
> I think that this incident has made it clearer that we need a proper spec
> for definition checking though, as well as at least one other
> implementation, suitable as a reference implementation and not tied to a
> complex system.

I agree, and I don't feel confident about creating such a definition myself.

Someone :-) seems to have implemented a definition check in mmj2, current version here:
https://github.com/digama0/mmj2/blob/master/mmj2jar/macros/definitionCheck.js

I think we should try to formalize what that is doing.

Note that the recommended way to run this check is the following
(which excludes axioms & some specific definitions):
RunMacro,definitionCheck,ax-*,df-bi,df-clab,df-cleq,df-clel

Thoughts?

--- David A. Wheeler

Benoit

unread,
Sep 26, 2019, 9:49:49 AM9/26/19
to Metamath
Here is my understanding of the definition checker:

The $a-statement <label> with typecode |- passes the test if and only if it respects the following five rules.
Rule 1: <label> should be a biconditional or an equality (i.e. its root-symbol should be <-> or =).
If <label> passes Rule 1, then we can define its definiendum as its LHS and its definiens as its RHS.
We define the *defined symbol* as the root-symbol of the LHS.
We define a *dummy variable* as a variable occurring in the RHS but not in the LHS.
Rule 2: The defined symbol should not occur in an earlier $a-statement with typecode |- or in another place in <label>.
Rule 3: No two variables occurring in the LHS should share a DV condition.
Rule 4: All dummy variables are required to be disjoint from any other (dummy or not) variable occurring in <label>.
Rule 5: If X is a dummy variable, then there should be an earlier justification theorem (I presume this justification theorem is a biconditional (resp. equality) with on the LHS the definiens and on the RHS the definiens with X replaced by some Y of the same type), or, the definition checker should be able to automatically generate one.

Remarks:
* The "root-symbol" is the root of the considered tree; it need not correspond to a single token in the database (e.g. w3o or wsb).
* "DV" stands for "disjoint variable".
* I think that Rule 3 is not necessary.  It's true that a definition not satisfying Rule 3 is a bit strange, but Rules 1,2,4,5 seem to suffice for conservativity and eliminability.
* In the mmj2 implementation, Rule 5 is actually decomposed into two rules (5 and 6) according to whether X is of type setvar or not.  If not, then the definition checker will not try to generate a justification theorem (so there better be one in the database), and if it is of type setvar, then the definition checker, if there is no justification theorem in the database, will try to generate one of the form "|- F/ x definiens" (actually, "|- ( definiens -> A. x definiens )").
* Note that because of Rule 5, "passing the test" may actually depend on the effectiveness of the algorithm trying to generate a justification theorem.
* By the way, are there in set.mm any non-setvar dummy variables ?  An example would be "df-nul2 $a |- () = ( A \ A ) $." or the former definition ~dftru2 of T.  Actually, any non-setvar dummy variable could be replaced by _V (or (/)) or T. (or F.) depending on their type.

Benoit

ookami

unread,
Sep 26, 2019, 10:56:31 AM9/26/19
to meta...@googlegroups.com
The definition df-bust-verifier is not per se "crazy", it becomes so in the presence of ax-5. Should you ever strive for a set.mm without distinct variable constraints, usage of df-bust-verifier is at your discretion. Distinct variable constraints are in FOL (without sets) a mere convenience extension, because requiring two variables be distinct can be achieved by a distinctor hypothesis -. A. x x = y, and x be distinct from ph is replaceable with a F/ x ph hypothesis (I leave out the boring special case of a one-object universe).

This gives rise to the question, how the definitionChecker works: Does it assume the current axiomatic foundation in place, or does it really do a semantic check of axioms it finds in the file and individually derives requirements for soundness? In other words: If I remove ax-5 from set.mm and discard theorems based on it, is df-bust-verifier accepted then?

I am curious, and don't have the time to look at the sources. I am not too disappointed, if I receive no answer.

Wolf

Benoit

unread,
Sep 26, 2019, 11:29:47 AM9/26/19
to meta...@googlegroups.com
This gives rise to the question, how the definitionChecker works: Does it assume the current axiomatic foundation in place, or does it really do a semantic check of axioms it finds in the file and individually derives requirements for soundness? In other words: If I remove ax-5 from set.mm and discard theorems based on it, is df-bust-verifier accepted then?

The definition checker does not assume an axiomatic foundation: it reads the provided mm database (which has to be similar to set.mm, in terms of typecodes (e.g. "|-" and "setvar") but it also works e.g. on iset.mm).  It checks Rules 1--5 I gave in the post above because we can prove, outside of Metamath, that a definition respecting these rules is conservative and eliminable (a similar metatheorem is proved for instance in Kleene's Introduction to metamathematics).
 
In particular, if ax-5 is removed, df-bust-verifier will still be rejected, because it violates Rule 5, regardless of previous axioms (as long as no justification theorem for it is provable, but in that case the axiomatization would not be consistent).  It is instructive to look at how $a-statements violate the various rules.  To do this, remove the statements excluded from the check in the RunParms.txt file.  Here is an example of an mmj2 output (most ax-* are rejected because of Rule 1):
------------
I-UT-0015 **** Processing RunParmFile Command #4 = SetMMDefinitionsCheckWithExclusions,ax-*
I-PA-0201 Axiom df-bi has failed the definitional soundness check. The root symbol is not = or <->.
I-PA-0202 Axiom df-clab has failed the definitional soundness check. The previous axiom ax-c14 uses the symbol being defined in this axiom.
I-PA-0202 Axiom df-cleq has failed the definitional soundness check. The previous axiom ax-8d uses the symbol being defined in this axiom.
I-PA-0202 Axiom df-clel has failed the definitional soundness check. The previous axiom df-clab uses the symbol being defined in this axiom.
I-PA-0206 Axiom df-bust-verifier has failed the definitional soundness check. Dummy variables [y] are possibly free in the definiendum, and no justification is available.
-----------
[By the way: the failure to pass the test for df-clab and df-cleq should not be due to the obsolete axioms mentioned, but this is another matter.]

Benoit

Mario Carneiro

unread,
Sep 26, 2019, 2:17:23 PM9/26/19
to metamath
On Thu, Sep 26, 2019 at 9:49 AM Benoit <benoit...@gmail.com> wrote:
* I think that Rule 3 is not necessary.  It's true that a definition not satisfying Rule 3 is a bit strange, but Rules 1,2,4,5 seem to suffice for conservativity and eliminability.

Rule 3 is not necessary for it to be a conservative extension, but it is required for it to be a definitional extension, in the same way that the "definition"

cfoo $a class Foo $.

(with no associated definitional theorem) posits the existence of a constant but does not pin it down. This does not lead to contradiction or the ability to prove new theorems, because you can always just replace Foo with (/) everywhere it appears, but you do not have eliminability, in that there are propositions which contain Foo that are not provably equal to any proposition in the original language (for example "(/) e. Foo").

If you leave out rule 3, you can define something like

cbar $a wff Bar x y $.
${ $d x y $. df-bar $a |- ( Bar x y <-> x = y ) $. $}

and now "Bar x x" is one of these unmentionables: there is no way to prove that it means anything in particular, because the definitional theorem that is supposed to be responsible for connecting it to the original language wants nothing to do with this expression, even though it is well formed.

You could go further and make it not even well formed:

${ $d x y $.
  cbar $a wff Bar x y $.
  df-bar $a |- ( Bar x y <-> x = y ) $.
$}

but now the grammar is not context free because the term formers have side conditions, so parsing becomes harder. (If it's just $d preconditions appearing in syntax axioms then you can probably get a reasonable method by generating all parse trees and rejecting all those that violate a DV condition, but if you add hypotheses then it becomes outright undecidable.)

On Thu, Sep 26, 2019 at 10:56 AM 'ookami' via Metamath <meta...@googlegroups.com> wrote:
The definition df-bust-verifier is not per se "crazy", it becomes so in the presence of ax-5. Should you ever strive for a set.mm without distinct variable constraints, usage of df-bust-verifier is at your discretion. Distinct variable constraints are in FOL (without sets) a convenience extension, because requiring two variables be distinct can be achieved by a distinctor hypothesis -. A. x x x = y, and x be distinct from ph is replaceable with a F/ x ph hypothesis (I leave out the boring special case of a one-object universe).

I believe this is false. While I don't think you can derive false, you can at least derive that the universe has one element and hence distinctors don't work:

$d x y  $d x z
1 |- ( BustVerifier <-> x = y )
2 |- ( BustVerifier <-> x = z )
3:1,2 |- ( x = y <-> x = z )
4 |- ( x = y -> x = y )
5:3 |- ( x = y -> x = z )
6:4,5 |- ( x = y -> y = z )
7:6 |- ( E. x x = y -> y = z )
8 |- E. x x = y
9:7,8 |- y = z
10:9 |- A. y y = z
 
I haven't carefully analyzed these supporting theorems for ax-5 usage, in particular step 7 (exlimiv), but it's a bit weird to have $d hypotheses from df-bust-verifier when the axiom system itself doesn't deal with them.

This gives rise to the question, how the definitionChecker works: Does it assume the current axiomatic foundation in place, or does it really do a semantic check of axioms it finds in the file and individually derives requirements for soundness? In other words: If I remove ax-5 from set.mm and discard theorems based on it, is df-bust-verifier accepted then?

Deleting theorems will only make more definitions be rejected, because the theorem prover that discharges side conditions will be able to prove less things. The reasoning in the mmj2 definition checker is tailored to set.mm theorems (for example, step 1 - "every definition must have <-> or = at the head" - is very specific to set.mm). The next generation of definition checker will be based on the $j annotations, which attempt to define generically properties of certain constants which suffice for the equality metatheorem ( x = y -> P(x) ~ P(y) ) mentioned in df-cdeq (http://us2.metamath.org/mpeuni/mmtheorems32.html#mm3162s).



On Thu, Sep 26, 2019 at 11:29 AM Benoit <benoit...@gmail.com> wrote:
The definition checker does not assume an axiomatic foundation: it reads the provided mm database (which has to be similar to set.mm, in terms of typecodes (e.g. "|-" and "setvar") but it also works e.g. on iset.mm).  It checks Rules 1--5 I gave in the post above because we can prove, outside of Metamath, that a definition respecting these rules is conservative and eliminable (a similar metatheorem is proved for instance in Kleene's Introduction to metamathematics).

This proof relies on the database being significantly set.mm-like, down to the names of certain constants (and especially their types). You can probably fool the definition checker or cause it to crash if you define "wb" to have three arguments, or not act like an equivalence relation, or other things like that.

Mario

David A. Wheeler

unread,
Sep 26, 2019, 2:28:17 PM9/26/19
to metamath, metamath
Here's a first cut at modifying the "conventions" text to
explain the additional rules for the definitioncheck and also LRParser.
I took existing text as a starting point. I think it's
important to have "Rule 5" and "Rule 6" as separate items, since
that is how users will see error reports.

Mario: I know you're working to generalize this, but I think it's
important to document "as it is" now, and refine it as other
refinements occur.

--- David A. Wheeler

============================


diff --git a/set.mm b/set.mm
index 7ff58bf..4b4c9cb 100644
--- a/set.mm
+++ b/set.mm
@@ -369106,7 +369106,81 @@ $)
renamer theorem like ~ cbval . The section
~ mmtheorems32.html#mm3146s also describes the
metatheorem that underlies this.
- </p></HTML>
+ </p>
+
+ <p>
+ Standard Metamath verifiers do not distinguish between axioms and
+ definitions (both are $a statements).
+ In practice, we require that definitions (1) not be creative
+ (a definition should not allow an expression
+ that previously qualified as a wff but was not provable
+ to become provable) and be eliminable
+ (there should exist an algorithmic method for converting any
+ expression using the definition into
+ a logically equivalent expression that previously qualified as a wff).
+ To ensure this, we have additional rules on almost all definitions
+ ($a statements not beginning with the name ax-).
+ These additional rules are not applied in a few cases where the rules
+ are too strict ( ~ df-bi , ~ df-clab , ~ df-cleq, ~ df-clel );
+ see those definitions for more information.
+ These additional rules are checked by at least
+ <A HREF="https://github.com/digama0/mmj2/blob/master/mmj2jar/macros/definitionCheck.js"
+ >mmj2's definition check</A>.
+ This definition check relies on the database being very much like
+ set.mm, down to the names of certain constants and types, so it
+ cannot apply to all Metamath databases but is useful in set.mm.
+ In this definition check, a $a-statement with a given label and
+ typecode ` |- ` passes the test if and only if it
+ respects the following rules in order (these rules imply that we have
+ an unambiguous tree parse, which is also checked):
+ </p>
+
+ <ul>
+ <li>Rule 1:
+ The expression must be a biconditional or an equality (i.e. its
+ root-symbol must be ` <-> ` or ` = `).
+ We define its definiendum as its left hand side (LHS) and
+ its definiens as its right hand side (RHS).
+ We define the *defined symbol* as the root-symbol of the LHS.
+ We define a *dummy variable* as a variable occurring
+ in the RHS but not in the LHS.
+ Note that the "root-symbol" is the root of the considered tree;
+ it need not correspond to a single token in the database
+ (e.g. ~ w3o or ~ wsb ).
+ <li>Rule 2: The defined symbol should not occur in an earlier
+ $a-statement with typecode ` |- ` or in any another place
+ in this labelled expression.
+ <li>Rule 3: No two variables occurring in the LHS should share a
+ distinct variable (DV) condition.
+ Without this rule, you can define something like
+ cbar $a wff Bar x y $.
+ ${ $d x y $. df-bar $a |- ( Bar x y <-> x = y ) $. $}
+ and now "Bar x x" is not eliminatible;
+ there is no way to prove that it means anything in particular,
+ because the definitional theorem that is supposed to be
+ responsible for connecting it to the original language wants
+ nothing to do with this expression, even though it is well
+ formed.
+ <li>Rule 4: All dummy variables are required to be disjoint from any
+ other (dummy or not) variable occurring in this labelled expression.
+ <li>Rule 5: Either
+ (1) there must be no non-set dummy variables, or
+ (2) there must be a justification theorem.
+ The justification theorem must be of form
+ ` |- ( ` definiens root-symbol definiens' ` ) `
+ where definiens' is definiens but the dummy variables are replaced
+ with other unused dummy variables of the same type.
+ Note that root-symbol is ` <-> ` or ` = ` , and that set variables
+ are simply variables with the ` setvar ` typecode.
+ <li>Rule 6: One of the following must be true:
+ (1) there must be no set dummy variables,
+ (2) there must be a justification theorem as described in Rule 5, or
+ (3) we must prove that each dummy set variable is
+ not free, that is, we must prove that
+ ` ( ph -> A. x ph ) ` for each dummy variable ` x `
+ where ` ph ` is the definiens.
+ Part 3 of rule 6 is typically how most dummy variables are handled.
+ </ol></HTML>

Here is more information about our processes for checking and
contributing to this work:
@@ -369135,6 +369209,34 @@ $)
(triggering further review).
</li>

+ <li><b>LRParser check.</b>
+ Metamath verifiers ensure that $p statements follow from previous
+ $a and $p statements.
+ However, by itself the Metamath language permits certain kinds of
+ syntactic ambiguity that we want to avoid in this database.
+ Thus, we require that this database unambiguously parse
+ using the "LRParser" check (implemented by at least mmj2).
+ Details are in
+ <A HREF="https://github.com/digama0/mmj2/blob/master/src/mmj/verify/LRParser.java"
+ >LRParser.java</A>.
+ This check
+ <A HREF="https://github.com/metamath/set.mm/pull/754"
+ >counters, for example, a devious ambiguous construct
+ developed by saueran at oregonstate dot edu</A>
+ posted on Mon, 11 Feb 2019 17:32:32 -0800 (PST)
+ based on creating definitions with mismatched parentheses.
+ <!-- Devious Construct:
+ ${
+ wleftp $a wff ( ( ph ) $.
+ wbothp $a wff ( ph ) $.
+ df-leftp $a |- ( ( ( ph ) <-> -. ph ) $.
+ df-bothp $a |- ( ( ph ) <-> ph ) $.
+ anything $p |- ph $=
+ ( wbothp wn wi wleftp df-leftp biimpi df-bothp mpbir mpbi simplim
+ ax-mp) ABZAMACZDZCZMOEZOCQAEZNDZRNAFGSHIOFJMNKLAHJ $.
+ $}
+ -->
+
<li><b>Proposing specific changes.</b>
Please propose specific changes as pull requests (PRs) against the
"develop" branch of set.mm, at:

Mario Carneiro

unread,
Sep 26, 2019, 4:56:40 PM9/26/19
to metamath
Benoit's description is a good explanation of how mmj2's definition checker works today. The main worry I have about a pure reimplementation is the automated theorem prover part.

If you aren't worried about dependence on set.mm metatheory, or on actually producing proofs, the easiest algorithm to implement is to keep track of all binding operators, and ensure that all dummies are not free in the true sense (while also calculating the binding structure of the new definition).

--
You received this message because you are subscribed to the Google Groups "Metamath" group.
To unsubscribe from this group and stop receiving emails from it, send an email to metamath+u...@googlegroups.com.

David A. Wheeler

unread,
Sep 26, 2019, 6:01:05 PM9/26/19
to metamath, metamath
On Thu, 26 Sep 2019 16:56:26 -0400, Mario Carneiro <di....@gmail.com> wrote:
> Benoit's description is a good explanation of how mmj2's definition checker
> works today.

I used Benoit's description as a starting point. I tweaked the last definition rule text
to be (hopefully) clearer & to map more cleanly to the error messages
if a user sees them.

> The main worry I have about a pure reimplementation is the
> automated theorem prover part.

I added a hypertext link from the conventions text to the JavaScript code.
That way, someone who wants to know more will be able to quickly
see more information. That's not the best way to define things,
but I think the current problem is that it's not obvious that this situation
even exists or how to get *any* information about it.
If someone wants to rewrite that to be clearer/better later, I'd be delighted.

> If you aren't worried about dependence on set.mm metatheory, or on actually
> producing proofs, the easiest algorithm to implement is to keep track of
> all binding operators, and ensure that all dummies are not free in the true
> sense (while also calculating the binding structure of the new definition).

Is that algorithm adequate?

The code seems to all depend on proveBoundVar (copied below), but
I've never written an mmj2 JavaScript plugin so I wasn't sure how
to interpret that code (and it'd take me some time to drill in).

--- David A. Wheeler



=========================

function proveBoundVar(w, boundVars, v, dummy, root, fast)
{
var bound = new boolArr(root.child.length);
var bound2 = new boolArr(bound.length);
var allBound = true;
for (var i = 0; i < root.child.length; i++)
allBound &= bound[i] = bound2[i] = proveBoundVar(w, boundVars,
v, dummy, root.child[i], fast);
if (allBound)
return v.stmt != root.stmt;

var val = boundVars.get(root.stmt);
if (val == null) {
if (!fast)
return isBound(w, v, dummy, root);

var proto = root.stmt.getExprParseTree().getRoot();
val = new boolArrArr(proto.child.length);
for (var i = 0; i < val.length; i++) {
if (!proto.child[i].stmt.getTyp().getId().equals("setvar"))
continue;

val[i] = new boolArr(val.length);
var assignments = new HashMap();
for (var j = 0; j < val.length; j++)
if (proto.child[j].stmt instanceof VarHyp) {
assignments.clear();
assignments.put(
proto.child[j].stmt,
boxToType(proto.child[i],
boxToType(proto.child[i], null, "class"),
proto.child[j].stmt.getTyp().getId()));
val[i][j] = proveBoundVar(w, boundVars, proto.child[i],
dummy, reassignVariables(assignments, proto), false);
}
}
boundVars.put(root.stmt, val);
}

for (var i = 0; i < val.length; i++)
if (val[i] != null && !bound[i])
for (var j = 0; j < val.length; j++)
bound2[j] |= val[i][j];

for (var i = 0; i < val.length; i++)
if (!bound2[i])
return !fast && isBound(w, v, dummy, root);
return true;
}

Mario Carneiro

unread,
Sep 26, 2019, 7:08:35 PM9/26/19
to metamath
On Thu, Sep 26, 2019 at 6:01 PM David A. Wheeler <dwhe...@dwheeler.com> wrote:
On Thu, 26 Sep 2019 16:56:26 -0400, Mario Carneiro <di....@gmail.com> wrote:
> The main worry I have about a pure reimplementation is the
> automated theorem prover part.

I added a hypertext link from the conventions text to the JavaScript code.
That way, someone who wants to know more will be able to quickly
see more information. That's not the best way to define things,
but I think the current problem is that it's not obvious that this situation
even exists or how to get *any* information about it.

The definition of correctness here is not hard: you just assert that a particular theorem is true, or provable. The hard part is having an algorithm that is sufficiently good to actually be able to find such proofs sufficiently often that it isn't spuriously rejecting definitions all the time.
 
> If you aren't worried about dependence on set.mm metatheory, or on actually
> producing proofs, the easiest algorithm to implement is to keep track of
> all binding operators, and ensure that all dummies are not free in the true
> sense (while also calculating the binding structure of the new definition).

Is that algorithm adequate?

It is, and it's less flaky than one that is dependent on there being enough theorems in the database to be able to prove the relevant F/ theorems. There is a metatheorem that asserts that whenever a variable is properly not free in a term then the appropriate F/ theorem is provable, and from there you can prove that P(x) <-> P(y) holds where x is the variable that is not free in P. The metatheorem corresponds to a particular proof, that can be constructed by recursion on the term P, but we aren't putting that proof anywhere right now and for conservativity it only matters that such a proof exists, so we may as well be satisfied with the metatheorem and not generate the proof at all.

The code seems to all depend on proveBoundVar (copied below), but
I've never written an mmj2 JavaScript plugin so I wasn't sure how
to interpret that code (and it'd take me some time to drill in).

Oh, I guess I forgot; boundVars in that algorithm is actually the information about whether each set variable is bound in a syntax constructor. So actually mmj2 isn't proving F/ theorems all the time; it only does so when it can't derive the bound variable status of a definition, usually because it's an axiomatic term like "A. x ph". In these cases it falls back on the "fast = false" case where it actually attempts to search the database for a proof of "A. x ph -> A. x A. x ph", and if it finds a proof then it marks x as not free in ph. If it was not able to find a proof then it says that x is (potentially) free in ph in "A. x ph", which would then cause later definitions using a dummy under a forall binder to fail.

The "theorem proving" in this case is actually trivial; it's looking for an exact match with a given theorem statement. There is no index saying "here is the proof of |- ( E. x ph -> A. x E. x ph )" so I'm using the mmj2 one-step theorem prover to do the matching, but this is the sort of thing where a well positioned $j annotation can just say "you want the F/ theorem for df-ex? Try nfex" and skip the library-wide search.

Mario

ookami

unread,
Sep 29, 2019, 3:24:27 AM9/29/19
to Metamath


Am Donnerstag, 26. September 2019 20:17:23 UTC+2 schrieb Mario Carneiro:

I believe this is false. While I don't think you can derive false, you can at least derive that the universe has one element and hence distinctors don't work:

...

Mario

You are right, of course.  I really thought that evening a definition provides a means of replacing parts literally.  I should not post after work, I need to be in a better shape for this.

@all Thanks for your answers

Wolf

 
Reply all
Reply to author
Forward
0 new messages