Roughly a fortnight ago, I wrote a C++ procedure to convert Fitch-style natural deduction proofs of propositional theorems to Hilbert-style proofs.
I released it today as part of pmGenerator 1.2.2.
Since Metamath concerns Hilbert systems I'd like to share a few insights I learned along the way, apart from presenting the new feature. In particular, I noticed that Mario Carneiro gave a talk on "Natural Deduction in the Metamath Proof Language", but it doesn't seem to have gained much traction; at least I do not see many theorems in Metamath that would be crucial to apply proof design based on natural deduction.
In my opinion, the option to design such proofs gives a great advantage since they are very easy for humans to find (at least at the propositional level), whereas Hilbert-style proofs are rather complicated (even at the propositional level).
For demonstration, here's a proof constructor to try out natural deduction proof design: https://mrieppel.github.io/FitchFX/
My converter is using the same syntax (with "Deduction Rules for TFL" only). Some exemplary proofs are: m_ffx.txt, w1_ffx.txt, …, w6_ffx.txt
— of the seven minimal single axioms of classical propositional logic
with operators {→,¬}. These files can also be imported via copy/paste
into the FitchFX tool under the "Export" tab.
[Note: In the following, exponents ⁿ (or ^n) mean n-fold concatenation of sequences, and D stands for (2-ary) condensed detachment in prefix notation, i.e. most general application of modus ponens, taking a proof of the conditional as first and a proof of the antecedent as second argument.]
The core of the translation algorithm can be found at nd/NdConverter.cpp#L815-L947 (definition and call of recursive lambda function translateNdProof).
UsageMy converter (pmGenerator --ndconvert) uses aliases by default (as mentioned in the header file) rather than treating connectives other than {→,¬} as real symbols and operators, with the same aliases that are used by pmproofs.txt. This is unlike Metamath. But there is the option -h to use heterogeneous language (i.e. with extra axioms to define additional operators). But then the user must also provide rule-enabling theorems in order to enable their corresponding rules for translation.
My procedure can translate into all kinds of propositional Hilbert systems, as long as the user provides proofs of (ax-1) and (ax-2) together with sufficient information for the used rules. When using {→,¬}-pure language, providing a proof for (ax-3) in addition to (ax-1), (ax-2) is already sufficient to enable all rules.
For example, m.txt (which is data/m.txt in the release files) can be used via
pmGenerator --ndconvert input.txt -n -b data/m.txt -o result.txtto generate a proof based on (meredith) as a sole axiom, for whichever theorem there is a FitchFX proof in input.txt. All rules are supported since m.txt contains proofs for (ax-1), (ax-2), and (ax-3). Since it also contains a proof for p→p that is shorter than building one based on DD211, resulting proofs use the corresponding shortcut.
Results can then be transformed via
pmGenerator --transform result.txt -f -n […options…] -o transformedResult.txtand optionally be compressed with -z or -x to potentially find fundamentally shorter proofs. When exploring new systems, the hardest part can be to find the first proofs of sufficient theorems (or figure out they don't exist).
QuestionRegarding Metamath's databases, shouldn't there be more elements to support proof design based on natural deduction? One cannot prepare all the elements, of course, since translations of Fitch-style proofs of unlimited depth require infinitely many derivations — at least one per depth per rule-enabling theorem.
The lack of more intuitive tools for proof design in set.mm is something that bothered me when I discovered Metamath years ago, but I couldn't quite make use of it. I hope we can shed some light on the matter.
— Samiro Discher
In particular, I noticed that Mario Carneiro gave a talk on "Natural Deduction in the Metamath Proof Language", but it doesn't seem to have gained much traction; at least I do not see many theorems in Metamath that would be crucial to apply proof design based on natural deduction.
- Most rules can be enabled by using a corresponding theorem. For example, p→(q→(p∧q)) (pm3.2) can be used — in combination with two modus ponens applications — to apply conjunction introduction, i.e. ∧I: Γ∪{p,q}⊢(p∧q). There may be multiple rule-enabling theorems, for example p→(q→(q∧p)) (pm3.21) can accomplish the same thing by changing the order of arguments. I provided a table of rule-enabling theorems at nd/NdConverter.h, many of which do not have set.mm correspondents: Of six ∨E-enabling theorems there is only (jao), and I didn't find any of (p→⊥)→¬p, ¬p→(p→⊥), p→(¬p→⊥), (¬p→⊥)→p, i.e. no enabling theorems for any of ¬I, ¬E, IP.
- However, in natural deduction proofs, there are blocks at certain depths, each starting with an assumption.
For example, ∧I: Γ∪{p,q}⊢(p∧q) at depth 3 is actually Γ∪{a→(b→(c→p)),a→(b→(c→q)}⊢a→(b→(c→(p∧q))). Fortunately, such variants can easily be constructed from the zero-depth rule-enabling theorems:
For symbols 1 := (ax-1) and 2 := (ax-2), the proof σ_mpd(d) for σ_mpd(0) := D and σ_mpd(n+1) := (σ_mpd(n))²(D1)ⁿ2 can be used to apply modus ponens at depth d. For example, σ_mpd(0) is (ax-mp), σ_mpd(1) is (mpd), and σ_mpd(2) is (mpdd). But σ_mpd(d) for d ≥ 3 seem to be still missing from Metamath.
Every theorem can be shifted one deeper by adding an antecedent via preceding its proof with D1, i.e. with a single application of (a1i).
In combination with σ_mpd, rule-enabling theorems can thereby be applied at any depth. I gave detailed constructions of all supported rules at nd/NdConverter.cpp#L538-L769.
- We cannot simply make use of some rule-enabling theorem to translate conditional introduction, i.e. →I: from Γ∪{p}⊢q infer Γ⊢(p→q), since it handles the elimination of blocks and depth, which is necessary because Hilbert-style proofs operate on a global scope everywhere. Other rules just call it in order to eliminate a block and then operate on the resulting conditional.
QuestionRegarding Metamath's databases, shouldn't there be more elements to support proof design based on natural deduction? One cannot prepare all the elements, of course, since translations of Fitch-style proofs of unlimited depth require infinitely many derivations — at least one per depth per rule-enabling theorem.
On 08/06/2025 15:52, Mario Carneiro wrote:
On Sat, Jun 7, 2025 at 8:20 PM samiro.d...@rwth-aachen.de <samiro....@rwth-aachen.de> wrote:In particular, I noticed that Mario Carneiro gave a talk on "Natural Deduction in the Metamath Proof Language", but it doesn't seem to have gained much traction; at least I do not see many theorems in Metamath that would be crucial to apply proof design based on natural deduction.
To the contrary, I would say that the idea behind that talk was quite influential on set.mm; writing large theorems in "deduction style" basically became the default, replacing the more limited usage of inference style + weak deduction theorem as well as "theorem style" closed inferences which do a lot of context manipulation. Personally I consider writing deduction proofs in metamath / set.mm to be basically a solved problem.
This was also my first reaction.
As far as I'm concerned, virtually all of my recent additions are deduction style.
Here is a discussion which basically calls for even more
deduction from theorems: https://github.com/metamath/set.mm/issues/3990,
with some strong support.
A quick look at changes-set.mm will show that recently, we've been moving a lot of deduction style proofs to the main part of set.mm (those with labels ending in `d`)
BR,
As I explain in the talk, you don't actually need depth > 1 since you can use conjunction instead. That's exactly the reason why this whole setup works pragmatically and doesn't require an infinite list of axioms. Instead of stacking things up as
ph -> a -> b -> c -> p,
ph -> a -> b -> c -> q
|- ph -> a -> b -> c -> (p /\ q)if you use
ph /\ a /\ b /\ c -> p,
ph /\ a /\ b /\ c -> q
|- ph /\ a /\ b /\ c -> (p /\ q)then this is just a substitution instance of jca again. You use imp/ex in order to push and pop assumptions to the context but otherwise you never have to worry about larger depths.
> Here is a discussion which basically calls for even more
deduction from theorems: https://github.com/metamath/set.mm/issues/3990,
with some strong support.