I am just now writing up slides for a tutorial on reals in HoTT and I
had to explain in what sense judgmental equality is "not logic".
The equality rules are thus not about some prior notion of equality.
Whether this is a satisfactory "foundational system" depends on what
one is looking for. There is certainly ample evidence that a lot of
mathematics (including logic) can be derived within such a system.
I guess it depends what you mean by “self-contained”.Studying HoTT as a logical system (an “object theory”), judgemental equality is a proposition in the *meta-theory*, which is not invariant under the propositional equality of the object theory. But that’s fine, and doesn’t prevent the meta-theory itself from also being HoTT.
It seems parallel to the observation that in the proof theory of ZFC, syntactic equality is a non-extensional notion — the terms “{0,1}” and “{1,0}” are syntactically different, although by extensionality, they must denote the same set, i.e. must be provably equal in the object theory. But this doesn’t cause problems with using ZFC as its own meta-theory, or make ZFC less self-contained as a foundation.
It’s also worth noting that when using HoTT as its own meta-theory, syntax provably forms a *set* (at least under the more straightforward constructions of syntax), so syntactic equality is very well-behaved.
So for these reasons, I wouldn’t see this as making HoTT less self-contained as a foundation.On the other hand, subtleties of judgemental equality do become an issue when using HoTT to study its own *model* theory — see e.g. Mike’s blog post http://homotopytypetheory.org/2014/03/03/hott-should-eat-itself/ for a lot of details on this. Essentially, it’s unproblematic to (working in HoTT as a meta-theory) redo the standard theory of what models of type theory are, and how they behave; but it seems difficult to actually *construct* such models from universes, in the way that one might expect to. And non-truncatedness of the universe is the major obstacle there.
Hello,1. First of all HoTT is not a foundational system. The foundations of mathematics that HoTT is related to are called the Univalent Foundations.HoTT is the study of a meta-theory of the Univalent Foundations. The analog of "Homotopy Type Theory" in the current set-theoretic foundations is "Set Theory" as a theory of predicate logic - the study of axioms, models etc.It is unfortunate that set theoretic foundations of mathematics do not have a name to clearly distinguish them from the study of formal set theory in predicate logic.In our foundations there is such a distinction - HoTT is about a particular approach to the meta-theory, the one based Type Theory.The Univalent Foundations are the foundations based on the idea of h-levels, on the stratification of mathematical objects into propositions, sets and higher types that correspond to higher groupoids.It is possible that in the future an entirely different approach to the meta-theory of the Univalent Foundations will be found, just as the formalization of sets as types of h-level 2 in the MLTT provides an approach to the meta-theory of the set-theoretic foundations that is entirely different from the one provided by the "set theory".
2. There is a more narrow reading of HoTT in which HoTT is the name of particular type system. This is probably what you had in mind in your questions. This reading, in particular for the reason you mention, does not make much sense.
The more we study HoTT (as a subject) the more we understand that there can be no single type system that is the best one.On the more abstract level it is due to Goedel's results.On a more concrete level it is due to the fact that there can be a system A of strict equalities and a system B of strict equalities, both A and B being consistent with the Univalence Axiom but the system A + B being inconsistent.It is similar to the phenomenon first observed probably by Carlos Simpson that when trying to replace an n-category by an equivalent strict n-category it might be possible to make associativity strict but then the units must be weak or to make the units strict but then the associativity must me weak.Whether a particular system A of strict equalities can be combined with another system B can be a very difficult question. For example it is currently a conjecture that the system of strict equalities validated by the cubical set model that includes, for example, that the application of functions to paths is strictly associative, can be extended by adding the strict reduction rule for the J eliminator that exists in the original MLTT. But this conjecture remain open despite a lot of effort to prove it.
3. The way I see things is that we will have many type systems that can be used to do mathematics in the univalent style. We will create an infrastructure that will help us to describe such systems formally and to construct models and interpretations of these systems in the each other or in the ZFC, formally.
The system of mathematical objects and constructions being created by us and certified for correctness with this infrastructure then will be 'the HoTT".
I think there are several different questions here that are being confused.
Question 1: Is HoTT (or, more specifically, any particular type theory
used for HoTT) a foundational system?
Answer 1: Yes: it suffices to represent practically all known
mathematics, and plenty of as-yet-unknown mathematics. It is indeed
the case that any Turing-complete programming language could
equivalently be called a "foundation of programming". Computer
scientists don't generally speak that way, but by analogy they could.
Question 2: Is there a foundational system in which absolutely
everything is homotopy invariant?
Answer 2: This is an interesting question, and I don't think the
answer is known. You're quite right that the presence of strict
equality in all known type theories for HoTT means that they are not
such.
It's reasonable to hope for such a thing,
but I think it's
fairly likely that the answer is "yes, but it's too cumbersome to
actually use for anything." But even if the answer is "no", it
doesn't detract at all from the fact that current type theories for
HoTT can serve as foundations for mathematics, and are *extremely*
homotopy invariant even if not *completely* so.
Question 3: Does the presence of judgmental equality in type theories
for HoTT make them any less "self-contained"?
Answer 3: No. A type theory is a set of rules and axioms formulated
in a deductive system, and judgmental equality, being a judgment, is
part of that deductive system in the same way that a typing judgment
a:A is. It exists at a different level *within* the system from
propositional equality, but it's still part of the type theory.
Analogously, set theory is generally formulated with axioms inside the
deductive system of first-order logic, in which the primary judgment
is "P is true/provable", existing at a different level from the
proposition "P" itself. When we construct a model of such a type
theory, we have to interpret judgmental equality and propositional
equality by different things, and usually we interpret them by a
"strict" equality and by "homotopies", but we are interpreting the
whole system of which both are a part.
On Sun, Sep 14, 2014 at 10:53 PM, Matt Oliveri <atm...@gmail.com> wrote:
> But I still would've hoped the
> conditions for a foundation of math would be stronger than just the ability
> to develop enough math with it.
Do you have any suggestions for what other conditions one might
impose? I've never heard of any. Of course one foundation of math
can be a *better* one than another, just like one programming language
can be better than another, in the sense of being easier to use, more
powerful, more convenient for many purposes, etc., but that's a
different question.
>> It's reasonable to hope for such a thing,
>
> Is it though? I didn't ask that, but it's worth asking.
I think that from a higher-category-theorist's perspective, without
knowing much about foundations, it's reasonable to hope for.
> I give up, how would you even begin to try defining a system where literally
> everything is invariant?
I don't really know; that's why I said it was an interesting problem.
One might start by taking an existing type theory for HoTT and making
absolutely all the judgmental equalities into propositional ones. But
would that be a workable system? I don't know.
> Just to check: being "*extremely* homotopy invariant" is no reason to be a
> foundation.
Homotopy-invariance is an unrelated question to being a foundation or
not. ZFC is a foundation, but it is decidedly not homotopy-invariant.
> It sounds like you're pointing out that judgmental equality is still part of
> the language even though it can't be reflected as a type, and therefore the
> language is self-contained. But I hope not, because that seems tautological.
Yes, that's what I was saying. It is tautological. What do *you*
mean by "self-contained"?
It's important to separate semantics from foundations. When using
HoTT as a foundation, we don't need to think about "interpreting" it
anywhere; we just take it as it is.
On Mon, Sep 15, 2014 at 2:53 AM, Matt Oliveri <atm...@gmail.com> wrote:
> Can you explain what you mean by that? I understand how the role of
> judgmental equality is to allow certain constructions, but it still seems to
> do that by providing an appropriate notion of equality.
If we understand substitution as a form of construction (which we
certainly should in type theory) then judgmental equality gives us
preconditions for certain substitution-constructions. And since
substitution itself requires no notion of equality (except being able
to recognize variables and to handle alpha-equivalence), we may
explain judgmental equality in terms of its role in
subtitution-construction.
> It's part of the definition of the system whether it's considered "prior" or not.
Whether a particular concept in
a foundational system is "prior to the system" is precisely the kind
of philosophy that may lurk around the foundational system, but should
never be allowed to enter it -- at least not under the current
conception of what a formal system may be.
That's one of the things I want to understand. Why does the ability to derive a lot of math make it a foundation? I can write a lot of programs in Java (or Haskell, or OCaml...), but it's no foundation of software.
Interesting. I'm actually asking about the idea that "everything" in a foundational system should be homotopy invariant. In HoTT (no specific system, as long as it's something like MLTT) at least judgmental equality is not homotopy invariant.
If we understand substitution as a form of construction (which we
certainly should in type theory) then judgmental equality gives us
preconditions for certain substitution-constructions. And since
substitution itself requires no notion of equality (except being able
to recognize variables and to handle alpha-equivalence), we may
explain judgmental equality in terms of its role in
subtitution-construction.
With math, I suppose you get to pick which concepts are the low-level ones, but not with programming.
On Sep 15, 2014, at 7:56 AM, Matt Oliveri <atm...@gmail.com> wrote:
> It seems to me that a foundation should have as little conceptual and technical baggage as possible, while still being able to encode all math. The encoding can be complex and cumbersome, as long as it can be done once and for all. Notice that ZF has HoTT beat hands down on conceptual and technical simplicity.
In this sense all maths can be encoded in sequences of 0's and 1's and this "foundation" beats any other in the conceptual and technical simplicity.
Here is what I wrote for a general audience lecture on foundations at the IAS. I said that any usable foundation of mathematics should have at least three components:
> The first component is a formal deduction system: a language and rules of manipulating sentences in this language that are purely formal, such that a record of such manipulations can be verified by a computer program.
>
> The second component is a structure that provides a meaning to the sentences of this language in terms of mental objects intuitively comprehensible to humans.
>
> The third component is a structure that enables humans to encode mathematical ideas in terms of the objects directly associated with the language.
A foundation of maths is something like an accounting system that can be used to ensure that things are recorded properlyand that if there is a problem it can be localized and dealt with.
Martin-Loef Type Theory.
How would you teach an undergrad HoTT? It seems harder than teaching set theory.
On Sep 15, 2014, at 10:52 AM, Matt Oliveri <atm...@gmail.com> wrote:How would you teach an undergrad HoTT? It seems harder than teaching set theory.HoTT is the study of the type-theoretic approach to the meta-theory of the Univalent Foundations. Why would you want to teach it to an undergrad?
Here is what I wrote for a general audience lecture on foundations at the IAS. I said that any usable foundation of mathematics should have at least three components:
> The first component is a formal deduction system: a language and rules of manipulating sentences in this language that are purely formal, such that a record of such manipulations can be verified by a computer program.
>
> The second component is a structure that provides a meaning to the sentences of this language in terms of mental objects intuitively comprehensible to humans.
>
> The third component is a structure that enables humans to encode mathematical ideas in terms of the objects directly associated with the language.
The first two components I agree are definitely necessary for a foundation of math. It's because of the second component that conceptual complexity should be low. Much better to intuitively explain a simple system and then formalize complex ideas, than to have to intuitively explain a complex system from scratch.
It's the third component that I'm not convinced should be built into a foundation. That can also be done within the foundational system.
In the ZFC-based foundations natural numbers have a complex definition. In particular natural numbers are not among the objects that are directly associated with the language.But I would say that a foundation of maths should provide an explanation of what natural numbers are and how to work with them.
On Sep 15, 2014, at 9:38 AM, Matt Oliveri <atm...@gmail.com> wrote:With math, I suppose you get to pick which concepts are the low-level ones, but not with programming.Not with programming so far while we only have one type of hardware...But in maths there are indeed choices and a "foundation" became a misleading name because maths can exist quite well without any foundation and there can be several foundations at a given time.
A foundation of maths is something like an accounting system that can be used to ensure that things are recorded properlyand that if there is a problem it can be localized and dealt with.That's why it is important that a foundation be convenient in practical use.
Or a foundation can be any sufficiently strong formal system used to formally justify other formal systems and thus everything they justify, at any level of indirection. This is what I mean.
For notion two, with no emphasis on convenience, all emphasis is on simplicity, and on efficiency of computer implementations.
From: Vladimir Voevodsky <vlad...@ias.edu>
Date: Mon, 15 Sep 2014 09:11:56 +0100
To: Matt Oliveri <atm...@gmail.com>
Cc: "Prof. Vladimir Voevodsky" <vlad...@ias.edu>, "HomotopyT...@googlegroups.com" <HomotopyT...@googlegroups.com>
Subject: Re: [HoTT] Possibly silly question about HoTT as a foundation
On Sep 15, 2014, at 1:53 AM, Matt Oliveri <atm...@gmail.com> wrote:That's one of the things I want to understand. Why does the ability to derive a lot of math make it a foundation? I can write a lot of programs in Java (or Haskell, or OCaml...), but it's no foundation of software.What is a foundation of software?
V.
This message and any attachment are intended solely for the addressee and may contain confidential information. If you have received this message in error, please send it back to me, and immediately delete it. Please do not use, copy or disclose the information contained in this message or in any attachment. Any views or opinions expressed by the author of this email do not necessarily reflect the views of the University of Nottingham.
This message has been checked for viruses but the contents of an attachment may still contain software viruses which could damage your computer system, you are advised to perform your own checks. Email communications with the University of Nottingham may be monitored as permitted by UK legislation.
On Monday, September 15, 2014 6:01:34 AM UTC-4, Peter LeFanu Lumsdaine wrote:On Mon, Sep 15, 2014 at 11:14 AM, Vladimir Voevodsky <vlad...@ias.edu> wrote:A foundation of maths is something like an accounting system that can be used to ensure that things are recorded properlyand that if there is a problem it can be localized and dealt with.For this reason, I’d add another desideratum for a foundation, though I realise some might disagree: correctness of proofs should be decidable (and ideally, efficiently so).
How about semi-decidable? That is, valid proofs are accepted, but invalid proofs might make the checker loop.
Is Andromeda checking decidable or semi-decidable? I figured semi-decidable, unless it can somehow avoid loops in the equality-reflected reduction rules.
For this reason, I’d add another desideratum for a foundation, though I realise some might disagree: correctness of proofs should be decidable (and ideally, efficiently so).How about semi-decidable? That is, valid proofs are accepted, but invalid proofs might make the checker loop.One can certainly argue for semi-decidability being enough, but I don’t find it quite satisfactory. Or, at least, I find it considerably *less* satisfactory than decidability.
On Mon, Sep 15, 2014 at 12:16 PM, Matt Oliveri <atm...@gmail.com> wrote:
On Monday, September 15, 2014 6:01:34 AM UTC-4, Peter LeFanu Lumsdaine wrote:On Mon, Sep 15, 2014 at 11:14 AM, Vladimir Voevodsky <vlad...@ias.edu> wrote:A foundation of maths is something like an accounting system that can be used to ensure that things are recorded properlyand that if there is a problem it can be localized and dealt with.For this reason, I’d add another desideratum for a foundation, though I realise some might disagree: correctness of proofs should be decidable (and ideally, efficiently so).How about semi-decidable? That is, valid proofs are accepted, but invalid proofs might make the checker loop.One can certainly argue for semi-decidability being enough, but I don’t find it quite satisfactory. Or, at least, I find it considerably *less* satisfactory than decidability.
At an extreme, consider [...]
Practically, as Vladimir says, one thing one wants for purported proofs is that “if there is a problem it can be localised and dealt with”. And this certainly seems to fit better with decidability than semi-decidability. If proof-checking is only semi-decidable, then you may not ever be able to know that a problem exists — just that you haven’t found a way to certify this proof *yet*.I want my proof assistant — or the referees of my papers — to be able to either say “yes, it’s sound”, or “no, here’s a gap”. It’s frustrating enough in practice when Coq or a referee seems to be taking forever. It would be quite disheartening if even in principle, that were all we could expect!
Or a foundation can be any sufficiently strong formal system used to formally justify other formal systems and thus everything they justify, at any level of indirection. This is what I mean.I see now. This is a useful distinction. What you call a foundation I call "an etalon of consistency". For example ZFC is used as an etalon of consistency relative to which the consistency of the MLTT is shown. It is also used as the etalon of consistency for the type systems of HoTT through the use of univalent models.
In this sense neither the original MLTT nor its strengthened versions that are used in the Univalent Foundations are foundational systems.
For notion two, with no emphasis on convenience, all emphasis is on simplicity, and on efficiency of computer implementations.I am not sure why would you care about efficiency of an implementation. You are not going to actually use it for anything because the translation from the system that you actually use to an etalon of consistency may involve non-linear expansion in size.
On 15/09/14 21:34, Matt Oliveri wrote:
> As Martin and Maxime have excellently pointed out, decidability is no
> guarantee of practicality, or anywhere close.
This is not what I wanted to point out.
In practice, one would like "most naturally occurring proofs" to be
checked to be proofs fast, and any accidental (or not so accidental)
non-sense to be rejected, again fast.
But I think that the primary stuff are the derivations.
On Mon, Sep 15, 2014 at 4:47 AM, Vladimir Voevodsky <vlad...@ias.edu> wrote:
> In this sense neither the original MLTT nor its strengthened versions that
> are used in the Univalent Foundations are foundational systems.
Why not? We can and do use them to prove the relative consistency of
other theories. For instance, Peter Aczel proved the relative
consistency of CZF by constructing a model in MLTT (right?), and in
chapter 10 of the book we prove the relative consistency of ZF by
constructing a model in HoTT.
Ok, so when you say that HoTT "is not" an etalon, you mean it is not a
good choice of etalon *for you*, because you find set-theoretic
concepts more believable than homotopy-theoretic ones.
But in the
formal sense, it can certainly be an etalon of consistency,
and for those of us that you consider weird, it can be an etalon of
"believability" or "meaning"
(I don't think it makes much sense to
talk about mathematical ideas being "true" in any absolute sense).
On Mon, Sep 15, 2014 at 4:10 PM, Matt Oliveri <atm...@gmail.com> wrote:
> Yes. And I can say more. It seems to me that HoTT actually involves all the
> concepts of set theory and many more.
No; it doesn't involve the slightly-bizarre concept of global
membership that ZF-style set theory is built on. It does involve most
or all of the concepts of "structural" set theory like ETCS, and both
of them seem to me preferable to ZF.
> I don't think this idea of etalon can be formalized. Choosing an etalon is a
> social process. So I guess yes, vacuously?
I thought Vladimir was saying that an "etalon of consistency" is just
something relative to which you can prove other theories consistent.
That's certainly true of HoTT.
If your only goal is to prove things consistent, or to have fewer
basic notions, then you may want to choose a simpler theory as your
etalon. But as Andrej has pointed out, there are many other
considerations that are at least as important as a simplicity (some
might say "paucity") of basic notions.
On Mon, Sep 15, 2014 at 4:53 PM, Matt Oliveri <atm...@gmail.com> wrote:
> And I figure that some set of ideas would
> not be considered fundamental if a strict subset is fundamental.
This claim only makes sense if "fundamentality" is a property of sets
of ideas rather than individual ideas (otherwise the set of all
fundamental ideas would certainly contain strict subsets).
But in that case, I don't see any reason to believe it.
For instance (probably the case you had in mind), HoTT could be said
to contain as a "strict subset" structural set theory, and both can be
argued to be "fundamental sets of ideas". But SST's fundamentality is
not just a restriction of the fundamentality of HoTT, nor is HoTT's
fundamentality simply obtained by forgetting down to SST, so I don't
see any problem with regarding both as independently fundamental.
On 15/09/14 14:27, Thorsten Altenkirch wrote:
> > What is a foundation of software?
>
> Martin-Loef Type Theory.
Is there any theory of constructive mathematics, other than MLTT, in
which computation is built-in (at the meta-level), and doesn't have to
be extracted, and, more importantly, can be left unmentioned when
writing proofs?
This is very important for some people (including myself): because
computation is not mentioned when proofs in MLTT are written up, the
theory is compatible with e.g. excluded middle as a postulate (sure,
then the computational content is lost).
While there are constructive counter-parts of ZFC such as IZF and CZF,
their constructive content is not seen directly (in the case of CZF it
is seen by translation to MLTT).
In some sense, MLTT it is not just a foundation of constructive
mathematics: it is a foundation of neutral or absolute mathematics. It
just happens that (by careful design), in the context of type theory,
neutrality implies constructivity.
From this point of view, I find it very difficult to advocate ZFC as a
foundation (sure, it is a consistency etalon): it is difficult to find
a neutral core, with inherent computational content.
I no longer know what people in this thread mean by "etalon". I
thought Vladimir introduced the word to distinguish two different
meanings that are often attached to the word "foundation": (1) a
context in which to formalize informal mathematics (which he proposed
to call "foundation") and (2) a system with respect to which to
measure the consistency of other systems (which he proposed to call
"etalon"). But now it seems that many of us are using "etalon" for
meaning (1); or, at least, most of the criteria being applied seem to
me mainly applicable to that usage.
So as regards the question
> Is this purely hypothetical, or do you consider both to be good etalons? If
> you do, can you explain why?
If by "etalon" you mean a yardstick of consistency, then it's a fact
that they are equally well-suited to that as ZFC is, simply because a
version of each of them is equiconsistent with ZFC.
If you mean a foundation, whose role is to formalize informal
mathematics, provide a unified conceptual framework in which to
describe mathematics, basic tools for mathematicians to use in their
daily work, etc. then they are certainly both such a thing.
Long
experience of mathematics shows that SST is equally good as ZFC at
doing these things, indeed it is closer to what mathematicians do and
use on a daily basis than ZFC is.
And HoTT includes SST, so it is
equally adequate for those things, while also supplying more.
On Mon, Sep 15, 2014 at 10:14 PM, Matt Oliveri <atm...@gmail.com> wrote:
> I brought up the idea of an
> "etalon of meaning" which is a formal system of basic concepts that are used
> to establish the meaning of other concepts, not just their consistency.
I would consider that part of the role of a foundation (1). For instance, here:
>> If you mean a foundation, whose role is to formalize informal
>> mathematics, provide a unified conceptual framework in which to
>> describe mathematics, basic tools for mathematicians to use in their
>> daily work, etc. then they are certainly both such a thing.
By "a unified conceptual framework in which to describe mathematics" I
think I meant more or less the same thing as "a system of basic
concepts used to establish the meaning of other concepts".
Are you saying you don't understand how ETCS or HoTT can be regarded
as such a system of basic concepts?
On Tue, Sep 16, 2014 at 12:35 AM, Matt Oliveri <atm...@gmail.com> wrote:
> Back up. I asked "do you consider both to be good etalons?". Specifically, I
> don't understand why someone would have two etalons. It seems like one or
> the other would be considered more intuitive.
No, I don't see any reason why "intuitiveness" should be a linear ordering.
It's important that in mathematics, "intuitiveness" is not something
you have to take a priori. We can and generally do learn new
intuitions for new concepts.
I don't see any reason why "intuitiveness" should be a lattice either. (-:
More generally, notion two is an etalon of meaning, in that provides the most primitive ideas that all the ideas you trust can be reduced to.
On Monday, September 15, 2014 7:47:17 AM UTC-4, v v wrote:Or a foundation can be any sufficiently strong formal system used to formally justify other formal systems and thus everything they justify, at any level of indirection. This is what I mean.I see now. This is a useful distinction. What you call a foundation I call "an etalon of consistency". For example ZFC is used as an etalon of consistency relative to which the consistency of the MLTT is shown. It is also used as the etalon of consistency for the type systems of HoTT through the use of univalent models.
OK good, we have a term for this. Actually, my notion two can be more than an etalon of consistency. It can also be an etalon of truth. For example, many intuitionistic systems are still consistent if you add excluded middle, but for someone who's really, philosophically a constructivist (not me), it's presumably not good to use an etalon with excluded middle. They would want to know that none of their ideas depend on it, since it's not true for them.
More generally, notion two is an etalon of meaning, in that provides the most primitive ideas that all the ideas you trust can be reduced to. These reductions may well be too complex to actually carry out routinely, but their existence still provides some assurance that people are understanding each other.
In this sense neither the original MLTT nor its strengthened versions that are used in the Univalent Foundations are foundational systems.
I think a simple formulation of MLTT might be a sensible etalon for constructivists. Some variant of CZF might be better. I'm not sure. I agree that Univalent Foundations is not an etalon.For notion two, with no emphasis on convenience, all emphasis is on simplicity, and on efficiency of computer implementations.I am not sure why would you care about efficiency of an implementation. You are not going to actually use it for anything because the translation from the system that you actually use to an etalon of consistency may involve non-linear expansion in size.
Well, an implementation of an etalon should at least be efficient enough for it to be practical to maintain the interpretation of fancier systems you want to use. But basically you're right; the etalon wouldn't be used often, so it wouldn't need to be highly efficient.
I don't think any practical etalon would actually have a problem with non-linear expansion. Since it needs to have a practical formalization of the models, and interpretations are compositional, the models basically provide the systems that you actually use, but with a linear expansion plus some constant overhead.
(I don't think it makes much sense to
talk about mathematical ideas being "true" in any absolute sense).
As I have elaborated on previously (https://github.com/vladimirias/2010_80ies_aniversary_lecture) it is entirely possible that Peano arithmetic is inconsistent (in fact I made a formal conjecture that it is inconsistent).
Vladimir.
>> email to HomotopyTypeTheory+unsub...@googlegroups.com.
On Tuesday, September 16, 2014 6:42:07 PM UTC+6, v v wrote:As I have elaborated on previously (https://github.com/vladimirias/2010_80ies_aniversary_lecture) it is entirely possible that Peano arithmetic is inconsistent (in fact I made a formal conjecture that it is inconsistent).
Vladimir.
In my opinion problem with formal systems is even deeper.Main question about metatheory for the Godel Incompleteness theorems(as well as any metatheorems).This metatheories exploits the diagonal arguments that is impedicativity with problematic resolution even in first order setting(needs infinite object).But moreover, the godelization is a higher order function.Namely, the proposition/sentence is the definition for class of his own proofs/disproofs, or both in inconsistent case. Moreover it is higher order structure : proofs, proved equivalency of proofs and so on...; "proofs between proofs" of all arbitrary degrees. Something like classes of classes of classes... or "Higher groupoid of proofs".And this higher order objects subjected to diagonal arguments!Consistency of Godel metatheory is a big question.Whether there are proofs of "antiGodelIncompletenssTheorems"?Summary: a metatheory is a theory with his own consistency problem(undecidable in formal system concept). I.e. formal system concept needs in external foundations.Liar paradox on all metalevels.
--
You received this message because you are subscribed to the Google Groups "Homotopy Type Theory" group.
To unsubscribe from this group and stop receiving emails from it, send an email to HomotopyTypeThe...@googlegroups.com.
A more appropriate name analogous to "Set Theory" would be ... um ...
"Homotopy Type Theory"? (-:O
More generally, notion two is an etalon of meaning, in that provides the most primitive ideas that all the ideas you trust can be reduced to.Not necessarily. For example, in HoTTs (I am using the plural to distinguish HoTTs that are various type-theories from HoTT that is the mathematical study of these type theories) types can not be built from sets unless one adds a strong form of the axiom of choice.
The univalent models indeed provide us with intuition about the objects that HoTTs directly address but with a limited intuition not with a possibility of formally reducing everything.