AI is possible, here is the proof

41 views
Skip to first unread message

IvanV

unread,
May 4, 2013, 3:42:42 PM5/4/13
to general-in...@googlegroups.com
Here is the proof that AI exists already, an "ACE" reasoner: http://attempto.ifi.uzh.ch/acerules/interface
(Click program->load example to play with the beast :))

Right now I'm onto intelligent encyclopedia development.
  • It should be crowdsourced like wiki.
  • As crowdsourcing gathering bate it should provide answers in form of simple explanations that can expand further on demand (reader-known termins will stay unexpanded, so the reader isn't polluted with already known facts)
  • It should answer simple wh-questions and it should provide proofs as answers to complex how-questions (deduction) - the simple part of this AceRules already has.
  • It should be able to induce new rules from examplars in some fruity theory (I'm hoping for explanation of life phenomena, or new formulas finder like in quantum physics). This can be done by induction of systematically constructed new formulas on examplars of KB - very time consuming task, I predict, combinatorial explosion is to be expected, yet, lot of computers are out there connected to Internet.
  • It should be able to solve any user defined theory problem like math, chemistry or biological (DNA) problems (deduction and customized universal parser would do this).
  • It should be able to implement any language beside English from crowd source through universal parser.
If they can do it with ACE and they have public documentation, it shouldn't be a problem to extend it to complete NL of English, or other languages.

Current stage of the project: after a while I'm back onto writing universal top-down parser with left recursion solved through seed-growing algorithm. I don't like the older shift-reduce one I made, it is not universal enough. Cross fingers for me.

Lot of promises, but I'm a tuff guy, I will make it in some amount, I believe.

Matt Mahoney

unread,
May 4, 2013, 4:22:00 PM5/4/13
to general-intelligence
How do you plan to translate natural language into ACE? Translating
natural languages to formal languages is an advanced skill in humans
that happens only after they learn natural language. If your
translator already knows natural language, then you have solved the
problem.
> --
> You received this message because you are subscribed to the Google Groups
> "Genifer" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to general-intellig...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.
>
>



--
-- Matt Mahoney, mattma...@gmail.com

Ivan Vodišek

unread,
May 4, 2013, 4:55:24 PM5/4/13
to general-in...@googlegroups.com
I plan to translate NL to Synth (a programming language I developed for AI purposes - some sort of BNF language enhanced with EventActions). When it is translated by universal parser, the rest is easy, it is just about running queries and transforming expressions by rules (deduction - like transforming math expressions).

An example of Synth definition is this math definition: 

        Sum {
            Add (Left -> @Sum, In -> "+", Right -> @Fact) -> @Add.Left + @Add.Right;
            Sub (Left -> @Sum, In -> "-", Right -> @Fact) -> @Sub.Left - @Sub.Right;
            Fact {
                Mul (Left -> @Fact, In -> "*", Right -> @Exp) -> @Mul.Left * @Mul.Right;
                Div (Left -> @Fact, In -> "/", Right -> @Exp) -> @Div.Left / @Div.Right;
                Exp {
                    @Num;
                    @Var;
                    Braces(Left -> "(", In -> @Sum, Right -> ")") -> @Braces.In
                }
            }
        }

From this definition it is possible to parse *and* calculate simple math expressions.

Rules for transforming can be made in somewhat similar way like these logic rules:

@And (
    Left -> @Boolean,
    Right -> @Boolean
) -> @And (
    Left -> @Super.Right,
    Right -> @Super.Left
);

@Not (
    Param -> @And (
        Left -> @Boolean, 
        Right -> @Boolean
    )
) -> @Or (
    Left -> @Not (
        Param -> @Super.Param.Left
    ),
    Right -> @Not (
        Param -> @Super.Param.Right
    )
);

Deduction is about transforming rules and induction is more complicated - it requires semantic tables.


2013/5/4 Matt Mahoney <mattma...@gmail.com>

Ivan Vodišek

unread,
May 4, 2013, 5:06:36 PM5/4/13
to general-in...@googlegroups.com
NL definition should look like this, but a lot more complicated:

Sentence (
    Subject -> { I, You, He, She, ...},
    Predicate -> {am, want, pick, ...},
    Object -> {apple, apricot, food, ...}
)

Now we can parse sentences this way:

Sentence(~I pick food~);

and parser distributes words onto properties Subject, Predicate and Object.

Writing full English grammar rules will be a tuff task for me, since I talk only pidgin English. I guess that grammar book I downloaded from web will do.


2013/5/4 Ivan Vodišek <ivan....@gmail.com>

Matt Mahoney

unread,
May 4, 2013, 5:20:25 PM5/4/13
to general-intelligence
I mean that translating natural language to *any* formal language
requires that you have solved the natural language problem first. For
example, who does "he" refer to in the following sentences?

Jim punched Bob because he cheated.
Jim punched Bob because he was mad.

At what point do you solve the problem, during the translation or
during the interpretation of the formal statement?

Another example. How do you parse the following?

I ate pizza with sausage.
I ate pizza with chopsticks.
I ate pizza with Bob.

Ivan Vodišek

unread,
May 4, 2013, 5:33:09 PM5/4/13
to general-in...@googlegroups.com
I guess that would be problems I will have to deal with.

Maybe some concrete verbs bind to subject, other to object. There are also problems of ambiguity. This grammar book I've mentioned has some answers, it is pretty thick and has 500 pages. A century of reading is comming.

Also conditional parsing rules will be in my way. For example, if first parameter is X then second can be Y or Z and so on. Here I'll have to use some queries for parameters, I've already worked out this one, it shouldn't be a problem.

Now I'm more confident when I saw AceRules.

I still have some issues about conditional expanding of segments of text that needs more explanation to a reader.


2013/5/4 Matt Mahoney <mattma...@gmail.com>

Matt Mahoney

unread,
May 4, 2013, 5:40:59 PM5/4/13
to general-intelligence
Maybe you can add common sense rules to help you parse, like:

Sausage is a pizza topping.
Chopsticks are eating utensils.
Bob is a person.

But how many rules do you need to add? This was the approach used by
Cyc. But they had no idea how many rules would be needed. They have
been adding rules for 29 years and still have not solved the problem.
They still do not know if they are 10% finished, or 1%, or 0.1%. What
do you think the answer is? How would you know?

Ivan Vodišek

unread,
May 4, 2013, 6:04:21 PM5/4/13
to general-in...@googlegroups.com
When NL is inputed, the program can ask questions on ambigous problem, i.e. some dialogbox can pop up. The more rules are entered, the less dialogs pop out.

Also NL is live, it changes over time. Then some favors of NL can be optioned on ambiguity input.

Nevertheless, the most of definitions should be inputed from crowdsource, so I will have to build parser for ongoing changes entered by people who enter articles. So some start from me is planned by defining starting rules and by copying wikis I find interesting, but for the rest I will somehow have to interest scientists and teachers to enter new articles and build up new rules.

Here I find Math solver interesting because it gathers schoolars - crowd that potentially reads articles. I still don't know how the product will behave as a tool for helping different academic researchings. Maybe automatic induction and deduction would pollute some results, I hope. On demand expanding paragraphs and answer machine should also be a feature that helps on gathering people that don't want to lose their time on reading full text, but to get quick answers. I hope that would be enough to bring alive a crowdsourced project.


2013/5/4 Matt Mahoney <mattma...@gmail.com>

Mike Dougherty

unread,
May 4, 2013, 6:22:50 PM5/4/13
to general-in...@googlegroups.com
On Sat, May 4, 2013 at 6:04 PM, Ivan Vodišek <ivan....@gmail.com> wrote:
> Here I find Math solver interesting because it gathers schoolars - crowd
> that potentially reads articles. I still don't know how the product will
> behave as a tool for helping different academic researchings. Maybe
> automatic induction and deduction would pollute some results, I hope. On
> demand expanding paragraphs and answer machine should also be a feature that
> helps on gathering people that don't want to lose their time on reading full
> text, but to get quick answers. I hope that would be enough to bring alive a
> crowdsourced project.

I think you might be asking the crowd to provide value to you without
providing much to them in return.

However, if your application wrote papers for those students from a
meager dialog with them for things like topic, depth of research,
etc... then you might have something people will use - not because
they're excited to contribute to your project but because they get
something useful from it. Over time, you should get better and that'd
attract more users.

Ivan Vodišek

unread,
May 4, 2013, 6:38:54 PM5/4/13
to general-in...@googlegroups.com
I think that starting set should be rich enough to attract enough students, so I'll probably do a year or two on filling data before attracting students to the site.

I hope that many free already coded ontologies on web would help me, so if U know about some quality resources, please let me know.


2013/5/5 Mike Dougherty <msd...@gmail.com>

Matt Mahoney

unread,
May 4, 2013, 7:14:35 PM5/4/13
to general-intelligence
On Sat, May 4, 2013 at 6:38 PM, Ivan Vodišek <ivan....@gmail.com> wrote:
> I hope that many free already coded ontologies on web would help me, so if U
> know about some quality resources, please let me know.

What about OpenCyc and WordNet?

But I think you would be better off if you could figure out the
algorithm for learning natural language and training it on plain text.


-- Matt Mahoney, mattma...@gmail.com

YKY (Yan King Yin, 甄景贤)

unread,
May 4, 2013, 11:51:15 PM5/4/13
to general-in...@googlegroups.com
Ivan,

Thanks for showing ACE, I've been aware of it, but never tried a demo.

Natural language has some features that require "abductive" interpretations (abductive meaning "finding the best explanation").  For example, how we resolve the pronoun in "John tells his son the story of his life".  Or how to interpret compound words like "cyber bullying".

That makes natural language a bad knowledge-representation format -- because it's inefficient to interpret.  I am searching for a universal logic underlying natural language.

YKY
Reply all
Reply to author
Forward
0 new messages