> For a real world scenario, MindForth needs embodiment in a robot. Forthers?
> Because MindForth learns new words, it is an excellent chatbot.
Perhaps we have to define what a chat bot is, and what “learning of new
words” mean. A chatbot is a computer program which fulfills external
requirements. An example requirement is: talk about the weather. Is
Mindforth able to do a smalltalk about the weather? Not yet. Instead,
Mindforth asks the Forth programmer to develop a standalone program or
a subroutine for Mindforth which implements a chatbot.
The same is true for another feature: learning of new words. Learning
means, that the software improves itself by storing new vocabulary into
the main memory. And with this definition in mind it's possible to judge
about a computer program if it can do such things. In case of Mindforth no
such feature is available, and it was never designed for this purpose. In
contrast, Mindforth asks to user to extend the routines ... and so on.
Perhaps it make sense to generalize the problem a bit. A narrow AI is
a computer program which fulfills a task defined before. For example,
the challenge is to translate a sentence from Russia into English and
the Narrow AI has to fulfill the task. Mindforth is not a narrow AI,
but it is equal to the domain definition.
It's important to explain that it's not very complicated to program a
software which creates an action space. All what the programmer has to do
is to imagine a game and formulate it as a computer program. All games
are asking the user to do certain actions. And the game provides the
rules which forces the user to act inside the game. The problem is, that
Artificial Intelligence invented with the aim to solve a game. Because
this is what humans are doing. They are providing the answer for a
problem. To judge if a certain type of software is capable of doing so,
we have to investigate if the software is able to play not it's own game,
but external games.