Re: The Future of Strongtalk

29 views
Skip to first unread message

Gilad Bracha

unread,
Oct 27, 2006, 7:36:45 PM10/27/06
to strongtal...@googlegroups.com
Adding to what Dave said:

JVMs are currently very, very poorly suited to running dynamic languages.

A. The verifier is in the way: your byte code must statically type
check. You cannot realistically compile Smalltalk (or Ruby, Python)
directly to such code. A lot of cleverness and complexity needs to be
done (cf. IronPython on .Net) but you still don't reach really high
performance. One can turn off verification and open the VM up to all
sorts of security risks, but then you can never be sure if your code
will run on a given installation - it depend son their settings.

B. Worse, the JVM has very limited suport for modifying code on the
fly (hotswapping).

The effect of these two items is that you are pretty much forced to
write your own interpreter and pay the penalty in performance and
complexity dancing around these limitations.

JSR292 is an initiative I started to address these problems. I hope it
will continue in my absence, but it will be some years until it comes
to fruition (if ever). It is frankly much harder to add support for
these features to a JVM then to stabilize Strongtalk.

There are additional issues.

C. No built in support for closures. Strongtalk does a pretty amazing
job on optimizing closures, and that won't happen on a JVM.

D. Mixins. If you want good support for the various multiple
inheritance schemes other languages have, it's pretty hard to do that
on a JVM, and there are no plans so far to deal with that. Ruby mixins
are very close to Strongtalk mixins; mixins are a good general purpose
primitive for supporting a wide variety of inheritance semantics.

E. Footprint and startup. JVMs make big bloated clients that start up
slowly. This may not be the JVM's fault. Mostly, it's the bloat in the
Java platform. But disentangling these things has eluded a lot of
people so far.

Strongtalk is a vastly superior basis for dynamically typed languages.
I realy hope the community gets it to the point where it's ready for
prime time.
--
Cheers, Gilad

azgolfer

unread,
Oct 28, 2006, 12:43:06 AM10/28/06
to Strongtalk-general

Gilad Bracha wrote:
> Adding to what Dave said:
>
> JVMs are currently very, very poorly suited to running dynamic languages.
>
> A. The verifier is in the way: your byte code must statically type
> check. You cannot realistically compile Smalltalk (or Ruby, Python)
> directly to such code. A lot of cleverness and complexity needs to be
> done (cf. IronPython on .Net) but you still don't reach really high
> performance. One can turn off verification and open the VM up to all
> sorts of security risks, but then you can never be sure if your code
> will run on a given installation - it depend son their settings.
>

Couldn't you use type inference to create interfaces for all the
dynamic types ? I think you said this was 'brittle' in some
presentation I read but I wasn't sure why. Couldn't the compiler just
create say Interface Set, Interface Dictionary implements Set, etc..
for Collections ?

David Griswold

unread,
Oct 28, 2006, 6:58:22 AM10/28/06
to strongtal...@googlegroups.com

First of all, the relationships between the "interfaces" in dynamic
languages often don't follow subtyping relationships, so you would have to
do a dynamic cast at every nearly every point that such an interface was
used, which would be extremely slow.

Secondly, inference is based on the *use* of the interfaces, not just on the
definitions, so anytime you use an object in a new way, existing inferred
interface relationships could change, requiring global analysis and
recompilation of arbitrarily large parts of the code in the system.

And thirdly, inference isn't nearly powerful enough to model all the things
that people might do in dynamic languages. Anytime the inference algorithm
falls short, your dynamic code would not compile, so to the programmer it
would appear that the compiler/system would arbitrarily reject code that
ought to work according to the fully dynamic language semantics, so the
inference algorithm, which is purely an implementation detail, could never
be fully encapsulated, and would limit the language semantics in
hard-to-understand ways.

For example, an inference algorithm might reasonably conclude that + for
integers and + used for string concatenation in some language aren't really
the same operation, but some new piece of weird code could always use + in
some limited way that would work for both kinds of objects, so the
inference mechanism would have to be able construct some common interface
just for that strange use.

The problems are demonstrated even in the examples you mentioned: the
Dictionary interface *ISN'T* a subtype of the Set interface, even if
Dictionary is implemented by inheritance from Set in most Smalltalks
(example: for Dictionaries, #do: iterates over the elements associated with
the keys, not the associations themselves, and #add: takes an association,
not an element, whereas for Sets both #do: and #add: use the elements, not
associations), so you couldn't generate typesafe interfaces in Java unless
all arguments and return values were just specified as Object and cast
*everywhere* they were used.

-Dave


Reply all
Reply to author
Forward
0 new messages