> In turn, thanks for your time and feedback, Thomas.
> (I'm going for replying to yours and Tom separately, here)
> On Jul 27, 6:18 am, Thomas de Grivel <billi...@gmail.com> wrote:
> > Thanks for taking time to write your intuitions, I enjoy your writing
> > style and others will surely do too. But what are you trying to say ?
> > For instance UNIX platforms have compilers for almost every useful
> > language out there. Are they first class citizens enough for your
> > definition ?
> Short answer: no, they aren't, per this specific view I give to "first
> class citizens" wrt languages (i.e., in the context of this
> Longer answer: they aren't because, from my pov, it's worth to try
> something new and investigate on how to step up to a higher level of
> abstraction wrt approaching "competing" languages/paradigms (I'd
> rather say, more positively: "collaborating" ones).
> There specifically, I'm talking about something much like what you
> encounter in PLs offering closures (or methods pointers, or delegates,
> whatever you call it) to make functions "first class" (as opposed to
> PLs which don't), thus able to be passed as just another kind of value
> between environments of other functions' activation precisely, etc,
> but with the (somewhat bizarre/adventurous/unusual, yes, I suppose)
> "twist" of actually being willing to speak, there, of formal languages/
> But yes, I also anticipate this also comes with the cost of providing
> some actual, concrete support to this programme/agenda: if one wants
> to speak of/look at languages' (in their syntactic and semantic defs)
> as first class values, one also has to provide enough of an
> infrastructure (WWW-like, in my contemplation) to be able, likewise:
> to identifiy/name/compose/reuse these (computing or modeling-related)
> languages' definitions. In the case of first class value functions,
> such infra is conveniently, practically provided by the underlying
> type systems (oft-VM-based, such as Java, the CTS/CLR, JS engines,
> etc) and it's not so suprising as we are then talking about first
> class citizen values of this specific abstraction: computation.
> Thus, I'm just trying to figure out how to step up a little(?), I hope
> not unreasonably, further:
> since my core belief (and here, to take literally: I can't provide any
> more rationale than that; take it just as my "axiom" bet I need to
> make if the topic I'm concerned with is a long term one, as it seems
> it's also the case for TUNES) is we won't have any time soon "one PL/
> paradigm to rule them all", then I prefer to "humbly" accept the idea
> that imaginative/innovative new PLs and paradigms will keep breeding
> no matter what we try to do to unify them, and therefore my concern
> "ok, then, languages will keep being invented on top of Turing's
> theorems, concretized in everyday computers, how can we at least make
> sure we do better (than so far) wrt to have them, their design, and
> implementations, processing tools, applications built with them, scale
> better, compose better, be reused better, pipelined better, etc?"
> That's how I figure you then "just need" to look at it (that ever
> growing population of languages and usages of them) from a higher
> point of view, not just at the grounded level of the computation they
> Hope it makes better sense to you, but I confess I myself have hard
> time to convey these ideas without the support of many predecessor
> attempts (obviously).
> > The so-called short-termed industry is still using UNIX and a wide
> > variety of languages covering most use cases and paradigms, only problem
> > is high maintainance cost, from the fragmentation you described. How
> > would "first class citizen languages" solve this problem and not make it
> > worse ?
> I think I've actually replied to this above, already; but just in
> case, in short: by not trying to unify anything, but by not trying to
> let things go mostly unmanaged as it's still the case for now. A
> simple summary of the agenda I have, I hope related/useful to TUNES,
> is: devise a way to manipulate languages definitions (repositories
> registries, thereof, etc) and their derived artifacts after
> transforming their phrases into others' via computations and allow the
> tools for language L1 have an optional, simple, standardized way of
> learning about "meta linguistic" properties of the language L2: are
> all L2 implementations conformant to its definition? is it strict or
> lazy evaluation, can I embed L1 in this or that syntactic construct of
> L2 with the benefit of reusing this or that of L2's semantic (as found
> in its reference implementation, for instance) etc, etc?
> > And if you want my counter-intuitive point of view on the software
> > industry :
> > I believe the fact that most of the industry is high on java and crappy
> > designs is mostly a cultural problem, not a technical one : companies do
> > not have the knowledge to make good design decisions and have a hard
> > time hiring good developers and replace them. I think a good rationale
> > (if rationale is any good) for that is in this paper :
> > http://www.winestockwebdesign.com/Essays/Lisp_Curse.html
> > I guess the solution lies in open source and despite its low activity
> > TUNES is a remarkable project. Lisp communities are slowly gaining mass,
> > and the number of people publicly scratching their itch is growing
> > thanks to github and others. Crazy, unifying designs might become
> > something we can implement some day, but much hacking is needed. There
> > is a lot of crazy work to do, and we're all short on time.
> You received this message because you are subscribed to the Google Groups
> "TUNES Project" group.
> To post to this group, send email to email@example.com.
> To unsubscribe from this group, send email to
> For more options, visit this group at
-Brian T. Rice