Thanks for taking time to write your intuitions, I enjoy your writing
style and others will surely do too. But what are you trying to say ?
For instance UNIX platforms have compilers for almost every useful
language out there. Are they first class citizens enough for your
definition ?
The so-called short-termed industry is still using UNIX and a wide
variety of languages covering most use cases and paradigms, only problem
is high maintainance cost, from the fragmentation you described. How
would "first class citizen languages" solve this problem and not make it
worse ?
And if you want my counter-intuitive point of view on the software
industry :
I believe the fact that most of the industry is high on java and crappy
designs is mostly a cultural problem, not a technical one : companies do
not have the knowledge to make good design decisions and have a hard
time hiring good developers and replace them. I think a good rationale
(if rationale is any good) for that is in this paper :
http://www.winestockwebdesign.com/Essays/Lisp_Curse.html
I guess the solution lies in open source and despite its low activity
TUNES is a remarkable project. Lisp communities are slowly gaining mass,
and the number of people publicly scratching their itch is growing
thanks to github and others. Crazy, unifying designs might become
something we can implement some day, but much hacking is needed. There
is a lot of crazy work to do, and we're all short on time.
Le 27/07/11 07:47, CJJ a écrit :
I had the following intuition:
since (1) abstraction is one of our most powerful lever to fight back
against accidental complexity in design, implementation, and since (2)
with that dear RESTflavored core of a WWW+DNS (yes, from day one
actually, read Tim BL back in early 90s even before Fielding's thesis/
assessment 10 years later) we can now make, in reasonably delimited
knowledge use cases, at least for a good number of applications, make
the [CWA], why not go one step farther and make *languages* themselves
first class citizens *within* the platform/infrastructure ?
And if you want my counter-intuitive point of view on the software industry :
I believe the fact that most of the industry is high on java and crappy designs is mostly a cultural problem, not a technical one : companies do not have the knowledge to make good design decisions and have a hard time hiring good developers and replace them. I think a good rationale (if rationale is any good) for that is in this paper :
http://www.winestockwebdesign.com/Essays/Lisp_Curse.html
I guess the solution lies in open source and despite its low activity TUNES is a remarkable project. Lisp communities are slowly gaining mass, and the number of people publicly scratching their itch is growing thanks to github and others. Crazy, unifying designs might become something we can implement some day, but much hacking is needed. There is a lot of crazy work to do, and we're all short on time.
--
You received this message because you are subscribed to the Google Groups "TUNES Project" group.
To post to this group, send email to tunes-...@googlegroups.com.
To unsubscribe from this group, send email to tunes-projec...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/tunes-project?hl=en.
Is the #lang keyword in the Racket implementation of Scheme a but closer
to what you have iin mind?
-- hendrik
Maybe that's a dream I dream too : "let's formalize the semantics of all
languages and compile any of them into any other". Maybe we should
determine and dump the semantics of all first-class languages into
Prolog. Is that an hypothetical implementation of your idea ? That's a
heap of work though, and a heavy weight as you have a very large dataset
to reason upon.
Or "let's have all the compilers compile to a good internal
representation", like Lisp or Shen when it's ready. Actually languages
as first-class citizens do remind me of something which gives its power
to Lisp : macros, and reader macros. In Lisp we could define a reader
macro "C++" which would read and parse C++ converting it into Lisp code.
Now this is not a special construct of Lisp, macros are the core of Lisp.
If we can also translate back to the language then we can do pretty
funny things, but that almost doubles the work. Actually you can almost
do this with Javascript in Common-Lisp with Parenscript.
This is first-class enough for me in the context of an abstract idea.
But that means all compilers for complex languages have to be ported,
will not be optimized.
Problem is when you try to compile most languages to something other
than x86 you realize they're completely ill-defined and produce broken
abstractions etc. Or produced code is impossible to read or to optimize.
Le 28/07/11 02:50, Cyril Jandia a �crit :
I think we can imagine these datasets come to life partially in the
semantic web. I guess what lacks is motivation to produce formalized
semantics into RDF. We usually don't doubt that we can compile one PL
into another, only we usually do it in an informal way : it's a huge
enough work not to take the time needed to expose the underlying
semantics in a formalized representation.
>> Or "let's have all the compilers compile to a good internal
>> representation", like Lisp or Shen when it's ready.
>
> I'm sorry if I confused you, but this is NOT a strategy I'm thinking
> of EITHER, really.
Sorry for my naive interpretation, I'm just trying to understand.
> If I understand correctly your point that's roughly equivalent to say
> I would be trying, this time, to unify the various languages' virtual
> execution environments into one VM / intermediary language only, let's
> say, e.g., dropping the JVM once in favor of the CLR/CIL or the other
> way round.
No, that's not what I meant. I should explain a bit more : I was not
really talking about execution but *internal representation* as used by
a compiler, providing an ad-hoc formal representation for all languages.
The advantage of using Lisp is that we already have a good compiler for
this representation, and that it is meta-programmable, meaning we can
reuse the imported languages to support further ones.
> That's definitely not something I would consider reasonable either,
> for all the obvious reasons related to the portability and/or
> efficiency of this or that VM implementation's source code over this
> or that target processor architecture. Plus, imho, it's very much a
> matter of tastes/personal preferences more than of objective argument
> that this VM or intermediary language addresses better the portability
> to/the optimal use of a given processor than that other VM or
> intermediary language.
Lisp can be compiled to any VM. We can think of internal representation
as in LLVM. Multiple languages, multiple targets.
>> Actually languages as first-class citizens do remind me of something which gives its power
>> to Lisp : macros, and reader macros. In Lisp we could define a reader
>> macro "C++" which would read and parse C++ converting it into Lisp code.
>> Now this is not a special construct of Lisp, macros are the core of Lisp.
>
> So, it seems I need to rephrase the core idea maybe differently, one
> more time to clarify. I'll give it a try a later/soon.
Your idea sounds interesting, I'm trying to imagine what you mean to say
and I definitely hope to read more about it.
Cheers,
--
Thomas de Grivel
http://b.lowh.net/billitch
"I must plunge into the water of doubt again and again."
This is the kind of thing the #lang line in Racket is good for. (Racket
is the new name for PLT Scheme; I don't know if this feature is part of
Scheme now, or just part of Racket). You put a line like
#lang fooo
at the start of a file of Racket code, and it tells Racket that the file
is written in the language 'fooo' It fetches the langauge definition
from wherever it keeps them, and uses it for the rest of the file.
Multimodule programs can easily be written in multiple languages.
Though I doubt the mechanism is up to the rather insecure semantics of
C++ -- probably a good thing.
-- hendrik
-- hendrik