yesterday I released tink_markup, a cross-platform way to build
hierarchical structures, which incidentally means that it can be used
as a template language. In fact, at it's current stage, it is not much
more. Documentation can be found here:
https://github.com/back2dos/tinkerbell/wiki/tink_markup
If you're familiar with haml or jade, you might find it's not that
different (although I had to change the syntax in order to get it to
run in haXe). It offers two outputs, one being an Xml tree, the other
being a runtime performance optimized string output (at the cost of
significantly higher compile times - although I do have ideas how to
reduce this).
I did a bit of benchmarking, comparing the fast output, templo, haXe
templates, the xml output and erazor.
The source is here:
https://github.com/back2dos/tinkerbell/blob/master/tests/src/markup/SpeedTest.hx
Here are the numbers I got on my machine:
neko (20000 runs):
- fast: ~0.55 seconds
- templo: ~0.70 seconds
- template: ~0.90 seconds
- xml: ~2.30 seconds
- erazor: ~69.5 seconds
php (2000 runs - sad but true):
- fast: ~0.33 seconds
- templo: //got some internal Xml parse error from the generated
temporary file, that I couldn't track down further.
- template: ~3.90 seconds
- xml: ~5.80 seconds
- erazor: ~41.5 seconds
js (chrome 50000 runs):
- fast: ~0.21 seconds
- templo: not available?
- template: ~1.75 seconds
- xml: ~1.94 seconds
- erazor: ~60.0 seconds
js (firefox 50000 runs):
- fast: ~0.27 seconds
- templo: not available?
- template: ~3.85 seconds
- xml: ~13.70 seconds
- erazor: ~173 seconds
flash9 (20000 runs in chrome bundled version):
- fast: ~0.22 seconds
- templo: not available?
- template: ~2.32 seconds
- xml: ~5.22 seconds
- erazor: didn't get it to work. release build caused
EUnknownVariable(name) and debug build says
"C:\Motion-Twin\haxe\lib\hscript/1,6/hscript/Parser.hx:716: characters
10-15 : Local variable c used without being initialized"
Remarks:
As for templo and template, the template string is parsed once for all
successive runs. All that is really measured here is execution speed.
As for erazor, the way it is now, it is not possible, because the
template is parsed during execution. Separating this (or introducing
caching) is rather straight forward, but seeing how currently work is
done on a macro based implementation, I didn't feel like playing
around with it too much. It is reasonable to assume that a macro-based
version of erazor will have speed very similar to tink_markup and
templo.
Pure measure of execution speed also only makes sense for neko and
node.js, where the parsed template can be kept in memory. In PHP you
can expect your template to be parsed on every request that needs it,
although templo has support for template caching.
Please note that tink_markup involves no runtime parsing. It
transforms your expression directly into the haXe code that generates
the desired output.
For the future, I soon plan to add another output for js, that will
directly create DOM objects. And after I've had opportunity to meet
the Cocktail team at the WWX, hear their talk and see what the new
version brings, I plan to create a flavor of this language that will
output Cocktail object structures. Also in the context of GUI
declarations, there will be an integration with tink_reactive allowing
definition of bindings after an MXML-ish fashion.
Any feedback is hugely appreciated :)
Regards,
Juraj
I'll take that as a compliment ;)
It looks awesome :)
The latest git version of erazor allows you to parse templates at
compile-time through macros, though (see :
https://github.com/ciscoheat/erazor/blob/master/test/erazor/TestMacro.hx)
I'd be very interested to see how it performs in your benchmark, even
if it's likely to be very close to other "pre-compiled" systems.
tinker_markup only supports compile-time parsing, no run-time parsing, right ?
If so, are there plans, or ways, to workaround this limit ?
Regards,
Clément
> --
> To post to this group haxe...@googlegroups.com
> http://groups.google.com/group/haxelang?hl=en
--
Yeah, I'll give it a try tomorrow and hit you with some new numbers. I
just was too busy (and too lazy) to use anything not on haxelib ;)
> @:template("Hello @name")
> class MacroTest extends erazor.macro.Template<{name:String}>{}
>
> And execute like this:
>
> var template = new MacroTest();
> template.execute({ name : "Franco" });
>
> In any case congratulations for the new wave of code ... you are really
> doing some amazing things ;)
Thank you :)
Regards,
Juraj
Thanks :)
> The latest git version of erazor allows you to parse templates at
> compile-time through macros, though (see :
> https://github.com/ciscoheat/erazor/blob/master/test/erazor/TestMacro.hx)
> I'd be very interested to see how it performs in your benchmark, even
> if it's likely to be very close to other "pre-compiled" systems.
Will update you as soon as I had time to try it out.
> tinker_markup only supports compile-time parsing, no run-time parsing, right ?
> If so, are there plans, or ways, to workaround this limit ?
You are right and that's on purpose. In contradistinction to bulky
languages as Java, haXe has a blazing fast compiler and is a very
expressive language. Externalizing templates, configs and such is
largely a workaround to limitations, that haXe doesn't actually have
(not since we have macros anyway). Currently I am focusing on
exploring how far you can get with this approach.
It is possible to have a parallel effort that processes the same
syntax at runtime, but it would give up most of the strengths of
tink_markup.
As an alternative, you can always fragment your project, by defining
something like:
external class Templates {
function renderUser(user:User):String;
function renderArticle(article:Article):String;
}
And have a concrete implementation of this in a different project and
combine these two at runtime. Speaking of it, I will look into how
this could be automated in a convenient way.
The long term plan for me is to continue hoping for haXe compiler
bootstrapping :)
Regards,
Juraj
If you're ok with the idea, you could also be part of the people
auditioning the cocktail presentation, so you can have an earlier
preview and then see with Silex Labs team what the best output could
be for cocktail.
Raph
Regards,
Juraj