Also, does anyone have experience with the grammars supplied with
Abraxas and are they complex and complex enough to use to develop
transformation engines (e.g., C to VB)?
Any advice is greatly appreciated.
TIA
[On 16 bit platforms, the commercial systems worked a lot better than
the freeware. On 32 bit platforms, there's a lot less difference.
MKS has a bunch of debugging stuff that can be nice, but the basic
yacc and lex function in the commercial versions is about the same as
Berkeley yacc and flex. I looked at Abraxas' sample grammars a while ago,
and they're not a whole lot more than the yacc file. -John]
--
Send compilers articles to comp...@iecc.com, meta-mail to
compiler...@iecc.com. Archives at http://www.iecc.com/compilers
>Does anyone have recommendations as to whether the commercially
>available Lex/YACC packages are significantly better than Flex and
>Bison. Specifically, the tools from MKS and Abraxas? I've been using
>Flex and Bison with much success (WinTel platform) and am trying to
>understand the benefits of these packages (which cost $400 - $1000).
If all you want is the basic functionality, then flex and byacc (or
bison) are fine. If you want something more, I highly recommend
Yacc++ from Compiler Resources. I found out about them from our
FAQ, and although the package is not cheap, it is really well
implemented. And it has one of the best manuals I've had the
pleasure of reading, in my 31 years of programming. Five stars.
Dwight
[Chris Clark, who wrote much of that package, posts here from time to time.
Daniel
Daniel Dittmar (mailto:dit...@berlin.snafu.de)
There were several real reasons why people continued to hand-code
scanners despite the presence of lex. However, none of them is really
a *good* reason any more.
The most obvious one was that the original lex had a (well-justified)
reputation for poor performance. Van Jacobson fixed that a decade
ago, although some of the authors of lex clones may not have known
about it.
A less obvious one was that in the days of the PDP11, the size of the
lex tables was a serious problem when trying to squeeze a compiler
into a 16-bit address space. This hasn't been an issue for most
people for a long time, but hassles of this sort contributed to
persistent folklore about lex being unsuited to production
applications.
Non-Unix(TM) C compilers appeared well before non-Unix(TM) lexes, so
in the early days of writing compilers that weren't tied to Unix
licensing, there was a tendency to avoid lex because it often wasn't
available in other environments. That's been fixed by freeware
implementations, notably flex.
Finally, people have a lamentable tendency to treat lex as a panacea
rather than a tool, and then complain and abandon lex when they find
that they can't easily write a one-line pattern for a tricky case like
(the canonical bad example) C comments. A judicious leavening with
bits of C code can deal with this without throwing out the baby with
the bathwater.
A decade or so ago, when I was interested in writing a C compiler, I
used a hand-coded scanner for several reasons... but I also wrote a
lex scanner and used it for checking the correctness of the hand-coded
version, since it was a lot easier to verify the lex version against
the manuals. (In fact, I think I posted the lex version to
comp.compilers...)
--
| Henry Spencer
| he...@zoo.toronto.edu