Hi Haoyi,
On Mon, Apr 29, 2013 at 9:18 PM, Haoyi Li <
haoy...@gmail.com> wrote:
>> string ~ ":" ~ json_exp ~~> ((_, _))
> How is it I do not need to worry about the return value of the ":"? I would
> have expected to have to write:
>
>> string ~ ":" ~ json_exp ~~> (a, b, c) => (a, c)
In parboiled every rule can parse to a String but only explicitly.
I.e. if not specified the parser for a string literal is a `Rule0`
which doesn't produce any output. You can use the `~>` combinator to
create a `Rule1[X]` from the String it has parsed.
> The example I gave didn't really show what I meant (I picked it to be
> simple) so here's another try. This is something that's meant to parse a
> JSON array. Bear with me for the pseudo-scala (a mix between parboiled and
> scala-parser-combinators), since my parser-combinatory skills are rusty:
>
> val array = "[" ~ json_exp ~ rep("," ~ json_exp) ~ opt(space) ~ "]" ~~>
> {case (_, first, rest_pairs, _, _) => first +: rest_pairs.map(_._2)}
>
> The huge converter is required because the parser would naturally return a
>
> (String, JsVal, Seq[(String, JsVal)], String, String)
I understand. As I said above parboiled chose to parse to Strings only
explicitly which makes separator strings disappear in parameter lists.
See
https://github.com/spray/spray-json/blob/master/src/main/scala/spray/json/JsonParser.scala
for a real parboiled json parser.
> So I need to do a bunch of messy stuff in order to extract the T and Seq[T]
> from the messy parse tree. Now, if I use name-binding, I am able to convert
> it to:
>
> val array = "[" ~ (json_exp is first) ~ rep("," ~ (json_exp is rest)) ~
> opt(space) ~ "]" ~~> (first +: rest)
I agree and I think your may argument still hold even with parboiled:
E.g. if the function you pass to `~~>` doesn't fit with the order of
values you have parsed before the syntax you propose would make things
better to read. My point was that you can already improve the syntax
over standard Scala parser combinators much without having to resort
to macros.
> Does this make any sense? It would need to bind new names within the
> parser-expression which are available within the scope of the
> transformer-function, something which AFAIK is impossible without macros.
I think it could make sense. Something related was considered to
improve the syntax of sbt setting definitions. As I said another way
to bind new names without macros is using monadic style (which
sometimes won't look so clean a a possible macro-based custom syntax).