I understand your counter example, but I don't think it makes sense on the code generation side of things.
I think expecting case class semantics on symbols and types in the context of treehugger is reasonable, i.e.:
RootClass.newClass("aaa") == RootClass.newClass("aaa")
//> res13: Boolean = false
I find this pretty surprising, I understand that once the compiler sees the aaa symbol its interpretation is context depndent, but it's not so on the code generation side, we're using symbols as pumped-up strings, if we want them to represent a type we make a type out of them, i.e. TYPE_REF( "aaa" ) and if we want them to represent a variable we make a ref out of them: REF("aaa"), so I think it's pretty realistic to expect value semantics out of these things.
I already went for the 'self' symbol table approach, but it poses some costs and complexities on my code, i.e. I've just hit a failing test with the following message:
java.lang.IllegalArgumentException: requirement failed: Iterable[com.actimize.bs.BeanCaseClass] == Iterable[com.actimize.bs.BeanCaseClass]
I think you'd agree with me that this surprising at first, and irritating once you understand what's going on :-)