Google Groups

Re: tables with more than 22 columns?


Stefan Zeiger Jan 10, 2013 9:10 AM
Posted in group: Slick / ScalaQuery
On 2012-12-30 23:40, Michael Slinn wrote:
Here is my latest attempt, complete with error message: https://gist.github.com/4415732

Instead of using nested tuples, I defined two helper classes. Now I have two problems:
  1. How to define TypeMappers for the helper classes

You can't. TypeMappers are for columns only.

This works in 1.0:

    // 2 classes for the nested structure
    case class Part(i1: Int, i2: Int, i3: Int, i4: Int, i5: Int, i6: Int)
    case class Whole(id: Int, p1: Part, p2: Part, p3: Part, p4: Part)

    // Note that it's a Table[Int] -- we only map the primary key in *
    object T extends Table[Int]("t_wide") {
      def id = column[Int]("id", O.PrimaryKey)
      def p1i1 = column[Int]("p1i1")
      def p1i2 = column[Int]("p1i2")
      def p1i3 = column[Int]("p1i3")
      def p1i4 = column[Int]("p1i4")
      def p1i5 = column[Int]("p1i5")
      def p1i6 = column[Int]("p1i6")
      def p2i1 = column[Int]("p2i1")
      def p2i2 = column[Int]("p2i2")
      def p2i3 = column[Int]("p2i3")
      def p2i4 = column[Int]("p2i4")
      def p2i5 = column[Int]("p2i5")
      def p2i6 = column[Int]("p2i6")
      def p3i1 = column[Int]("p3i1")
      def p3i2 = column[Int]("p3i2")
      def p3i3 = column[Int]("p3i3")
      def p3i4 = column[Int]("p3i4")
      def p3i5 = column[Int]("p3i5")
      def p3i6 = column[Int]("p3i6")
      def p4i1 = column[Int]("p4i1")
      def p4i2 = column[Int]("p4i2")
      def p4i3 = column[Int]("p4i3")
      def p4i4 = column[Int]("p4i4")
      def p4i5 = column[Int]("p4i5")
      def p4i6 = column[Int]("p4i6")
      // This is just the default projection -- It doesn't have to contain all columns
      def * = id
      // Instead, we use nested tuples for a full projection:
      def all = (
        id,
        (p1i1, p1i2, p1i3, p1i4, p1i5, p1i6),
        (p2i1, p2i2, p2i3, p2i4, p2i5, p2i6),
        (p3i1, p3i2, p3i3, p3i4, p3i5, p3i6),
        (p4i1, p4i2, p4i3, p4i4, p4i5, p4i6)
      )
      // And override create_* to get the DDL for all columns.
      // Yeah, this is ugly. It used to be much simpler in ScalaQuery.
      // We can add a helper method to simplify it.
      override def create_* =
        all.shaped.packedNode.collect {
          case Select(Ref(IntrinsicSymbol(in)), f: FieldSymbol) if in == this => f
        }.toSeq.distinct
    }

    T.ddl.create
    // Insert into T.all. The extra ".shaped" call is needed because we cannot
    // get the types in an implicit conversion due to SI-3346
    T.all.shaped.insert(
      0,
      (11, 12, 13, 14, 15, 16),
      (21, 22, 23, 24, 25, 26),
      (31, 32, 33, 34, 35, 36),
      (41, 42, 43, 44, 45, 46)
    )

    // Get the nested tuples in a query
    val q1 = T.map(_.all)
    println(q1.first)

    // Map the result to the case classes
    val i2 = q1.mapResult { case (id, p1, p2, p3, p4) =>
      Whole(id, Part.tupled.apply(p1), Part.tupled.apply(p2), Part.tupled.apply(p3), Part.tupled.apply(p4))
    }
    println(i2.first)

What you still cannot do is define a bidirectional mapping with <> over a shaped value. That would allow the us to define a Table[Whole] and do inserts from the mapped types. There is an existing ticket for this feature: https://github.com/slick/slick/issues/40

--
Stefan Zeiger
Typesafe - The software stack for applications that scale
Twitter: @StefanZeiger