tables with more than 22 columns?

6,348 views
Skip to first unread message

george

unread,
Dec 9, 2010, 9:07:58 AM12/9/10
to ScalaQuery
Hi,

I'm just starting out with ScalaQuery and I ran into a problem because
my table has more than 22 columns, so I can't list them all in the
tuple needed to define the table object.

How do I work with tables where there are more than 22 columns?

Thanks

George

Stefan Zeiger

unread,
Dec 10, 2010, 12:49:41 PM12/10/10
to scala...@googlegroups.com

A few choices:

- If the tables already exist and you only need a subset of up to 22
columns, just leave the other columns out.

- If the tables already exist and you need more than 22 columns for your
queries, do not include all columns in the "*" projection and the
table's type parameter. You can still use all columns in queries as long
as all individual projections contain less than 22 columns. You
shouldn't run into any problems as long as you don't pass a complete
table out of a subquery. This will only pass the columns from * in the
generated SQL code. If you need to pass other columns out of a subquery,
construct an explicit projection.

- Additionally, if you want to create the tables with ScalaQuery,
override AbstractTable.create_* to return an Iterable of all
NamedColumns that should be created. The default implementation of
create_* takes the columns from *. You can also override create_* if
your table contains DBMS-specific special columns like ROWID which are
always present and must not be created explicitly.

-sz

Michael Slinn

unread,
Dec 26, 2012, 10:24:55 AM12/26/12
to scala...@googlegroups.com, szei...@googlemail.com
Does this limitation still exist? I'm looking at an existing table with >22 columns, and I need to write something that can allow users to edit each field of each record.

Mike

Jason Schmitt

unread,
Dec 27, 2012, 9:46:59 AM12/27/12
to scala...@googlegroups.com, szei...@googlemail.com
We are looking into this as well. The limitation come from Scala...you can't have a tuple that is larger than 22 and the * projection uses tuples to match up with the columns.

I would like to know more about the second suggestion that says to construct an explicit projection. I'm not exactly sure how that is done.

Jason

Jason Schmitt

unread,
Dec 27, 2012, 5:22:49 PM12/27/12
to Mike Slinn, scala...@googlegroups.com, szei...@googlemail.com
I found this:


I haven't had a chance to test it out. We may be finding a way to restructure our tables so they don't need so many columns.


Jason


On Thu, Dec 27, 2012 at 9:30 AM, Mike Slinn <msl...@gmail.com> wrote:
Apparently Slick now has support for nested tuples, but I have not seen a code code example for how to use them with Slick. The second file of this gist (PaypalTransaction.scala) is a partial step towards an example.

https://gist.github.com/4382183

Mike

virtualeyes

unread,
Dec 27, 2012, 6:10:43 PM12/27/12
to scala...@googlegroups.com, Mike Slinn, szei...@googlemail.com
Given the 22 tuple limit, normalizing tables can be a good call, particularly when working with web frameworks (like Play) that support nested forms/validation

For example, I inherited a client site where customer membership, address, payment and login details were all globbed into a single table. It worked, simple requirements for the original site, no real need to normalize the data. For the new site the client required additional functionality (hierarchical memberships) beyond the fact the 22 tuple limit had been exceeded, so no choice but to break out the data into separate tables.

In the end it worked out really well, but YMMV, in some cases the mega table makes sense, just have to find a way around 22 tuple limit...

Mike Slinn

unread,
Dec 27, 2012, 6:13:28 PM12/27/12
to virtualeyes, scala...@googlegroups.com, szei...@googlemail.com
Noah,

While I agree with you for the general case, in this case the best way
forward would be to figure out how to get >22 columns to persist into a
table with Slick.

Thanks,

Mike

virtualeyes

unread,
Dec 27, 2012, 7:21:29 PM12/27/12
to scala...@googlegroups.com, virtualeyes, szei...@googlemail.com
Have not checked but wasn't there a push to up the 22 tuple limit in to-be-released Scala 2.10??

This topic has come up previously, search the group archive, there were a couple of workarounds as I recall, none of them that provided what I was looking for, however.

Good luck, the limitation is a hassle, enterprise world likely enjoy a solution...

Simon Ochsenreither

unread,
Dec 27, 2012, 8:37:16 PM12/27/12
to scala...@googlegroups.com, virtualeyes, szei...@googlemail.com
Wouldn't it work to define the required (extension) methods/TupleXX types yourself?

Michael Slinn

unread,
Dec 30, 2012, 5:40:39 PM12/30/12
to scala...@googlegroups.com, virtualeyes, szei...@googlemail.com
Here is my latest attempt, complete with error message: https://gist.github.com/4415732

Instead of using nested tuples, I defined two helper classes. Now I have two problems:
  1. How to define TypeMappers for the helper classes
  2. Dealing with <> overloading
Mike

Noah Nordrum

unread,
Dec 31, 2012, 1:03:21 PM12/31/12
to scala...@googlegroups.com, virtualeyes, szei...@googlemail.com
I could be wrong, but it looks like TypeMappers are (currently) restricted to single column.

trait TypeMapperDelegate[T] { self =>
  /**
   * A zero value for the type. This is used as a default instead of NULL when
   * used as a non-nullable column.
   */
  def zero: T
  /**
   * The constant from java.sql.Types that is used for setting parameters of
   * the type to NULL.
   */
  def sqlType: Int
  /**
   * The default name for the SQL type that is used for column declarations.
   */
  def sqlTypeName: String
  /**
   * Set a parameter of the type.
   */
  def setValue(v: T, p: PositionedParameters): Unit
  /**
   * Set an Option parameter of the type.
   */
  def setOption(v: Option[T], p: PositionedParameters): Unit

However, I'm not sure yet exactly what the sqlType/sqlTypeName are used for. sqlType looks like you would be ignored if you manually created your own schema (I'm guessing here based on a quick scan), and sqlTypeName seems to be used in a ridiculously long match statement (specifically, in "case a @ Library.Cast(ch @ _*) =>").

Maybe somebody can expand on this further, but I'm guessing you could (un)marshall multiple columns in the set/next methods. However, if it works, it's clearly a fortuitous side-effect, so who knows when it would stop working... 

I think the correct answer is to get TypeMappers officially supporting mapping to/from n columns.

Stefan Zeiger

unread,
Jan 10, 2013, 12:10:33 PM1/10/13
to scala...@googlegroups.com
On 2012-12-30 23:40, Michael Slinn wrote:
Here is my latest attempt, complete with error message: https://gist.github.com/4415732

Instead of using nested tuples, I defined two helper classes. Now I have two problems:
  1. How to define TypeMappers for the helper classes

You can't. TypeMappers are for columns only.

This works in 1.0:

    // 2 classes for the nested structure
    case class Part(i1: Int, i2: Int, i3: Int, i4: Int, i5: Int, i6: Int)
    case class Whole(id: Int, p1: Part, p2: Part, p3: Part, p4: Part)

    // Note that it's a Table[Int] -- we only map the primary key in *
    object T extends Table[Int]("t_wide") {
      def id = column[Int]("id", O.PrimaryKey)
      def p1i1 = column[Int]("p1i1")
      def p1i2 = column[Int]("p1i2")
      def p1i3 = column[Int]("p1i3")
      def p1i4 = column[Int]("p1i4")
      def p1i5 = column[Int]("p1i5")
      def p1i6 = column[Int]("p1i6")
      def p2i1 = column[Int]("p2i1")
      def p2i2 = column[Int]("p2i2")
      def p2i3 = column[Int]("p2i3")
      def p2i4 = column[Int]("p2i4")
      def p2i5 = column[Int]("p2i5")
      def p2i6 = column[Int]("p2i6")
      def p3i1 = column[Int]("p3i1")
      def p3i2 = column[Int]("p3i2")
      def p3i3 = column[Int]("p3i3")
      def p3i4 = column[Int]("p3i4")
      def p3i5 = column[Int]("p3i5")
      def p3i6 = column[Int]("p3i6")
      def p4i1 = column[Int]("p4i1")
      def p4i2 = column[Int]("p4i2")
      def p4i3 = column[Int]("p4i3")
      def p4i4 = column[Int]("p4i4")
      def p4i5 = column[Int]("p4i5")
      def p4i6 = column[Int]("p4i6")
      // This is just the default projection -- It doesn't have to contain all columns
      def * = id
      // Instead, we use nested tuples for a full projection:
      def all = (
        id,
        (p1i1, p1i2, p1i3, p1i4, p1i5, p1i6),
        (p2i1, p2i2, p2i3, p2i4, p2i5, p2i6),
        (p3i1, p3i2, p3i3, p3i4, p3i5, p3i6),
        (p4i1, p4i2, p4i3, p4i4, p4i5, p4i6)
      )
      // And override create_* to get the DDL for all columns.
      // Yeah, this is ugly. It used to be much simpler in ScalaQuery.
      // We can add a helper method to simplify it.
      override def create_* =
        all.shaped.packedNode.collect {
          case Select(Ref(IntrinsicSymbol(in)), f: FieldSymbol) if in == this => f
        }.toSeq.distinct
    }

    T.ddl.create
    // Insert into T.all. The extra ".shaped" call is needed because we cannot
    // get the types in an implicit conversion due to SI-3346
    T.all.shaped.insert(
      0,
      (11, 12, 13, 14, 15, 16),
      (21, 22, 23, 24, 25, 26),
      (31, 32, 33, 34, 35, 36),
      (41, 42, 43, 44, 45, 46)
    )

    // Get the nested tuples in a query
    val q1 = T.map(_.all)
    println(q1.first)

    // Map the result to the case classes
    val i2 = q1.mapResult { case (id, p1, p2, p3, p4) =>
      Whole(id, Part.tupled.apply(p1), Part.tupled.apply(p2), Part.tupled.apply(p3), Part.tupled.apply(p4))
    }
    println(i2.first)

What you still cannot do is define a bidirectional mapping with <> over a shaped value. That would allow the us to define a Table[Whole] and do inserts from the mapped types. There is an existing ticket for this feature: https://github.com/slick/slick/issues/40

--
Stefan Zeiger
Typesafe - The software stack for applications that scale
Twitter: @StefanZeiger

Simon Ochsenreither

unread,
Jan 11, 2013, 10:18:13 AM1/11/13
to scala...@googlegroups.com
Considering that this comes up every few weeks, wouldn't it make sense to file an issue against Scala to increase 22 to something larger e. g. 255?

Stefan Zeiger

unread,
Jan 11, 2013, 10:30:47 AM1/11/13
to scala...@googlegroups.com
On 2013-01-11 16:18, Simon Ochsenreither wrote:
Considering that this comes up every few weeks, wouldn't it make sense to file an issue against Scala to increase 22 to something larger e. g. 255?

This is already on the table and we want to look into it for Scala 2.11. Increasing the limit to 255 would add a huge amount of overhead (tuples, products, functions, etc.). What we need is a way to get rid of the limit entirely and allow abstracting over arity without negatively affecting code size or performance (and we already have some ideas how to accomplish that goal).

Michael Slinn

unread,
Jan 12, 2013, 8:01:28 AM1/12/13
to scala...@googlegroups.com
Stefan,

There are two issues here:
  1. Your pseudo-code would be much more useful if it were a standalone working code example that left less to the imagination
  2. A second Slick code example needs to also incorporate a Play form because these two products have yet to be integrated. To see what I mean, look at this posting from the Play mailing list. I will post this message on the Play mailing list so the Play developers are aware also.

Thanks,

Mike

Lukas Eder

unread,
Jan 12, 2013, 1:50:35 PM1/12/13
to scala...@googlegroups.com

This is already on the table and we want to look into it for Scala 2.11. Increasing the limit to 255 would add a huge amount of overhead (tuples, products, functions, etc.). What we need is a way to get rid of the limit entirely and allow abstracting over arity without negatively affecting code size or performance (and we already have some ideas how to accomplish that goal).

That sounds quite interesting. Can you already hint at how you those ideas will work, roughly? Has there been a preliminary publication by the Scala folks about these ideas?

Cheers
Lukas

Stefan Zeiger

unread,
Jan 14, 2013, 7:49:34 AM1/14/13
to scala...@googlegroups.com
On 2013-01-12 14:01, Michael Slinn wrote:
Stefan,

There are two issues here:
  1. Your pseudo-code would be much more useful if it were a standalone working code example that left less to the imagination

The code I posted compiles and runs. Just add some imports (the obvious ones plus scala.slick.ast.Util._) and a database connection. We can put this into slick-examples when the create_* isn't needed anymore. That could be a feature of Slick 1.1.


  1. A second Slick code example needs to also incorporate a Play form because these two products have yet to be integrated. To see what I mean, look at this posting from the Play mailing list. I will post this message on the Play mailing list so the Play developers are aware also.

Thanks, We're working on a sample app for Play + Slick but this is driven by the Play guys at the moment. My experience with Play is pretty much zero.

Stefan Zeiger

unread,
Jan 14, 2013, 7:52:57 AM1/14/13
to scala...@googlegroups.com
On 2013-01-12 19:50, Lukas Eder wrote:

This is already on the table and we want to look into it for Scala 2.11. Increasing the limit to 255 would add a huge amount of overhead (tuples, products, functions, etc.). What we need is a way to get rid of the limit entirely and allow abstracting over arity without negatively affecting code size or performance (and we already have some ideas how to accomplish that goal).

That sounds quite interesting. Can you already hint at how you those ideas will work, roughly? Has there been a preliminary publication by the Scala folks about these ideas?

Not all ideas that I've seen have been made public yet. A new discussion just started here: http://groups.google.com/group/scala-internals/browse_thread/thread/a8ccb6ce74944e7e

Stefan Zeiger

unread,
Jan 14, 2013, 9:15:52 AM1/14/13
to scala...@googlegroups.com
On 2013-01-14 13:49, Stefan Zeiger wrote:
The code I posted compiles and runs. Just add some imports (the obvious ones plus scala.slick.ast.Util._) and a database connection.

Mike Slinn

unread,
Jan 14, 2013, 9:52:29 AM1/14/13
to scala...@googlegroups.com
Excellent, thank you!

Mike

Mike Slinn

unread,
Jan 14, 2013, 8:45:19 AM1/14/13
to scala...@googlegroups.com

> The code I posted compiles and runs. Just add some imports (the
> obvious ones plus scala.slick.ast.Util._) and a database connection.
> We can put this into slick-examples when the create_* isn't needed
> anymore. That could be a feature of Slick 1.1.

Why not post what you have as a complete working project to GitHub now?

Mike

Daniel Krieg

unread,
Feb 19, 2013, 10:04:43 PM2/19/13
to scala...@googlegroups.com
I am not sure if this thread is still active but I was wondering if anyone has considered using an HList (Heterogeneous List) rather than Tuples?  I am not sure what additional overhead this would pose but it would eliminate the 22 column issue.

Simon Ochsenreither

unread,
Feb 25, 2013, 1:01:35 AM2/25/13
to scala...@googlegroups.com
I filed a bug some time ago https://issues.scala-lang.org/browse/SI-7099 and linked to past discussions. I think HLists were one of the ideas mentioned. I hope this helps a bit to figure out the current state.

Jason Giedymin

unread,
Mar 3, 2013, 12:14:28 AM3/3/13
to scala...@googlegroups.com
+1

Idan Waisman

unread,
May 3, 2013, 12:55:20 PM5/3/13
to scala...@googlegroups.com
If I were to use this method of mapping a wide table, how would I use it as if it were a fully mapped table? For example, how could I create a foreign key to it from another table. In this example's case how would I do something like:

case class Other(id: Option[Int], wholeId: Int)
object Others extend Table[Other]("whatever") {
  def id = column[Int]("id", O.PrimaryKey)
  def wholeId = column[Int]("wholeId")

  def * = id.? ~ wholeId <> (Other, Other.unapply _)

  def whole = foreignKey("fk_whatever", wholeId, T)(_.id) // This won't work! (What would?)
}

val joinQuery = for{
  p1i1 <- Parameters[Int]
  other <- Others 
  whole <- other.whole if (whole.p1.i1 === p1.i1)
} yield(other, whole)

Since foreign key requires a subclass of TableNode I am not sure how to give it something that will satisfy us both. Even if I eschew the foreign key mapping and just do an explicit join I still wouldn't know how to make use of a particular projection rather than the table itself.

Am I just missing something basic?

Thanks for any assistance!
Idan

John Ky

unread,
Nov 13, 2013, 11:15:36 PM11/13/13
to scala...@googlegroups.com
Hi Stefan,

Have you got an example for update?

Cheers,

-John

Jacob Lyles

unread,
Jan 16, 2014, 3:49:57 PM1/16/14
to scala...@googlegroups.com
Hi, 

I am running into this exact same problem. Is there a solution for it?

- Jacob

Naftoli Gugenheim

unread,
Jan 19, 2014, 3:46:44 AM1/19/14
to scala...@googlegroups.com
Yes, 2.0 has HLists. (Unfortunately I don't know more at the moment, but hopefully that will give you what to search.)


--
 
---
You received this message because you are subscribed to the Google Groups "Slick / ScalaQuery" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scalaquery+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/scalaquery/76bf53d9-8368-45ef-ae5f-7829a19a4470%40googlegroups.com.

For more options, visit https://groups.google.com/groups/opt_out.

John Ky

unread,
Jan 19, 2014, 5:09:25 PM1/19/14
to scala...@googlegroups.com
Hi Jacob,

The approach suggested by Stefan Zeiger earlier in the thread worked for me.

Support for +22 records would be much better though.

Cheers,

-John



--
 
---
You received this message because you are subscribed to the Google Groups "Slick / ScalaQuery" group.
To unsubscribe from this group and stop receiving emails from it, send an email to scalaquery+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/scalaquery/76bf53d9-8368-45ef-ae5f-7829a19a4470%40googlegroups.com.

For more options, visit https://groups.google.com/groups/opt_out.



  Sydney, Australia

        

Anthony Wong

unread,
Jan 20, 2014, 6:36:41 AM1/20/14
to scala...@googlegroups.com

Sven Pedersen

unread,
Jan 20, 2014, 7:42:15 PM1/20/14
to scala...@googlegroups.com
Beware that the current testing release (RC2) of Slick 2.0 has an HList implementation that exposes a weakness in the Scala 2.10 compiler.  It is fixed in the repository and should be usable in the 2.0 final release. 

I tried to use it for > 22 columns and had compiler trouble (very long to infinite compile time and memory use). You will not be able to access columns by name, but that might not be an issue for you.
--Sven

Stefan Zeiger

unread,
Jan 21, 2014, 7:05:03 AM1/21/14
to scala...@googlegroups.com
On 2014-01-16 21:49, Jacob Lyles wrote:
Hi, 

I am running into this exact same problem. Is there a solution for it?

- Jacob

On Wednesday, November 13, 2013 8:15:36 PM UTC-8, John Ky wrote:
Hi Stefan,

Have you got an example for update?

There shouldn't be any specific issues with updates. It works the same as any other operation once the mappings are defined. For example, extending JdbcMapperTest.testWideMappedEntity:

    val oData = Whole(0,
      Part(11, 12, 13, 14, 15, 16),
      Part(21, 22, 23, 24, 25, 26),
      Part(31, 32, 33, 34, 35, 36),
      Part(41, 42, 43, 44, 45, 46)
    )
    val oData2 = Whole(10,
      Part(111, 12, 13, 14, 15, 16),
      Part(121, 22, 23, 24, 25, 26),
      Part(131, 32, 33, 34, 35, 36),
      Part(141, 42, 43, 44, 45, 46)
    )

    ts.ddl.create

    ts.insert(oData)
    assertEquals(oData, ts.first)

    ts.filter(_.p1i2 === 12).update(oData2)
    assertEquals(oData2, ts.first)



--
Stefan Zeiger
Slick Tech Lead
Typesafe - Build Reactive Apps!
Twitter: @StefanZeiger

Michael Slinn

unread,
Jan 21, 2014, 10:48:12 AM1/21/14
to scala...@googlegroups.com
A complete working example available in GitHub would be most helpful.

Mike

Christopher Vogt

unread,
Jan 22, 2014, 3:51:22 AM1/22/14
to scala...@googlegroups.com

Michael Slinn

unread,
Jan 22, 2014, 9:26:45 AM1/22/14
to scala...@googlegroups.com
Thank you, that is indeed helpful. Comments would make the code example even better. If all CRUD operations and any issues specific to using wide mapped entities were explored then this code example would be complete.

Has this code example been tested against Postgres?

Thank you,

Mike

Christopher Vogt

unread,
Jan 22, 2014, 9:36:02 AM1/22/14
to scala...@googlegroups.com
This code is an excerpt from out tests.

> Thank you, that is indeed helpful. Comments would make the code example
> even better. If all CRUD operations and any issues specific to using
> wide mapped entities were explored then this code example would be complete.

Accepting pull requests, if something is not covered by the tests :).

> Has this code example been tested against Postgres?

Yes. We run all out tests against all supported backends.

Michael Slinn

unread,
Jan 22, 2014, 9:54:20 AM1/22/14
to scala...@googlegroups.com
Jan, 

I have wasted untold hours due to poor Slick 1.x docs and spotty testing. In order for a product to be adopted the primary developers need to write usable and complete docs, provide decent and comprehensive sample code and test the product against common use cases and deployment scenarios. No user is going to care more about a product than the primary development team. If the primary development team cannot be bothered to do this work then no-one else will either. If Slick is to succeed then the primary developers must think and act in ways that encourage adoption. Look at Akka for a shining example.

Mike

Christopher Vogt

unread,
Jan 23, 2014, 5:42:55 AM1/23/14
to scala...@googlegroups.com
Mike,

> In order for a product to be adopted the primary developers
> need to write usable and complete docs, provide decent and comprehensive
> sample code and test the product against common use cases and deployment
> scenarios.

(you make it sound different, but) we are aware of this and constantly
working on improving the user experience of Slick. However not
everything happens over night and we need to prioritize tasks and work
with the resources we have. Slick is in active development and there are
certainly things to be desired. FYI, improving documentation is high on
our priority list right now.

For the case of this particular thread, there is nothing special about
querying,inserting,updating nested tuples compared to other values. It
just works as in the documentation for querying, inserting and updating.

Chris

Khaled Saleh

unread,
Jul 23, 2014, 12:52:30 PM7/23/14
to scala...@googlegroups.com
Found this code from Daniel Frei:

case class Person(name: Name, address: Address)
case class Name(given: String, family: String)
case class Address(street: String, city: String)

def * = (name, address) <> (Person.tupled, Person.unapply)
def name = (givenName, familyName) <> (Name.tupled, Name.unapply)
def address = (street, city) <> (Address.tupled, Address.unapply)

Modified the code by having 10 fields in each case class, and I was able to have a table with 30 columns.

-khaled

Mike Slinn

unread,
Jul 23, 2014, 12:58:56 PM7/23/14
to scala...@googlegroups.com
That is a very nice example. Seems Slick's HList can also handle records
with more than 22 fields, although I have not tried it.

The problem that I and many others face is that Play does not know about
Slick's HList, and Play uses 4 fields for housekeeping for every HTTP
form, so Play is still limited to 18 fields per form.

No product is an island :)

Mike

virtualeyes

unread,
Jul 24, 2014, 6:41:35 AM7/24/14
to scala...@googlegroups.com
> and Play uses 4 fields for housekeeping for every HTTP form, so Play is still limited to 18 fields per form

Right per form, so why not use Play's nested form support in the same way that the above example uses embedded case classes? Can have 324 form fields (18 * 18) if you really needed it....

Mike Slinn

unread,
Jul 24, 2014, 10:24:18 AM7/24/14
to scala...@googlegroups.com
Noah,

I did not know about that! I wonder if there is an example of Play +
Slick that demonstrates this? Seems straightforward enough, though...

Thanks,

Mike

neverminder

unread,
Sep 7, 2016, 6:20:04 AM9/7/16
to Slick / ScalaQuery
So it's been 2 more years and I'd like to distil the question a little - is it possible as of latest slick version (3.1.1) to insert/update when a table is more than 22 columns AND table classes are generated with out of the box slick code generator? Just to clarify - I have to deal with databases reaching hundreds of tables with hundreds of columns and I have no control over the schema. It is therefor obvious that no manual customisation is made to Tables.scala and that code generator is used.
Reply all
Reply to author
Forward
0 new messages