DDL Generation

87 views
Skip to first unread message

pha...@dtc.com.au

unread,
May 7, 2013, 10:15:27 AM5/7/13
to mapp...@googlegroups.com
Does mapperdao include functionality for generating DDL to create tables etc. from entity definitions?

If not, is there a recommended library or do most people just do it manually?

Is there any built in support for database schema evolution?

Konstantinos Kougios

unread,
May 7, 2013, 10:22:06 AM5/7/13
to mapp...@googlegroups.com
Hi,

Not right now. All DDL and schema evolution have to be done
externally. I am not sure if schema evolution can be auto-generated.
Currently both are done manually. Ofcourse for schema evolution you can
use your favorite tool.

Cheers
> --
> You received this message because you are subscribed to the Google
> Groups "mapperdao" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to mapperdao+...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.
>
>

pha...@dtc.com.au

unread,
May 7, 2013, 10:43:51 AM5/7/13
to mapp...@googlegroups.com, kostas....@googlemail.com
We're currently looking at moving our architecture framework from Java with a custom ORM 
developed approximately 10 years ago to Scala with an industrial strength external ORM.

Doing some proof of concept work and started with Hibernate.  Then took a brief look at Slick,
now looking at both mapperdao and Activate.  I'm only just learning Scala but the language is
great and each of the ORM / persistence libraries has some fantastic stuff.  Of course, none
of them do everything I want :-)

Some of these other frameworks support DDL generation.  Can you recommend any framework
or tool that plays nicely with mapperdao?  e.g., Use mapperdao for ORM and Slick for generating
DDL.  Activate also seems to have some level of support for generating DDL and managing schema
evolution.

MANY THANKS
Peter

Tim Pigden

unread,
May 7, 2013, 11:08:24 AM5/7/13
to mapp...@googlegroups.com
Peter,
I'm not convinced you'd gain from having separate DDL generation. Would all the naming conventions be the same? You'll end up specifying everything twice if you just use mapperdao (ddl + scala) but with something like slick added in you'll do it twice in scala and still have to fix up the ddl.
If I had a lot of tables to do I'd probably use mapperdao + some sort of uml or database tool to manage it - with the benefit you've got all those pretty diagrams for your documentation
Tim
--
Tim Pigden
Optrak Distribution Software Limited
+44 (0)1992 517100
http://www.linkedin.com/in/timpigden
http://optrak.com
Optrak Distribution Software Ltd is a limited company registered in England and Wales.
Company Registration No. 2327613 Registered Offices: Orland House, Mead Lane, Hertford, SG13 7AT England 
This email and any attachments to it may be confidential and are intended solely for the use of the individual to whom it is addressed. Any views or opinions expressed are solely those of the author and do not necessarily represent those of Optrak Distribution Software Ltd. If you are not the intended recipient of this email, you must neither take any action based upon its contents, nor copy or show it to anyone. Please contact the sender if you believe you have received this email in error.

pha...@dtc.com.au

unread,
May 7, 2013, 11:27:32 AM5/7/13
to mapp...@googlegroups.com
Hi Tim,

I suspect mixing frameworks will indeed make it a bit messy.  I'll try a few things out on the proof of concept code 
in the lab tomorrow (wee hours of the morning here now).

The main reason for wanting DDL generation in the framework is that our main product is a stand-alone Java application 
that runs on multiple JDBC databases.  So if we update the database schema, each installation must automatically update
and the database product may be different.  Our current custom ORM handles this automatically for us.

I could, for example use Hibernate annotations on our entity classes and use Hibernate to generate DDL.  Only played with 
this briefly and not sure how far I can push this.  Don't think the annotations will get in the way of mapperdao and wouldn't 
require too much duplication.  Would need to be careful with approaches used for auto-generated keys.

Similarly, Slick would appear to work off the case class fields.  However, would need to be careful in dealing with the
different approaches as to when the id field is included in the entity class.

Activate has a different approach again.

Devil no doubt in the detail but I'll have a look at a few scenarios tomorrow.

REGARDS
Peter

Konstantinos Kougios

unread,
May 7, 2013, 4:17:02 PM5/7/13
to mapp...@googlegroups.com
On 07/05/13 16:27, pha...@dtc.com.au wrote:
Hi Tim,

I suspect mixing frameworks will indeed make it a bit messy.  I'll try a few things out on the proof of concept code 
in the lab tomorrow (wee hours of the morning here now).

The main reason for wanting DDL generation in the framework is that our main product is a stand-alone Java application 
that runs on multiple JDBC databases.  So if we update the database schema, each installation must automatically update
and the database product may be different.  Our current custom ORM handles this automatically for us.

How about table modification and existing data? I.e. does it drop columns, create new columns and missing tables? And what about the situation where patching of data is required?

The DDL generation is worth considering and it would be a worthy addition to mapperdao. At the moment my focus is on developing the library itself but anyone volunteering to make a DDL generator (or a reverse engineering tool) is more than welcomed :)

Cheers

Konstantinos Kougios

unread,
May 7, 2013, 4:18:15 PM5/7/13
to mapp...@googlegroups.com
On 07/05/13 15:43, pha...@dtc.com.au wrote:
We're currently looking at moving our architecture framework from Java with a custom ORM 
developed approximately 10 years ago to Scala with an industrial strength external ORM.

Doing some proof of concept work and started with Hibernate.  Then took a brief look at Slick,
now looking at both mapperdao and Activate.  I'm only just learning Scala but the language is
great and each of the ORM / persistence libraries has some fantastic stuff.  Of course, none
of them do everything I want :-)

I'll be interested to what attracts you in each framework. Please let me know if you got some time.

Cheers

pha...@dtc.com.au

unread,
May 7, 2013, 11:08:06 PM5/7/13
to mapp...@googlegroups.com, kostas....@googlemail.com
I'll feedback what I can when I can.  Evaluating all the current frameworks for use in our new technology 
stack is a big job.  Particularly when you need to consider more than just best-of-breed and functionality,
but also maturity vs. bleeding edge, future enhancement, and support.

Data persistence is only one aspect as we also need to re-evaluate UI frameworks, security, and so on.
What we currently have is very good but starting to show its age now.

Overall what we're looking for in frameworks is support of easy to write and maintain application code.  Not
too concerned if integrating the frameworks is complex provided that it is largely transparent to the
application developer.

In the case of persistence framework, a clean interface with the business object model is probably the 
major requirement with database portability high on the list as well.  So something like Slick is nice
but still leaves the need for a layer (ala ORM) to integrate cleanly with the business object layer.

Hibernate is nice because of its maturity but becoming legacy in the new technology stack particularly 
given that Scala is almost certainly going to be our language of choice.  We haven't used Hibernate
previously so legacy support isn't really necessary.

Activate was looking good but the whole STM thing seemed to add complexity due to functionality that
we probably won't need.

While researching Activate I came across mapperdao which then looked like being everything we needed;
sans the DDL generation.  Still appears to be the front-runner.

However, all my comments so far are only based on project documentation and what I have been able to
learn from public discussions.  Hoping to get stuck into some concrete POC development in the next few
weeks.  Will keep you in the loop with findings / questions re mapperdao as we go.

REGARDS
Peter

pha...@dtc.com.au

unread,
May 7, 2013, 11:26:57 PM5/7/13
to mapp...@googlegroups.com, kostas....@googlemail.com
Our current framework checks the schema version when the application starts.  If the user database schema isn't the current
application version then they are given the option to upgrade.  Depending on schema update configuration the application may
start without upgrading the schema but in most cases will not start and force the user to use an earlier version of the application
if they don't allow the application to update the schema.

If they elect to update the schema, the framework applies each custom script stepping through the versions until the current
schema version is reached.  So the actual creation of new tables, adding columns, migrating data is performed by scripts
written by the application developer.  However, the scripts are implemented using the business entities as much as possible
mixed with JDBC queries for the "once off" migration stuff.  So its not automatic to the point where you can just give it an old
table definition a new table definition and have it automatically move from one to the other.  As you point out, most schema
evolutions will require custom modifications (e.g., migrating existing data).

I suspect the schema evolution stuff will just be ported (design wise) from our current implementation.  Being able to generate 
DDL for table creation etc. in a database agnostic way would be nice to get from a library.

It looks like Slick are investigating something akin to reverse engineering the schema (using Scala macros) in a future version.
Though I prefer to create the schema from the business model rather than the other way around

REGARDS
Peter

samzil...@gmail.com

unread,
May 8, 2013, 4:47:16 AM5/8/13
to mapp...@googlegroups.com
If you are planning to use playframework just use its "evaluations" feature to update the db.
It's easy and works.
Ofcourse then you'll have to write all the ddl stuff by yourself, or use a tool to do it, then copy this into the 1.sql evaulation... 

Peter Hancox

unread,
May 8, 2013, 4:54:20 AM5/8/13
to mapp...@googlegroups.com, samzil...@gmail.com
I looked at the Play framework predominantly as a web framework.  Will probably use it in the future but initially 
plan on using Vaadin for its UI functionality.

I noticed its database evolution approach which is somewhat similar to our current custom one.  Given the 
components of the Typesafe stack, I suspect Play is either based on or using Slick for implementing schema
evolution.

kostas....@googlemail.com

unread,
May 8, 2013, 1:15:55 PM5/8/13
to mapp...@googlegroups.com, mapp...@googlegroups.com
Thanks for the feedback. Certainly hibernate is the most mature one. The main drawback of it been mutability and ofcourse it doesnt play well with scala. 

Sent from my self
Reply all
Reply to author
Forward
0 new messages