Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Copy new tables from test to prod server - how?

43 views
Skip to first unread message

aj

unread,
Feb 20, 2008, 11:20:24 AM2/20/08
to
DB2 LUW v8.2 FP 14 Red Hat EL 3.4.6-2

I need to occasionally copy new tables from a test server to a
production server, and I want to automate it.. What makes this
difficult is that the tables sometimes (usually) have auto-generated
surrogate primary keys in the form of IDENTITY columns, ala:

CREATE TABLE FOO (
FOO_ID INTEGER NOT NULL GENERATED ALWAYS AS IDENTITY
(START WITH +1 , INCREMENT BY +1 , CACHE 20 ) , .......
I want to KEEP the surrogate key values.

If I export the source table, then try to use the IMPORT command (from
Control Center), I get:
SQL3550W The field value in row "1" and column "1" is not NULL, but the
target column has been defined as GENERATED ALWAYS.

This creates the table, but with no data (and also no unique indexes as
far as I can see....why?). OK - The IMPORT command has IDENTITYIGNORE
and IDENTITYMISSING options, but neither will let me keep the surrogate
key values.

If I use the LOAD command, it has the IDENTITYOVERRIDE option, which is
exactly what I want, but there is no CREATE INTO option for LOAD - it
assumes the table is already there, and you want to either append to
existing data, or replace existing data. Plus, with LOAD, If I don't
want my prod tablespace to go into BACKUP PENDING, I have to do an
unrecoverable load, which makes me nervous.

Am I missing something? Is there a better way to go about this? These
are not big tables..

Any help appreciated.

aj

Colin Booth

unread,
Feb 20, 2008, 11:47:40 AM2/20/08
to
aj wrote:

Create your tables via DDL. It is unwise to let something like import create
production tables

If you use a non-recoverable load, you can always do an on-line backup of
your table space (or even the database) after the load.

Don't forget to alter the loaded table to reflect the next value to be used
for the identity column

Colin

aj

unread,
Feb 20, 2008, 12:27:41 PM2/20/08
to
Colin Booth wrote:
> aj wrote:
>> DB2 LUW v8.2 FP 14 Red Hat EL 3.4.6-2>>
>> I need to occasionally copy new tables from a test server to a
>> production server, and I want to automate it.. What makes this
>> difficult is that the tables sometimes (usually) have auto-generated
>> surrogate primary keys in the form of IDENTITY columns, ala:>>
>> CREATE TABLE FOO (
>> FOO_ID INTEGER NOT NULL GENERATED ALWAYS AS IDENTITY
>> (START WITH +1 , INCREMENT BY +1 , CACHE 20 ) , .......
>> I want to KEEP the surrogate key values.
...snip...

> Create your tables via DDL. It is unwise to let something like import create
> production tables

Well.....OK, but maybe you could address my questions. Are you saying
that I should use db2look to generate the CREATE TABLE DDL? Once I do
that, how do I get the same surrogate values in? Or is that just an
impossibility? Or maybe I should just type all the DDL from scratch.
I do so love to type...

Once the table is created, I guess I could ALTER the table to remove the
IDENTITYness, then INSERT INTO....SELECT * ???? Then ALTER the table to
recreate the IDENTITY.. Then reset the identity value...

So I have to script/code this all up myself? Why does DB2 make this so
bloody difficult?

> If you use a non-recoverable load, you can always do an on-line backup of
> your table space (or even the database) after the load.

Did I mention this was a production DB? If its unavailable for even 10
minutes I'll have people w/ pitchforks and torches at my office door..

Serge Rielau

unread,
Feb 20, 2008, 2:22:58 PM2/20/08
to
Aj,

You can unset the identity generation using ALTER TABLE ALTER COLUMN
Then IMPORT, then ALTER TABLE ALTER COLUMN to set it again

I have posted in the past procedures that synch up identity values.
Cheers
Serge


--
Serge Rielau
DB2 Solutions Development
IBM Toronto Lab

thatche...@gmail.com

unread,
Feb 21, 2008, 1:55:06 PM2/21/08
to


At the end of your load statement use the "Copy To" syntax to avoid
the backup pending.
You'll need to idenify your relative path to where your data will be
copied or shoot it to TSM.
I use this statement repeatedly throughout the week to post meta-data
to our production unit
and it works well.

0 new messages