public static void transferRecords() {
try (Connection connection = sourceConnectionPool.getConnection();) {
DSLContext create = DSL.using(connection, SQLDialect.MYSQL);
// Create object table entries
BatchBindStep objectBatch = create.batch(create.insertInto(targetdb.Tables.OBJTABLE, OBJTABLE.TYPE, OBJTABLE.PARENT).values(null, (ULong)null));
for (sourcedb.tables.records.ObjectRecord record : sourceDB.fetch(sourcedb.Tables.OBJTABLE, OBJTABLE.TYPE.equal(ObjectType.account))) {
currentUUID = ULong.valueOf(currentUUID.longValue() + 1);
uuidMap.put(record.getUid().intValue(), currentUUID.intValue());
System.out.println("oldUUID:" + record.getUid() + " currentUUID: " + currentUUID.longValue());
objectBatch.bind(1, targetDB.enums.ObjectType.account);
objectBatch.bind(2, record.getParent());
}
objectBatch.execute();
} catch (SQLException e) {
e.printStackTrace();
}
}
Attempting to reduce the problem I dropped down to a single database. I've tried the enum itself and every permutation of name/literal and ordinal.
objectBatch.bind(1, ObjectType.account.getLiteral());
objectBatch.bind(1, ObjectType.account.getName());objectBatch.bind(1, ObjectType.account.ordinal());All end with the same exception generated not upon the bind on execute. The bind method appears to be asking for a generic object. The way I would normally do this with MySQL is to bind a string name of the enumeration to the statement parm.
public static void transferRecords() {
try (Connection connection = sourceConnectionPool.getConnection();) {
DSLContext create = DSL.using(connection, SQLDialect.MYSQL);
// Create object table entries
BatchBindStep objectBatch = create.batch(create.insertInto(targetdb.Tables.OBJTABLE, OBJTABLE.TYPE, OBJTABLE.PARENT).values(null, (ULong)null));
for (sourcedb.tables.records.ObjectRecord record : sourceDB.fetch(sourcedb.Tables.OBJTABLE, OBJTABLE.TYPE.equal(ObjectType.account))) {
currentUUID = ULong.valueOf(currentUUID.longValue() + 1);
uuidMap.put(record.getUid().intValue(), currentUUID.intValue());
System.out.println("oldUUID:" + record.getUid() + " currentUUID: " + currentUUID.longValue());
objectBatch.bind(targetDB.enums.ObjectType.account, record.getParent());
}
objectBatch.execute();
} catch (SQLException e) {
e.printStackTrace();
}
}
--
You received this message because you are subscribed to the Google Groups "jOOQ User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jooq-user+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
On a more serious note (sorry, couldn't resist 5 minutes ago)I'm sorry to see you go because of this. I do believe that you will find the missing features limiting after a while, but I certainly understand that moving a project forward can be essential.I'm curious about the "bit slower" part. How does this manifest?
On Thursday, November 27, 2014 7:31:14 AM UTC-5, Lukas Eder wrote:On a more serious note (sorry, couldn't resist 5 minutes ago)I'm sorry to see you go because of this. I do believe that you will find the missing features limiting after a while, but I certainly understand that moving a project forward can be essential.I'm curious about the "bit slower" part. How does this manifest?This project was just a database migration, so QueryDsl worked. Less features true, but the overall interface was simpler as a result. QueryDSL doesn't really have a manual like Jooq does, but I didn't need one.
I hope you find time to make a pass over the manual by the time I come back.
I really want to use Jooq for crud, but most of the examples look like Java 5-6 era and are in serious need of being brought up to date. Other than my binding issue, I couldn't find an example that used try with resources in the manual, as everything is missing surrounding boilerplate code.
I also couldn't find an example that leveraged lambda expressions.
Nor could I find out the best practices in using the same schema for two different databases. It worked out of the box with querydsl, but jooq didn't appear to like it at all. Unlike QueryDSL, the jooq generated java objects appeared inherently tied to the source database My workaround was to generate two separate schemas, renaming the mysql enum fields on the target db. Not pretty, but it did work!
Thanks for the responses, they really help a great deal.We constantly pass over the manual. Anything you find is greatly appreciated. We hardly need to read our own manual anymore, so feedback from users who do read it for the first time is very useful!Manuals should assume no tribal knowledge. People with tribal knowledge don't read them!
Because there isn't any surrounding boilerplate code! :-) With jOOQ, you don't need any try-with-resources statements. jOOQ manages all JDBC resources for you, internally (except the JDBC Connection, if you want to manage that).My confusion with the try-with-resources partially was that some jooq objects didn't appear to have implemented closeable.
If all I need to do is manage the connection object then that's perfect and makes me want to use jooq even more.
Imho they should shoot the guy who invented finally blocks. I was doing backflips in and out of cubicles when they came out with try-with-resources.
Is that what you had in mind? Or in what context would you expect us to provide lambda examples in the manual?In terms of lambdas, a lot of db work obviously involves processing sets of records which they fit well. I starting to see them employed in framework documentation everywhere, but when I read the jooq manual I just have this sense of reading Java 6 code instead of Java 8 code. Felt... dusty?
And perhaps includes / excludes patterns. jOOQ is a database-first API, so yes, it's true that generated objects are inherently tied to the source database. We see this as a feature, but perhaps we've missed something - e.g. a specific use-case that we cannot cover yet?Perhaps I'm missing the 'feature' aspect of it.A schema is not inherently tied to any physical database but a record mapping with added validation. Conceptually, I think of them as data structures in languages such as C, RPG IV or COBOL that you could use as a template moving it around in memory. There is no reason I can see where a person should be forcibly restricted from pointing a valid schema abstraction at multiple connections. I'm sure I'm not the only person in the world doing a data migration. QueryDSL was able to handle this use case out of the box, without any further XML configuration. You simply reference the schema object and the connection when creating the query.