I'm trying to batch inserts to reduce the number of round trips pushing millions (or billions) of records into MySQL 5.7 with jooq 3.11.9.
I was trying to compare two approaches with a table with schema:
CREATE TABLE `Table1` (
`BytesKey` binary(16) NOT NULL,
`StringData` varchar(2600) CHARACTER SET utf8 NOT NULL,
UNIQUE KEY `UniqueKey` (`BytesKey`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1
This first approach works correctly:
class BulkInsertBuffer {
public InsertValuesStep2 insertBuffer = insertInto(table("Table1")).columns(field("BytesKey"), field("StringData"));
public void add(Record record) {
insertBuffer = insertBuffer.values(
field("unhex({0})", String.class, record.getHexStringKey()),
(String) record.getData());
}
public int flush(DSLContext mysql) {
return mysql.execute(insertBuffer);
}
}
but I wanted to try the second one, and got various errors trying different approaches:
static class BulkInsertBufferV2 {
public BatchBindStep insertBuffer = mysql.batch(insertInto(table("Table1"), field("BytesKey"), field("StringData")).values(null, null));
public void add(Record record) {
insertBuffer = insertBuffer.bind( field("unhex({0})", String.class, record.getHexStringKey()), (String) record.getData());
}
public int[] flush(DSLContext mysql) {
return insertBuffer.execute();
}
}Which throws a org.jooq.exception.DataAccessException: SQL [insert into Table1 (BytesKey, StringData) values (?, ?)]; Data truncation: Data too long for column 'BytesKey' at row 1
Thanks!