Greetings,
What is the correct way to allow MyBatis to commit a large batch insert operation where some of the values being inserted violate a primary key constraint?
e.g. primary key constraint violation should be ignored/logged and not cause the entire (very large) transaction to be rolled back.
Put another way, the data coming in may be 'dirty' in that some records are duplicated and can safely be ignored, while the many other valid records should still be preserved (committed).
Right now, I am sending a list with three objects, one of which has the same value for primary key field as a record already in the DB.
This causes:
### Error committing transaction. Cause: org.apache.ibatis.executor.BatchExecutorException: dataCore.batchInsertdataCore (batch index #1) failed. Cause: java.sql.BatchUpdateException: Duplicate entry '222' for key 'PRIMARY'
### Cause: org.apache.ibatis.executor.BatchExecutorException: dataCore.batchInsertdataCore (batch index #1) failed. Cause: java.sql.BatchUpdateException: Duplicate entry '222' for key 'PRIMARY'
The records that would have been inserted for the other two valid objects are 'lost' due to this one record failing.
Thanks once again,
Aaron