Problem inserting large amount of rows

25 views
Skip to first unread message

Bruno Kamiche

unread,
Feb 20, 2015, 11:31:39 PM2/20/15
to percona-d...@googlegroups.com
Hi, I currently work with 3 node xtrab cluster.

- Node 1 is ONLY used for write operations
- Nodes 2 & 3 are used for read operations

Everything works fine normally, but I'm currently having problems with the following situation:

Whenever I insert a large amount of rows into a table (>100K), although it is Innodb, and no other process is using it, all other DML operations get stuck for serveral seconds (even minutes sometimes), this occurs when the insert is finished in the writing server and it has to replicate the change in the other nodes.

Is there anyway to avoid this situation? 

Bruno

Jay Janssen

unread,
Feb 23, 2015, 10:53:08 AM2/23/15
to percona-d...@googlegroups.com
Yes, send smaller transactions.  Galera currently has issues with big transactions.  Big transaction points can cause issues at several Galera serialization points in ways that small transactions don't.  Namely:  replication, certification, and commit.  This can indeed backlog of other DMLs (regardless of which tables they may modify).  The symptoms you describe seem consistent with that and the bigger the transactions, the worse the symptoms.






--
You received this message because you are subscribed to the Google Groups "Percona Discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to percona-discuss...@googlegroups.com.
To post to this group, send email to percona-d...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
Jay Janssen
Managing Consultant, Percona
Reply all
Reply to author
Forward
0 new messages