All we need is an easy explanation of the problem, so here it is.
We’re using Galera cluster and I’m running below statement to make a new table:
create table tab2 select distinct id , report from my_table ;
It runs for sometime and after that it fails/cancel with this error:
ERROR 1180 (HY000): Got error 5 during COMMIT
I searched for this error and found that if we set global variable binlog_row_image=minimal then the problem should have gone. However, such thing didn’t happen in my case. I ran below and even after when I checked -it was at FULL binlog_row_image.
mysql> select @@binlog_row_image; +--------------------+ | @@binlog_row_image | +--------------------+ | FULL | +--------------------+ 1 row in set (0.00 sec) mysql> SET GLOBAL binlog_row_image=minimal; Query OK, 0 rows affected (0.00 sec)
And, since it didn’t change even after that my new table creation query is failing. Any, other way of getting it done? I just need to create my new table the existing table.
Thanks in advance!
How to solve :
I know you bored from this bug, So we are here to help you! Take a deep breath and look at the explanation of your problem. We have many solutions to this problem, But we recommend you to use the first method because it is tested & true method that will 100% work for you.
binlog_row_image=MINIMAL has no effect for an INSERT. The rows will be just as large as they were.
The the minimal row image means that in the case of an UPDATE, the binlog event will contain only the primary key columns and the columns that were changed. For a DELETE, the binlog event will contain only the pimary key columns. But for an INSERT, naturally because the row in
tab2 is a new row, all the columns are new. So they are all needed for the binlog row image.
The problem you have is that your
select distinct id , report from my_table is producing too many rows, and their combined size is too large fit in a single Galera transaction.
You need to do this copy in several smaller transactions, by copying subsets of rows from
my_table and commit each subset.
The way to do this in smaller transactions is to limit the number of rows per transaction.
create table tab2 select id, report from my_table where id between 1 and 1000;
id is a primary key, therefore you don’t need the
Then insert the next subset, in a separate transaction:
insert into tab2 select id, report from my_table where id between 1001 and 2000;
And then the next, and so on:
insert into tab2 select id, report from my_table where id between 2001 and 3000;
Keep going until you do enough of these subsets to reach
Note: Use and implement method 1 because this method fully tested our system.
Thank you 🙂