How we take dump with batch of records option?

All we need is an easy explanation of the problem, so here it is.

we have 30 GB database.we want to take backup of this database.

here the question is we want to take the backup batch of every 10k records.

How we can take batch of records backup using mysqldump

How to solve :

I know you bored from this bug, So we are here to help you! Take a deep breath and look at the explanation of your problem. We have many solutions to this problem, But we recommend you to use the first method because it is tested & true method that will 100% work for you.

Method 1

Here is something you may find intriguing

SUGGESTION #1 : Let mysqldump create the batches

mysqldump already uses an option called –extended-insert. This will group a series of records that can be loaded in a single INSERT command. While you cannot use mysqldump to control the number of rows to be batched together, you can parse a mysqldump. Of course, these means you will have to script it in the language of your choice (Python, Perl, C++, etc).

For any given table in a mysqldump of a single database, you will see the following paradigm:

CREATE TABLE
LOCK TABLE
INSERT INTO ...
INSERT INTO ...
INSERT INTO ...
...
...
INSERT INTO ...
INSERT INTO ...
UNLOCK TABLES;

You could parse the mysqldump as follows:

  • Step01 : Get the next table name in the mysqldump
  • Step02 : Locate next INSERT INTO or UNLOCK TABLES;
  • Step03 : If you encounter UNLOCK TABLES;, goto Step01
  • Step04 : If you have no more tables, you are done
  • Step05 : Goto Step02

Each INSERT command you read has a certain number of rows. This number you cannot control unless you script to put them together in the number of rows you choose. I would advise against that be mysqld decided the best number of rows for that particular batch given the current configuration settings (perhaps the max_allowed_packet)

SUGGESTION #2 : Use --skip-extended-insert and batch them your way

When you deactivate –extended-insert, each INSERT command will have one row. You can collect the rows using a script and collect as many rows as you like.

CAVEATS

  1. Please be a little conservative on the number of rows you choose to batches together.
  2. You will take much longer to parse the mysqldump because of the mysqldump will be much longer.

Method 2

One can also consider mysqldump with –where option to create multiple smaller dumps.

Depends on a schema.

Note: Use and implement method 1 because this method fully tested our system.
Thank you 🙂

All methods was sourced from stackoverflow.com or stackexchange.com, is licensed under cc by-sa 2.5, cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply