MySQL Shell 8.0.22 data exportMySQL Shell is an advanced client tool that has many features and is easy to use. The recent MySQL release (MySQL 8.0.22) has the utility “exportTable()”, which supports exporting the data into a file from a table using MySQL shell. The exported data file can be imported using the utility “importTable()”, which was released in MySQL 8.0.17.

With “exportTable()”, you can export the data into a local server or in any S3-compliant object storage provider. In this blog, I am going to explain how those exportTable() & importTable() utilities are working in local servers and I also did some tests with MySQL 5.7 and MySQL 5.6.

Overview of exportTable() & importTable()

exportTable():

  • Introduced in MySQL 8.0.22. 
  • The utility is used to export the data from the MySQL table into a data file. 
  • It can be used to export the table to a local server or any S3-compliant object storage provider. 
  • By default, the data will be exported in the default format for MySQL Shell’s table import utility. 
  • It supports CSV and TSV files.
  • Compression is supported with “gzip” or “zstd”.
  • JSON is not supported.

Below is the format of the query that will be used to export the data.

importTable():

  • Introduced in MySQL 8.0.17. 
  • The utility is used to import the data into a MySQL table from a data file.
  • It supports parallelism and a very fast method to import a large data file into the table. 
  • It is similar to LOAD DATA INFILE. 
  • By default, the data will be imported in the default format for the MySQL shell table export utility.
  • The data file will be chunked and processed with different threads. 
  • We can control the number of threads for parallel execution.
  • “local_infile” is the mandatory variable that should be ON on the server. 

 

Below is the format of the query that will be used to import the data.

How to Export the Table Into a File Using util.exportTable()

For testing, I have installed the latest Percona Server for MySQL 8.0.22 and created the table “percona.herc7”. The size of the table is around 2.3G. 

In this section, I am going to address three topics:

  • Default output file structure using util.exportTable()
  • Modified output file structure using util.exportTable() 
  • Compressed output using util.exportTable() 

Default Output File Structure Using util.exportTable()

I am going to export the table “percona.herc7” with the default option so that the utility will create the output file with default structure.

The table export is completed and the output seems pretty good. During the execution, the utility prints the progress, data size, execution time, rows written, throughput, etc.

The output from the data file looks like tab-separated:

Query from processlist: 

Note: Make sure to create the needed directories before executing the command. Otherwise, the command will fail. 

Modified Output File Structure Using util.exportTable()

In this section, I am going to play with the options and create the output file with a comma separated CSV file. I am going to modify the following options to create the CSV file.

  • fieldsOptionallyEnclosed: true, 
  • fieldsTerminatedBy: “,”
  • linesTerminatedBy: “\n”
  • fieldsEnclosedBy: ‘”‘
  • defaultCharacterSet: “utf8”
  • showProgress: true
  • dialect: “csv”

The output looks like:

Compressed Output Using util.exportTable()

exportTable() utility supports two types of compressions ( compression: “gzip” or “zstd”).

  • Gzip
  • Zstd

Let’s generate the output using the gzip { compression: “gzip” }:

Note: The util.importTable() function currently does not support compressed files, the output file needs to be unpacked first.

The output file size is 125M.

At the end of the export command, it prints the below line. It says that we can’t directly import the compressed file using the tableImport() utility, so we need to decompress the file first and then we need to import the file.

How to Import the Data File using util.importTable()

I am going to show a single example using the CSV file, which was already generated from Modified output using util.exportTable()”.

Before starting the import, I have created an empty table to load the CSV file using the ImportTable utility. 

I am going to use the following options to import the data file. Here I would suggest using the same options which were used to export the data.

  • schema: “percona”
  • table: “herc7load”
  • dialect: “csv”
  • skipRows: 0
  • showProgress: true
  • fieldsOptionallyEnclosed: true
  • fieldsTerminatedBy: “,”
  • linesTerminatedBy: “\n”
  • fieldsEnclosedBy: ‘”‘
  • threads: 3
  • bytesPerChunk: “100M”

Below is the output from the processlist during the process. You can see three LOAD DATA INFILE commands running parallel as I mentioned “thread: 3”. You can define the threads based on your available CPU. 

Will exportTable() and importTable() Work With MYSQL 5.7?

Yes, it is working when I tested with a local server.

exportTable():

importTable()

Will exportTable() & importTable() Work With MySQL 5.6?

Yes, it is working when I tested with a local server.

exportTable():

importTable()

It is good to see that MySQL shell is supported to export and import the data from the table, and you can load the large tables very fast using the importTable utility. In my next blog, I am planning to write about how these utilities support S3.

3 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Ramesh

One advantage of this is, you can check the time to completion of the dump (at least approximately) and the parallel processing.

Good one.

Mike Rohland

Interesting article!
You mention that exportTable() “can be used to export the table to a local server or any S3-compliant object storage provider”
Do you have an example how to configure this to export into S3 directly? From the docs seems only OCI is supported? thx very much!

Manish Kumar Peddi

can you share the syntax , How to use the export utility table from command line