Objective
A server needs to be retired. One of the applications that server is hosting is an old Matomo instance. Just moving all the data in one piece takes too long, so data should be moved in several steps.
Warnings
Do not skip this section! Read carefully.
- Create a backup.
- Check your backup contains current data.
- Always double check you are connected to the right host or database!
- Use same version of Matomo on the source and the target server.
- Use same version of the database on the source and the target server.
Splitting the database
The tables in the source database fall into three categories: active tables, archive tables and tables from an installation that was configured to use a table name prefix.
The idea here is for the first step to copy the never again touched archive tables. In the second step, only active tables are copied in a much quicker way.
To select all tables, you can run
docker exec -it matomo_db_1 mysql --user=matomo --password=${DBPASSWORD} matomo -sN -e 'show tables;' > all-tables.txt
To select only archive tables and exclude the archive for current and next month:
current_or_next_month_pattern=$(python3 -c "from datetime import date, timedelta; next= date.today().replace(day=1) + timedelta(days=31);print('({0:_%Y_%m}|{1:_%Y_%m})'.format(date.today(),next))") docker exec -it matomo_db_1 mysql --user=matomo --password=${DBPASSWORD} matomo -sN -e 'show tables;' | grep -e '^archive_' | grep -Ev ${current_or_next_month_pattern} | sort > archive-tables.txt
To see the active (i. e. not the archives, not the unnecessary) tables:
diff --new-line-format="" --unchanged-line-format="" all-tables.txt archive-tables.txt | grep -ve 'matomo_' > live-tables.txt
Backup the database in two parts
The previous part created three text files with table names. To join these lines with spaces instead of \r
, I can use the perl replace command.
To dump the archive tables, run
docker exec -it matomo_db_1 mysqldump --user=matomo --password=${DBPASSWORD} matomo --tables $(perl -pe 's/\r/ /' archive-tables.txt) | bzip2 > archive.sql.bz2
To dump the live tables, run
docker exec -it matomo_db_1 mysqldump --user=matomo --password=${DBPASSWORD} matomo --tables $(perl -pe 's/\r/ /' live-tables.txt) | bzip2 > live.sql.bz2
My final figures really look promising:
File | Size |
---|---|
archive.sql.bz2 | 156M |
live.sql.bz2 | 624K |
Importing the data to the new server
The new server also runs Matomo inside containers.
Import archives:
bzcat archive.sql.bz2 | docker exec -i matomo-db-1 mysql -D matomo -u matomo --password=${DBPASSWORD}
Import live:
bzcat live.sql.bz2 | docker exec -i matomo-db-1 mysql -D matomo -u matomo --password=${DBPASSWORD}