Our Piwik instance, which has been running for 8 years, had a database failure and we now need to restore it from a mysqldump file. Some of the tables are so massive, however, that the mySQL restore will take a week (assuming no errors)! For example,
piwik_log_link_visit_action has 100 million rows. (Yes, we should have pruned the data, but we weren’t paying close attention to this system. Lesson learned.)
My question is: does anybody have a suggestion how to restore this database faster? Ideally, I’d love to delete old data from the mysqldump file and restore only the last 2 years’ worth. However, I don’t want to break the integrity of the data (of course). Can this approach work, and if so, how?
As a last resort, we can add a ton of CPU and RAM to the machine (it’s a VM), do the restore, delete everything but the last 2 years’ worth of data, then remove the extra CPU/RAM resources. But I’d rather restore only the data that we intend to keep, instead of restoring it all.
Thank you for any tips.