Backup Efficiency

Hi,
We’ve been running Matomo for a couple of years and the database has reached 20Gb. I’ve gradually been reducing the number of backups retained to accommodate the growth, but now need a different strategy.

I moved from a ‘dump the entire db and then compress it’ strategy to a ‘work through the individual tables one by one, dumping and compressing’ strategy. I’m guessing that some of the tables would never change, for example these. matomo_archive_numeric_YYYY_MM
matomo_archive_blob_YYYY_MM
Can anyone confirm that?
In which case, maybe I don’t need to backup all those tables every time, and can just keep one copy. So then I generate a list of tables, remove the unchanging tables, and I can just back those up? Does that sound like a plan?

I occasionally re-read the documentation section about removing old reports, or old visitor data from within the Matomo interface, but it doesn’t really make much sense to me what data would actually be lost (I don’t use Matomo personally, I just run the server), so I’m unable to decide if that is an appropriate course of action. If someone feels like explaining (maybe with examples) exactly what data would be lost when running these options, then maybe I could make a decision about that.