Is there any other way of archiving the reports other than cron?


Using cron in the infrastructure I am working on can be a bit troublesome due to cron’s requirement of the daemon running as root.

Besides manually running the command and using cron to auto archive the reports at specific time intervals, is there any other way this can be achieved?

(Lukas Winkler) #2


Well if you can’t use cron on your OS, there are many other ways to run a process regularly.

I am pretty sure your environment provides one as many system services need to be run regularly. One example are systemd timers, which can be used similar to cron as far as I know.

If there really is absolutely no way to call a program in regular intervals in your environment, then you can start the archiving via accessing an URL:

But I really wouldn’t recommend this as it may cause issues if you have a timeout (as the archiving can sometimes take a bit longer)


Is archiving needed only when using the javascript tracker, or is it also needed using the log analytics script?

(Lukas Winkler) #4


Archiving is always needed as it is the process of generating all reports from the raw list of visits. (as well as doing other jobs like checking for updates, updating geoip databases, etc.)

You can enable browser archiving, which means that this will be done when you open Matomo, but this means that Matomo will be much slower as all reports haven’t been yet generated when you want to look at them.