We are running Piwik 1.5.1 and have a question concerning running the archive cronjob.
Setup is a system with different web AND mysql servers for frontend AND backend. As we have a seperated frontend and backend environment, also concerning mysql database, we have 2 Piwik installations.
The mysql data from the frontend environment is dumped (when newer) into the backend environment daily.
All this brings some problems.
When we are working with settings, users, sites, we have to make sure those settings are done on the frontend sys (as well). Otherwise we are running the risk of losing data when updating…
All optimizing on tables in the backend has no sense when the data will be overwritten with frontend data…
It looks like only the following tables are changing in the live sys and will be dumped:
Maybe it will be a good idea to explicitly just dump those tables (better without the session table, I suppose).
If admin settings are done on the admin sys, maybe those tables should be dumped the other way
Which are the tables concerned? I assume
We have around 12 sites with a database size at around 4,3GB.
How do we have to set up the archive cronjob?
We poorly have to admint that it was never running up to now…
My idea is to run it once daily before the data will be dumped to the admin sys.
When archive.sh was running succesfully ONCE, is all older data archived correctly, so that we can delete older logs and have the overviews nevertheless?
Thanks in advance,